An Explainable and Reliable Facial Expression Recognition System for Remote Health Monitoring
Résumé
Remote Health Monitoring (RHM) appears as a promising solution for continuous and better healthcare, especially in the rural areas where the basic medical facilities are scarce or often out of reach. Generally, the main focus is on the monitoring of physiological signals (i.e. ECG, EEG) or activities. However, the overall emotional state of the remotely monitored patients is often underestimated due to the complexity of this task. Consequently, it is crucial to propose RHM -suitable technologies for explainable and reliable emotional state recognition. In this work, a free-position model to detect emotion and facial action units (FAU) is proposed. The proposed method has been validated on the public CK + dataset and shows very promising results in terms of the explainability of the obtained results and performances, which are competitive in comparison with the state-of-the-art (an average accuracy of 93.4 % over all AUs). In addition, the lightweight architecture of the proposed method as well as its low number of parameters, in comparing with other FER models, make it suitable for resource-constrained embedded systems and open the ways for its wide adoption in remote health monitoring applications.