Facial emotion recognition from feature loss media: human versus machine learning algorithms
Dube, D. Y., Sannasi, M. V., Kyritsis, M.
It is advisable to refer to the publisher's version if you intend to cite from this work. See Guidance on citing. To link to this item DOI: 10.1016/j.chb.2025.108806 Abstract/SummaryThe automatic identification of human emotion, from low-resolution cameras is important for remote monitoring, interactive software, pro-active marketing, and dynamic customer experience management. Even though facial identification and emotion classification are active fields of research, no studies, to the best of our knowledge, have compared the performance of humans and Machine Learning Algorithms (MLAs) when classifying facial emotions from media suffering from systematic feature loss. In this study, we used singular value decomposition to systematically reduce the number of features contained within facial emotion images. Human participants were then asked to identify the facial emotion contained within the onscreen images, where image granularity was varied in a stepwise manner (from low to high). By clicking a button, participants added feature vectors until they were confident that they could categorise the emotion. The results of the human performance trials were compared against those of a Convolutional Neural Network (CNN), which classified facial emotions from the same media images. Findings showed that human participants were able to cope with significantly greater levels of granularity, achieving 85% accuracy with only three singular image vectors. Humans were also more rapid when classifying happy faces. CNNs are as accurate as humans when given mid- and high-resolution images; with 80% accuracy at twelve singular image vectors or above. The authors believe that this comparison concerning the differences and limitations of human and MLAs is critical to (i) the effective use of CNN with lower-resolution video, and (ii) the development of useable facial recognition heuristics.
Altmetric Deposit Details University Staff: Request a correction | Centaur Editors: Update this record |