Accessibility navigation


Emotion recognition across visual and auditory modalities in autism spectrum disorder: a systematic review and meta-analysis

Yik Nam Leung, F., Sin, J., Dawson, C., Ong, J. H. ORCID: https://orcid.org/0000-0003-1503-8311, Zhao, C., Veić, A. and Liu, F. ORCID: https://orcid.org/0000-0002-7776-0222 (2022) Emotion recognition across visual and auditory modalities in autism spectrum disorder: a systematic review and meta-analysis. Developmental Review, 63. 101000. ISSN 0273-2297

[img]
Preview
Text - Accepted Version
· Available under License Creative Commons Attribution Non-commercial No Derivatives.
· Please see our End User Agreement before downloading.

1MB

It is advisable to refer to the publisher's version if you intend to cite from this work. See Guidance on citing.

To link to this item DOI: 10.1016/j.dr.2021.101000

Abstract/Summary

An expanding literature has investigated emotion recognition across visual and auditory modalities in autism spectrum disorder (ASD). Findings, however, have been highly variable. The present work systematically reviewed and quantitatively synthesised a large body of literature, in order to determine whether autistic individuals differ from their neurotypical counterparts in emotion recognition across human face, nonhuman face, speech, and music domains. To identify eligible studies, the literature was searched using Embase, Medline, PubMed, Web of Science, and Google Scholar. Synthesising data from 72 papers, results showed a general difficulty with emotion recognition accuracy in ASD, while autistic individuals also showed longer response times than their neurotypical (NT) counterparts for a subset of emotions (i.e., anger, fear, sadness, and the six-emotion composite). These impairments were shown to be robust as they were not driven by differences in stimulus presentation time restriction and IQ matching, though the severity of impairments was less pronounced for a subset of emotions when full-scale IQ matching (i.e., anger, fear, happiness, sadness, and disgust) and verbal IQ matching (i.e., anger, fear, sadness, and disgust) had been undertaken. The heterogeneity among studies arose from a combination of sample characteristics (i.e., age but not IQ) and experimental design (i.e., stimulus domain and task demand) parameters. Specifically, we show that (i) impairments were more pronounced in autistic adults; (ii) full-scale, verbal, and nonverbal IQ did not moderate impairments; (iii) emotion-general impairments were found for human faces but emotion-specific impairments were observed for speech prosody (i.e., anger, happiness, and disgust) and music (i.e., fear and sadness), while no impairment was observed for nonhuman faces; (iv) impairments were found across emotions for verbal but not nonverbal tasks. Importantly, further research on the recognition of prosodic, musical, and nonhuman facial emotions is warranted, as the current findings are disproportionately influenced by studies on human faces. Future studies should also continue to explore the different emotion processing strategies employed by autistic individuals, which could be fundamental to promoting fulfilling emotional experiences in real life.

Item Type:Article
Refereed:Yes
Divisions:Interdisciplinary Research Centres (IDRCs) > Centre for Integrative Neuroscience and Neurodynamics (CINN)
Interdisciplinary centres and themes > ASD (Autism Spectrum Disorders) Research Network
Life Sciences > School of Psychology and Clinical Language Sciences > Department of Psychology
Life Sciences > School of Psychology and Clinical Language Sciences > Development
Life Sciences > School of Psychology and Clinical Language Sciences > Perception and Action
ID Code:101079
Publisher:Elsevier

Downloads

Downloads per month over past year

University Staff: Request a correction | Centaur Editors: Update this record

Page navigation