Accessibility navigation


FACSGen: a tool to synthesize emotional facial expressions through systematic manipulation of facial action units

Downloads

Downloads per month over past year

Roesch, E. B., Tamarit, L., Reveret, L., Grandjean, D., Sander, D. and Scherer, K. R. (2011) FACSGen: a tool to synthesize emotional facial expressions through systematic manipulation of facial action units. Journal of Nonverbal Behavior, 35 (1). pp. 1-16. ISSN 0191-5886

[img] Text - Accepted Version
· Please see our End User Agreement before downloading.

3101Kb

To link to this article DOI: 10.1007/s10919-010-0095-9

Abstract/Summary

To investigate the perception of emotional facial expressions, researchers rely on shared sets of photos or videos, most often generated by actor portrayals. The drawback of such standardized material is a lack of flexibility and controllability, as it does not allow the systematic parametric manipulation of specific features of facial expressions on the one hand, and of more general properties of the facial identity (age, ethnicity, gender) on the other. To remedy this problem, we developed FACSGen: a novel tool that allows the creation of realistic synthetic 3D facial stimuli, both static and dynamic, based on the Facial Action Coding System. FACSGen provides researchers with total control over facial action units, and corresponding informational cues in 3D synthetic faces. We present four studies validating both the software and the general methodology of systematically generating controlled facial expression patterns for stimulus presentation.

Item Type:Article
Refereed:Yes
Divisions:Interdisciplinary centres and themes > Centre for Integrative Neuroscience and Neurodynamics (CINN)
Faculty of Life Sciences > School of Psychology and Clinical Language Sciences > Department of Psychology
ID Code:18474
Publisher:Springer

Download Statistics for this item.

University Staff: Request a correction | Centaur Editors: Update this record

Page navigation