Human observers and automated assessment of dynamic emotional facial expressions: KDEF-dyn database validation

dc.contributor.authorCalvo, Manuel G.
dc.contributor.authorFernández-Martín, Andrés
dc.contributor.authorRecio, Guillermo
dc.contributor.authorLundqvist, Daniel
dc.date.accessioned2022-02-22T13:46:32Z
dc.date.available2022-02-22T13:46:32Z
dc.date.issued2018-10-26
dc.date.updated2022-02-22T13:46:32Z
dc.description.abstractMost experimental studies of facial expression processing have used static stimuli (photographs), yet facial expressions in daily life are generally dynamic. In its original photographic format, the Karolinska Directed Emotional Faces (KDEF) has been frequently utilized. In the current study, we validate a dynamic version of this database, the KDEF-dyn. To this end, we applied animation between neutral and emotional expressions (happy, sad, angry, fearful, disgusted, and surprised; 1,033-ms unfolding) to 40 KDEF models, with morphing software. Ninety-six human observers categorized the expressions of the resulting 240 video-clip stimuli, and automated face analysis assessed the evidence for 6 expressions and 20 facial action units (AUs) at 31 intensities. Low-level image properties (luminance, signal-to-noise ratio, etc.) and other purely perceptual factors (e.g., size, unfolding speed) were controlled. Human recognition performance (accuracy, efficiency, and confusions) patterns were consistent with prior research using static and other dynamic expressions. Automated assessment of expressions and AUs was sensitive to intensity manipulations. Significant correlations emerged between human observers' categorization and automated classification. The KDEF-dyn database aims to provide a balance between experimental control and ecological validity for research on emotional facial expression processing. The stimuli and the validation data are available to the scientific community.
dc.format.extent12 p.
dc.format.mimetypeapplication/pdf
dc.identifier.idgrec719222
dc.identifier.issn1664-1078
dc.identifier.urihttps://hdl.handle.net/2445/183420
dc.language.isoeng
dc.publisherFrontiers Media
dc.relation.isformatofReproducció del document publicat a: https://doi.org/10.3389/fpsyg.2018.02052
dc.relation.ispartofFrontiers in Psychology, 2018, vol. 9, p. 2052
dc.relation.urihttps://doi.org/10.3389/fpsyg.2018.02052
dc.rightscc-by (c) Calvo, Manuel G. et al., 2018
dc.rights.accessRightsinfo:eu-repo/semantics/openAccess
dc.rights.urihttps://creativecommons.org/licenses/by/4.0/
dc.sourceArticles publicats en revistes (Psicologia Clínica i Psicobiologia)
dc.subject.classificationExpressió facial
dc.subject.classificationReconeixement facial (Informàtica)
dc.subject.otherFacial expression
dc.subject.otherHuman face recognition (Computer science)
dc.titleHuman observers and automated assessment of dynamic emotional facial expressions: KDEF-dyn database validation
dc.typeinfo:eu-repo/semantics/article
dc.typeinfo:eu-repo/semantics/publishedVersion

Fitxers

Paquet original

Mostrant 1 - 1 de 1
Carregant...
Miniatura
Nom:
719222.pdf
Mida:
1.21 MB
Format:
Adobe Portable Document Format