Benda and Scherf. (2020)2525 Benda MS, Scherf KS. The complex emotion expression database: a validated stimulus set of trained actors. PLoS One. 2020;15(2):e0228248. https://doi.org/10.1371/journal.pone.0228248 https://doi.org/10.1371/journal.pone.022...
|
Complex Emotion Expression Database (CEED) |
1) Presentation of an equivalent photograph expressing the emotion 2) Emotions elicited from specific situations |
-
–
Background: White
-
–
Clothes: ND
-
–
Distractors removed: ND
|
Accuracy ≥50% |
796 volunteers recruited through MTurk
-
–
Age: 34. years; SD=11.6
-
–
Gender: M=403; F=388
-
–
Race: ND
|
|
Chung et al. (2019)2626 Chung KM, Kim S, Jung WH, Kim Y. Development and validation of the Yonsei face database (YFace DB). Front Psychol. 2019;10:2626. https://doi.org/10.3389/fpsyg.2019.02626 https://doi.org/10.3389/fpsyg.2019.02626...
|
Yonsei Face Database (YFace DB) |
1) Presentation of an equivalent photograph expressing the emotion 2) Instruction on muscle movement of the emotions based on the FACS 3) Emotions elicited from specific situations |
|
Accuracy, intensity, and naturalness |
212 students from the Seoul University
-
–
Age: 18-28 years
-
–
Gender: M=97; F=115
-
–
Race: ND
|
-
–
Analysis of the items: Accuracy
-
–
Precision: Accuracy
-
–
Validity evidence: Content-based: Accuracy Based on the relationship with other variables: ANOVA for difference in precision between genders of the stimuli and evaluators, t-test for difference in mean accuracy between genders and emotions, and post-hoc Bonferroni analysis for items with significant differences‡
|
Conley et al. (2018)1616 Conley MI, Dellarco DV, Rubien-Thomas E, Cohen AO, Cervera A, Tottenham N, et al. The racially diverse affective expression (RADIATE) face stimulus set. Psychiatry Res. 2018;270:1059-67. https://doi.org/10.1016/j.psychres.2018.04.066 https://doi.org/10.1016/j.psychres.2018....
|
The racially diverse affective expression (RADIATE) |
Presentation of an equivalent photograph expressing the emotion |
|
Accuracy and Cohen's kappa |
662 participants recruited through MTurk
-
–
Age: 18-35 years (27.6 years; SD=3.8)
-
–
Gender: M=402; F=260
-
–
Race: Asian (n=48), Black/African-American (n=70), Caucasian (n=470), Hispanic (n=63), and others (n=11)
|
|
Dalrymple et al. (2013)2727 Dalrymple KA, Gomez J, Duchaine B. The dartmouth database of children's faces: acquisition and validation of a new face stimulus set. PLoS One. 2013;8(11):e79131. https://doi.org/10.1371/journal.pone.0079131 https://doi.org/10.1371/journal.pone.007...
|
The Dartmouth Database of Children's Faces |
Emotions elicited from specific situations |
|
Images recognized with ≥70% accuracy |
163 students and members of the Dartmouth College academic community
-
–
Age: 19.6 years; SD=4.15
-
–
Gender: M=67; F=96
-
–
Race: ND
|
-
–
Precision: Accuracy and Cohen's kappa among the evaluators
-
–
Validity evidence: Content-based: Accuracy and Cohen's kappa among the evaluators Based on the relationship with other variables: ANOVA for difference in precision between gender of the stimuli and evaluators‡
|
Donadon et al. (2019)2828 Donadon MF, Martin-Santos R, Osório FL. Baby faces: development and psychometric study of a stimuli set based on babies’ emotions. J Neurosci Methods. 2019;311:178-85. https://doi.org/10.1016/j.jneumeth.2018.10.021 https://doi.org/10.1016/j.jneumeth.2018....
|
Baby Faces |
The parents were instructed and trained to provoke the intended emotions |
ND |
Rasch model to minimize floor and ceiling effects with values from 0.50 to 1.50 Rate of correct answers according to Kringelbach et al. 20086464 Kringelbach ML, Lehtonen A, Squire S, Harvey AG, Craske MG, Holliday IE, et al. A specific and rapid neural signature for parental instinct. PLoS One. 2008;3(2):e1664. https://doi.org/10.1371/journal.pone.0001664 https://doi.org/10.1371/journal.pone.000...
|
Validation 119 volunteers from the community
-
–
Age: 36 years; SD=12.8
-
–
Gender: M=36.1%; F=63.9%
-
–
Race: Caucasian (n=69.7%), Black (n=26.1%), and Japanese (n=4.2%) Retest 31 volunteers from the community
-
–
Age: 38.06 years; SD=11.57
-
–
Gender: M=35.5%; F=64.5%
-
–
Race: Caucasian (n=74%), Black (n=19.5%), and Japanese (n=6.5%)
|
-
–
Analysis of the items: Adjustment and difficulty of the items by the Rasch model
-
–
Precision: Reliability (test-retest)
-
–
Validity evidence: Content-based: Accuracy Based on the relationship with other variables: ANCOVA to assess the differences between groups considering the sociodemographic variables (gender, race, schooling level of the adults, and gender and race of the faces in the stimulus)‡
|
Ebner et al. (2010)1313 Ebner NC, Riediger M, Lindenberger U. FACES--a database of facial expressions in young, middle-aged, and older women and men: development and validation. Behav Res Methods. 2010;42(1):351-62. https://doi.org/10.3758/BRM.42.1.351 https://doi.org/10.3758/BRM.42.1.351...
|
Faces--a life-span Database of Facial Expressions |
1) Emotion induction through photographs and videos 2) Emotions elicited from specific situations |
|
Agreement among evaluators for (1) purity of the facial expression and (2) high intensity facial expression |
154 students
-
–
Age: 20-81 years
-
–
Gender: M=78; F=76
-
–
Race: Caucasian
|
-
–
Precision: Accuracy and consensus among the evaluators
-
–
Validity evidence: Content-based: Accuracy and consensus among the evaluators Based on the relationship with other variables: ANOVA for face age × evaluator's age × emotion expressed‡
|
Egger et al. (2011)2929 Egger HL, Pine DS, Nelson E, Leibenluft E, Ernst M, Towbin KE, et al. The NIMH Child Emotional Faces Picture Set (NIMH-ChEFS): a new set of children's facial emotion stimuli. Int J Methods Psychiatr Res. 2011;20(3):145-56. https://doi.org/10.1002/mpr.343 https://doi.org/10.1002/mpr.343...
|
NIMH Child Emotional Faces Picture Set (NIMH-ChEFS) |
|
-
–
Background: Gray
-
–
Clothes: ND
-
–
Distractors removed: ND
|
The cutoff point for the image to be included was that ≥15 evaluators identified the intended emotion |
20 professors and employees of the Duke University Medical Center
-
–
Age: 38.3 years
-
–
Gender: M=7; F=13
-
–
Race: ND
|
-
–
Analysis of the items: Accuracy
-
–
Difficulty of the items: Intensity and representativeness scores
-
–
Precision: Agreement among the evaluators//
-
–
Validity evidence: Content-based: Accuracy and agreement among the evaluators
|
Ekman and Friesen. (1976)3030 Ekman P, Friesen WV. Pictures of facial affect. Palo Alto: Consulting Psychologists Press; 1976.
|
Pictures of Facial Affect (POFA) |
Instruction on muscle movement of the emotions based on FACS |
ND |
ND |
ND |
ND |
Fujimura and Umemura (2018)3131 Fujimura T, Umemura H. Development and validation of a facial expression database based on the dimensional and categorical model of emotions. Cogn Emot. 2018;32(8):1663-70. https://doi.org/10.1080/02699931.2017.1419936 https://doi.org/10.1080/02699931.2017.14...
|
A facial expression database based on the dimensional and categorical model of emotions |
1) Emotions elicited from specific situations 2) Instruction on muscle movement of the emotions based on FACS |
|
Agreement among the evaluators Mean of 69% agreement among the evaluators (SD=21%) |
39 university students
|
|
Franz et al. (2021)3232 Franz M, Müller T, Hahn S, Lundqvist D, Rampoldt D, Westermann JF, et al. Creation and validation of the Picture-Set of Young Children's Affective Facial Expressions (PSYCAFE). PLoS One. 2021;16(12):e0260871. https://doi.org/10.1371/journal.pone.0260871 https://doi.org/10.1371/journal.pone.026...
|
Picture-Set of Young Children's Affective Facial Expressions (PSYCAFE) |
1) Guidance of emotions in theater workshops 2) Directed Facial Action Task used to guide the movement of anatomical landmarks |
|
Step 1 Confirmatory hierarchical cluster analysis by Ward Step 2 Intensity, authenticity, and likeability. Accuracy (77-100%) and AFFDEX Software |
Step 1 197 volunteers from the community
|
|
Garrido et al. (2017)3333 Garrido MV, Lopes D, Prada M, Rodrigues D, Jerónimo R, Mourão RP. The many faces of a face: comparing stills and videos of facial expressions in eight dimensions (SAVE database). Behav Res Methods. 2017;49(4):1343-60. https://doi.org/10.3758/s13428-016-0790-5 https://doi.org/10.3758/s13428-016-0790-...
|
Stills and Videos of facial Expressions (SAVE database) |
Emotions elicited from specific situations |
|
Stimuli with an assessment of 2.5 SD above or below the mean |
120 university students
|
-
–
Precision: Accuracy
-
–
Validity evidence: Content-based: Accuracy and interest dimensions (valence, excitement, clarity, intensity, appeal, similarity, and familiarity) Based on the relationship with other variables: Accuracy × gender of the model and the participant
|
Giuliani et al. (2017)1515 Giuliani NR, Flournoy JC, Ivie EJ, Von Hippel A, Pfeifer JH. Presentation and validation of the DuckEES child and adolescent dynamic facial expressions stimulus set. Int J Methods Psychiatr Res. 2017;26(1):e1553. https://doi.org/10.1002/mpr.1553 https://doi.org/10.1002/mpr.1553...
|
The DuckEES child and adolescent dynamic facial expressions stimulus set |
Emotions elicited from specific situations |
-
–
Background: White
-
–
Clothes: ND
-
–
Distractors removed: ND
|
Images recognized with ≥70% accuracy |
36 volunteers from the Oregon University
-
–
Age: 19.5 years; SD=1.95
-
–
Gender: M=14; F=22
-
–
Race: ND
|
|
Happy et al. (2015)3434 Happy SL, Patnaik P, Routray A, Guha R. The Indian spontaneous expression database for emotion recognition. IEEE Transactions on Affective Computing. 2015;8(1):131-42. https://doi.org/10.1109/TAFFC.2015.2498174 https://doi.org/10.1109/TAFFC.2015.24981...
|
The Indian Spontaneous Expression Database for Emotion Recognition (ISED) |
Emotion induction through videos |
-
–
Background: ND
-
–
Clothes: ND
-
–
Distractors removed: ND
|
Agreement among the evaluators (Fleiss’ Kappa) |
Four trained evaluators
-
–
Age: ND
-
–
Gender: M=2; F=2
-
–
Race: ND
|
|
Kaulard et al. (2012)3535 Kaulard K, Cunningham DW, Bülthoff HH, Wallraven C. The MPI facial expression database--a validated database of emotional and conversational facial expressions. PLoS One. 2012;7(3):e32321. https://doi.org/10.1371/journal.pone.0032321 https://doi.org/10.1371/journal.pone.003...
|
The MPI Facial Expression Database |
Emotions elicited from specific situations |
|
Consistency among the evaluators (Fleiss’ Kappa) |
20 German natives
-
–
Age: 19-33 years
-
–
Gender: M=10; F=10
-
–
Race: ND
|
|
Keutmann et al. (2015)3636 Keutmann MK, Moore SL, Savitt A, Gur RC. Generating an item pool for translational social cognition research: methodology and initial validation. Behav Res Methods. 2015;47(1):228-34. https://doi.org/10.3758/s13428-014-0464-0 https://doi.org/10.3758/s13428-014-0464-...
|
Visual and vocal emotional expressions of adult and child actors |
Emotions elicited from specific situations |
-
–
Background: Green
-
–
Clothes: ND
-
–
Distractors removed: ND
|
Accuracy |
510 students, 226 from Drexel University and 284 from the University of Central Florida
-
–
Age: ND
-
–
Gender: ND
-
–
Race: ND
|
|
Kim et al. (2017)3737 Kim SM, Kwon YJ, Jung SY, Kim MJ, Cho YS, Kim HT, et al. Development of the Korean facial emotion stimuli: Korea university facial expression collection 2nd edition. Front Psychol. 2017;8:769. https://doi.org/10.3389/fpsyg.2017.00769 https://doi.org/10.3389/fpsyg.2017.00769...
|
Korea University Facial Expression Collection – Second Edition (KUFEC-II) |
Instruction on muscle movement of the emotions based on FACS |
|
Internal consistency Accuracy |
75 evaluators
|
-
–
Precision: Accuracy
-
–
Validity evidence: Content-based: Accuracy; agreement among the evaluators and scores for purity, valence, and intensity Based on the relationship with other variables: ANOVA to test the effects of gender on recognition‡ and correlations between the participant's emotional state and task performance
|
Langner et al. (2010)3838 Langner O, Dotsch R, Bijlstra G, Wigboldus DHJ, Hawk ST, van Knippenberg A. Presentation and validation of the Radboud Faces Database. Cognition and Emotion. 2010;24(8):1377-88. https://doi.org/10.1080/02699930903485076 https://doi.org/10.1080/0269993090348507...
|
Radboud Faces Database |
Instruction on muscle movement of the emotions based on FACS |
|
Accuracy |
276 students from Radboud University
-
–
Age: 21.2 years; SD=4.0
-
–
Gender: M=38; F=238
-
–
Race: ND
|
-
–
Precision: Accuracy
-
–
Validity evidence: Content-based: Accuracy and dimensions of interest (type of expression, intensity, clarity, genuineness, and valence) Based on the relationship with other variables: ANOVA comparing each of the precision variables with age, gender, expression, and gaze direction‡
|
LoBue and Thrasher. (2015)1414 LoBue V, Thrasher C. The Child Affective Facial Expression (CAFE) set: validity and reliability from untrained adults. Front Psychol. 2015;5:1532. https://doi.org/10.3389/fpsyg.2014.01532 https://doi.org/10.3389/fpsyg.2014.01532...
|
The Child Affective Facial Expression (CAFE) |
Instruction on muscle movement of the emotions based on FACS was carried out during improvised games |
-
–
Background: White
-
–
Clothes: White sheet
-
–
Distractors removed: ND
|
Images recognized with ≥60% accuracy |
-
–
100 undergraduate students from Rutgers University
-
–
Age: ND
-
–
Gender: M=50; F=50
-
–
Race: African-American (n=17%), Asian (n=27%), White (n=30%), Latin (n=17%), and others (n=9%)
|
-
–
Analysis of the items: Difficulty of the items: Rasch model
-
–
Precision: Test-retest reliability and accuracy
-
–
Validity evidence: Content-based: Accuracy
|
Lundqvist et al. (1998)3939 Lundqvist D, Flykt A, Öhman A. The Karolinska directed emotional faces--KDEF. (CD ROM). Stockholm: Karolinska Institute, Department of Clinical Neuroscience, Psychology Section; 1998.
|
Karolinska Directed Emotional Faces (KDEF) Database |
The participants were free to express the emotion as they wished |
Background: Neutral Clothes: Gray T-shirt Distractors removed: Beard, mustache, earrings, glasses, and makeup |
ND |
ND |
ND |
Ma et al. (2020)4040 Ma J, Yang B, Luo R, Ding X. Development of a facial-expression database of Chinese Han, Hui and Tibetan people. Int J Psychol. 2020;55(3):456-64. https://doi.org/10.1002/ijop.12602 https://doi.org/10.1002/ijop.12602...
|
Han, Hui, and Tibetan Chinese facial expression database |
1) Emotion induction through photographs and videos 2) Instruction on muscle movement of the emotions based on FACS |
|
Images recognized with ≥60% accuracy |
|
|
Ma et al. (2015)4141 Ma DS, Correll J, Wittenbrink B. The Chicago face database: a free stimulus set of faces and norming data. Behav Res Methods. 2015;47(4):1122-35. https://doi.org/10.3758/s13428-014-0532-5 https://doi.org/10.3758/s13428-014-0532-...
|
Chicago Face Database (CFD) |
1) Emotions expressed from verbal instructions 2) Presentation of an equivalent photograph expressing the emotion |
-
–
Background: White
-
–
Clothes: Gray T-shirt
-
–
Distractors removed: ND
|
Two independent judges assessed how believable the expression was on a Likert scale from 1 to 9 (1=not at all believable; 9=very believable) |
1,087 evaluators (convenience sample)
-
–
Age: 26.7 years; SD=10.5
-
–
Gender: M=308; F=552
-
–
Race: White (n=516), Asian (n=117), Black (n=74), bi–or multi-race (n=72), Latin (n=57), others (n=18), and did not report (n=233)
|
|
Maack et al. (2017)4242 Maack JK, Bohne A, Nordahl D, Livsdatter L, Lindahl ÅAW, Øvervoll M, et al. The Tromso Infant Faces Database (TIF): development, validation and application to assess parenting experience on clarity and intensity ratings. Front Psychol. 2017;8:409. https://doi.org/10.3389/fpsyg.2017.00409 https://doi.org/10.3389/fpsyg.2017.00409...
|
The Tromso Infant Faces Database (TIF) |
The parents were instructed to elicit the intended emotions with games and specific stimuli |
|
The photographs with best agreement among the evaluators were selected Mean classification of clarity and intensity below 2.5 Validation: (a) expression portrayed, (b) clarity of expression, (c) intensity of the expression, and (d) valence of the expression |
720 participants
|
-
–
Precision: Accuracy
-
–
Validity evidence: Content-based: dimensions of interest (type of expression, clarity, intensity, and valence) Based on the relationship with other variables: ANOVA to compare performance × child-rearing stage × gender × mood
|
Meuwissen et al. (2017)4343 Meuwissen AS, Anderson JE, Zelazo PD. The creation and validation of the developmental emotional faces stimulus set. Behav Res Methods. 2017;49(3):960-6. https://doi.org/10.3758/s13428-016-0756-7 https://doi.org/10.3758/s13428-016-0756-...
|
Developmental Emotional Faces Stimulus Set (DEFSS) |
1) Emotions elicited from specific situations 2) Presentation of an equivalent photograph expressing the emotion |
|
The images recognized by less of 55% of the evaluators were excluded |
228 university students between undergraduate and graduate levels and children preappointed by the family via the Internet
|
|
Minear and Park. (2004)4444 Minear M, Park DC. A lifespan database of adult facial stimuli. Behav Res Methods Instrum Comput. 2004;36(4):630-3. https://doi.org/10.3758/bf03206543 https://doi.org/10.3758/bf03206543...
|
A life span database of adult facial stimuli |
Emotions expressed from verbal instructions |
-
–
Background: Gray
-
–
Clothes: ND
-
–
Distractors removed: ND
|
ND |
ND |
ND |
Negrão et al. (2021)4545 Negrão JG, Osorio AAC, Siciliano RF, Lederman VRG, Kozasa EH, D'Antino MEF, et al. The child emotion facial expression set: a database for emotion recognition in children. Front Psychol. 2021;12:666245. https://doi.org/10.3389/fpsyg.2021.666245 https://doi.org/10.3389/fpsyg.2021.66624...
|
The Child Emotion Facial Expression Set |
1) Presentation of an equivalent photograph expressing the emotion 2) Emotions elicited from specific situations |
-
–
Background: White
-
–
Clothes: White
-
–
Distractors removed: ND
|
Step 1: 100% agreement between two evaluators Step 2: 100% agreement between other two evaluators (two of each step) |
Four judges
-
–
Age: ND
-
–
Gender: ND
-
–
Race: ND
|
|
Novello et al. (2018)4646 Novello B, Renner A, Maurer G, Musse S, Arteche A. Development of the youth emotion picture set. Perception. 2018;47(10-11):1029-42. https://doi.org/10.1177/0301006618797226 https://doi.org/10.1177/0301006618797226...
|
Youth Emotion Picture Set |
1) Emotions elicited from specific situations 2) Presentation of an equivalent photograph expressing the emotion 3) Presentation of videos and a game to specifically elicit the emotion of anger |
|
Images recognized with ≥75% accuracy |
Adults: 101 volunteers recruited through the snowball method
|
|
O'Reilly et al. (2016)4747 O'Reilly H, Pigat D, Fridenson S, Berggren S, Tal S, Golan O, et al. The EU-emotion stimulus set: a validation study. Behav Res Methods. 2016;48(2):567-76. https://doi.org/10.3758/s13428-015-0601-4 https://doi.org/10.3758/s13428-015-0601-...
|
The EU-Emotion Stimulus Set |
Emotions elicited from specific situations |
-
–
Background: White
-
–
Clothes: ND
-
–
Distractors removed: ND
|
Accuracy |
1,231 volunteers
-
–
Age: 44 years; SD=16.7
-
–
Gender: M=428; F=803
-
–
Race: ND
|
-
–
Precision: Accuracy_ and Cohen's kappa
-
–
Validity evidence: Content-based: performance comparison by expression type, valence, and excitation
|
Olszanowski et al. (2015)4848 Olszanowski M, Pochwatko G, Kuklinski K, Scibor-Rylski M, Lewinski P, Ohme RK. Warsaw set of emotional facial expression pictures: a validation study of facial display photographs. Front Psychol. 2015;5:1516. https://doi.org/10.3389/fpsyg.2014.01516 https://doi.org/10.3389/fpsyg.2014.01516...
|
Warsaw Set of Emotional Facial Expression Pictures (WSEFEP) |
Instruction on muscle movement of the emotions based on FACS |
|
Agreement in recognition |
1,362 participants
-
–
Age: 26.6 years; SD=11.6
-
–
Gender: M=261; F=1,101
-
–
Race: ND
|
|
Passareli et al. (2018)4949 Passarelli M, Masini M, Bracco F, Petrosino M, Chiorri C. Development and validation of the Facial Expression Recognition Test (FERT). Psychol Assess. 2018;30(11):1479-90. https://doi.org/10.1037/pas0000595 https://doi.org/10.1037/pas0000595...
|
Facial Expression Recognition Test (FERT) |
Presentation of an equivalent photograph expressing the emotion |
-
–
Background: Black
-
–
Clothes: Black T-shirt
-
–
Distractors removed: ND
|
Unidimensional model |
794 volunteers from the community
|
-
–
Validity evidence: Based on the internal structure: factor analysis through the two-parameter Bayesian model
-
–
Based on the relationship with other variables; performance comparison between gender and age‡
-
–
Analysis of the items: Discrimination and difficulty through the Item Response Theory (IRT)
|
Romani-Sponchiado et al. (2015)5050 Romani-Sponchiado A, Sanvicente-Vieira B, Mottin C, Hertzog-Fonini D, Arteche A. Child Emotions Picture Set (CEPS): development of a database of children's emotional expressions. Psychology & Neuroscience. 2015;8(4):467-78. https://doi.org/10.1037/h0101430 https://doi.org/10.1037/h0101430...
|
Child Emotions Picture Set |
Emotion induction through videos |
-
–
Background: ND
-
–
Clothes: ND
-
–
Distractors removed: ND
|
Images recognized with ≥60% accuracy |
30 psychologists with experience in child development
-
–
Age: ND
-
–
Gender: ND
-
–
Race: ND
|
-
–
Precision: Accuracy and Fleiss’ Kappa
-
–
Analysis of the items: Accuracy
-
–
Validity evidence: Content-based: Fleiss’ kappa; chi-square to compare the proportion of posed and spontaneous photographs
|
Samuelsson et al. (2012)5151 Samuelsson H, Jarnvik K, Henningsson H, Andersson J, Carlbring P. The Umeå university database of facial expressions: a validation study. J Med Internet Res. 2012;14(5):e136. https://doi.org/10.2196/jmir.2196 https://doi.org/10.2196/jmir.2196...
|
Umeå University Database of Facial Expressions |
Instruction on muscle movement of the emotions based on FACS |
|
Accuracy |
526 participants
|
|
Sharma and Bhushan. (2019)5252 Sharma U, Bhushan B. Development and validation of Indian Affective Picture Database. Int J Psychol. 2019;54(4):462-7. https://doi.org/10.1002/ijop.12471 https://doi.org/10.1002/ijop.12471...
|
Indian Affective Picture |
1) Presentation of an equivalent photograph expressing the emotion 2) Emotions elicited from specific situations |
|
Accuracy Intensity (9-point scale) |
350 undergraduate students
|
|
Tottenham et al. (2009)1212 Tottenham N, Tanaka JW, Leon AC, McCarry T, Nurse M, Hare TA, et al. The NimStim set of facial expressions: judgments from untrained research participants. Psychiatry Res. 2009;168(3):242-9. https://doi.org/10.1016/j.psychres.2008.05.006 https://doi.org/10.1016/j.psychres.2008....
|
The NimStim set of facial expressions |
Emotions expressed from verbal instructions |
|
Validity (accuracy and Cohen's kappa) and reliability |
Group 1 47 university students
-
–
Age: 19.4 years (SD=1.2)
-
–
Gender: M=39; F=47
-
–
Race: European-American (81%), African-American (6%), Asian-American (9%), and Hispanic-American (4%) Group 2 34 volunteers from the community
-
–
Age: 25.8 years (SD=4.1)
-
–
Gender: M=22; F=12
-
–
Race: European-American (59%), African-American (18%), Asian-American (6%), Hispanic-American (6%), and other races (12%)
|
|
Tracy et al. (2009)5353 Tracy JL, Robins RW, Schriber RA. Development of a FACS-verified set of basic and self-conscious emotion expressions. Emotion. 2009;9(4):554-9. https://doi.org/10.1037/a0015766 https://doi.org/10.1037/a0015766...
|
University of California, Davis, Set of Emotion Expressions (UCDS) |
Instruction on muscle movement of the emotions based on FACS |
|
Accuracy (the most recognized emotion of each expression was included in the final database) |
Study 1 175 undergraduate students
|
|
Vaiman et al. (2017)5454 Vaiman M, Wagner MA, Caicedo E, Pereno GL. Development and validation of an Argentine set of facial expressions of emotion. Cogn Emot. 2017;31(2):249-60. https://doi.org/10.1080/02699931.2015.1098590 https://doi.org/10.1080/02699931.2015.10...
|
FACS |
Emotions elicited from specific situations |
|
Images recognized with ≥70% accuracy |
466 students from the Psychology School of the National University of Córdoba.
|
|
Yang et al. (2020)5555 Yang T, Yang Z, Xu G, Gao D, Zhang Z, Wang H, et al. Tsinghua facial expression database – a database of facial expressions in Chinese young and older women and men: development and validation. PLoS One. 2020;15(4):e0231304. https://doi.org/10.1371/journal.pone.0231304 https://doi.org/10.1371/journal.pone.023...
|
Tsinghua facial expression database |
1) Emotions elicited from specific situations 2) Instruction on muscle movement of the emotions based on FACS |
-
–
Background: White
-
–
Clothes: ND
-
–
Distractors removed: Tattoos, piercings, jewelry, glasses, and makeup.
|
Images recognized with ≥70% accuracy |
34 young individuals and 31 older adults, Chinese Young individuals
-
–
Age: 19-35 years (23.50 years; SD=4.41)
-
–
Gender: M=19; F=15
-
–
Race: Chinese Older adults
-
–
Age: 58-72 years (65.06 years; SD=3.50)
-
–
Gender: M=13; F=18
-
–
Race: Chinese
|
|