EmBody/EmFace

About the EmBody/EmFace

The EmBody/EmFace is a novel tool to measure how well a person recognizes emotion from body expressions (EmBody) and facial expressions (EmFace).


When to use this test

The EmBody/EmFace can be used to explore emotion recognition in healthy subjects and clinical populations such as:

  • autism spectrum disorders
  • schizophrenia
  • affective disorders such as depression
  • borderline personality disorder
  • social anxiety disorder
  • … and many more.

We recommend using a suitable control group when applying EmBody/EmFace.


Administration time

10 minutes (5 minutes per task)


Task format

The EmBody/EmFace uses a three-option, forced-choice response format, i.e., subjects are asked whether each body or facial expression displayed an angry, happy, or neutral expression. The test program logs each response.


Measures

The EmBody/EmFace automatically generates various measures, including:

  • raw hits X: times subject chose the correct answer X
  • omissions X: times subject did not choose the correct answer X
  • false alarms X: times subject chose answer X when the correct answer would have been Y or Z


Data analysis

To analyze results from EmBody/EmFace, users have the option to:

  1. Save the automatically generated results table

    Results are computed automatically and displayed on the screen at the end of EmBody/EmFace. The results table is not saved automatically. Users are therefore advised to save the results by printing the respective page or copying the table into an empty document (e.g., Microsoft Word or Excel file).
    Alternatively, the data used in the output table and other outcomes of interest can be computed using the raw data (see b).


  2. Download raw data

    A CSV file that logs all experiment data, such as button presses, response times etc., can be downloaded at the end of the experiment.



Publication

Lott, L.L.*, Spengler, F.B.*, Stächele, T., Schiller, B., & Heinrichs, M. (2022). EmBody/EmFace as a new open tool to assess emotion recognition from body and face expressions. Scientific Reports, 12, 14165. doi: 10.1038/s41598-022-17866-w

* shared first co-authorship







Personal tools