The blenderFace method: video-based measurement of raw movement data during facial expressions of emotion using open-source software.

Item request has been placed! ×
Item request cannot be made. ×
loading   Processing Request
  • Additional Information
    • Subject Terms:
    • Author-Supplied Keywords:
      Emotion expression
      Measurement of facial movement
      Open-source software
      Optical measurement
      Video data analysis
    • Abstract:
      This article proposes an optical measurement of movement applied to data from video recordings of facial expressions of emotion. The approach offers a way to capture motion adapted from the film industry in which markers placed on the skin of the face can be tracked with a pattern-matching algorithm. The method records and postprocesses raw facial movement data (coordinates per frame) of distinctly placed markers and is intended for use in facial expression research (e.g., microexpressions) in laboratory settings. Due to the explicit use of specifically placed, artificial markers, the procedure offers the simultaneous measurement of several emotionally relevant markers in a (psychometrically) objective and artifact-free way, even for facial regions without natural landmarks (e.g., the cheeks). In addition, the proposed procedure is fully based on open-source software and is transparent at every step of data processing. Two worked examples demonstrate the practicability of the proposed procedure: In Study 1(N= 39), the participants were instructed to show the emotions happiness, sadness, disgust, and anger, and in Study 2 (N= 113), they were asked to present both a neutral face and the emotions happiness, disgust, and fear. Study 2 involved the simultaneous tracking of 14 markers for approximately 12 min per participant with a time resolution of 33 ms. The measured facial movements corresponded closely to the assumptions of established measurement instruments (EMFACS, FACSAID, Friesen & Ekman, 1983; Ekman & Hager, 2002). In addition, the measurement was found to be very precise with sub-second, sub-pixel, and sub-millimeter accuracy. [ABSTRACT FROM AUTHOR]
    • :
      Copyright of Behavior Research Methods is the property of Springer Nature and its content may not be copied or emailed to multiple sites or posted to a listserv without the copyright holder's express written permission. However, users may print, download, or email articles for individual use. This abstract may be abridged. No warranty is given about the accuracy of the copy. Users should refer to the original published version of the material for the full abstract. (Copyright applies to all Abstracts.)
    • Author Affiliations:
      1Personality, Psychological Assessment, and Psychological Methods, University of Koblenz-Landau, Fortstr. 7, 76829, Landau, Germany
      2Abteilung für Angewandte Psychologie und Methodenforschung, Alpen-Adria-Universität Klagenfurt, Universitätsstraße 65-67, 9020, Klagenfurt am Wörthersee, Austria
    • Full Text Word Count:
      15997
    • ISSN:
      1554-351X
    • Accession Number:
      10.3758/s13428-018-1085-9
    • Accession Number:
      136128622
  • Citations
    • ABNT:
      ZINKERNAGEL, A. et al. The blenderFace method: video-based measurement of raw movement data during facial expressions of emotion using open-source software. Behavior Research Methods, [s. l.], v. 51, n. 2, p. 747–768, 2019. DOI 10.3758/s13428-018-1085-9. Disponível em: http://search.ebscohost.com/login.aspx?direct=true&site=eds-live&db=cxh&AN=136128622. Acesso em: 27 out. 2020.
    • AMA:
      Zinkernagel A, Alexandrowicz RW, Lischetzke T, Schmitt M. The blenderFace method: video-based measurement of raw movement data during facial expressions of emotion using open-source software. Behavior Research Methods. 2019;51(2):747-768. doi:10.3758/s13428-018-1085-9
    • APA:
      Zinkernagel, A., Alexandrowicz, R. W., Lischetzke, T., & Schmitt, M. (2019). The blenderFace method: video-based measurement of raw movement data during facial expressions of emotion using open-source software. Behavior Research Methods, 51(2), 747–768. https://doi.org/10.3758/s13428-018-1085-9
    • Chicago/Turabian: Author-Date:
      Zinkernagel, Axel, Rainer W. Alexandrowicz, Tanja Lischetzke, and Manfred Schmitt. 2019. “The BlenderFace Method: Video-Based Measurement of Raw Movement Data during Facial Expressions of Emotion Using Open-Source Software.” Behavior Research Methods 51 (2): 747–68. doi:10.3758/s13428-018-1085-9.
    • Harvard:
      Zinkernagel, A. et al. (2019) ‘The blenderFace method: video-based measurement of raw movement data during facial expressions of emotion using open-source software’, Behavior Research Methods, 51(2), pp. 747–768. doi: 10.3758/s13428-018-1085-9.
    • Harvard: Australian:
      Zinkernagel, A, Alexandrowicz, RW, Lischetzke, T & Schmitt, M 2019, ‘The blenderFace method: video-based measurement of raw movement data during facial expressions of emotion using open-source software’, Behavior Research Methods, vol. 51, no. 2, pp. 747–768, viewed 27 October 2020, .
    • MLA:
      Zinkernagel, Axel, et al. “The BlenderFace Method: Video-Based Measurement of Raw Movement Data during Facial Expressions of Emotion Using Open-Source Software.” Behavior Research Methods, vol. 51, no. 2, Apr. 2019, pp. 747–768. EBSCOhost, doi:10.3758/s13428-018-1085-9.
    • Chicago/Turabian: Humanities:
      Zinkernagel, Axel, Rainer W. Alexandrowicz, Tanja Lischetzke, and Manfred Schmitt. “The BlenderFace Method: Video-Based Measurement of Raw Movement Data during Facial Expressions of Emotion Using Open-Source Software.” Behavior Research Methods 51, no. 2 (April 2019): 747–68. doi:10.3758/s13428-018-1085-9.
    • Vancouver/ICMJE:
      Zinkernagel A, Alexandrowicz RW, Lischetzke T, Schmitt M. The blenderFace method: video-based measurement of raw movement data during facial expressions of emotion using open-source software. Behavior Research Methods [Internet]. 2019 Apr [cited 2020 Oct 27];51(2):747–68. Available from: http://search.ebscohost.com/login.aspx?direct=true&site=eds-live&db=cxh&AN=136128622