Comparing Computer-Face-Based Emotion Recognition with Human Emotion Perception.

I am presenting a short talk at the 40th Conference of Research and Development by Tecnologico de Monterrey (CIDTEC) in Monterrey, Nuevo Leon, Mexico. January 20–22, 2010.

Summary

Emotions are a form of non verbal communication that we use to reflect our physiological and mental state. We express emotions when we are dealing with everything around us even with our computers. Since we are becoming more dependent on computers in our lives, we need to design more interactive systems. In other words, we need to adapt computers to our needs as well as to our behavior: we need to make computers emotionally intelligent, in order to be able to detect ours mood and make appropriate decision. For this project our team, developed an interactive museum application that incorporated facial gesture recognition and human affect recognition to detect and display a subject’s emotions. Based on MIT Media Lab software, the MindReader API enables the real time analysis, tagging and inference of cognitive affective mental states from facial video. The API presents a computational model of “mind reading” as a framework for machine perception and mental state recognition. This framework combines bottom-up vision-based processing of the face with top- down predictions of mental state models to interpret the meaning underlying head and facial signals overtime. Our application was exhibited in the museum for several months. The exhibit requires two simultaneous users: A subject and an observer. The subject is asked to complete certain computer-based activities while (s) he is being recorded using a web cam to capture her/his expressions and then analyze them. The observer is asked to indicate each time that (s) he thinks that the subject expresses a certain emotion, e.g. concentration. At the end of the experience, the exhibit shows a screen that displays the video of the subject and the opinion of both the observer and the computer regarding the emotion of the subject throughout the activity. Through an analysis of the recorded data we are comparing the accuracy of the computer perception with the perception obtained from a human observer. Our efforts with the Exploratorium are ongoing and we expect publish conclusions about these results in the near future.

 
Slides

These are my slides for the short paper presentation, any comment is more than welcome.

Source

Research Project.

Companion

40th Conference of Research and Development by Tecnologico de Monterrey

Reference

Burleson W., Muldner K., Gonzalez Sanchez J., Chavez Echeagaray M., Lu P., and Freed N. (2010). Comparing computer-face-based emotion recognition with human emotion perception. Companion of the 40th Conference of Research and Development by Tecnologico de Monterrey. Monterrey, Nuevo Leon, Mexico, January 20–22, 2010. Page 594. ISBN: 978-607-501-0007.