Atkinson R., Christopherson R., Gonzalez Sanchez J., Chavez Echeagaray M. “Automated Detection of Affective States to Measure Learning Experience”. Companion of the 41th Conference of Research and Development by Tecnologico de Monterrey (Monterrey, Nuevo Leon, Mexico, January 19 – 21, 2011). January 2011.
Developed by the Learning Science Research Lab at Arizona State University.
Principal Investigator: Dr. Robert Atkinson.
Researchers: M. Robert Christopherson, MSc. Javier Gonzalez Sanchez, MSc. Maria Elena Chavez Echeagaray.
Research shows that learning is enhanced when empathy or support is present. Various studies have linked interpersonal relationships between teachers and students to increase student motivation over the long term. Thus great interest exists to develop systems that embed affective support into tutoring applications. The design and use of systems and devices that deals with sensing and perception (affect recognition) will provide direct customized instruction or feedback to students without the aid of human beings.
The goal of our research approach focuses on automated detection of affective states in a variety of learning contexts. That permits us to develop educational software that will be able to perceive and interpret the emotional state of a student and adapt feedback and behavior to him or her using appropriate responses. In other words, develop educational software that creates and empathic relationship with the user. Where “empathy” can be described as an intellectual identification with the vicarious experiencing of the feelings, thoughts, or attitudes of the student. Being empathic implies that the computer has or is able to acquire the ability to attribute mental states —beliefs, desires, intention and knowledge— to the student and understand the implication of those states.
Providing the computer with the ability of “perceive” feelings, thoughts, or attitudes requires the implementation of additional sensing and perception mechanisms such as biofeedback and brain-computer interfaces, face-based emotion recognition systems and eye-tracking systems. Using the information provided by these mechanisms as input, it is possible to measure in an objective way the user experience and to be able to create user’s models to predict user’s behavior and, in consequence, create educational software able to interact in a more human-like manner.
These are my slides for the short paper presentation, any comment is more than welcome.
This video shows our study. Students were asked to play Guitar Hero © while we measure their emotional states using different affective sensors, including: Emotiv headset, face-based emotion recognition systems, skin conductance sensor, pressure buttons.