Advancing Next Generation Learning Environments Lab

Research Assistant, Learning Science Research Lab, ASU
February 2009 – May 2016
Principal Investigator: Dr. Robert Atkinson, Associated Professor.
Projects: Automated Detection of Affective States to Measure Learning Experience

 

Abstract

Research shows that learning is enhanced when empathy or support is present. Various studies have linked interpersonal relationships between teachers and students to increase student motivation over the long term. Thus great interest exists to develop systems that embed affective support into tutoring applications. The design and use of systems and devices that deals with sensing and perception (affect recognition) will provide direct customized instruction or feedback to students without the aid of human beings.
The goal of our research approach focuses on automated detection of affective states in a variety of learning contexts. That permits us to develop educational software that will be able to perceive and interpret the emotional state of a student and adapt feedback and behavior to him or her using appropriate responses. In other words, develop educational software that creates and empathic relationship with the user. Where “empathy” can be described as an intellectual identification with the vicarious experiencing of the feelings, thoughts, or attitudes of the student. Being empathic implies that the computer has or is able to acquire the ability to attribute mental states —beliefs, desires, intention and knowledge— to the student and understand the implication of those states.
Providing the computer with the ability of “perceive” feelings, thoughts, or attitudes requires the implementation of additional sensing and perception mechanisms such as biofeedback and brain-computer interfaces, face-based emotion recognition systems and eye-tracking systems. Using the information provided by these mechanisms as input, it is possible to measure in an objective way the user experience and to be able to create user’s models to predict user’s behavior and, in consequence, create educational software able to interact in a more human-like manner.

Research Team

Dr. Robert Atkinson, Robert Christopherson, Javier Gonzalez-Sanchez, Maria-Elena Chavez-Echeagaray, Angela Barrus, Kent Sabo, Andre Denham, Lin Lijia, Stacey Schink.

Video

This video developed by ONR describes the research focus of our group:



My work

It is focused on the creation of learning environments that integrate biofeedback sensors, eye tracking sytem, and brain computer interfaces to gather information related with both cognitive and affective dimensions, in order to adjust the environment according with the user’s experience.

Web Page

You can read more about this project in our lsrl.lab.asu.edu.