short talk: (CUbiC 2011) Building an Emotion Recognition Framework

Center for Cognitive Ubiquitous Computing (CUbiC) Seminar Series.
Tempe, Arizona, USA. April 2011.
Short Talk
Friday (April 15) — ASU Tempe Main Campus. Room 380 of the Brickyard building.

 

Abstract

The computer’s ability to recognize human emotional states given physiological signals is gaining in popularity to create empathetic systems such as learning environments, health care systems and videogames. Despite that, there are few frameworks, libraries, architectures, or software tools, which allow systems developers to easily integrate emotion recognition into their software projects.

In that context, our work (colaborationg with the Motivational Environment Group, the Learning Science Research Lab and the Affective Metatutor Group) is addressing the construction of a multimodal emotion recognition framework to support third-party systems becoming empathetic systems. Our approach is multimodal and includes: Brain-Computer Interfaces, Eye Tracking, Face-Based Emotion Recognition, and Sensors to measure Physiological Signals (such as skin conductivity, posture, and finger pressure).

Come to our talk and let us share with you some hands-on demos and ideas.

 
Slides

These are my slides for the short talk, any comment is more than welcome.