I am teaching a new course titled Applied Affective Computing (CSC 570) at California Polytechnic State University this Spring. This is a graduate-level course for students in Computer Science and Software Engineering programs.
In this course, we will design and implementation of affect- driven intelligent systems. It includes hands-on use of devices to detect emotions, including brain-computer interfaces, gesture- and posture-based affect recognition, eye-tracking, and physiological sensors. Machine learning workflow to build emotion awareness. Gather, filter, and integrate affective data. Affect (emotions and moods) is inextricably related to human cognitive processes and expresses à great deal about human necessities; affect signals what matters to us and what we care about. Furthermore, affect impacts our rational decision-making and action selection. Providing computers with the capability to recognize, understand, and respond to human affective states would narrow the communication gap between the highly emotional human and the emotionally detached computer, enhancing their interactions. Computer applications in learning, healthcare, and entertainment stand to benefit from such capabilities. This course provides hands-on experience using devices and methodologies for automatically detecting affective states with a multimodal approach. Students will review data samples collected in experimental studies and build affect-driven intelligent applications for learning, entertainment, or healthcare domain.
This is an opportunity for students to apply their software engineering skills, learn about machine learning techniques applied to process human physiological data, understand the importance of human-centered software applications, and be exposed to a variety of sensors and human-computer interaction concepts.
- Recognize the importance of human cognitive and affective factors while building software applications.
- Explore the use of devices to detect affective states, including brain-computer interfaces, gesture-based and posture-based affect recognition, eye-tracking, and physiological sensors.
- Compare the pros and cons of the various devices, the data gathered from each, and their characteristics.
- Explain what it takes to gather, filter, and integrate affective data from a variety of sources and the challenges of Machine Learning applied to physiological data.
- Integrate affective data from a variety of sources and correlated stimuli for user modeling.
- Develop affect-driven intelligent systems.
- Produce new ideas based on a systematic review of milestone papers on User Modeling and
This course originated from a tutorial that I presented years ago at CHI, the ACM premier international Conference on Human Factors in Computing Systems. It has a strong intersection of Human-Computer Interaction but it is focused on engineering Intelligent Software Systems, therefore, the application of machine learning and software engineering.
This course includes 13 (2-hours) lectures as follows (lecture slides available):
- Course Presentation
- Models of Emotions
- Pleasure-Arousal-Dominance Vector
- Ensemble Methods
- Gaze Tracking
- Face Gestures
- Facial Emotion Recognition
- Connecting the Dots
- Physiological Sensors
- Final Review