Emotion Recognition for Human-Robot Interaction
via multimodal fusion and deep learning
UMBC Ph.D. student Farshad Safavi presented his dissertation proposal on recognizing emotions for better human-robot interaction on February 9, 2024.
Emotion Recognition via Multimodal Fusion for Human-Robot Interaction Using Deep Learning
One of the primary challenges in Human-Robot Interaction (HRI) is enabling robots to effectively understand and respond to human emotions. Humans express emotions through verbal and non-verbal cues, while robots typically rely on pre-programmed algorithms and physical gestures. Our research aims to develop HRI that bridges this gap by leveraging multimodal emotion detection. Emotions play a crucial role in human communication and decision-making, significantly influencing human-robot interactions. We aim for robots to understand and respond to human emotions by integrating neurophysiological and behavioral channels. Initially, we examine unimodal facial expression recognition using Convolutional Neural Networks (CNN) and Vision Transformers (ViT). Next, we enhance the model with a Mixture of Transformers (MiT). Using this enhanced model, we have developed a human-robot interaction perception system. Subsequently, we investigate multimodal emotion recognition in conveying emotions in Human-Robot Interaction (HRI). While unimodal techniques have been used to recognize emotions from various sources, research indicates that emotion recognition is inherently multimodal. Fusion representations provide a more comprehensive view of the emotional state, thereby enhancing emotion recognition accuracy. Therefore, exploring the role of multimodal fusion through computational models and neurophysiological experiments is essential. Our framework uses machine learning and deep learning to interpret complex physiological and facial expression data, enabling nuanced human-robot interactions. We focus on the offline fusion of multimodal methods, combining brain and behavior models, and exploring real-time fusion solutions. These human-robot interactions, based on emotions, will be validated through neurophysiological experiments, aiming for seamless and intuitive interactions based on a thorough understanding of human emotions.
Committee: Drs. Ramana Kumar Vinjamuri (Chair/Advisor), Tulay Adali, Nilanjan Banerjee, Justin Brooks, Scott Kerick
Posted: February 10, 2024, 10:15 AM