Research

  • Emotion-driven human-computer interactions
 Human-Machine Interaction, nowadays, is already taking advantage of the rapid developments in the area of Natural Interaction, making use of tools such as Computer Vision and Artificial Intelligence. Visual cues, related to facial analysis or body posture, can be a valuable source of information regarding a person’s emotional and/or cognitive state, while the recent explosion in computational power has paved the path for a new boost in machines’ capabilities to employ well-known and established algorithms like Active Shape Models and state-of-the-art variants for tracking facial features’ location, as well as for inferring bodily expressivity features. Such instruments can be of great use for robust, accurate and real time Facial Expression Recognition (FER) and/or Visual Focus of Attention (VFoA) estimation in Assisted Living environments, Serious Games apps, e-Learning, etc., allowing, thus, an AI-endowed machine (robot, computer, mobile phone….) to react promptly to human everyday needs.

Currently, together with my team, we are conducting research in emotion-driven adaptive learning environments, collaborative games and the role of automatically retrieved human emotion, human activity recognition in indoor settings, multimodality and cognitive analysis in human-computer interaction.