Humans can perceive social cues and the interaction context of another human to infer internal states including cognitive and emotional states, empathy, and intention. This unique ability to infer internal states leads to effective social interaction between humans. Such unique ability is desirable for intelligent systems like robots, virtual agents, and human-machine interfaces to collaborate and interact seamlessly with humans in the era of Industry 5.0, where intelligent systems must work alongside humans to perform a variety of tasks anywhere at home, factories, offices, transit, etc.
The underlying technologies to achieve efficient and intelligent collaboration between humans and ubiquitous intelligent systems can be realized by cooperative intelligence, spanning interdisciplinary studies between robotics, AI, human-robot and -computer interaction, computer vision, cognitive science, etc.
This workshop is dedicated to discussing computational methods for sensing and recognition of nonverbal cues and internal states in the wild to realize cooperative intelligence between humans and intelligent systems. Specifically, we are interested in cognition-aware computing by integrating environmental contexts and multi-modal nonverbal social cues not limited to gaze interaction, body language, and para language.
More importantly, we extend multi-modal human behavior research to infer the internal states of humans and to generate empathetic interactions between humans and intelligent systems. We gather researchers from different expertise, yet have the common goal, motivation, and resolve to explore and tackle this delicate issue considering the practicality of industrial applications. We are calling for papers to discuss novel methods to realize human-robot cooperative intelligence by sensing and understanding humans’ behavior, internal states, and to generate empathetic interactions.