Multimodal emotional recognition is an indispensable facet of human-robot interaction (HRI). And, the fact that it is becoming one of the key factors for a successful design and development of social robots. As a result, numerous Multimodal emotional recognition based HRI systems been developed and evaluated – either in the research laboratories or in real world settings – in different service areas. Though, these emotion based HRI systems have evolved through several generations and made impressive progress, they are still far from operating as real application.
Recently, A*STAR Social Robotics Laboratory (ASORO) of A*Star (Agency for Science, Technology and Research) has demonstrated a Polar Bear Robot that recognises a user’s verbal-non verbal and facial emotions. Based on recognised user’s emotion, the bear responses back with certain emotions, so that the user engages in social interaction.
So the next big question would be where to use such Polar Bear Robot? In response, robotic scientists suggest that the Polar Bear can be used as a companion robot for elderly or therapeutics robot for autism children. It seems we are not very far from taken care by robots like Polar Bear Robot.