A Framework for Affect-Based Natural Human-Robot Interaction
In this paper we present a general framework for affective human-robot interaction that allows users to intuitively interact with a robot and takes into account their mental fatigue, thus simplifying the task or providing assistance when the user feels stressed.
Interaction with the robot is achieved by naturally mapping user’s forearm motion, detected with a smartwatch, into robot’s motion. High-level commands can be provided by means of gestures.
An approach based on affective robotics is used to adapt the level of robot’s autonomy to the cognitive workload of the user. User’s mental fatigue is detected from the analysis of heart rate, also measured by the smartwatch.
The framework is general and can be applied to different robotic systems.
In this paper, we consider its experimental validation on a wheeled mobile robot.