Our project was accepted at the International Conference on Human-Robot Interaction.
Abstract
We address the challenge of understanding and responding to human cognitive and emotional variations during Human-Robot Interaction in manufacturing settings. We investigate the use of consumer-grade EEG devices to gather the opera- tor’s brain signals and infer their emotional and cognitive states. These states are then communicated to the robot. A Raspberry Pi- controlled robotic arm, programmed with an adaptive algorithm, allows it to react in real time to changes in the operator’s stress and concentration levels. Aiming to facilitate smoother interactions, the robot dynamically adjusts its motor speed according to the operator’s concentration and stress levels. Additionally, it uses RGB lighting to notify the operator when stress levels increase above a predefined threshold, suggesting a break to maintain optimal well-being. This exploration paves the way for delving into more complex scenarios involving multiple users and robots collaborating simultaneously.
Video
Reference
Canete A., Gonzalez-Sanchez J., and Guerra-Silva R. “Exploring Cognition and Affect during Human-Cobot Interaction.” In Companion of the ACM/IEEE International Conference on Human-Robot Interaction (HRI 2024). pp. 288–291. 10.1145/3610978.3641082.