Introduction: In this work we explore a potential approach to improve human-robot collaboration experience by adapting cobot behavior based on natural cues from the operator. Methods: Inspired by the literature on human-human interactions, we conducted a wizard-of-oz study to examine whether a gaze towards the cobot can serve as a trigger for initiating joint activities in collaborative sessions. In this study, 37 participants engaged in an assembly task while their gaze behavior was analyzed. We employed a gaze-based attention recognition model to identify when the participants look at the cobot. Results: Our results indicate that in most cases (83.74%), the joint activity is preceded by a gaze towards the cobot. Furthermore, during the entire assembly cycle, the participants tend to look at the cobot mostly around the time of the joint activity. Given the above results, a fully integrated system triggering joint action only when the gaze is directed towards the cobot was piloted with 10 volunteers, of which one characterized by high-functioning Autism Spectrum Disorder. Even though they had never interacted with the robot and did not know about the gaze-based triggering system, most of them successfully collaborated with the cobot and reported a smooth and natural interaction experience. Discussion: To the best of our knowledge, this is the first study to analyze the natural gaze behavior of participants working on a joint activity with a robot during a collaborative assembly task and to attempt the full integration of an automated gaze-based triggering system.

Lavit Nicora, M., Prajod, P., Mondellini, M., Tauro, G., Vertechy, R., Andre, E., Malosio, M., Gaze detection as a social cue to initiate natural human-robot collaboration in an assembly task, <<FRONTIERS IN ROBOTICS AND AI>>, 2024; 11 (july): 1-12. [doi:10.3389/frobt.2024.1394379] [https://hdl.handle.net/10807/313611]

Gaze detection as a social cue to initiate natural human-robot collaboration in an assembly task

Mondellini, Marta;
2024

Abstract

Introduction: In this work we explore a potential approach to improve human-robot collaboration experience by adapting cobot behavior based on natural cues from the operator. Methods: Inspired by the literature on human-human interactions, we conducted a wizard-of-oz study to examine whether a gaze towards the cobot can serve as a trigger for initiating joint activities in collaborative sessions. In this study, 37 participants engaged in an assembly task while their gaze behavior was analyzed. We employed a gaze-based attention recognition model to identify when the participants look at the cobot. Results: Our results indicate that in most cases (83.74%), the joint activity is preceded by a gaze towards the cobot. Furthermore, during the entire assembly cycle, the participants tend to look at the cobot mostly around the time of the joint activity. Given the above results, a fully integrated system triggering joint action only when the gaze is directed towards the cobot was piloted with 10 volunteers, of which one characterized by high-functioning Autism Spectrum Disorder. Even though they had never interacted with the robot and did not know about the gaze-based triggering system, most of them successfully collaborated with the cobot and reported a smooth and natural interaction experience. Discussion: To the best of our knowledge, this is the first study to analyze the natural gaze behavior of participants working on a joint activity with a robot during a collaborative assembly task and to attempt the full integration of an automated gaze-based triggering system.
2024
Inglese
Lavit Nicora, M., Prajod, P., Mondellini, M., Tauro, G., Vertechy, R., Andre, E., Malosio, M., Gaze detection as a social cue to initiate natural human-robot collaboration in an assembly task, <<FRONTIERS IN ROBOTICS AND AI>>, 2024; 11 (july): 1-12. [doi:10.3389/frobt.2024.1394379] [https://hdl.handle.net/10807/313611]
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/10807/313611
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 1
  • ???jsp.display-item.citation.isi??? 0
social impact