The aim of this study was to evaluate the viability of a new selection procedure based on machine learning (ML) and virtual reality (VR). Specifically, decision-making behaviours and eye-gaze patterns were used to classify individuals based on their leadership styles while immersed in virtual environments that represented social workplace situations. The virtual environments were designed using an evidence-centred design approach. Interaction and gaze patterns were recorded in 83 subjects, who were classified as having either high or low leadership style, which was assessed using the Multifactor leadership questionnaire. A ML model that combined behaviour outputs and eye-gaze patterns was developed to predict subjects' leadership styles (high vs low). The results indicated that the different styles could be differentiated by eye-gaze patterns and behaviours carried out during immersive VR. Eye-tracking measures contributed more significantly to this differentiation than behavioural metrics. Although the results should be taken with caution as the small sample does not allow generalization of the data, this study illustrates the potential for a future research roadmap that combines VR, implicit measures, and ML for personnel selection.

Parra, E., García Delgado, A., Carrasco-Ribelles, L. A., Chicchi Giglioli, I. A. M., Marín-Morales, J., Giglio, C., Alcañiz Raya, M., Combining Virtual Reality and Machine Learning for Leadership Styles Recognition, <<FRONTIERS IN PSYCHOLOGY>>, N/A; 13 (N/A): N/A-N/A. [doi:10.3389/fpsyg.2022.864266] [https://hdl.handle.net/10807/268237]

Combining Virtual Reality and Machine Learning for Leadership Styles Recognition

Chicchi Giglioli, Irene Alice Margherita;
2022

Abstract

The aim of this study was to evaluate the viability of a new selection procedure based on machine learning (ML) and virtual reality (VR). Specifically, decision-making behaviours and eye-gaze patterns were used to classify individuals based on their leadership styles while immersed in virtual environments that represented social workplace situations. The virtual environments were designed using an evidence-centred design approach. Interaction and gaze patterns were recorded in 83 subjects, who were classified as having either high or low leadership style, which was assessed using the Multifactor leadership questionnaire. A ML model that combined behaviour outputs and eye-gaze patterns was developed to predict subjects' leadership styles (high vs low). The results indicated that the different styles could be differentiated by eye-gaze patterns and behaviours carried out during immersive VR. Eye-tracking measures contributed more significantly to this differentiation than behavioural metrics. Although the results should be taken with caution as the small sample does not allow generalization of the data, this study illustrates the potential for a future research roadmap that combines VR, implicit measures, and ML for personnel selection.
2022
Inglese
Parra, E., García Delgado, A., Carrasco-Ribelles, L. A., Chicchi Giglioli, I. A. M., Marín-Morales, J., Giglio, C., Alcañiz Raya, M., Combining Virtual Reality and Machine Learning for Leadership Styles Recognition, <<FRONTIERS IN PSYCHOLOGY>>, N/A; 13 (N/A): N/A-N/A. [doi:10.3389/fpsyg.2022.864266] [https://hdl.handle.net/10807/268237]
File in questo prodotto:
File Dimensione Formato  
2022_Combining Virtual Reality and Machine Learning for Leadership Styles Recognition.pdf

accesso aperto

Tipologia file ?: Versione Editoriale (PDF)
Licenza: Creative commons
Dimensione 1.37 MB
Formato Adobe PDF
1.37 MB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/10807/268237
Citazioni
  • ???jsp.display-item.citation.pmc??? 1
  • Scopus 2
  • ???jsp.display-item.citation.isi??? 1
social impact