
Smart collaboration assistant for medical training
How can we improve collaborative learning via (automated) nudges in more immersive XR experiences?
With XR technology, physiological sensors and learning analytics, a platform is developed to automate the analysis of students’ behaviour in collaborative learning simulations with multimodal sensing.
Also, a dashboard provides the teacher with the possibility to intervene in the scenario and provide nudges intended to support the student’s learning.
This research aims to uncover reliable and valid indicators to assess the quality of collaboration and see what training strategies are effective for developing collaboration skills. The ultimate goal is to automate nudging to support trainees and trainers.
The ExperienceTwin framework is used to build immersive multi-user XR scenarios that provide realistic verbal and non-verbal interactions. Furthermore, it investigates how we can capture and analyse students’ behaviour in a more automated way and how a mentor can guide the learning process through an interactive dashboard to trigger actions and nudges in the XR scenarios.