Operator support in Industry 5.0

How can we measure and improve the (cognitive) ergonomics in human-robot collaboration during assembly work?

With this project, we aim to develop:

  • An ergonomic model (digital twin) of a human-cobot interaction for industrial (assembly) contexts​

  • Adaptive self-learning cobot control based on this ergonomic model and fueled by interaction data in different contexts:​

  • Automatically generated/captured (video) data from different operators and contexts​

  • Predictive control model focusing on HRI assembly tasks ​

  • Remote training/simulation/control for operators and (future) assembly lines​

We focus on the use of our ExperienceTwin framework to determine the cognitive/psychological markers (i.e., exploring HMD eye tracking, hand tracking & object interactions to measure hesitation, doubt, load, fatigue & risk behaviour)​. In addition, we aim to track these markers during human-robot collaborations to build models of cognitive interaction between robots and humans​.

Publications

Billast, M., De Bruyne, J., Bombeke, K., De Schepper, T., & Mets, K. (2024). Physical ergonomics anticipation with human motion prediction. Proceedings of the 19th International Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications : (Volume 4), 539–548. https://doi.org/10.5220/0012392500003660

Participants

  • imec-mict-UGent

  • imec-IDLab-UA

  • imec-Brubotics-VUB

Funding

imec AAA

Contact

Previous
Previous

Smart collaboration assistant for medical training

Next
Next

DEDICAT