Back to news

A Virtual Reality simulation Demo

Feb 14, 2024

by Jérôme Guzzi *

Genoa, 28 February 2024

organised by AITEK, CNR IEIIT, School of Robotics

at School of Robotics

via Balbi 1A, 16126 Genova

h. 10-12; 15-17

The Horizon Europe project REXASI-PRO is developing trustworthy-by-design tools to help people with reduced mobility, like smart wheelchairs that navigate autonomously among people. In this context, we are designing social-compliant behaviours that are comfortable for the user sitting in the wheelchair, as well as perceived as friendly by people sharing the same spaces. We present a demonstration of the setup we are using to test navigation algorithms with humans-in- the-loop: subjects immerse themselves in a virtual reality simulation populated with virtual wheelchairs and pedestrians (equipped with bio-inspired navigation algorithms [3]), are free to move around and experience how the virtual agents behave and react to their actions.

Virtual reality is well suited to experiment with cognitive Human-Robot interaction, like when using pointing gestures to select objects or destinations [1]. Simulation speeds up development, for instance, by isolating control algorithms from perception. Moreover, we can experiment with robots that we don’t have in our labs. Virtual reality extends simulation to include real humans in a human-driven development. In our context, we can measure the behaviour of people during the simulation and gather their feedback to iteratively improve wheelchair navigation in critical situations, like when negotiating narrow passages with people. Such experiments complement large-scale offline simulations that measure instead navigation safety and efficiency [2].

In this demonstration, users wear a VR headset (Meta Quest 3), connected wirelessly to a robotics simulator (CoppeliaSim) running different navigation scenes with virtual wheelchairs and pedestrians. While users move in an empty area of about 6 x 6 meters, their movements in the virtual scene are visualized on a screen, so that spectators are aware of what is happening in simulation. Users can experience two modalities: in virtual reality, they are fully immersed to perceive virtual agents in a virtual environment; in mixed reality, they perceive virtual agents in the real environment instead. A video featuring a previous version of the setup limited to virtual reality is available online: Testing wheelchair navigation in a Virtual Reality


[1] Jérôme Guzzi, Gabriele Abbate, Antonio Paolillo, and Alessandro Giusti. Interacting with a conveyor belt in virtual reality using pointing gestures. In 2022 17th ACM/IEEE International Conference on Human-Robot Interaction (HRI), pages 1194–1195. IEEE, 2022.

[2] Jérôme Guzzi, Alessandro Giusti, Luca M. Gambardella, Guy Theraulaz, and Gianni A Di Caro. Human-friendly robot navigation in dynamic environments. In 2013 IEEE international conference on robotics and automation, pages 423–430. IEEE, 2013.

[3] Jérôme Guzzi. Navground. https://github.com/idsia-robotics/navground, 2023.

* Jérôme Guzzi holds a Master’s degree in Physics for ETH Zurich and a PhD in Informatics from USI Lugano. He is a senior researcher at the Dalle Molle Institute for Artificial Intelligence (IDSIA, SUPSI-USI) in Lugano, Switzerland. His research spans several topics in mobile robotics: perception, path planning and navigation, swarm robotics, human-robot interaction, and educational robotics. Currently, his primary interest is in the relationships between communication, coordination, and complexity in multi-human/multi-robot systems.