How can you imagine a good virtual reality immersion without a sense of social presence with the virtual characters populating the environments? For Geoffrey Gorisse, a lecturer and researcher atthe Arts et Métiers institute Arts et Métiers Laval, the visual and behavioral consistency of avatars is crucial to promoting social interaction effects in a virtual reality application.
Perfecting the human aspect of avatars
Geoffrey placed visual fidelity of avatars at the heart of his research during his thesis and postdoctoral studies at the University of Barcelona's Event Lab in 2019. A year later, in this quest for realism, the project is coming to fruition. He is taking advantage of the time "assigned" to working from home (like everyone else during lockdown) to develop the Virtual Human Project (VHP) in Unity (one of the most widely used real-time 3D engines for developing video games and virtual reality applications) : a toolkit designed to provide virtual reality application developers with a set of optimized functions for procedurally animating avatars. The goal is to perfect the human aspect of avatars by focusing on three main features: gaze management (multiple strategies from static to probabilistic), lip synchronization (based on pre-recorded audio tracks or real-time audio from a microphone), and emotional expression (anger, disgust, fear, happiness, sadness, and surprise).
One tool, two skills
Making characters even more realistic truly enhances the headset experience, and developers are well aware of this for the success of their projects. The need is pressing, but the resources required to develop such tools are not always available within a single team. Computer labs have the technical skills to develop this type of tool, but they do not necessarily have the expertise in social sciences that psychology labs, for example, have. For the development of VHP, this dual approach was essential. Geoffrey, through his background (a master's degree in virtual engineering followed by a doctorate at LAMPA on the fidelity of avatars to feelings of embodiment in immersive virtual environments), contributed his dual technical skills (programming and 3D modeling) as well as his expertise in virtual humans to build this scalable, easy-to-integrate, and optimized tool.
Serving the scientific community
At this stage, the project primarily serves the scientific community and is part of a knowledge-sharing initiative. This open-source project (public and accessible to all) is available on GitHub, a platform for hosting source code for IT projects. Released on YouTube a few months ago, the project has already brought people together, initiating collaboration with researchers from the Memory and Cognition Laboratory (LMC2) at Paris Descartes University and being presented at a meeting of the CNRS virtual reality working group.