AFXR
Menu icoMenu232Dark icoCross32Dark

Ajoutez un logo, un bouton, des réseaux sociaux

Cliquez pour éditer
  • AFXR
  • Accueil ▴▾
  • L'AFXR ▴▾
    • Qui sommes-nous ?
    • L'équipe
    • Partenaires
    • Contribuer
    • Statuts
  • Evénements ▴▾
    • Evènements
    • Journées jf·XR 2025
    • Archives des jf·XR
  • Ressources ▴▾
    • Blog de l'AFXR
    • Publications clés
    • Entretiens sur la XR
    • Qu'est-ce que la XR ?
  • Communauté ▴▾
    • Offres d'emplois
    • Annuaires
    • Agenda commun
  • Zone Membre ▴▾
    • Que pouvez-vous y trouver ?
    • Annuaire
    • Soumettre un article
    • Soumettre une offre de poste
    • Communication Adhérents
  • Nous soutenir
  • Se connecter
  • Qui sommes-nous ?
  • L'équipe
  • Partenaires
  • Contribuer
  • Statuts
  • Evènements
  • Journées jf·XR 2025
  • Archives des jf·XR
  • Blog de l'AFXR
  • Publications clés
  • Entretiens sur la XR
  • Qu'est-ce que la XR ?
  • Offres d'emplois
  • Annuaires
  • Agenda commun
  • Que pouvez-vous y trouver ?
  • Annuaire
  • Soumettre un article
  • Soumettre une offre de poste
  • Communication Adhérents
Retour
Rebecca FRIBOURG
24 janvier 2024
Stage M2: Developing Avatar Control with Eye-Tracking in VR (LS2N)

ABONNEZ-VOUS À NOTRE NEWSLETTER

Abonnez-vous à notre newsletter
icoCross16Dark

Stage M2: Developing Avatar Control with Eye-Tracking in VR (LS2N)

Supervisors:

Rebecca FRIBOURG, Associate Professor at LS2N laboratory, Ecole Centrale de Nantes

  • Rebecca.Fribourg@ec-nantes.fr

Jean-Marie NORMAND, Professor at LS2N laboratory, Ecole Centrale de Nantes

  • Jean-Marie.Normand@ec-nantes.fr

Geoffrey GORISSE, Associate Professor at LS2N laboratory, Ecole Centrale de Nantes

  • geoffrey.gorisse@ensam.eu

Structure:

  • City: Nantes (France)
  • Category of hosting institution: Research Laboratory
  • Host institution: Ecole Centrale de Nantes

Keywords: Virtual Reality, Avatar Embodiment, Eye Tracking

Duration and start date: 6 months

Internship context:

Virtual Reality allows users to be immersed in virtual environments while being able to interact with their content (e.g. virtual objects or other users). In many applications, users are represented in the VE by a virtual body, also called an avatar, which is animated according to their own movements in real time. This can be done by using motion capture suits or only specific sensors (e.g., head, hands and feet sensors) used to estimate body movements with inverse kinematics algorithms [1]. However, this assumes that users are in the capacity to move, which is not always the case. Indeed, VR applications tend to be more and more inclusive, which implies being accessible to people that might suffer from paraplegia or tetraplegia and therefore cannot move their own body. Furthermore, VR applications can also provide medical solutions for limb rehabilitation of stroke patients with limited limb control [2]. While some of these applications only provide visual feedback without control to the participants (e.g., an animation over a virtual body walking seen from first person point of view [3]), some applications try to involve a source of control from the participants despite their lack of mobility. Indeed, the amount of control - even very low - over a virtual body can greatly improve the sense of embodiment towards it [4], and have positive fallouts on the rehabilitation and the global experience in VR. For instance, several works explored the use of EEG-based brain-computer interface to provide control over the virtual body [5]. Yet, the use of such systems raises a lot of technical challenges, requires training and is time consuming to install. Overall, there exist very few studies that explore control schemes for avatars that do not involve the physical body.

The aim of this internship is therefore to explore new control capacities over an avatar in VR that involves extremely little or no movement at all of the physical body of users. In particular, we are willing to explore the use of eye tracking, potentially combined with other inputs (voice, or one button) to control an avatar in VR.

storage?id=4061543&type=picture&secret=bkrZklVyBL7khHXl2P1fBC78SHb4VlCNCFZAhOV7&timestamp=1706083947

The goal of the internship will be to develop an immersive environment in Unity 3D and implement different control methods over an avatar involving eye tracking. The internship can include a user study on healthy participants in order to compare different control methods and how they might impact different dimensions of the user experience: the sense of embodiment (i.e., the feeling that the avatar is really the user’s body) towards the avatar (and in particular the sense of agency, that is the sense of being in control of one’s avatar’s movements), the ease-of-use, the performance of the method on a specific task, etc. If the results are conclusive, the intern will also have the possibility to write a research paper with the help of the supervisors.

Adapted equipment will be provided (VR-compatible computer, VR HMD, trackers, etc.)

The internship is open for a duration of 6 months, with a potential opening towards a PhD depending on funding opportunities.

Expected skills:

  • Object-oriented programming (C#, or C++)
  • Unity 3D
  • Autonomy
  • Interest in user studies and data analysis (not primordial)

References:

[1] D. Roth, J. -L. Lugrin, J. Büser, G. Bente, A. Fuhrmann and M. E. Latoschik, "A simplified inverse kinematic approach for embodied VR applications," 2016 IEEE Virtual Reality (VR), 2016, pp. 275-276, doi: 10.1109/VR.2016.7504760.

[2] Bui, Julie & Luaute, Jacques & Farnè, Alessandro. (2021). Enhancing Upper Limb Rehabilitation of Stroke Patients With Virtual Reality: A Mini Review. Frontiers in Virtual Reality. 2. 595771. 10.3389/frvir.2021.595771.

[3] J. Saint-Aubert, M. Cogne, I. Bonan, Y. Launey and A. Lecuyer, "Influence of user posture and virtual exercise on impression of locomotion during VR observation," in IEEE Transactions on Visualization and Computer Graphics, doi: 10.1109/TVCG.2022.3161130.

[4] Konstantina Kilteni, Raphaela Groten, Mel Slater; The Sense of Embodiment in Virtual Reality. Presence: Teleoperators and Virtual Environments 2012; 21 (4): 373–387. doi: https://doi.org/10.1162/PRES_a_00124

[5] Luu, T.P., Nakagome, S., He, Y. et al. Real-time EEG-based brain-computer interface to a virtual avatar enhances cortical involvement in human treadmill walking. Sci Rep 7, 8895 (2017). https://doi.org/10.1038/s41598-017-09187-0

Documents
icoPaperclip32Dark SujetStage_EyeTracking_23.pdf
Découvrez davantage d'articles sur ces thèmes :
Recrutement Recherche Technologies immersives Avatars
icoFacebook35Color icoTwitter35Color icoLinkedin35Color icoComment35Color
icoFacebook35Color icoTwitter35Color icoLinkedin35Color icoComment35Color
0 commentaire(s)
ou
Connectez-vous
Aucun commentaire pour le moment.
Consultez également
Poste Doctorant F/H Immersive and Situated Visualizations of Personal Data

Poste Doctorant F/H Immersive and Situated Visualizations of Personal Data

Informations généralesThème/Domaine : Interaction et visualisationInstrumentation et...

Alexandre KABIL
27 avril 2020
Thèse : « «Dynamique d’interactions tactiles et cognition sociale » à l’UTC

Thèse : « «Dynamique d’interactions tactiles et cognition sociale » à l’UTC

Poursuivez ici selon votre inspiration...Type de financement : Demi-bourse région + demi-bourse...

1 mai 2020
Internship on Human Robot Interaction evaluation in Virtual Reality

Internship on Human Robot Interaction evaluation in Virtual Reality

Keywords:Human Machine Interaction, Human Robot Collaboration, Virtual Reality, Virtual Reality...

Alexandre KABIL
3 mai 2020
Internship on Intelligent Tutoring System in Virtual Reality

Internship on Intelligent Tutoring System in Virtual Reality

Keywords: Virtual Reality Training System (VRTS), Data collection and analysis, Machine...

Alexandre KABIL
3 mai 2020
Postdoctoral researcher / Research engineer (AR/VR)

Postdoctoral researcher / Research engineer (AR/VR)

Keywords: Human Machine Interaction, Virtual Reality, Virtual Reality Training Systems (VRTS),...

Alexandre KABIL
3 mai 2020
Thèse: Prototypage rapide d’interactions en RA/RV

Thèse: Prototypage rapide d’interactions en RA/RV

Mots-clés : Interaction homme – machine (IHM), réalités virtuelle et augmentée (RV&A), génie...

Alexandre KABIL
4 mai 2020
Contacts
  • UFR Sciences et Technologies de l'Université Evry-Paris-Saclay, CE1455, 40 rue de Pelvoux, 91020 EVRY
  • Contact
Informations
  • A propos
  • Manifeste de l'AFXR
  • Les Statuts
  • Partenaires
Réseaux sociaux
  • LinkedIn
  • Twitter
  • Facebook
  •  
  • Plan du site
  • Licences
  • Mentions légales
  • CGUV
  • Paramétrer les cookies
  • Se connecter
  • Propulsé par AssoConnect, le logiciel des associations Professionnelles