The VR/AR lab's mission is to develop novel mixed-reality systems and experiences, focusing on the following aspects:

  • Immersion
  • Interactivity
  • Collaboration
  • Adaptivity
  • Agency
  • Representation
  • Context-Awareness
  • Socio-Technical Design

Since its inception, the VR/AR Lab has been involved in different national and international research projects and collaborations:


Virtual Reality Body Image Intervention and Assessment Suite
The VR BIAS project explores novel virtual-reality-based approaches to improve body image, i.e., how someone perceives, feels, and thinks about their body. Employing different variations of body illusions and distinct ways to assess distorted perceptions of the body, the project focuses on measuring how strongly participants experience avatar embodiment, how this improves their body image, and how they generally feel during the experience, by combining self-report and behavioral assessments with state-of-the-art psychophysiology. VR BIAS will develop innovative tools and techniques for research and clinical practice and contribute to current psychological and neurobiological theories of body image disturbance and eating disorders.


Modulating Human Subjective Time Experience
The ChronoPilot project focuses on the different dimensions of time perception in individuals and groups of humans, as well as in hybrid systems consisting of humans and machines, such as software agents and robots. The project's goal is to create a time modulation toolkit and prototype technology capable of improving both the quality and the process of decision-making by extending/compressing human subjective time adaptively, whenever required. Exploring novel methods in cognitive science and applying mediated-reality technologies such as virtual/augmented reality (VR/AR) and body sensors to different human sensory channels, the ChronoPilot team will develop innovative solutions to control time perception plasticity.


Delegation of Decision-Making to Autonomous Agents in Socio-Technical Systems
With digital assistance and interconnectivity available ubiquitously, humans will increasingly delegate their social, economic or data-related transactions to artificial autonomous agents, both for reasons of convenience or complexity. This multidisciplinary project investigates possible designs and questions of trust or acceptance, using computational agent models that are validated in behavioural experiments as well as virtual and augmented reality technologies to examine aspects of representation and socio-technical acceptance.

Légionnaires Rallye

Interactive Treasure Hunt about the Luxembourgish Légionnaires
The Légionnaires Rallye is an interactive game that engages the players in a digital treasure hunt around Luxembourg City. The game concept was developed to promote the Légionnaires exhibition (June 30 - November 28, 2021) at the Musée Dräi Eechelen in collaboration with the Luxembourg Centre for Contemporary and Digital History (C2DH). Following the trail of the Luxembourgish légionnaires outside of the museum walls, players can see the cityscape in a different light and discover the history of the légionnaires in the places that marked their passage. For a maximum degree of accessibility, we realised the game as a mobile web application together with a dedicated backend platform.


Collaborative Context-Aware Mobile Training and Exploration
The CollaTrEx project aims at providing an integrated framework for collaborative context-aware mobile training and exploration, with a focus on in-situ collaboration within groups of learners engaged in varied educational activities. The advantages and opportunities of context-aware mobile learning are widely recognised but modern mobile devices offer an enormous potential beyond establishing precise spatio-temporal context. The multitude of sensors, as well as advanced recording and networking capabilities, call for increased interactivity and collaboration. However, instead of harnessing these capabilities to full effect for a genuinely collaborative and interactive mobile learning experience, they are often left unexploited.


Immersive Robotic Telepresence
The interaction with social robots allows for cognitive outcomes similar to human involvement. While many research efforts concentrate on aspects related to autonomous and cognitive robotics, enabling operators to directly control a social robot and immersively interact with users and bystanders opens up further possibilities, for instance, in therapeutic or educational contexts. The telepresence framework developed in this project focuses on the direct and immersive control via different interaction modes including motion, emotion and voice output.

Forest SaVR

Raising Awareness of Deforestation through VR
Deforestation is a serious issue shaping climate and contributing to global warming. The virtual reality application developed in this project allows users to interact with the environment in order to discover and immersively experience the effects of deforestation.


Procedural Environment Generation for Virtual Realities
In order to automatically and efficiently create large amounts of detailed, yet varying models and textures while keeping file sizes of assets low, the procedural generation of three-dimensional environments is key. The focus of this project is to create a framework for the generation of realistic, potentially infinite environments aimed specifically at virtual reality exploration - from larger terrain maps down to particularities of natural vegetation.


AR for Robotic Control
This project revolves around augmented reality solutions to extend the capabilities of mobile applications for robotic control. Using a mixture of computer vision-based analysis of camera feeds and the interactive visualisation of a robot’s sensory data, an array of data can be provided as superposed layers tailored to different use cases.


Mobile Asset Placement for AR Applications
Sharing georeferenced information between users is common to many mobile applications and services; far less common is the location-based sharing of virtual objects. The goal of this project is to create an augmented reality-based mobile application that enables users to place and interact with such digital objects in the real world, which are shared with and visible to fellow users.