The VR/AR lab's mission is to develop novel mixed-reality systems and experiences, focusing on the following aspects:

  • Immersion
  • Interactivity
  • Collaboration
  • Adaptivity
  • Agency
  • Representation
  • Context-Awareness
  • Socio-Technical Design

Since its inception, the VR/AR Lab has been involved in different national and international research projects and collaborations:


DELICIOS

Delegation of Decision-Making to Autonomous Agents in Socio-Technical Systems
With digital assistance and interconnectivity available ubiquitously, humans will increasingly delegate their social, economic or data-related transactions to artificial autonomous agents, both for reasons of convenience or complexity. This multidisciplinary project investigates possible designs and questions of trust or acceptance, using computational agent models that are validated in behavioural experiments as well as virtual and augmented reality technologies to examine aspects of representation and socio-technical acceptance.

CollaTrEx

Collaborative Context-Aware Mobile Training and Exploration
The CollaTrEx project aims at providing an integrated framework for collaborative context-aware mobile training and exploration, with a focus on in-situ collaboration within groups of learners engaged in varied educational activities. The advantages and opportunities of context-aware mobile learning are widely recognised but modern mobile devices offer an enormous potential beyond establishing precise spatio-temporal context. The multitude of sensors, as well as advanced recording and networking capabilities, call for increased interactivity and collaboration. However, instead of harnessing these capabilities to full effect for a genuinely collaborative and interactive mobile learning experience, they are often left unexploited.

RoboTP

Immersive Robotic Telepresence
The interaction with social robots allows for cognitive outcomes similar to human involvement. While many research efforts concentrate on aspects related to autonomous and cognitive robotics, enabling operators to directly control a social robot and immersively interact with users and bystanders opens up further possibilities, for instance, in therapeutic or educational contexts. The telepresence framework developed in this project focuses on the direct and immersive control via different interaction modes including motion, emotion and voice output.

Forest SaVR

Raising Awareness of Deforestation through VR
Deforestation is a serious issue shaping climate and contributing to global warming. The virtual reality application developed in this project allows users to interact with the environment in order to discover and immersively experience the effects of deforestation.

PEnGen

Procedural Environment Generation for Virtual Realities
In order to automatically and efficiently create large amounts of detailed, yet varying models and textures while keeping file sizes of assets low, the procedural generation of three-dimensional environments is key. The focus of this project is to create a framework for the generation of realistic, potentially infinite environments aimed specifically at virtual reality exploration - from larger terrain maps down to particularities of natural vegetation.

AR2C

AR for Robotic Control
This project revolves around augmented reality solutions to extend the capabilities of mobile applications for robotic control. Using a mixture of computer vision-based analysis of camera feeds and the interactive visualisation of a robot’s sensory data, an array of data can be provided as superposed layers tailored to different use cases.

ObDrop

Mobile Asset Placement for AR Applications
Sharing georeferenced information between users is common to many mobile applications and services; far less common is the location-based sharing of virtual objects. The goal of this project is to create an augmented reality-based mobile application that enables users to place and interact with such digital objects in the real world, which are shared with and visible to fellow users.
x