Fidelity describes how closely a replication resembles the original. It can be helpful to analyze how faithful interactions in virtual reality (VR) are to a reference interaction. In prior research, fidelity has been restricted to the simulation of reality – also called realism. Our definition includes other reference interactions, such as superpowers or fiction. Interaction fidelity is a multilayered concept. Unfortunately, different aspects of fidelity have either not been distinguished in scientific discourse or referred to with inconsistent terminology.
Therefore, we present the Interaction Fidelity Model (IntFi Model). Based on the human-computer interaction loop, it systematically covers all stages of VR interactions. The conceptual model establishes a clear structure and precise definitions of eight distinct components. As a communication tool, it helps teams to understand and discuss fidelity in VR. It was reviewed through workshops with fourteen VR experts. We provide guidelines, diverse examples, and educational material to apply the IntFi Model universally to any VR experience and propose foundational research opportunities.
With this publication, we provide free educational posters and slide decks for teaching and teamwork. It is distributed under the Creative Commons BY 4.0 license. It can then be shared, adapted, and printed as long as the original publication and authors are appropriately cited, and any modifications are indicated. The editable presentation slides and the posters in printing quality are in the supplemental material of the publication.
Large A0 PosterMedium A1 PosterSlide Deck (light design)
During the COVID-19 pandemic, online meetings became common for daily teamwork in the home office. To understand the opportunities and challenges of meeting in virtual reality (VR) compared to videoconferences, we conducted the weekly team meetings of our human-computer interaction research lab on five off-the-shelf online meeting platforms over 4 months. After each of the 12 meetings, we asked the participants (N = 32) to share their experiences, resulting in 200 completed online questionnaires. We evaluated the ratings of the overall meeting experience and conducted an exploratory factor analysis of the quantitative data to compare VR meetings and video calls in terms of meeting involvement and co-presence. In addition, a thematic analysis of the qualitative data revealed genuine insights covering five themes: spatial aspects, meeting atmosphere, expression of emotions, meeting productivity, and user needs. We reflect on our findings gained under authentic working conditions, derive lessons learned for running successful team meetings in VR supporting different kinds of meeting formats, and discuss the team’s long-term platform choice.
We propose an approach to facilitate adjustable grip for object interaction in virtual reality. It enables the user to handle objects with a loose and firm grip using conventional controllers. Pivotal design properties were identified and evaluated in a qualitative pilot study. Two revised interaction designs with variable grip were compared to the status quo of invariable grip in a quantitative study. The users performed placing actions with all interaction modes. Performance, clutching, task load, and usability were measured. While the handling time increased slightly using variable grip, the usability score was significantly higher. No substantial differences were measured in positioning accuracy. The results lead to the conclusion that variable grip can be useful and improve realism depending on tasks, goals, and user preferences.
Some virtual reality (VR) applications require true-to-life object manipulation, such as for training or teleoperation. We investigate an interaction technique that replicates the variable grip strength applied to a held object when using force-feedback gloves in VR. We map the exerted finger pressure to the rotational freedom of the virtual object. With a firm grip, the object’s orientation is fixed to the hand. With a loose grip, the user can allow the object to rotate freely within the hand. A user study (N = 21) showed how challenging it was for participants to control the object’s rotation with our prototype employing the SenseGlove DK1. Despite high action fidelity, the grip variability led to poorer performance and increased task load compared to the default fixed rotation. We suspect low haptic fidelity as an explanation as only kinesthetic forces but no cutaneous cues are rendered. We discuss the system design limitations and how to overcome them in future haptic interfaces for physics-based multi-finger object manipulation.
It is challenging to provide users with a haptic weight sensation of virtual objects in VR since current consumer VR controllers and software-based approaches such as pseudo-haptics cannot render appropriate haptic stimuli. To overcome these limitations, we developed a haptic VR controller named Triggermuscle that adjusts its trigger resistance according to the weight of a virtual object. Therefore, users need to adapt their index finger force to grab objects of different virtual weights. Dynamic and continuous adjustment is enabled by a spring mechanism inside the casing of an HTC Vive controller. In two user studies, we explored the effect on weight perception and found large differences between participants for sensing a change in trigger resistance and, thus, for discriminating virtual weights. The variations were easily distinguished and associated with weight by some participants, while others did not notice them at all. We discuss possible limitations, confounding factors, how to overcome them in future research and the pros and cons of this novel technology.
Hand interaction plays a key role in virtual reality (VR) sports. While in reality, athletes mostly rely on haptic perception when holding and throwing objects, these sensational cues can be missing or differ in virtual environments. In this work, we investigated how the visibility of a virtual hand can support players when throwing and what impact it has on the overall experience. We developed a Frisbee simulation in VR and asked 29 study participants to hit a target. We measured the throwing accuracy and self-reports of presence, disc control, and body ownership. The results show a subtle advantage of hand visibility in terms of accuracy. Visible hands further improved the subjective impression of realism, body ownership and subjective control over the disc.
When playing sports in virtual reality, foot interaction is crucial for many disciplines. We investigated how the visibility of the foot influences penalty shooting in soccer. In a between-group experiment, we asked 28 players to hit eight targets with a virtual ball. We measured the performance, task load, presence, ball control, and body ownership of inexperienced to advanced soccer players. In one condition, the players saw a visual representation of their tracked foot, which significantly improved the accuracy of the shots. Players with invisible foot needed 58% more attempts. Further, with foot visibility, the self-reported body ownership was higher.
Questionnaires are among the most common research tools in virtual reality (VR) user studies. Transitioning from virtuality to reality for giving self-reports on VR experiences can lead to systematic biases. VR allows the embedding of questionnaires into the virtual environment, which may ease participation and avoid biases. To provide a cohesive picture of methods and design choices for questionnaires in VR (inVRQ), we discuss 15 inVRQ studies from the literature and present a survey with 67 VR experts from academia and industry. Based on the outcomes, we conducted two user studies in which we tested different presentation and interaction methods of inVRQs and evaluated the usability and practicality of our design. We observed comparable completion times between inVRQs and questionnaires outside VR (nonVRQs) with higher enjoyment but lower usability for inVRQs. These findings advocate the application of inVRQs and provide an overview of methods and considerations that lay the groundwork for inVRQ design.
Providing haptic feedback in virtual reality to make the experience more realistic has become a strong focus of research in recent years. The resulting haptic feedback systems differ greatly in their technologies, feedback possibilities, and overall realism, making it challenging to compare different systems. We propose the Haptic Fidelity Framework, providing the means to describe, understand and compare haptic feedback systems. The framework locates a system in the spectrum of providing realistic or abstract haptic feedback using the Haptic Fidelity dimension. It comprises 14 criteria that either describe foundational or limiting factors. A second Versatility dimension captures the current trade-off between highly realistic but application-specific and more abstract but widely applicable feedback. To validate the framework, we compared the Haptic Fidelity score to the perceived feedback realism of evaluations from 38 papers and found a strong correlation, suggesting the framework accurately describes the realism of haptic feedback.
Non-Intrusive Feedback for Virtual Walls in VR Environments with Room-Scale Mapping
Mette Boldt, Michael Bonfert, Inga Lehne, Melina Cahnbley, Kim Korschinq, Loannis Bikas, Stefan Finke, Martin Hanci, Valentin Kraft, Boxuan Liu, Tram Nguyen, Alina Panova, Ramneek Singh, Alexander Steenbergen, Rainer Malaka & Jan Smeddinck (2018)
Room-scale mapping facilitates natural locomotion in virtual reality (VR), but it creates a problem when encountering virtual walls. In traditional video games, player avatars can simply be prevented from moving through walls. This is not possible in VR with room-scale mapping due to the lack of physical boundaries. Game design is either limited by avoiding walls, or the players might ignore them, which endangers the immersion and the overall game experience.
To prevent players from walking through walls, we propose a combination of auditory, visual, and vibrotactile feedback for wall collisions. This solution can be implemented with standard game engine features, does not require any additional hardware or sensors, and is independent of game concept and narrative. A between-group study with 46 participants showed that a large majority of players without the feedback did pass through virtual walls, while 87% of the participants with the feedback refrained from walking through walls. The study found no notable differences in game experience.
That’s all for the Virtual Reality category. Scroll up to see projects on Voice Interfaces, Games, and Apps & Experiences.
The next major evolutionary stage for voice assistants will be their capability to initiate interactions by themselves. However, to design proactive interactions, it is crucial to understand whether and when this behaviour is considered useful and how desirable it is perceived for different social contexts or ongoing activities. To investigate people’s perspectives on proactivity and appropriate circumstances for it, we designed a set of storyboards depicting a variety of proactive actions in everyday situations and social settings and presented them to 15 participants in interactive interviews. Our findings suggest that, although many participants see benefits in agent proactivity, such as for urgent or critical issues, there are concerns about interference with social activities in multi-party settings, potential loss of agency, and intrusiveness. We discuss our implications for designing voice assistants with desirable proactive features.
Find a video introduction of the CUI talk by Nima Zargham on YouTube
Digital home assistants have an increasing influence on our everyday lives. The media now reports how children adapt the consequential, imperious language style when talking to real people. As a response to this behavior, we considered a digital assistant rebuking impolite language. We then investigated how adult users react when being rebuked by the AI. In a between-group study (N = 20), the participants were rejected by our fictional speech assistant “Eliza” when they made impolite requests. As a result, we observed more polite behavior. Most test subjects accepted the AI’s demand and said “please” significantly more often. However, many participants retrospectively denied Eliza the entitlement to politeness and criticized her attitude or refusal of service.
Smart displays augment the concept of a smart home speaker with a touchscreen. Although the visual modality is added in this device variant, the virtual agent is still only represented through auditory output and remains invisible in most current products. We present an empirical study on users’ interactions with a smart display on which the agent is embodied with a humanoid representation.
Three different conditions are compared in a between-group experiment: no agent embodiment, a digitally rendered character, and a photorealistic representation performed by a human actress. Our quantitative data do not indicate that agent visualization on a smart display affects the user experience significantly. On the other hand, our qualitative findings revealed differentiated perspectives by the users. We discuss potentials and challenges of embodying agents on smart displays, reflect on their continuous on-screen presence, present user considerations on their appearance, and how the visualization influenced the politeness of the users.
That’s all for the Voice Interfaces category. Scroll up to see projects on Virtual Reality, Games, and Apps & Experiences.
Many virtual and mixed-reality games focus on single-player experiences. This paper describes the concept and prototype implementation of a mixed-reality multiplayer game that can be played with a smartphone and an HMD in outdoor environments. Players can team up to fight against attacking alien drones. The relative positions between the players are tracked using GPS, and the rear camera of the smartphone is used to augment the environment and teammates with virtual objects. The combination of multiplayer, mixed reality, the use of geographical location and outdoor action together with affordable, mobile equipment enables a novel strategic and social game experience.
Space Project Y
A Virtual Reality Exergame in Outer Space
Michael Bonfert, Mette Boldt, Inga Lehne, Melina Cahnbley, Kim Korschinq, Loannis Bikas, Stefan Finke, Martin Hanci, Valentin Kraft, Boxuan Liu, Tram Nguyen, Alina Panova, Ramneek Singh, Alexander Steenbergen, Rainer Malaka & Jan Smeddinck (2018)
Master’s Project in Digital Media at the University of Bremen, Winter Semester 2016/17
S.P.Y is a sci-fi virtual reality exergame – a motion-based game including physical exercises – that aims at treating and preventing back pain. It was designed and developed by a group of 14 master students. By incorporating exercises, the game supports back training and general movement while having fun.
In S.P.Y, the players assume the role of a spy working for the intergalactic, top-secret enterprise “Space Project Y,” which specializes in infiltrating alien cultures. They do so by wearing a special space suit that makes them look like one of the aliens – as long as they mimic the alien’s movement and behavior. This ensures the correct execution of the back exercise. The player moves through the virtual world by walking in the real world using VR room-scale technology, vastly improving the immersion. All exercises used in the game have been discussed with a physiotherapist to facilitate optimal training for the back. The group created the game concept, characters, 3D models and textures, animations, level designs, sound design, and technical implementation.
Hell No Barbie
An Interactive Exhibit
Michael Bonfert & Florian Schröder (2016)
Student Project in Digital Media at the University of the Arts Bremen, Summer Semester 2016, awarded with the ADC Junior Award Bronze Nail
Mattel sent a spy into the children’s room with “Hello Barbie.” The inquisitive doll can talk to children. Recordings are stored on US servers, analyzed and shared with third parties. “Hello Barbie” pretends to be a best friend and obtains secrets by asking targeted questions. Parents can eavesdrop on their child’s every conversation. The exhibition aims to bring to life how “Hello Barbie” can manipulate children, probe their privacy and feign empathy.
The installation effectively stages Barbie using light. Visitors select sample dialogs via a menu. The subtitles are projected onto the wall. When Barbie speaks, she is illuminated from the front and casts an oversized shadow. When the child speaks, Barbie disappears into the darkness of the child’s silhouette. The voices are each recorded via a stereo channel. Visitors can also talk to Barbie themselves and listen to these recordings directly via the web interface with headphones.
Thurid is a Viking warrior who was captured by the god-like Jotuns. She escapes and fights her way through Midgard, the world of humans. Her next quest is to find the roots of Yggdrasil, a huge magical tree. If she climbs all the way up to Asgard, the Gods will have answers for her. There are dangerous animals in Thurid’s way. Can you guide her all the way up?
This game prototype was developed in a Master’s course at the University of Bremen in 2016.
This is the story of two walking bats who wish to find each other in a pitch-black maze. Many dangers linger for a bat that cannot fly, but their echolocation helps them to find the way. As each has one wing, they can only escape together. Step in and help them find each other!
This cooperative multiplayer game was developed at the Global Game Jam 2017 at the University of Bremen.
That’s all for the Games category. Scroll up to see projects on Virtual Reality, Voice Interfaces, and Apps & Experiences.
When I started my Bachelor’s studies in Augsburg in 2011, there was no mobile app for students. So, I started a student initiative to create the first version of the CampusApp with a team of talented volunteers. The supported features, platforms, and active users quickly grew. I led the group as a product owner from 2011 to 2013.
Our student initiative has been honored with a Special Award for Student Commitment in 2014. The project is still being developed by the University of Augsburg and helps students in their everyday plans.
Other Projects & Prototypes
That’s all for the Apps & Experiences category. Scroll up to see projects on Virtual Reality, Voice Interfaces, and Games.