Amit Garg
UX Research and Design
Asset 5.png

ARES VR

ARES: Exploring new VR interactions via interactive narrative

Executive Summary

Room-scale VR systems like the HTC Vive have the potential to provide extraordinarily immersive experiences. With a 4m x 3m tracking space, interactors can walk, crawl, climb, and jump in VR as the virtual world responds contextually to their actions. As part of the eTV Lab and in collaboration with an industry partner, this project sought to explore the fascinating space of 3D interaction design by applying UX Design principles, process, and prototyping to the design of interactive VRXs (virtual reality experiences).

My Contributions

Storyboarding

3D Interaction Design and Wireframing

Rapid VR Prototyping (Unity3D)

Agile Development

UX Research and Evaluation

 

Crafting the Narrative: Designing for 3D Interaction

With room-scale VR, interactions are nearly limitless. In our ideation sessions, we pitched interactions within a story context. Narratives ranged from an implicit bias training simulation for police, to being a sea turtle near the shores of Miami beach, to being alone in a room with a cat, to transferring your soul into other beings. 

Each narrative was pitched using storyboards -- a highly useful tool for ideating in VR. We decided upon a sub-terranean interplanetary experience featuring novel interactions like crawling, climbing, and shimmying all bound together by a narrative element of survival. 

 
 sketches by  Pranav Nair

sketches by Pranav Nair

 sketches by  Pranav Nair

sketches by Pranav Nair

 sketches by  Pranav Nair

sketches by Pranav Nair

 

Rapid Prototyping: A Live Demo

Unity3D and SteamVR, our weapons of choice, provided a useful framework for rapidly prototyping our interaction concepts. Prototyping in VR requires the use of physics engines to simulate real-world behaviors. With the help of the HTC Vive community, we created a low-fidelity version of our narrative ready for feedback from keen gamers at Georgia Tech.

IMG_20161026_144828.jpg
 

Wireframing for VR

To move around the virtual world, most VR games provide a teleportation feature, or take you on a 'magic carpet ride' throughout the virtual environment. But, to achieve truly immersive experiences, people need to be able to walk around the environment -- what we call natural locomotion. It's how we humans explore the affordances of a space and make sense of interacting within it. Unfortunately, the HTC Vive only provides a 4m x 3m tracking space, while virtual environments can span thousands of meters. How, then, can we enable natural locomotion? Make people think they're exploring a space larger than the room they're in using Impossible Spaces

Analogous to UX wireframes, the wireframes below demonstrate the navigation interaction users have in our VR experience and the flow between different phases of the experience. Using nothing but tracing paper and felt-tip pens, I created these architectural wireframes before actualizing them in a high-fidelity prototype in VR. 

 
ARES wireframes-01.png
ARES wireframes sideways-03.png
 

How did we do?

Recently, we designed and executed a 15-person evaluation of ARES. The results are yet forthcoming. We will be presenting our work at IEEE VR in March and at CHI in May!

IMG_20161207_172528.jpg