Grad Student’s Research Aims to Enhance VR Gaming Experience
Wearing an Oculus Rift virtual reality headset, you’re immersed in a chaotic pre-historic landscape, playing the role of a recent dinosaur hatchling attempting to complete four tasks. It’s an incredible sensation to dodge an attacking dragonfly and have an up close encounter with a roaring adult T Rex.
Amidst all this excitement, real-world reality is calling. You’re thirsty and you’d like to grab a drink. But how can you do that without flipping the goggles off and interrupting the gaming experience? CS graduate student Pulkit Budhiraja thinks he has a solution.
For his master’s thesis, Budhiraja designed several mixed reality renderings that selectively incorporate the physical world into the virtual world for interactions with physical objects. He then conducted a user study to compare his techniques, which balance immersion in a virtual world with ease of interaction with the physical world.
“It is a genuine usability issue because you become completely blind to the world when you’re wearing the display,” said Budhiraja, who is conducting research in CS Professor David Forsyth’s group. “We wanted a solution that doesn’t hamper the sense of immersion you have in a virtual world.”
Budhiraja’s solution included attaching two cameras to an Oculus Rift, creating a stereo view of objects in front of the user. “The cameras are your proxy eyes,” said Budhiraja, who wrote a color segmentation algorithm to selectively feed content into the virtual world scenario. “We chose to show all the colors in the skin color range, so users could see their hands. We also color segmented for the color of the cup [in our experiment] so that, too, would be visible.”
In addition, he added the edges of other objects like the keyboard and desk to give the user some context of where the drink was located. This approach, which he called Objects-Hands & Context (OHC), was the preferred method among 10 gamers who tried all four visual rendering methods he and his team developed.
According to Budhiraja, OHC allowed gamers to quickly re-acclimate themselves to the physical environment, especially when they were moving their head and body a lot as part of the VR experience.
While his color rendering technique was fairly simple, Budhiraja may pursue more sophisticated methods that could be applied to other objects of any color or size. For example, he could develop a system where a user shows a mouse or joystick to the external cameras from many angles, training the computer to recognize the physical object as something the user wants to interact with while immersed in the VR scene.
In September, Budhiraja showed a demonstration of his technology to Illinois alumnus Richard Yao, an experimental perceptual psychologist at Oculus, who was on campus to present an ACM lecture. According to Budhiraja, Oculus is interested in exploring the idea further.
More recently, Budhiraja published the results of his project on arXiv, the electronic pre-print web site for CS papers. Discover Magazine and MIT Technology Review picked up on the story. He is continuing to explore the boundaries between physical and virtual realities as part of his doctoral degree research at Illinois.