Nahrstedt on Team Receiving $1.2M NSF Grant to Develop Pioneering System for 360-Degree Video Creation and Delivery

12/4/2019

Creating and delivering 360-degree video to a large audience of viewers over the Internet remains an unsolved scientific problem.

Written by

Imagine watching a University of Illinois football game from the comfort of your own living room, but experiencing the game as if you were on the field with the quarterback. A form of virtual reality known as “360 videos” allows the viewer wearing a headset to do exactly that. 360 videos allow viewers to experience media content in an immersive fashion. But, creating and delivering 360 videos to a large audience of viewers over the Internet remains an unsolved scientific problem.

Klara Nahrstedt
Klara Nahrstedt

A team of researchers that includes Illinois Computer Science Professor Klara Nahrstedt, along with University of Massachusetts Amherst's Michael Zink, Professor of Electrical and Computer Engineering, and Ramesh Sitaraman, Professor of the College of Information and Computer Sciences, has been awarded a $1.2M grant from the National Science Foundation (NSF) to develop a revolutionary new system, called mi360World, for creating, delivering, and navigating 360 videos at scale over the internet.

In contrast to traditional video, according to the collaborators, 360 video is recorded with a special camera that captures the complete surroundings from almost all directions. Viewers consuming 360 video can select the direction they are looking at by using a pointing device on a regular display or through head movement using a head-mounted device. This new format allows a viewer to change viewing direction when watching the video; meaning, for instance, that a viewer can watch a sporting event from multiple perspectives on the field.

“In recent years,” the researchers explain, “virtual and augmented reality applications have seen a significant increase in popularity. However, despite these technological advances, major challenges remain with respect to effective representation, storage and distribution of 360 videos on the Internet.”

An additional challenge the group must address is cyber sickness, a phenomenon that occurs when a viewer’s interaction with a virtual environment triggers symptoms similar to motion sickness. This type of sickness occurs when there is a time lag of more than 20 milliseconds between the head movement and the rendering of the new scene.

Preventing sickness is one of the three major research thrusts the proposed project incorporates.

The first thrust is a video creation thrust that enables personalized viewing by generating innovative navigation graphs and cinematographic rules, while also maintaining a high quality of experience to reduce cyber sickness.

The second thrust focuses on scalable distribution of 360 videos to a global set of diverse viewers, utilizing navigation graphs and cinematographic rules for highly efficient prefetching and caching. Nahrstedt, the Ralph M. and Catherine V. Fisher Professor of Computer Science, has contributed a multitude of foundational research to the fields of multimedia distributed systems, 3D tele-immersion and novel multimedia applications. These experiences make her a perfect fit for this project, with a specific focus on the first and second research thrusts.

The third thrust focuses on quality of experience and has the goal of devising novel metrics and evaluation methods to assess cyber sickness. The researchers say that system architectures and algorithms will be extensively evaluated through simulation, emulation, and benchmarking using testbeds to assess the success of the proposed research.

If successful, the researchers explain that their work will transform 360 video creation and delivery and enable new and much richer educational, training, and entertainment experiences. It will also help train a new class of multimedia systems researchers and practitioners.

“In addition to the obvious industrial impact of the mi360World project, it will have major impact on students,” said Nahrstedt. “We will be educating a new class of multimedia systems researchers who will be in high demand in academia and the media industry.”


NSF Award Abstract #1900875, "CNS Core: Medium: Collaborative Research: Scalable Dissemination and Navigation of Video 360 Content for Personalized Viewing"

See the original CSL story.


Share this story

This story was published December 4, 2019.