Nahrstedt to Shake Off Zoom-Fatigue With an Augmented-Reality System for Virtual Meetings

9/30/2021 Jenny Applequist, Coordinated Science Lab

Under a new grant from the National Science Foundation, Illinois CS professor and CSL director Klara Nahrstedt will lead a timely effort to create a next-generation, mixed-reality, immersive meeting environment.

Written by Jenny Applequist, Coordinated Science Lab

The COVID-19 pandemic has forced hundreds of millions of people to interact with each other over online videoconferencing systems instead of meeting face-to-face—and no one would deny that tools like Zoom have made the pandemic far easier to cope with. However, such services provide only an unnatural-seeming meeting environment that emphasizes participants’ isolation and potentially leaves them feeling marginalized, unseen, uncomfortable, and less able to focus, resulting in less productive conversations. 

Klara Nahrstedt
Klara Nahrstedt

Now, under a new grant from the National Science Foundation, Illinois CS professor and Coordinated Science Laboratory director Klara Nahrstedt will lead a timely effort to create a next-generation, mixed-reality, immersive meeting environment that offers attendees a vivid experience that better simulates the feeling of in-person conversations.

The project was inspired by the faculty PIs’ less-than-ideal experiences in switching to remote lecturing when COVID-19 hit in March 2020: participating students might or might not be visible, and they didn’t know where to look when talking. Group work and interaction were difficult, and were even harder when a meeting included some people who were together in person and others participating via Zoom. 

“It is clear that the COVID-19 virus will stay with us for some time,” says Nahrstedt, who is also a Grainger Distinguished Chair in Engineering. “Even if a cure is found, hybrid gatherings—in which in-person and virtual groups come together—will happen because of other diseases, unforeseen circumstances, and other needs.”

Indeed, emergency use during crises is far from being the only valuable application of the proposed virtual meeting environment. To give just one example, if superior online learning options are enabled, a range of educational and workforce development activities could be pursued more effectively, by more people, and at lower cost.

Nahrstedt’s colleagues in the project include Profs. Ramesh K. Sitaraman and Michael H. Zink of the University of Massachusetts Amherst and Prof. Jacob Chakareski of the New Jersey Institute of Technology. Together, the team will build and evaluate an augmented-reality distributed system called miVirtualSeat that will closely simulate the immersive experience of in-person meetings. 

Graphic displaying axis for augmented reality headset
The goal with miVirtualSeat will be to build and evaluate an augmented-reality distributed system that will closely simulate the immersive experience of in-person meeting.

The vision is that miVirtualSeat will enable “hybrid” teleconferencing, wherein some people are locally present—for example, around a table in a physical meeting room—while virtual participants attend the meeting remotely, potentially from locations that have only limited compute and network resources. The locally present people will wear augmented-reality headsets through which it will appear as if the remote people (in the form of volumetric videos) are also locally present, sitting in physical chairs around the local table. Similarly, the remote participants will see the meeting room, with the physical participants, displayed in their own virtual-reality devices. This type of system would, for example, allow a course instructor to engage in a natural-feeling “classroom” discussion with a group of students, some of whom are physically present in the same room and some of whom are remotely occupying “virtual seats.”

The NSF project, which is entitled “CNS Core: Medium: miVirtualSeat: Semantics-aware Content Distribution for Immersive Meeting Environments,” will focus on key research challenges that must be overcome to achieve that vision. Specifically, the team will develop ways to detect, track, and localize distributed physical and virtual 360-degree avatars and objects in a joint immersive scene in real time; to reduce the bandwidth and latency of delivering integrated and synchronized 360-degree, volumetric, and 2D/3D video and ambisonics audio; and to ensure good quality-of-experience in the form of natural interactions between physical and virtual participants. 

Nahrstedt and her collaborators are looking forward to putting together a prototype miVirtualSeat system on the UIUC campus. “I am very excited to work on this system,” she says. “We are going to create and experiment with the teleconferencing system across all three universities, working with AR/VR/360 cameras, 3D cameras, et cetera. Should be really fun!”

The $1.2 million project will run for three years starting in October 2021. More information can be found on the project’s website.


See the original Coordinated Science Lab story.


Share this story

This story was published September 30, 2021.