A Solution for ‘Zoom Burnout’: Sitaraman and Zink to Build an Augmented Reality System for Virtual Meetings
Content
As the global pandemic closed the doors of schools, offices and public buildings worldwide, online teleconference platforms, such as Zoom, WebEx, Teams and many more have become the places where people gather to get work done. Yet, as anyone who has dragged themselves to back-to-back meetings or wrestled with the mute button knows, online conferencing software brings its own malady — so-called Zoom burnout. Worse, the move to online work can exacerbate feelings of isolation.
Now, a team of researchers, including the Manning College of Computer and Information Sciences's (CICS) Ramesh Sitaraman, and Michael Zink, College of Engineering, are working on the next generation of teleconferencing, thanks to a $1.2 million grant from the National Science Foundation (NSF). They’re part of a team, led by the University of Illinois’s Klara Nahrstedt and which includes researchers at the New Jersey Institute of Technology, which is designing miVirtualSeat — an immersive meeting environment that better simulates the real-life feeling of in-person get-togethers.
“How can we make this experience as pleasant and lively as possible,” asks Zink. “How can science teams and classes meet in a way most conducive to education and research?”
The vision is that miVirtualSeat will enable “hybrid” teleconferencing: a mix of in-person and remote. The locally present people will wear augmented-reality headsets through which it will appear as if the remote people (in the form of volumetric videos) are also locally present, sitting in physical chairs around the local table. Similarly, the remote participants will see the meeting room, with the physical participants, displayed in their own virtual-reality devices.
There are a number of enormously complicated scientific problems that the team has to solve, including developing ways to detect, track, and localize distributed physical and virtual 360-degree avatars and objects in a joint immersive scene in real time; reducing the bandwidth and latency of delivering integrated and synchronized 360-degree, volumetric, and 2D/3D video and ambisonics audio; and ensuring good quality-of-experience in the form of natural interactions between physical and virtual participants.
This is where the UMass team comes in. “We bring expertise in two critical areas,” says Sitaraman. “Sophisticated video delivery, or how we move enormous amounts of information over the internet nearly instantaneously, and the quality of experience, or how we create a virtual experience that is nearly as good as the in-person.”
It’s a challenge that Zink and Sitaraman, who have been close colleagues for a decade, are looking forward to. They’ll spend the next three years, along with their collaborators and students, tackling the scientific problems in providing an immersive virtual meeting experience.
Originally published by the UMass Amherst Office of News and Media Relations.