6/13/2022 10:55:07 AM
Tarek Abdelzaher received an Outstanding Paper Award at RTAS'22 and the Test of Time Influential Paper award.
It's a great achievement for researchers to be recognized at a conference for their work. For one Illinois researcher, his work received attention not once, but twice at the same conference for two different papers.
Tarek Abdelzaher, a professor of computer science at the University of Illinois Urbana-Champaign and the Coordinated Science Laboratory, along with other researchers from UIUC, Massachusetts Institute of Technology, and George Mason University, received an Outstanding Paper Award at RTAS'22 for their work, “Self-Cueing Real-Time Attention Scheduling in Criticality-Aware Visual Machine Perception.” At the same conference, a paper Abdelzaher helped author 20 years ago, “RAP: A Real-Time Communication Architecture for Large-Scale Wireless Sensor Networks,” received the Test of Time Influential Paper award.
"Both of these papers have been a great team effort, and it wouldn't have been possible without all of the authors," said Abdelzaher. "It's lucky to have a great team, and I think that was a big factor in us winning both awards."
Both projects focus on improving algorithms that automate the processing of sensor data, but in very different ways. In the most recent paper, researchers demonstrated how to minimize the computational footprint of AI so that it can run on smaller hardware: specifically, perception-based AI. Algorithms for perception tasks focus on seeing everything possible, such as drones and autonomous cars. But all that computing power means a bigger processor. If researchers reduced the hardware and the cost of the algorithm, then they could build processors that were more physically and computationally lightweight. That's where researchers decided to take inspiration from how humans perceive the world.
"If I'm sitting in an IMAX theater, I'm not actually looking at all the pixels on that screen and paying the same attention to them. I'm looking at where the action is," said Abdelzaher. The research was funded by the Army Research Lab through the Internet Battlefield of Things REIGN project. "We don't need a machine that's powerful enough to keep up with all the pixels because humans aren't powerful enough to keep up with all the pixels. We just needed to know where to focus."
Rather than target everything at once, the AI technology was able to determine the most significant aspect of a given moment by utilizing an algorithm to self-cue its attention. This decreased computational power without sacrificing a significant amount of image quality. The team is hopeful this will be implemented into daily life in the future.
“I think this promises a lot more diverse and interesting applications in smart homes, smart hospitals, and just everyday life where more and more computing devices are equipped with intelligent capabilities,” said Abdelzaher. “I think some of this work will empower more applications in that space.”
Abdelzaher’s other paper, “RAP: A Real-Time Communication Architecture for Large-Scale Wireless Sensor Networks,” has gained significant recognition among researchers since it was published 20 years ago. The article has been cited several hundred times over the years and was honored at the conference for making a large contribution to the field.
In 2002, the paper’s authors wrote that RAP provided a novel prioritization protocol for distributed micro-sensing applications. It was beneficial for communication scheduling in sensor networks, in which many wireless devices are seamlessly integrated into a physical space to perform real-time monitoring or control, such as a surveillance system.
“Our paper was the first to propose a scheduling algorithm that meets real time constraints on end-to-end tasks that run in very large distributed sensing systems,” said Abdelzaher. “It’s definitely flattering to hear that a paper that we wrote 20 years ago won an influential paper award. That’s a very good compliment.”
Read the original story from the Coordinated Science Laboratory.