8/19/2022 8:58:16 AM
His team’s solution can prevent nefarious use of health-monitoring devices as spying tools.
When you’re preparing for a trip, perhaps you do a few things to make your house look occupied when you’re gone. Maybe you put a lamp on a timer or ask the Post Office to hold your mail.
But have you ever considered recruiting a bunch of helpful ghosts to scare off intruders by appearing to live—even breathe—inside your empty home?
Thanks to Deepak Vasisht and his team, that farfetched-sounding tactic will soon be an option—and with good reason.
Recent years have seen dramatic advances in remote health-monitoring technology. Data on a range of health metrics, such as respiration patterns and heart rate, can now be collected by a sensor installed in a user’s home.
These sensors work a bit like the sonar of dolphins: they emit radio signals that are reflected off of people’s bodies and return to the sensor having been slightly modified by the human encounter in a way that provides information about each person’s physiological state. The sensors can easily “see” users even through walls, and there’s no need for users to wear clunky devices or remember to record health data.
In short, these sensors offer great benefits, but they have a dark side: what if someone else places such a device just outside your home, to spy on you? While the sensors were intended for things like monitoring sleep quality, an unintended consequence is that they can easily tell whether anyone is home, and, indeed, how many people are home, and even whether they’re awake. Thus, a defense against wrongful use of these devices has been needed.
“That’s where we come in,” says Vasisht, Illinois Computer Science professor.
To thwart the rogue use of health sensors, Vasisht and his team—including his students Jayanth “Jay” Shenoy, Zikun Liu, and Bill Tao, as well as Zachary Kabelac of a startup called Hedron—took inspiration from certain web browser privacy solutions.
“Google and Facebook, all of these companies, track every click you make,” says Vasisht. “And what some privacy solutions do is inject fake clicks, so [trackers] are confused about what your real preferences are. So that’s exactly what we’re doing here: we are injecting fake humans in the environment, which look real to this technology, but they don’t exist.”
Realistic “fake humans” may sound like a tall order, but Vasisht and his team have found a way to conjure them up. The trick involves complicated math, but the essential idea is simple: you make your home look occupied by installing a reflecting device that can precisely mimic human bodies’ reflection of signals. When the reflector bounces back the malicious sensor’s signal, it tweaks that signal to create the illusion that some desired number of humans are present. Further, it can do so with such sophistication that it provides realistic-seeming physiological data, such as breathing patterns, on behalf of the nonexistent occupants, and makes them appear to be moving around in the house.
“If somebody tries to track you in your home, they will not see just you, but a bunch of ghosts that they can’t distinguish from humans,” says Vasisht.
Vasisht and his colleagues are eager to commercialize the technology, which they call RF-Protect. However, they are currently undecided on what form it should take.
Led by Vasisht’s students Shenoy, Liu, and Tao, a paper describing the technical details of RF-Protect has been written and will be presented on August 25 at ACM SIGCOMM, the flagship annual conference of the ACM Special Interest Group on Data Communication.