MIT’s AI Can Track Humans Through Walls With Just a Wifi Signal
Researchers at the Massachusetts Institute of Technology have developed a new piece of software that uses wifi signals to monitor the movements, breathing, and heartbeats of humans on the other side of walls. While the researchers say this new tech could be used in areas like remote healthcare, it could in theory be used in more dystopian applications.
“We actually are tracking 14 different joints on the body […] the head, the neck, the shoulders, the elbows, the wrists, the hips, the knees, and the feet,” Dina Katabi, an electrical engineering and computer science teacher at MIT, said. “So you can get the full stick-figure that is dynamically moving with the individuals that are obstructed from you — and that’s something new that was not possible before.” The technology works a little bit like radar, but to teach their neural network how to interpret these granular bits of human activity, the team at MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) had to create two separate A.I.s: a student and a teacher.
The team developed one A.I. program that monitored human movements with a camera, on one side of a wall, and fed that information to their wifi X-ray A.I., called RF-Pose, as it struggled to make sense of the radio waves passing through that wall on the other side. The research builds off of a longstanding project at CSAIL lead by Katabi, which hopes to use this wifi tracking to help passively monitor the elderly and automate any emergency alerts to EMTs and medical professionals if they were to fall or suffer some other injury.
Massachusetts Institute of Technology behind the innovation has previously received funding from the Pentagon’s Defense Advanced Research Projects Agency (DARPA). Another also presented work at a security research symposium curated by a c-suite member of In-Q-Tel, the CIA’s high-tech venture capital firm.