AnhPhu Nguyen, one of the two students, shared a video demonstrating the technology, which was later picked up by 404 Media. The tech, called I-XRAY, uses Meta's smart glasses to live-stream video to Instagram. A computer program then tracks the stream and uses AI to recognize faces, pulling photos and running them through public databases to uncover names, addresses, phone numbers, and even relatives. This information is then relayed back to a phone app. Fortunately, the app is not being made available to the public, and they made it to show how dangerous face recognition technology can get when paired with AI in current times.
“The purpose of building this tool is not for misuse, and we are not releasing it,” Nguyen and Caine Ardafiyo explained in a project document that their goal is to raise awareness that this isn’t a distant dystopian future—it’s already possible with current technology. They note that I-XRAY is unique because it uses large language models (LLMs) to connect names and photos from massive data sources automatically.
In response for a comment from The Verge, Meta’s spokesperson cited a few lines from their policy document. Meta’s privacy policy warns smart glasses users not to “use your glasses to engage in harmful activities like harassment, infringing on privacy rights, or capturing sensitive information like pin codes.” It also advises users to “Show others how the capture LED works so they know when you’re recording. If the capture LED is covered, you’ll be notified to clear it before taking a photo or video or going live.”
With an AI-powered future on the horizon, and as technology continues to evolve, both developers and users need to remain vigilant about how these tools are used, and that innovation doesn't come at the cost of personal security and trust. But for now, there are steps individuals can take to protect their privacy. In their document, Nguyen and Ardafiyo provide a list of reverse face search tools and people search databases that offer the option to opt out.