AI-Powered Privacy Breach: Engineering Students Develop App to Expose Personal Information Using Smart Glasses
- Oct 04, 2024
- 0
Recently, a pair of engineering students at Harvard created an application utilizing Ray-Ban Meta smart glasses, which can disclose sensitive information about individuals unbeknownst to them. The students shared a video demonstration on X (previously known as Twitter), showcasing the app's features. Significantly, the app is not intended for public distribution; rather, its aim is to underscore the potential threats posed by AI-driven wearable gadgets equipped with subtle cameras capable of capturing images and footage of people.
The application, known as I-Xray, employs artificial intelligence for facial recognition and utilizes processed visual data to expose personal details. The term doxxing, stemming from the slang phrase meaning to "drop dox," refers to the act of unveiling someone's private information without their permission. The app has been designed to function with the Ray-Ban Meta smart glasses but is compatible with any smart glasses featuring discreet cameras. It leverages an AI model akin to those found in services like PimEyes and FaceCheck for reverse facial recognition. This technology can align a person's face with publicly accessible images found online and sift through associated URLs.
Subsequently, another large language model is utilized to analyze these URLs, generating automated prompts to extract information such as the individual’s name, job title, residence, and similar details. Furthermore, the AI examines publicly accessible government information, including voter registration records. An online resource called FastPeopleSearch is also employed for this purpose.
In their brief video demo, students AnhPhu Nguyen and Caine Ardayfio illustrated how the app functions. They were able to encounter strangers with the camera activated and request their names, with the AI-driven application efficiently retrieving personal information about the individuals involved.
The developers noted that the collaboration between large language models and reverse facial recognition enables a level of comprehensive and automatic data collection that conventional methods could not achieve. Although the students have declared their intention not to release the app to the public, the existence of such technology raises concerns regarding the risks associated with AI-infused wearable devices that may record people discreetly. Nevertheless, this does not eliminate the possibility that malicious entities could develop similar applications using comparable strategies.