Software engineer Vishnu Mohandas decided he would quit Google in more ways than one when he learned the tech giant had briefly helped the US military develop AI to study drone footage. In 2020, he left his job working on Google Assistant and also stopped backing up all of his images to Google Photos. He feared that his content could be used to train AI systems, even if they weren’t specifically ones tied to the Pentagon project. “I don’t control any of the future outcomes that this will enable,” Mohandas thought. “So now, shouldn’t I be more responsible?”
Mohandas, who taught himself programming and is based in Bengaluru, India, decided he wanted to develop an alternative service for storing and sharing photos that is open source and end-to-end encrypted. Something “more private, wholesome, and trustworthy,” he says. The paid service he designed, Ente, is profitable and says it has over 100,000 users, many of whom are already part of the privacy-obsessed crowd. But Mohandas struggled to articulate to wider audiences why they should reconsider relying on Google Photos, despite all the conveniences it offers.
Then one weekend in May, an intern at Ente came up with an idea: Give people a sense of what some of Google’s AI models can learn from studying images. Last month, Ente launched https://Theyseeyourphotos.com, a website and marketing stunt designed to turn Google’s technology against itself. People can upload any photo they want to the website, which is then sent to a Google Cloud computer vision program that writes a startlingly thorough three-paragraph description of it. (Ente prompts the AI model to document small details in the uploaded images.)
One of the first photos Mohandas tried uploading was a selfie with his wife and daughter in front of a temple in Indonesia. Google’s analysis was exhaustive, even documenting the specific watch model that his wife was wearing, a Casio F-91W. But then, Mohandas says, the AI did something strange: It noted that Casio F-91W watches are commonly associated with Islamic extremists. “We had to tweak the prompts to make it slightly more wholesome but still spooky,” Mohandas says. Ente started asking the model to produce short, objective outputs—nothing dark.
The same family photo uploaded to Theyseeyourphotos now returns a more generic result that includes the name of the temple and the “partly cloudy sky and lush greenery” surrounding it. But the AI still makes a number of assumptions about Mohandas and his family, like that their faces are expressing “joint contentment” and the “parents are likely of South Asian descent, middle class.” It judges their clothing (“appropriate for sightseeing”) and notes that “the woman’s watch displays a time as approximately 2 pm, which corroborates with the image metadata.”
Google spokesperson Colin Smith declined to comment directly on Ente’s project. He directed WIRED to support pages that state uploads to Google Photos are only used to train generative AI models that help people manage their image libraries, like those that analyze the age and location of photo subjects.The company says it doesn’t sell the content stored in Google Photos to third parties or use it for advertising purposes. Users can turn off some of the analysis features in Photos, but they can’t prevent Google from accessing their images entirely because the data are not end-to-end encrypted.
If you don’t want to upload your own picture, Ente gives people the option to experiment on Theyseeyourphotos using several pre-chosen stock images. Google’s computer vision is able to pick up on subtle details in them, like a person’s tattoo that appears to be of the letter “G,” and a child’s temporary tattoo of a leaf. “The whole point is that it is just a single photo,” Mohandas says. He hopes the website prompts people to imagine how much Google—or any AI company—can learn about them from analyzing thousands of their photos in the cloud in the same way.
If Theyseeyourphotos motivates you to switch from Google Photos to another image storage service, the transition might not be totally smooth. Mohandas says that Google makes it difficult for people to transfer their photo library elsewhere by breaking up files and compressing them. He also alleges that Google Play, the company’s Android app store, has flagged Ente’s app multiple times for issues such as non-transparent pricing, which Mohandas says are bogus. Google’s Smith says the feedback is appreciated and that its constantly making improvements to its services
Ente, which means “mine” in Mohandas’ native Malayalam, isn’t without its own downsides. Since the service is smaller and open source, features like file sharing and search may not be as advanced yet. If a user loses or forgets their password, which doubles as an encryption key, they could lose access to their photo library. Mohandas says he trusts his own family photos with Ente, which keeps two separate private backups for users. But Google has decades more experience ensuring photos don’t disappear in a poof.
In some ways, though, that’s exactly what concerns Mohandas. He’s worried humanity’s visual archive will be mined in the future in ways he can’t predict or control. “Google is a company which I believe will be there 20 years from now,” he says. Photos snapped of his daughter today reveal who she is and what makes her happy or sad. “This information could be used to manipulate her decades from now by anyone who has access to this data—advertisers, dating websites, employers, and industries that don’t exist yet but will benefit from psychological profiles,” Mohandas says.
He recognizes that he might appear overly paranoid to some people, but, “We don’t know how the future will turn out, and it doesn’t hurt to be cautious, and it doesn’t hurt to have an option.”