AI Search Engine PimEyes Facilitates Image-Based Sexual Abuse of Women… Then Sells Them the Solution
Fin Strathern reveals how PimEyes profits from vulnerable women, providing the solution to a problem it helps to create.
Google for faces. That is the unnerving selling point of PimEyes, a facial recognition search engine that holds more than two billion faces in its database. Like a supercharged reverse image search, it gives any paying customer the power to track a person’s face online and uncover potentially private information.
The site is popular in the darker corners of the internet as a tool to identify women and dig for sexually explicit photos of them online.
But most PimEyes subscribers are women who have paid to discover this online underbelly for the first time and are desperate to hide what they have found.
“Asian girl checking in, didn’t realise there were so many online. How’s everyone’s night?”
Amy* sat up in bed. She never wrote that. Why was the comment sitting there next to a selfie of her from years ago?
Moments earlier, scrolling through TikTok after a long day at work, she had come across a woman warning about the dangers of PimEyes. Immediately concerned, Amy headed to the site and added some photos of herself. As the loading bar filled, PimEyes analysed the biometric features of her face, matching the distance between her eyes, nose, and mouth to faces in its database.
It pulled up more than 100 results. Most were profile pictures from social media platforms reposted on other sites, but one result linking to an adult website stood out to her.
Confused, she clicked the link, only to find that PimEyes locks direct access to its results behind monthly subscription paywalls. Prices range from £30 to more than £300.
After reluctantly paying for the starter plan, Amy found that an old photo she had posted on Reddit was being used to catfish men on adult forums. The responses ranged from compliments to racist slurs. “I felt violated and exploited. By the people using images of me without my permission, and by PimEyes for making me pay to find it,” she told Byline Supplement.
Women like Amy are using PimEyes to discover an array of image-based sexual abuse (IBSA) online. This includes revenge porn posted by ex-partners, deepfake pornography using their faces, and even scenes from sex trafficking. The site requires them to pay a subscription to access the location of the photos and offers a paid service to help remove them from the web.
PimEyes sees this as doing its job. The company markets itself as a pro-privacy tool that helps women find and remove unwanted photos from the internet. It claims to have performed more than 12,000 image takedowns since January, saying that “up to 91 per cent” of subscribers to one of its monthly payment plans are women and girls.
Jake Moore, global cybersecurity advisor at internet security company ESET, said: “PimEyes poses a completely new danger to women due to the speeds at which it can be abused to find all sorts of personal information online without even needing a name.”
Web traffic analysis from data firm Similarweb also shows that, despite so many paying female users, more than two-thirds of PimEyes’ visitors are men. Big Brother Watch, a privacy rights group, told the Byline Supplement this disparity between male users and female subscribers suggests an “exploitative business model”.
“No one should be forced to pay PimEyes to protect them from the risk of stalking and harassment that the company itself has created,” said Madeleine Stone, legal and policy officer at Big Brother Watch.
“Facial recognition tools like PimEyes are ushering in a new era of high-tech sexual harassment, allowing predatory men to track women across the internet at the click of a button.”
Last November, Big Brother Watch filed a legal complaint with the Information Commissioner's Office (ICO) against PimEyes, calling the site “unlawful” and saying it facilitates stalking “on a scale previously unimaginable”.
But the ICO has since ended its inquiry into the search engine, and PimEyes CEO Giorgi Gobronidze said any accusations that the site facilitates image-based sexual abuse of women are “baseless” and “ethically bankrupt”.
“There is no evidence that our company has been somehow facilitating such activities,” he told Byline Supplement.
Down the 4chan Rabbit Hole: a World of Image-Based Sexual Abuse (IBSA)
Most of the women Byline Supplement talked to about having used PimEyes found their faces being used on 4chan, an anonymous message board notoriously frequented by internet trolls.
Here, anything goes. Amy found old photos of her being used to catfish men online with a fake identity.
Some found private images that have been shared by ex-partners. Others found photos of them being artificially edited to appear nude or placing their face into pornographic videos, a process called deepfaking.
PimEyes is itself a popular tool among 4chan’s user base. Every day on the site, users can be found offering out their PimEyes subscriptions for others to use, crowdsourcing the identities of women and searching for explicit photos of them online – any results are referred to as a ‘win’.
Byline Supplement trawled through hundreds of these posts for weeks and did not find a single request asking to search for a man’s face. While some users look to identify women they have seen in porn and find their personal information, most requests relate to women whom users appear to have a real-life connection to.
We asked anonymous 4chan users offering out their PimEyes accounts if it bothered them that the info they share could be used to stalk or harass women. “Who cares? People can just pay £30 and do it themselves if they really want,” one replied. “I don’t give a f**k, I want to see what these wh*res have online,” another said.
We asked another if they think PimEyes is a tool that should be available to the public. “Probably not, this sh*t is OP,” they answered, gamer terminology for something being ‘overpowered’.
Gobronidze told Byline Supplement that PimEyes has a “several step security protocol” to ensure its terms of service, which include not sharing accounts or looking for others’ faces, are “respected by the user”. The company claims to have blocked more than 500 accounts that broke these rules in recent months.
How PimEyes identifies misconduct or detects the hundreds of 4chan users sharing accounts every day was not made clear.
Uncovering Sexual Trauma: The Extent of Pimeyes’ Reach
For Cher Scarlett, an American software engineer and outspoken critic of facial recognition technology, PimEyes revealed more than just online abuse. Last February, Scarlett used the site to discover footage from a repressed incident in 2005 where she was coerced into performing violent sex acts on camera. Despite being unable to afford it, she immediately paid $300 for PimEyes’ most advanced subscription plan to try and erase the photos.
“I couldn’t believe what I was seeing. I was horrified that other people had watched this and terrified that other people would find it,” she told the Byline Supplement. “I didn’t have $300 to spend on hiding photos, but I did it instantly. There was no other response in that moment that made sense. You think: ‘How do I get rid of this? $300? Sure! Here! Take it!’ It’s a panic decision.”
Keep reading with a 7-day free trial
Subscribe to Byline Supplement to keep reading this post and get 7 days of free access to the full post archives.