A user on Weibo, a Chinese social network, is claiming to have “successfully identified more than 100,000 young ladies” in the adult industry “on a global scale” by using newly developed facial recognition software that captures images found on social media and “cross-references” them with faces seen in (mostly amateur) adult videos.
A Germany-based Chinese programmer said he and some friends have identified 100k porn actresses from around the world, cross-referencing faces in porn videos with social media profile pictures. The goal is to help others check whether their girlfriends ever acted in those films. pic.twitter.com/TOuUBTqXOP
— Yiqin Fu (@yiqinfu) May 28, 2019
So far, there has been no actual evidence the user was able to accomplish this, besides a barren GitLab page.
Tech website Motherboard contacted the man over Weibo’s chat feature and asked for proof of his claims. He responded and said a “database schema” and “technical details” will be released next week, but did not comment further.
Soraya Chemaly, the author of ‘Rage Becomes Her,’ chimed in on Twitter and described the revelation of software being used to identify adult film actresses, who otherwise want to remain anonymous or use pseudonyms to mask their real identities, as “horrendous”:
This is horrendous and a pitch-perfect example of how these systems, globally, enable male dominance. Surveillance, impersonation, extortion, misinformation all happen to women first and then move to the public sphere, where, once men are affected, it starts to get attention. https://t.co/oPhe4WMi06
— Soraya Chemaly (@schemaly) May 28, 2019
Academics in feminist studies and machine learning have decried the alleged development as “algorithmically-targeted harassment.”
During a series of discussions regarding the Weibo user’s software, the topic of deepfakes, or a technique for human image synthesis based on artificial intelligence, was brought up in the context of AI having been used to swap faces of female celebrities onto the bodies of adult performers.
Motherboard claims the use of machine learning in this capacity extorts “women’s bodily autonomy” and is a form of “deep misogyny.”
University of Maryland law professor Danielle Citron called the idea of using tech to essentially doxx adult film performers a “painfully bad idea”:
This is a painfully bad idea—surveillance and control of women’s bodies taken to new low @ma_franks @CCRInitiative @HoLLyCCRi @ariezrawaldman @cagoldberglaw https://t.co/PyiKV5x3Qu
— Danielle Citron (@daniellecitron) May 28, 2019
While it is entirely possible for someone knowledgeable in the field of programming and machine learning to have developed such facial recognition software, experts say it would take a tremendous amount of effort and there would be “no guarantee of quality.”
We will just have to wait and see if the Weibo user comes through and provides solid proof he’s achieved something that will undoubtedly affect the adult film industry one way or another.
Main Image Credit: Smart Cities World
Support our work!