Etsy site hosts pornographic celebrity deepfakes – Forbes France

AI-generated pornographic images of at least 55 celebrities were available for purchase on Etsy until Forbes informs the company. Meanwhile, Etsy’s recommendations and “related searches” feature directed users to even more items.

Article by Cyrus Farivar for Forbes USA – translated by Flora Lucas

On Etsy, anyone can buy Olivia Munn t-shirts and photos signed by the actress herself. There are also mugs for fans of the American actress and model that can be personalized. And until Tuesday, you could buy AI (deepfake) digital images of Olivia Munn, some of which were pornographic.

“With a resolution of 300 DPI, Olivia brings details to life, making it ideal for digital art, design and print. Olivia celebrates the elegance of the female form in an artistic and tasteful way, making it suitable for a variety of creative applications,” reads the description of the image on sale for $2.49.

Bryan Sullivan, the lawyer for Olivia Munn, who starred in the film Editorial office and many superhero movies, he said his client felt “insulted and insulted” by the images. “It’s outrageous and it violates my client’s rights and more importantly her dignity,” he said. Forbes. “We will take steps to remove this record and prevent it from happening again. And I have already started this process with Etsy. »

Although Bryan Sullivan said he immediately notified Etsy after being contacted Forbes On December 13, Etsy did not remove the images until six days later Forbes contacted the company directly.

Other sellers offered similar images for slightly higher prices ($5.51) and even got creative: “Jenna Ortega just took a bath. Shaved private parts, too big chest. » Another seller offered to make “any celebrity you like, naked in various positions…whether they’re naked, having sex, or just in their underwear” for just $0.92.


“I’ll be honest, it’s on Etsy, dammit! This is how we know this practice has become common, we find it on Etsy. »

Hany Farid, Professor, UC Berkeley


Etsy will remove affected accounts Forbesbut the problem is not solved

Etsy removed these accounts (16 identified by Forbes), but left behind thousands of other accounts containing AI-generated pornography, all easily found with the most basic of search terms. “We are deeply committed to the safety of our marketplace and community and removed the listings in question as soon as they were brought to our attention,” Alice Wu, head of security at Etsy, said in a statement. e-mail at Forbes. “Imitation of nude celebrities is prohibited. As this is an emerging issue on the Internet, we are actively working to strengthen our enforcement efforts in this area. »

Etsy, which has grown to a market capitalization of nearly $10 billion by brokering the sale of trinkets and crafts, is grappling with the problem of deeply fake pornography caused by the generalization of artificial intelligence.

Etsy declined to explain its policy on AI-generated images of real (unknown) people, or to explain why there are so many artificially pornographic images on its site. “While some adult content is allowed on Etsy, pornography is prohibited,” added Alice Wu.

Despite this policy, a search for the keywords “deepfake porn” returned more than 1,600 results as of December 18. Some of these results are not pornographic and simply offer non-explicit services for “creating your own deeply fake video”. After Forbes contacted Etsy, the number of results for that search term dropped to just under 1,500. Similarly, on Monday, a search for “ai nude” returned more than 4,000 results. After Forbes contacted Etsy, that number dropped to less than 3,700.

Some of these images generated by artificial intelligence were deep fakes of female celebrities like Olivia Munn, Jenna Ortega or Ariana Grande. Others were composed entirely of invented people, mostly women.

The ads were very clear about what they were selling. “This pack contains 40 high-quality, uncensored JPG images of many beautiful, completely nude, AI-generated young women in various positions and locations. There are no duplicates with other packages,” the notice said.

A massive problem on Etsy

Although Etsy’s Prohibited Items policy at the time of posting prohibits the sale of pornography (defined as “material that explicitly depicts or depicts sexual acts, genitalia, or other erotic behavior for the purpose of arousing or stimulating sexuality), there is plenty of pornography sold on the site.”

At the time of publication, most direct searches for phrases like “AI porn” returned explicit images, including artificial collections of “goth cour,” “naughty nurses,” and “winter flashing,” as well as a decorative pillow depicting oral sex.

Meanwhile, Etsy’s recommender algorithms directed users to similar images. At the bottom of the listing of the now-deleted fake Olivia Munn image were listings of several other artificially created erotic images of famous women sold by the same seller, along with suggestions to “explore related searches” using terms such as “nsfw ai art” and “olivia munn nude”.

According to Hany Farid, a professor of computer science at UC Berkeley and an expert on generative artificial intelligence technologies, there is “no technical reason” why Etsy could not do a better job of filtering these materials. Searches for the same phrase (“deepfake porn”) on other e-commerce platforms, including Amazon and eBay, do not yield similar results.

Officially, Etsy distinguishes between nude images, which it allows, and pornography, which it does not allow. Etsy adheres to the common legal definition of gender and prohibits images depicting genitalia or sexual acts “for the purpose of sexual arousal.”

“We are still working to determine the place of AI-generated products in our marketplace, but AI-generated listings that violate our long-standing policies will be removed,” Alice Wu said in a statement. It adds that while sellers are “responsible for following our policies,” Etsy monitors the site “both manually and through automated checks.” He refuses to explain exactly what this entails, or why, with such measures in place, a simple search for “AI porn” continues to return products containing pornographic deepfakes of famous actresses.

Pornographic deepfake is facilitated by generalization of AI

In recent years, deeply fake pornography, which disproportionately affects women, has become much more sophisticated, much easier to create, and has now spread to unprecedented levels, experts say.

Easily available software can take almost any image and render it pornographic, often in near-realistic detail. Websites dedicated to editing images of real women using AI to remove clothing already allow anyone to create endless AI-generated pornographic images in seconds, but these sites are not publicly traded e-commerce platforms.

According to Hana Farid, “I’ll put it bluntly, it’s on Etsy, dammit! This is how we know this practice has become common, we find it on Etsy. »

Etsy, founded in 2005, went public in 2015. In 2022, the company made a profit of $643 million, up from $627 million the previous year. At the beginning of December, it laid off 11% of its employees. It’s also struggling to cope with the flood of AI-generated content, such as bizarre coloring pages and an abundance of cheap coffee mugs with funny sayings, she said Atlantic at the beginning of the year.

According to the Etsy seller who offered these fake celebrity images for sale, “they’re not particularly popular” because “anyone can create artificially explicit images with AI these days.” The other vendors did not respond to requests for an interview Forbes.

Rebecca Delfino, a law professor at Loyola Marymount University in Los Angeles who has spent years studying the intersection between deepfakes and the law, said. Forbes that there was no US federal law to protect actual victims of deepfakes, but there were laws at the state level. “When you sell something commercially and sell it in bulk, you’re exposed to a whole range of copyright claims, (from) appropriation of likeness to defamation to false light, and in some states like California and New York there are now civil suits ,” she said, pointing to the states where many of these celebrities are based.

First change

According to Rebecca Delfino, most celebrity lawyers would send a demand letter to Etsy to protect their clients. To date, he adds, no major case has challenged the new state laws regarding fictional pornography. However, the non-pornographic deepfake is currently being tried in federal court in Los Angeles.

Earlier this year, Kyland Young, a California reality TV personality, successfully sued the Ukrainian company Neocortext, which makes the face replacement app “Reface.” Kyland Young argued that Neocortext violated his right of publicity under California state law by allowing users to pay to insert themselves into still images of Kyland Young or even swap their faces for his body. This month, Neocortext appealed its defeat before the 9thE US Circuit Court of Appeals, arguing that its application is protected by the First Amendment.

Given the limited legal recourse (not to mention the time and resources required to track them down), the responsibility for controlling the spread of pornographic deepfakes rests largely on the technology platforms that host and make them available, be it Etsy or Google, making it easier to find other sites publishing pornographic deepfakes deepfakes.

“I don’t think this problem can be solved with formal announcements and lawyers,” Hany Farid said. “It’s the Internet. You need to know where the checkpoints are. »

Also read: The best way to fight against deepfakes is to use simple methods

Leave a Comment