AI “naked” website tilts for millions of dollars

For many years, the so-called “Nude” apps and websites are emerging online, allowing people to create involuntary and abusive images of women and girls, including child sexual abuse material. Although some lawmakers and tech companies take steps to limit harmful services, they are still visiting the website every month, and the creators of the website may make millions of dollars a year.
Analysis of 85 nude and “undressing” sites that allow people to upload photos and use AI to generate theme “nude” images with just a few clicks – found that most sites rely on Google, Amazon and CloudFlare’s technical services to operate and stay online. The findings are revealed by publications that metrics survey digital fraud, which averaged 18.5 million visitors per site in the last six months, and totaled up to $36 million per year.
Alexios Mantzarlis, co-founder of Metrics and Online Security Researchers, said the fuzzy naked ecosystem has become a “profitable business” for “Silicon Valley’s Laissez-Faire for Generative AI.” “When their only use case is sexual harassment, they should stop providing any services to AI nude photos,” Mantzarlis told tech companies. Creating or sharing clear deep strikes has become increasingly illegal.
According to the research, Amazon and CloudFlare offer hosting or content delivery services to 62 of 85 sites, while Google’s login system is already used on 54 sites. Nude websites also use many other services provided by mainstream companies, such as payment systems.
Amazon Web Services spokesman Ryan Walsh said AWS has clear terms of service that requires customers to comply with “applicable” laws. “When we receive reports that we may violate the terms, we take quick action to review and take steps to ban prohibited content,” Walsh said, adding that people can report problems to their security teams.
“Some of these sites violate our terms and our team is taking action to resolve these violations and working on long-term solutions,” said Google spokesman Karl Ryan.
Cloudflare did not respond to Wired’s request for comment at the time of writing. Cable did not name the nude website in this story, rather than provide them with further exposure.
Since 2019, exposed and undressed websites and robots have initially emerged from tools and processes used to create the first explicit “deep strike”. As Bellingcat reports, a network of interconnected companies has emerged online, providing the technology and making money from the system.
Broadly speaking, the service uses AI to convert photos into involuntary explicit images. They often make money by selling “credits” or subscriptions that can be used to generate photos. They have been enhanced by the wave of generating AI image generators that have emerged in the past few years. Their output is tremendously damaged. Social media photos have been stolen and used to create abuse images. Meanwhile, in a new form of cyberbullying and abuse, teenage boys around the world have created images of classmates. This private image abuse makes the victim feel painful and the images can be difficult to scrub from the internet.