|
- AI ‘Nudify’ Websites Are Raking in Millions of Dollars - WIRED
Millions of people are accessing harmful AI “nudify” websites New analysis says the sites are making millions and rely on tech from US companies
- Analyzing the AI Nudification Application Ecosystem
First, by researching the features of AI nudification applications, we may develop and communicate knowledge about these AI nudification applications that could be used by bad actors
- Nudify Apps Are Proliferating Despite Illegality, Lawsuits
One lamentable use of artificial intelligence involves so-called "nudify" apps As the name suggests, these apps take ordinary pictures and generate nudes of those images They have been around since at least 2019 and are one of the first uses of deepfakes
- Combating Nudify Apps with Lawsuit New Technology | Meta
Across the internet, we’re seeing a concerning growth of so-called ‘nudify’ apps, which use AI to create fake non-consensual nude or sexually explicit images
- Meta urged to go further in crackdown on nudify apps
Meta has taken legal action against a company which runs ads on its platforms promoting so-called "nudify" apps, which typically using artificial intelligence (AI) to create fake nude
- Metas platforms showed hundreds of nudify deepfake ads, CBS News . . .
Meta has removed a number of ads promoting "nudify" apps — AI tools used to create sexually explicit deepfakes using images of real people — after a CBS News investigation found hundreds of
- Minnesota considers blocking nudify apps that use AI to . . . - AP News
Minnesota considers blocking ‘nudify’ apps that use AI to make explicit images without consent Minnesota Democratic State Senator Erin Maye Quade, center, discusses her bill to crack down on “nudification” apps during a news conference at the Minnesota State Capitolo in St Paul on Monday, Feb 24, 2025
- This country is banning ‘nudify’ apps. How will it actually work?
Nudify or “undress” tools are available on app stores and websites They use artificial intelligence (AI) methods to create realistic but fake sexually explicit images of people
|
|
|