Nudify app's strategy to dominate deepfake porn depends upon Reddit, docs show

INSUBCONTINENT EXCLUSIVE:
generate fake nudes of a famous singer."There are enough pictures of her on the Internet as it is," the user reasoned.However, that user
draws the line at generating fake nudes of private individuals, insisting, "If I ever learned of someone producing such photos of my
daughter, I would be horrified."For young boys who appear flippant about creating fake nude images of their classmates, the consequences
have ranged from suspensions to juvenile criminal charges, and for some, there could be other costs
In the lawsuit where the high schooler is attempting to sue a boy who used Clothoff to bully her, there's currently resistance from boys who
participated in group chats to share what evidence they have on their phones
If she wins her fight, she's asking for $150,000 in damages per image shared, so sharing chat logs could potentially increase the price
tag.Since she and the San Francisco city attorney each filed their lawsuits, the Take It Down Act has passed
That law makes it easier to force platforms to remove AI-generated fake nudes
But experts expect the law will face legal challenges over censorship fears, so the very limited legal tool might not withstand
scrutiny.Either way, the Take It Down Act is a safeguard that came too late for the earliest victims of nudify apps in the US, only some of
whom are turning to courts seeking justice due to largely opaque laws that made it unclear if generating a fake nude was illegal."Jane Doe
is one of many girls and women who have been and will continue to be exploited, abused, and victimized by non-consensual pornography
generated through artificial intelligence," the high schooler's complaint noted
because the governmental institutions that are supposed to protect women and children from being violated and exploited by the use of AI to
generate child pornography and nonconsensual nude images failed to do so."