- cross-posted to:
- technology@piefed.social
- cross-posted to:
- technology@piefed.social
Users were prompting Grok, the platform’s built-in AI feature, to “nudify” women’s images.
That reluctance only heightens the risks of Grok’s “nudifying” capabilities. “When one company is able to do something and is not held fully accountable for it,” Winters says, “it sends a signal to other Big Tech giants that they can do the next thing.”



Bozo take.
Up until recently, you would need to have thousands of hours of Photoshop or visual effects experience to get an even mediocre result.
With current AI, the barrier to entry is basically nothing and the results can often times be indistinguishable from reality.
The solution is obvious…governments need to make non-consensual reproduction of an individual’s likeness illegal and actively enforce it.
The tools are already out there. Regulating them is a lost cause at this point…
This is a non-starter. If you’re in public, people can take your picture. It’s rude to do so without asking permission but it’s still perfectly legal. If it were not legal, all those people recording ICE doing horrible things wouldn’t be able to do so.
Even if there were an exception for recording people working for the public, you’d still end up recording bystanders in the background. It’s too dangerous—from a societal perspective—to place roadblocks in the way of recording people or taking their picture. People do bad things every day and one of the only ways to deal with that is doing stuff like taking pictures without consent.
You could pass a law that bans this very specific use case: Distributing fake nudes but that’s the thing: They’re fake. They’re just guesses (by a computer). From a legal perspective, it’s no different than paying an artist to paint a nude painting of someone from a (non-nude) photo, which is not illegal.
Not only that, but you’d need a law like this to exist in every country, everywhere. Then you’d need some sort of enforcement mechanism. How would that even work?
“This person uploaded their likeness to the Internet, which was reproduced by Twitter 1 million times but then someone used it to make a fake nude of them which was reproduced by Twitter 10 million times. The Twitter TOS says that when you upload an image, they have no responsibility over it and retain the right to modify it in any way they see fit.”
Do you go after the user, who might be in a completely different county? It might even be a bot, which would further complicate things.
I stand by what I said: There’s no realistic way to prevent this other than to not upload pictures of yourself to the Internet.
We need people to understand that there’s bad actors out there that will do things like make fake nudes of them and there’s nothing that can be done to stop them. Once they have the image on their computer it’s game over.
In regards to Xitter, specifically, STOP USING IT. If you haven’t figured out that it’s a total shithole by now it’s time to wake up.
You can’t stop people from distributing CSAM. How would you possibly enforce that? Might as well not even try.
If the child didn’t want sexual materials of them distributed around, they shouldn’t have taken them in the first place.
If you don’t want some creep to sexualize your children, then keep them locked inside your house, dummy. Your child has no right to privacy in public.
/s
Taking a photo of someone in the background is vastly different from following a private citizen to record them covertly, then posting the recording online to single them out and get people to harass them.
Taking a photo of a child is not illegal, but posting said photo online with the intent to sexualize them is.
Taking a photo of a person is not illegal, but posting said photo online manipulated to make them nude or doxxing/harassing them should be.
The key here is intent. And that’s how it could easily be enforced by law.
In case I didn’t make it obvious, most of your arguments can be ripped apart simply by replacing the focus of the argument from ‘noncensual derogatory use of likeness’ with CSAM.
YES: If you don’t want creeps on the other side of the globe generating nudes of your children DO NOT POST IMAGES OF THEM TO THE PUBLIC! How is this not obvious‽
Also, NO: No one has a right to privacy in public. That’s how that works!
If someone takes a picture of your child, posts a fake nude of them to the Internet, that is already illegal. That’s CSAM. And because they took the picture that means they’re local and can be found and prosecuted.
Trying to stop random creeps across the entire Internet of six billion people is impossible. It’s like searching for aliens in the cosmos.