Users were prompting Grok, the platform’s built-in AI feature, to “nudify” women’s images.

That reluctance only heightens the risks of Grok’s “nudifying” capabilities. “When one company is able to do something and is not held fully accountable for it,” Winters says, “it sends a signal to other Big Tech giants that they can do the next thing.”

  • Riskable@programming.dev
    link
    fedilink
    English
    arrow-up
    7
    arrow-down
    14
    ·
    3 days ago

    "Anything they post now has the ability to be undressed, in any way a person wants, and they can do whatever they want with that photo of you.”

    I don’t want to defend the “nudify” thing but this statement was true ten years ago, twenty years ago, and even all the way back, shortly after the invention of photography.

    I remember when Photoshop began to be regularly pirated and people were pasting the faces of girls into porn images. There was news about it back then too.

    The lesson here is this: If you don’t want people to manipulate your images do not post them. That is the one and only true way to prevent something like this from happening. Nothing else will work.

    There’s no way to put this genie back in the bag. People are just going to have to learn to live with the possibility that they will have public images of themselves that they themselves posted to the public will be manipulated by the public.

    When you post an image to the Internet, you’re posting it to the entire world. That’s approximately 6 billion people who have access to it (people with Internet). That’s way too big a number to expect that there will be no bad actors.

    • brownmustardminion@lemmy.ml
      link
      fedilink
      arrow-up
      16
      arrow-down
      1
      ·
      3 days ago

      Bozo take.

      Up until recently, you would need to have thousands of hours of Photoshop or visual effects experience to get an even mediocre result.

      With current AI, the barrier to entry is basically nothing and the results can often times be indistinguishable from reality.

      The solution is obvious…governments need to make non-consensual reproduction of an individual’s likeness illegal and actively enforce it.

      The tools are already out there. Regulating them is a lost cause at this point…

      • Riskable@programming.dev
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        1
        ·
        2 days ago

        The solution is obvious…governments need to make non-consensual reproduction of an individual’s likeness illegal and actively enforce it.

        This is a non-starter. If you’re in public, people can take your picture. It’s rude to do so without asking permission but it’s still perfectly legal. If it were not legal, all those people recording ICE doing horrible things wouldn’t be able to do so.

        Even if there were an exception for recording people working for the public, you’d still end up recording bystanders in the background. It’s too dangerous—from a societal perspective—to place roadblocks in the way of recording people or taking their picture. People do bad things every day and one of the only ways to deal with that is doing stuff like taking pictures without consent.

        You could pass a law that bans this very specific use case: Distributing fake nudes but that’s the thing: They’re fake. They’re just guesses (by a computer). From a legal perspective, it’s no different than paying an artist to paint a nude painting of someone from a (non-nude) photo, which is not illegal.

        Not only that, but you’d need a law like this to exist in every country, everywhere. Then you’d need some sort of enforcement mechanism. How would that even work?

        “This person uploaded their likeness to the Internet, which was reproduced by Twitter 1 million times but then someone used it to make a fake nude of them which was reproduced by Twitter 10 million times. The Twitter TOS says that when you upload an image, they have no responsibility over it and retain the right to modify it in any way they see fit.”

        Do you go after the user, who might be in a completely different county? It might even be a bot, which would further complicate things.

        I stand by what I said: There’s no realistic way to prevent this other than to not upload pictures of yourself to the Internet.

        We need people to understand that there’s bad actors out there that will do things like make fake nudes of them and there’s nothing that can be done to stop them. Once they have the image on their computer it’s game over.

        In regards to Xitter, specifically, STOP USING IT. If you haven’t figured out that it’s a total shithole by now it’s time to wake up.

        • brownmustardminion@lemmy.ml
          link
          fedilink
          arrow-up
          1
          ·
          2 days ago

          You can’t stop people from distributing CSAM. How would you possibly enforce that? Might as well not even try.

          If the child didn’t want sexual materials of them distributed around, they shouldn’t have taken them in the first place.

          If you don’t want some creep to sexualize your children, then keep them locked inside your house, dummy. Your child has no right to privacy in public.

          /s

          Taking a photo of someone in the background is vastly different from following a private citizen to record them covertly, then posting the recording online to single them out and get people to harass them.

          Taking a photo of a child is not illegal, but posting said photo online with the intent to sexualize them is.

          Taking a photo of a person is not illegal, but posting said photo online manipulated to make them nude or doxxing/harassing them should be.

          The key here is intent. And that’s how it could easily be enforced by law.

          In case I didn’t make it obvious, most of your arguments can be ripped apart simply by replacing the focus of the argument from ‘noncensual derogatory use of likeness’ with CSAM.

          • Riskable@programming.dev
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 day ago

            YES: If you don’t want creeps on the other side of the globe generating nudes of your children DO NOT POST IMAGES OF THEM TO THE PUBLIC! How is this not obvious‽

            Also, NO: No one has a right to privacy in public. That’s how that works!

            If someone takes a picture of your child, posts a fake nude of them to the Internet, that is already illegal. That’s CSAM. And because they took the picture that means they’re local and can be found and prosecuted.

            Trying to stop random creeps across the entire Internet of six billion people is impossible. It’s like searching for aliens in the cosmos.

    • Doug Holland@lemmy.world
      link
      fedilink
      English
      arrow-up
      11
      ·
      3 days ago

      OK, but also not the point, when there’s suddenly free software that allows anyone anywhere to nudify any photo just by typing a few words.

      • Riskable@programming.dev
        link
        fedilink
        English
        arrow-up
        1
        ·
        2 days ago

        Ok. What do we do about it?

        As an IT person, this situation always seemed inevitable. Even before AI, software becomes better, with more features, and easier to use over time. It’s the natural progression of things. There’s hiccups here and there but the arrow of time always moves the bar forwards.

        You can’t stop the software from existing. That’s like trying to make water not wet. It’s just bits that are easily copied or re-created from scratch using the knowledge that now exists everywhere.

        You could make it illegal to distribute fake nude (or nude-ish) photos of people without their permission but that won’t actually stop this. Because anyone with a GPU can do this using local software (AI) running on their PC and people will just upload the images to a server in a county that doesn’t have that law.

        This isn’t a technical problem. It’s a human problem. You can’t fix human problems with technical solutions.

        I don’t think there’s a fix. We’re just going to have to live with it. Just like we live with other things we don’t like.

        • Doug Holland@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          2
          ·
          2 days ago

          Luckily I’m not a legislator, don’t need to have watertight solutions to propose for every problem, but the answer certainly isn’t a shrug. Learning to live with it — free fake porn of anyone, indistinguishable from true porn — is not a tenable future for a civilized society (if we’re ever planning to have one of those).

          • Riskable@programming.dev
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 day ago

            Ok, there’s actually a bit to unpack there…

            Free fake porn of anyone

            This isn’t actually the problem. You know that, right? If I generate a fake image of you in the nude and just keep it to myself on my own hard drive that has no impact on anything. It harms no one. It’s no different than drawing your picture and keeping it in a drawer.

            Also, fake porn is still porn. However, mere nudity is not porn. For something to be porn it has to be a sex act. I know a lot of people think that’s untrue but it’s reality. Nudity is not porn! Is it an “adults only image”? Maybe. But porn? Not necessarily.

            Example: The New York Times had a famous front page image featuring a little girl who tore her clothes off, running away from napalm in Vietnam. Definitely not porn.

            People who associate nudity with porn need to get their minds out of the gutter.

            Also, I really, really need you to explain what “true porn” is. Give me your expert opinion!