Users were prompting Grok, the platform’s built-in AI feature, to “nudify” women’s images.

That reluctance only heightens the risks of Grok’s “nudifying” capabilities. “When one company is able to do something and is not held fully accountable for it,” Winters says, “it sends a signal to other Big Tech giants that they can do the next thing.”

  • Doug Holland@lemmy.world
    link
    fedilink
    English
    arrow-up
    11
    ·
    8 days ago

    OK, but also not the point, when there’s suddenly free software that allows anyone anywhere to nudify any photo just by typing a few words.

    • Riskable@programming.dev
      link
      fedilink
      English
      arrow-up
      1
      ·
      7 days ago

      Ok. What do we do about it?

      As an IT person, this situation always seemed inevitable. Even before AI, software becomes better, with more features, and easier to use over time. It’s the natural progression of things. There’s hiccups here and there but the arrow of time always moves the bar forwards.

      You can’t stop the software from existing. That’s like trying to make water not wet. It’s just bits that are easily copied or re-created from scratch using the knowledge that now exists everywhere.

      You could make it illegal to distribute fake nude (or nude-ish) photos of people without their permission but that won’t actually stop this. Because anyone with a GPU can do this using local software (AI) running on their PC and people will just upload the images to a server in a county that doesn’t have that law.

      This isn’t a technical problem. It’s a human problem. You can’t fix human problems with technical solutions.

      I don’t think there’s a fix. We’re just going to have to live with it. Just like we live with other things we don’t like.

      • Doug Holland@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        2
        ·
        7 days ago

        Luckily I’m not a legislator, don’t need to have watertight solutions to propose for every problem, but the answer certainly isn’t a shrug. Learning to live with it — free fake porn of anyone, indistinguishable from true porn — is not a tenable future for a civilized society (if we’re ever planning to have one of those).

        • Riskable@programming.dev
          link
          fedilink
          English
          arrow-up
          1
          ·
          6 days ago

          Ok, there’s actually a bit to unpack there…

          Free fake porn of anyone

          This isn’t actually the problem. You know that, right? If I generate a fake image of you in the nude and just keep it to myself on my own hard drive that has no impact on anything. It harms no one. It’s no different than drawing your picture and keeping it in a drawer.

          Also, fake porn is still porn. However, mere nudity is not porn. For something to be porn it has to be a sex act. I know a lot of people think that’s untrue but it’s reality. Nudity is not porn! Is it an “adults only image”? Maybe. But porn? Not necessarily.

          Example: The New York Times had a famous front page image featuring a little girl who tore her clothes off, running away from napalm in Vietnam. Definitely not porn.

          People who associate nudity with porn need to get their minds out of the gutter.

          Also, I really, really need you to explain what “true porn” is. Give me your expert opinion!