Users were prompting Grok, the platform’s built-in AI feature, to “nudify” women’s images.

That reluctance only heightens the risks of Grok’s “nudifying” capabilities. “When one company is able to do something and is not held fully accountable for it,” Winters says, “it sends a signal to other Big Tech giants that they can do the next thing.”

  • Doug Holland@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    2
    ·
    18 days ago

    Luckily I’m not a legislator, don’t need to have watertight solutions to propose for every problem, but the answer certainly isn’t a shrug. Learning to live with it — free fake porn of anyone, indistinguishable from true porn — is not a tenable future for a civilized society (if we’re ever planning to have one of those).

    • Riskable@programming.dev
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      1
      ·
      17 days ago

      Ok, there’s actually a bit to unpack there…

      Free fake porn of anyone

      This isn’t actually the problem. You know that, right? If I generate a fake image of you in the nude and just keep it to myself on my own hard drive that has no impact on anything. It harms no one. It’s no different than drawing your picture and keeping it in a drawer.

      Also, fake porn is still porn. However, mere nudity is not porn. For something to be porn it has to be a sex act. I know a lot of people think that’s untrue but it’s reality. Nudity is not porn! Is it an “adults only image”? Maybe. But porn? Not necessarily.

      Example: The New York Times had a famous front page image featuring a little girl who tore her clothes off, running away from napalm in Vietnam. Definitely not porn.

      People who associate nudity with porn need to get their minds out of the gutter.

      Also, I really, really need you to explain what “true porn” is. Give me your expert opinion!