Users were prompting Grok, the platform’s built-in AI feature, to “nudify” women’s images.

That reluctance only heightens the risks of Grok’s “nudifying” capabilities. “When one company is able to do something and is not held fully accountable for it,” Winters says, “it sends a signal to other Big Tech giants that they can do the next thing.”

  • brownmustardminion@lemmy.ml
    link
    fedilink
    arrow-up
    1
    ·
    2 days ago

    You can’t stop people from distributing CSAM. How would you possibly enforce that? Might as well not even try.

    If the child didn’t want sexual materials of them distributed around, they shouldn’t have taken them in the first place.

    If you don’t want some creep to sexualize your children, then keep them locked inside your house, dummy. Your child has no right to privacy in public.

    /s

    Taking a photo of someone in the background is vastly different from following a private citizen to record them covertly, then posting the recording online to single them out and get people to harass them.

    Taking a photo of a child is not illegal, but posting said photo online with the intent to sexualize them is.

    Taking a photo of a person is not illegal, but posting said photo online manipulated to make them nude or doxxing/harassing them should be.

    The key here is intent. And that’s how it could easily be enforced by law.

    In case I didn’t make it obvious, most of your arguments can be ripped apart simply by replacing the focus of the argument from ‘noncensual derogatory use of likeness’ with CSAM.

    • Riskable@programming.dev
      link
      fedilink
      English
      arrow-up
      1
      ·
      23 hours ago

      YES: If you don’t want creeps on the other side of the globe generating nudes of your children DO NOT POST IMAGES OF THEM TO THE PUBLIC! How is this not obvious‽

      Also, NO: No one has a right to privacy in public. That’s how that works!

      If someone takes a picture of your child, posts a fake nude of them to the Internet, that is already illegal. That’s CSAM. And because they took the picture that means they’re local and can be found and prosecuted.

      Trying to stop random creeps across the entire Internet of six billion people is impossible. It’s like searching for aliens in the cosmos.