Greg Rutkowski, a digital artist known for his surreal style, opposes AI art but his name and style have been frequently used by AI art generators without his consent. In response, Stable Diffusion removed his work from their dataset in version 2.0. However, the community has now created a tool to emulate Rutkowski’s style against his wishes using a LoRA model. While some argue this is unethical, others justify it since Rutkowski’s art has already been widely used in Stable Diffusion 1.5. The debate highlights the blurry line between innovation and infringement in the emerging field of AI art.

  • Pulse@dormi.zone
    link
    fedilink
    arrow-up
    1
    ·
    1 year ago

    You keep comparing what one person, given MONTHS or YEARS of their life could do with one artists work to a machine doing NOT THE SAME THING can do with thousands of artists work.

    The machine is not learning their style, it’s taking pieces of the work and dropping it in with other people’s work then trying to blend it into a cohesive whole.

    The analogy fails all over the place.

    And I don’t care about copyright, I’m not an artist or an IP lawyer, or whatever. I can just look at a company stealing the labor of an entire industry and see it as bad.

    • FaceDeer@kbin.social
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      The speed doesn’t factor into it. Modern machines can stamp out metal parts vastly faster than blacksmiths with a hammer and anvil can, are those machines doing something wrong?

      • Pulse@dormi.zone
        link
        fedilink
        arrow-up
        0
        ·
        1 year ago

        The machine didn’t take the blacksmiths work product and flood the market with copies.

        The machine wasn’t fed 10,000 blacksmith made hammers then told to, sorta, copy those.

        Justify this all you want, throw all the bad analogies at it you want, it’s still bad.

        Again, if this wasn’t bad, the companies would have asked for permission. They didn’t.

        • FaceDeer@kbin.social
          link
          fedilink
          arrow-up
          1
          ·
          1 year ago

          That’s not the aspect you were arguing about in the comment I’m responding to. You said:

          You keep comparing what one person, given MONTHS or YEARS of their life could do with one artists work to a machine doing NOT THE SAME THING can do with thousands of artists work.

          And that’s what I’m talking about here. The speed with which the machine does its work is immaterial.

          Though frankly, if the machine stamping out parts had somehow “learned” how to do it by looking at thousands of existing parts, that would be fine too. So I don’t see any problem here.

          • Pulse@dormi.zone
            link
            fedilink
            arrow-up
            0
            ·
            edit-2
            1 year ago

            And that’s where we have a fundamental difference of opinion.

            A company hiring an engineer to design a machine that makes hammers, then hiring one (or more) people to make the machine to then make hammers is the company benefiting from the work product of people they hired. While this may impact the blacksmith they did not steal from the blacksmith.

            A company taking someone else’s work product to then build their product, without compensation or consent, is theft of labor.

            I don’t see those as equitable situations.

            • FaceDeer@kbin.social
              link
              fedilink
              arrow-up
              1
              ·
              1 year ago

              At least now you’re admitting that it’s a difference of opinion, that’s progress.

              You think it should be illegal to do this stuff. Fine. I think copyright duration has been extended ridiculously long and should be a flat 30 years at most. But in both cases our opinions differ from what the law actually says. Right now there’s nothing illegal about training an AI off of someone’s lawfully-obtained published work, which is what was done here.

    • TwilightVulpine@kbin.social
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      Speed aside, machines don’t have the same rights as humans do, so the idea that they are “learning like a person so it’s fine” is like saying a photocopier machine’s output ought to be treated as an independent work because it replicated some other work, and it’s just so good and fast at it. AI’s may not output identical work, but they still rely on taking an artist’s work as input, something the creator ought to have a say over.