• JohnBrownNote [comrade/them, des/pair]@hexbear.net
    link
    fedilink
    English
    arrow-up
    7
    ·
    edit-2
    8 months ago

    ???

    how is it an enclosure? the “original” works are all exactly where they were when they were scraped. (and the “original” of a digital thing is a real fucken weird concept too, the first time that image existed was in the computer’s RAM, or maybe the pixels in a particular state on the artist’s monitor, the one that you see on the internet is like 5 generations of copy already)

    like fuck these companies and the peter theil types behind them to death but superman 4-ing fractions of pennies from the take a penny tray isn’t theft just because you do it a billion times, and copying isn’t theft at all.

    • FanonFan [comrade/them, any]@hexbear.net
      link
      fedilink
      English
      arrow-up
      7
      ·
      8 months ago

      I think we’re talking past each other. Your argument applies if we’re talking about a liberal concept of ownership, or maybe judging the morality of ai, but that’s unrelated to a material analysis of it. Generative technology requires massive datasets, which are the result of millions of hours of labor. This isn’t a moral claim at all, it’s simply trying to describe the mechanism at play.

      Enclosure in the digital space isn’t an exact parallel to enclosure acts in medieval England. I usually see it applied to the Open Source ecosystem: the products of volunteer labor are enclosed or harvested or adopted by private corporations. Google and Microsoft and Apple all built empires on this mechanism.

      I mentioned a “mature” stage because I think the next step is more forceful enclosure and hoarding of datasets. The usability of the internet is quickly decreasing in lockstep with the development of AI, a dialectical evolution. It’s eating away at the foundation upon which it builds itself.