scy,
@scy@chaos.social avatar

I'm old enough to remember how @creativecommons was founded as a way for independent creators to safely share their work and build upon each other.

In 2024, their take is now "billion dollar companies plagiarizing your art is fair use".

https://creativecommons.org/2023/02/17/fair-use-training-generative-ai/

Hats off to the author, you don't see that kind of, uh, skillful rhetoric chicanery every day. Like "generative AI doesn't compete with artists because artists are not in the data market". 😬

ArneBab,
@ArneBab@rollenspiel.social avatar

@scy this sounds like the lax vs. copyleft free software argument again.

Some people use lax licenses, because they want to allow all to benefit from their work. And then complain when those don’t give back.

Others use copyleft licenses, because they don’t want their work used to limit the freedom of others.

I am in the second group. I use cc by-sa, because I don’t want my creations used in works that prohibit modification and sharing.

AI enables proprietarization anyway.
@creativecommons

nofollownoindex,
@nofollownoindex@deppenkessel.de avatar

@scy @creativecommons I publish using the https://creativecommons.org/licenses/by-nc-sa/4.0/ license - any possible use of my works that could be used in an AI engine violates both the NC as well as the SA if the works are not released wirh the same license. I do not see how any such licensed material can get legally used for AI generated output.

scy,
@scy@chaos.social avatar

@nofollownoindex @creativecommons The license grants people additional rights, but it's based on your copyright. The "fair use" rule in the US defines additional exceptions to copyright, no matter the terms of your license.

ArneBab,
@ArneBab@rollenspiel.social avatar

@nofollownoindex the key is "exception of copyright" (in the EU) or "fair use" (in the USA).

Those can be good when they permit more with works for which use is restricted.

They are bad if they permit restricting more than would otherwise be allowed.
@scy @creativecommons

nofollownoindex,
@nofollownoindex@deppenkessel.de avatar

@ArneBab @scy @creativecommons looks like I have some legal studies on "fair use" to schedule...

ArneBab,
@ArneBab@rollenspiel.social avatar

@nofollownoindex good luck — worse than the complex legal theory is that the cultural practice is even more complex, because things which are obvious fair use can be torn down by bogus complaints from powerful rightsholders.

More to reality: I took a video offline because someone claimed the proceeds from a cc by song I had used and youtube rejected my counter claim (so they wanted to add ads and give the money to someone who didn’t even create the song).
@scy @creativecommons

tshirtman,
@tshirtman@mas.to avatar

@scy @creativecommons I think it's saying that training the model is fine, not plagiarism, using the model is quite different though, as it will absolutely produce things competing in the aesthetic market of the originals.

🤦‍♂️

scy,
@scy@chaos.social avatar

@tshirtman @creativecommons "Building the Torment Nexus is fine, as long as you don't switch it on"

tshirtman,
@tshirtman@mas.to avatar

@scy @creativecommons oh i'm not saying you are wrong, i'm saying the sentence in the text is either incredibly naive, or cynically misleading.

And possibly both.

alexanderhay,
@alexanderhay@mastodon.social avatar

@scy @creativecommons Creative Commons can now go fuck itself.

Merovius,
@Merovius@chaos.social avatar

@scy TBQH I never really found claims that using art as training data constitutes copyright infringement convincing. That use might be immoral, antisocial and bad for society. But copyright doesn't seem a great framework to talk about any of that.

Of course that goes both ways. I think "using art to train generative software should be fair use" is just as much a category error as "[…] is copyright infringement".

18+ jhwgh1968,
@jhwgh1968@chaos.social avatar

@scy I feel like the block quote you picked is the weakest part of the entire @creativecommons article

I'm no fan of "AI", but also hate it when people fall for anti-hype

Taking the view of "AI" proponents to explain how dangerous it is -- rather than a view that realizes it's marketing BS and these systems shouldn't be treated the same in law

As someone who has watched copyright expand to absurdity, any precedent adding "running an algo" to copy rights would be terrible for everyone

18+ scy,
@scy@chaos.social avatar

@jhwgh1968 What about the part where the author claims that models "do not store images, they do not reproduce images in their data sets, and they do not piece together new images from bits of images from their training data. Instead, they learn what images represent and create new images based on what they learn about the associations of text and images"?

That's highly debatable imho. While the models might have fractured and mixed the pieces beyond recognition, they are still "stored".

18+ jhwgh1968,
@jhwgh1968@chaos.social avatar

@scy I think that is a pretty good "lay" description of models I've trained and tinkered with offline in various domains

  1. The data doesn't contain any fragments. It's a hierarchy of probabilities

  2. Shanon's limit proves there's no information storage *

  3. A bit literary, but reasonable description of correlative self-convolution -- merely a different algo for how Amazon Recommends Products for You all the time

  • Except maybe very large text models, e.g. the GH Copilot lawsit I support
18+ jhwgh1968,
@jhwgh1968@chaos.social avatar

@scy the best way to see these pts in action is what I've done: make models that are "too small to be good" and see what patterns appear

Or to ask larger models for outputs whose inputs have low probabilities

Then you can see what I think of as the "probability cone" of each piece of input's correlation affecting the output -- creating random visual smudges or that text model exploit where repeating words makes it spew junk

Both are "leaving the correlation cone" and showing its contours

18+ scy,
@scy@chaos.social avatar

@jhwgh1968 So what you're saying is that if I trained a model on all of the Marvel Cinematic Universe movies and series, and then used it to generate and sell new movies, Disney would not sue me five miles into the ground

18+ jhwgh1968,
@jhwgh1968@chaos.social avatar

@scy they should be able to sue you into the ground exactly as hard as if you had watched every Marvel Cinematic Universe movie, got drunk/high/whatever, and came up with a "totally original idea dude" on a napkin and then actually made it

And you should have the same defenses based on the specifics of the content iteslf

That's my position: copyright should be about output, not process

Someone hurt under the law -- financially or e.g. impersonation -- should fight that on its effects

juandesant,
@juandesant@astrodon.social avatar

@scy @creativecommons how can any Share-Alike license be used by a company like OpenAI or Anthropic for training? And of course, no of the non-comercial ones should be used in that way. You can force the model to reproduce the output but not the source…

Maybe we need to create a Sensible Commons…

schizanon,
@schizanon@mastodon.social avatar

@scy @creativecommons

You: "anyone can use my content"

: uses content

You: "No, not like that!"

scy,
@scy@chaos.social avatar

@schizanon @creativecommons No, the full quote is "anyone can use my content with proper attribution".

schizanon,
@schizanon@mastodon.social avatar

@scy @creativecommons so if I look at your content I have to attribute you with everything that it inspires me to create thereafter?

scy,
@scy@chaos.social avatar

@schizanon You know, that's a valid point – for someone who can be inspired.

Generative AI is not inspired. It's just remixing. Tiny snippets, often unrecognizable, but if I do the same with, for example, the music of contemporary artists, I'll get sued by billion dollar record companies.

That imbalance of power is what makes me opposed to the whole thing so much. They take the creations of countless individuals, make money from it, give nothing back, and forbid us from doing the same.

scy,
@scy@chaos.social avatar

@schizanon Also, I find it funny that someone with "Follow me before sharing your contrary opinion or I will block you" in their bio just slides into someone's replies and plays devil's advocate. Imbalance of power seems to be your thing.

paezha,
@paezha@mastodon.online avatar

@scy @schizanon @creativecommons

It's not like ai is someone

ArneBab,
@ArneBab@rollenspiel.social avatar

@schizanon

Me: "anyone can use my content, if they also allow that with what they create"

: "removes that notice"

Someone else: You may not modify what I created with AI!

Me: "how dare you forbid me to change the work that you created with my work!"

@scy @creativecommons

  • All
  • Subscribed
  • Moderated
  • Favorites
  • generativeAI
  • DreamBathrooms
  • magazineikmin
  • everett
  • InstantRegret
  • rosin
  • Youngstown
  • slotface
  • love
  • Durango
  • kavyap
  • ethstaker
  • tacticalgear
  • thenastyranch
  • cisconetworking
  • megavids
  • mdbf
  • tester
  • khanakhh
  • osvaldo12
  • normalnudes
  • GTA5RPClips
  • ngwrru68w68
  • modclub
  • anitta
  • Leos
  • cubers
  • provamag3
  • JUstTest
  • All magazines