thelinuxEXP,
@thelinuxEXP@mastodon.social avatar

All it would take for AI to completely collapse is a ruling in the US saying these companies have to licence the content they used to train these tools.

They simply would never reach a sustainable business model if they had to fairly compensate all the people who wrote, drew, edited, sang or just created the content they use.

Simply being forced to respect attribution and licenses would kill them. Will that ruling ever happen? Maybe not. Should it? I think so.

the_q,
@the_q@mastodon.social avatar

@thelinuxEXP I agree, but I don't think it will happen. The LLMs have all already been trained on stolen data. It's a knot that can't be undone at this point. There will be a lot of hand wringing and yelling, but in the end the corporations and their government lackeys will just hand-wave any grievances and then "promise" not to do it again in the future knowing full well they absolutely will.

In the end we're all to blame though. We clicked "I agree" on every social media platform.

dpino,
@dpino@mastodon.social avatar

AI has destroyed the symbiotic relationship that existed between content creators and search engines, there's no retribution loop anymore. The current state of AGI is of parasitism. Without incentives for creating new content, who is going to create new content in the future? The retribution loop needs to be restored somehow.

enthusiast101,

@thelinuxEXP
I feel like it's getting too late at this point. Many companies have started adding weird clauses where if you post anything on their website, they own all intellectual property to that content.

So while it be a bit more expensive, the AI companies will still get your data to by licensing companies for the data (this will still be cheaper than fair compensation). Of course the added expense will simply be passed on to the consumer and all blame will be placed on the regulations.

mahbub,
@mahbub@fosstodon.org avatar

@thelinuxEXP

Big companies when they see someone using their 57 years old 2 second long sound effect: GO TO JAIL

Big companies stealing every bit of creative content from the internet without permission from the small creators: :ageblobcat:

thelinuxEXP,
@thelinuxEXP@mastodon.social avatar

@mahbub « It’s different, we’re not copying the content, we’re creating something derivative so it’s ok », they say, as they refuse to acknowledge licenses

mahbub,
@mahbub@fosstodon.org avatar

@thelinuxEXP Ironically, US copyright law punish people for making derivatives, but somehow AI companies are exempted.

"First, the derivative work has protection under the copyright of the original work. Copyright protection for the owner of the original copyright extends to derivative works. This means that the copyright owner of the original work also owns the rights to derivative works." - LegalZoom (22 MAR 2023)

visone,
@visone@fosstodon.org avatar

@thelinuxEXP

Must happen!!

Jtuchel,
@Jtuchel@mastodon.social avatar

@thelinuxEXP doesn’t most of this somewhat apply to search engines as well?

tshirtman,
@tshirtman@mas.to avatar

@thelinuxEXP they already can't produce a sustainable model, but this would happen the downfall pretty fast, yes, which is reason enough to do it.

ainmosni,
@ainmosni@berlin.social avatar

@thelinuxEXP I fully agree, but AFAIK, the current state is already unsustainable. They are running at a huge loss trying to make a product that isn't catching on as hard as they're trying to make it.

This is so similar to the crypto hype, except that the tech is actually useful in some cases.

Crell,
@Crell@phpc.social avatar

@thelinuxEXP Or populate entirely from public domain works. We'd have an AI that acted like it's 1899.

duco,
@duco@norden.social avatar

@thelinuxEXP I understand that you aren't happy about them using such content but where do they violate licenses? Aren't they using material publicly available on the internet? Licenses maybe forbid to copy or distribute it, but to read it or learn from it? I don't think, that any license forbids that.

thelinuxEXP,
@thelinuxEXP@mastodon.social avatar

@duco The GPL says that all code built upon it needs to be GPL. I would argue all copilot generated code should thus be GPL.

Some licenses require attribution even for derivated works. No AI does any attribution.

thelinuxEXP,
@thelinuxEXP@mastodon.social avatar

@duco Basicslly « publicly available » doesn’t mean free of charge or of restrictions to use.

YouTube videos are publicly available, yet you’re not allowed to download them, it breaches the ToS. I can find an image from Getty in Google search, doesn’t mean I can use it freely on my website ;)

pavel,

@thelinuxEXP Come on, copyright is overreaching enough already. Plus, this would effectively give Facebook and China monopoly on big language models. Does not sound great.

thelinuxEXP,
@thelinuxEXP@mastodon.social avatar

@pavel That’s not a good reason to allow companies to leech off people, though

feld,
@feld@bikeshed.party avatar

@thelinuxEXP > They simply would never reach a sustainable business model if they had to fairly compensate all the people who wrote, drew, edited, sang or just created the content they use.

Unless they're mega rich. The big players (Apple, Goog, Amazon, etc) are doing licensing to get ahead. OpenAI, for example, could not afford to do this.

https://www.reuters.com/technology/inside-big-techs-underground-race-buy-ai-training-data-2024-04-05/

not2b,
@not2b@sfba.social avatar

@thelinuxEXP That would collapse OpenAI, but companies could obtain enough legally licensed and useful data to build new models.

thelinuxEXP,
@thelinuxEXP@mastodon.social avatar

@not2b And that would be much better!

mral,

@thelinuxEXP
I think is an example of the or rule of Steal big, steal a lot, steal everything, steal from everyone, steal all the time and it will become normalized.

or more simply

anything done while concentrating wealth is ok.

is more than how we vote!

jadugar63, (edited )
@jadugar63@mastodon.social avatar

@mral @thelinuxEXP
Exactly!
But what’s hardly ever mentioned in all these years is stealing your time and mindfulness and sanity and positivity. He’s destroyed the mental health of citizens and of the

elengale,
@elengale@mastodon.social avatar

@thelinuxEXP An alternative would be if USPTO decided that generated content could not be copyrighted.

No company or VC form would touch the stuff ever again. It'd live on, but I'm a very diminished manner.

LouisIngenthron,
@LouisIngenthron@qoto.org avatar

@thelinuxEXP You sure about that? Most people will sign away their life rights to save $0.05 on gas or a free t-shirt. We're already seeing major content sites like photo bucket creating lower licensing rates for training AI.

All that would do is slow them down a little.

crazyeddie,
@crazyeddie@mastodon.social avatar

@thelinuxEXP I would be surprised if it doesn't fall under "fair use" doctrine. We wouldn't want to do away with fair use, which lets us quote each other and learn and apply new techniques without asking permission. Requiring licensing and such for AI training would need to show that the output of that training is derivative and seeing as that it's learning in ways very similar to the way we do...that could be problematic. It's a big, complex issue.

mikkergp,

@thelinuxEXP One of the sometimes positive things about Capitalism is that it is an adversarial system, so these decisions don't happen in a vacuum, and it is interesting to wonder whether and why these new AI companies have more leverage/influence/power than media companies.

Openhuman,
@Openhuman@mastodon.online avatar

@thelinuxEXP Copyright only is for them not ordinary people

AngryAnt,
@AngryAnt@mastodon.gamedev.place avatar

@Openhuman @thelinuxEXP Seeing as modern day western copyright is a Disney invention, so as long as you're a corporation and you only tread on non-Disney copyright, you're probably fine?

Openhuman,
@Openhuman@mastodon.online avatar

@AngryAnt @thelinuxEXP lol Disney exploited European creativity from before

AngryAnt,
@AngryAnt@mastodon.gamedev.place avatar

@Openhuman @thelinuxEXP Are you suggesting that as a dane I might have heard a story about a little mermaid before the US merchandiser boiled off the morale and stuffed in some comic relief fish & crustacians? I find that highly unlikely and DMCA-adjacent ;)

Neblib,
@Neblib@mastodo.neoliber.al avatar

@thelinuxEXP ethical AIs can exist and be useful / profitable that use properly licensed training data and give attribution for significant copies, even if today's major players don't, and I totally agree this should be mandated.

scottytrees,
@scottytrees@mastodon.social avatar

@thelinuxEXP I've already grown very tired of this whole "AI as a marketing gimmick", most people don't even realize exactly what AI, machine learning, etc actually is.

gardiner_bryant,
@gardiner_bryant@mastodon.online avatar

@thelinuxEXP what will end up killing them is when they cause the power grid to collapse. It's coming.

AngryAnt,
@AngryAnt@mastodon.gamedev.place avatar

@thelinuxEXP While the power draw is downright offensive and IMO ought to at minimum see progressive taxation, it is worth pointing out that some grids are built ridiculously weaker than others - the assertion of collapse is as valid in one location as it is fantasy in another.

apemantus,
@apemantus@ieji.de avatar

@thelinuxEXP The “content creator” bubble is bursting.

thelinuxEXP,
@thelinuxEXP@mastodon.social avatar

@apemantus It’s not though. There was never any bubble in the first place. There were people who made content for ridiculously small payouts, and a really tiny fraction making a lot of money.

hirad,
@hirad@hirad.it avatar

@thelinuxEXP honestly, I don't think that's necessary. Training a LLM isn't the same as using copyright materials. That's like saying if I copy paste your this post into a text file on my computer requires me to pay you for it!
Instead, I'd argue to give incentives to companies to release their LLMs publicly, Like Meta and Mistral do.
Unless you are truly looking for killing generative AI, in which case, we can't have any discussion. But I can say throughout history, every new tech had faced people who thought it was their duty to destroy that technology no matter the cost.

thelinuxEXP,
@thelinuxEXP@mastodon.social avatar

@hirad I don’t want to destroy it, I want these tools to respect what they trained on, which currently they don’t.

I’m not even affected yet, AFAIK, but the argument that it’s just like copying a file doesn’t work, and never did. A company selling a product doesn’t use the same rules as an individual for their own use, that’s never been the case :)

thelinuxEXP,
@thelinuxEXP@mastodon.social avatar

@hirad That’s not the same at all, though, is it? Because they’re not just copying content, they’re selling access to a tool that uses that content, that they grabbed without attribution, without respecting licensing either.

It’s not the same as personal use from an individual ;)

ikanreed,

@thelinuxEXP eh, they'll just make a licensing clearinghouse like they do for public performances, so there's an easy way for big companies to smash poor people dodging the system, but never have to pay out to small money, because the structure disincentivizes claiming what's yours.

thelinuxEXP,
@thelinuxEXP@mastodon.social avatar

@ikanreed That’s very likely

  • All
  • Subscribed
  • Moderated
  • Favorites
  • random
  • Leos
  • DreamBathrooms
  • thenastyranch
  • magazineikmin
  • everett
  • InstantRegret
  • Youngstown
  • mdbf
  • slotface
  • rosin
  • modclub
  • kavyap
  • GTA5RPClips
  • cubers
  • JUstTest
  • osvaldo12
  • tacticalgear
  • Durango
  • khanakhh
  • ngwrru68w68
  • provamag3
  • anitta
  • tester
  • cisconetworking
  • ethstaker
  • megavids
  • normalnudes
  • lostlight
  • All magazines