@fox This "Training for AI" is just another version of selling your info to 3rd party.
Let's bet that a year from now, those companies will be selling those data, whether it's in the form of training data, or trained models to 3rd parties.
Even the fucking mental health and suicide hotline apps are selling your data to whatever the fuck AI / 3rd party advertisers
@fox Bullshit explanations aside, what they do is still use their users' data, irrespective of whether they hide their tracks after the fact; and as per the GDPR, this use requires prior information and prior consent collection -- i.e., opt-in. Any other way is in breach of the GDPR. If you're in Europe, ask them to provide you with proof that you were informed and consented before the fact.
@fox what a bunch of neo-luddites author and people in replies are. AI will learn on all of us and will make most of us useless. And you can do nothing against progress, but complain
@fox This feels like a case of something that isn't actually legal and they're just counting on people to not take it to court. I'm no expert mind you, but I think we haven't reached the point it's actually legal to hold people's data hostage against their expressly stated will.
@fox I'm running into the same thing as I near publication of my own audiobook. Almost all of the big (non-Audible) markets say, "We publish through Findaway Voices!" Meanwhile Findaway uses content for machine learning. Rights holders can opt-out, but narrators can't.
@fox I feel like this isn't new though. Before training for AI, it was data gathering for analytical and marketing purposes. That's how most free services could offer what they did, and Grammerly was no different. I didn't trust them from the get-go, because if them offering it for free. As the old saying goes, "if you're not paying for the product, you are the product." That holds just as true now as it did 5+ years ago during the start of the "Big Data" craze.
Add comment