ZachWeinersmith,
@ZachWeinersmith@mastodon.social avatar

Where do you draw the line between AI you like and dislike? My sense is there's a serious disdain for e.g. generative art or text, but not for other machine-learning stuff, like plant ID apps or speech to text software.

brentg,
@brentg@mastodon.social avatar

@ZachWeinersmith Like most tools, it's not so much what they do, but how people use them.

LLMs and generative algorithms are the real problem. It is that they are trained on other people's work, without permission, while their nonsense output is sold dishonestly.

Like other faux solutions to non-problems, they are also vast sinkholes for exploitation capital that should have been used better.

ftl,
@ftl@wandering.shop avatar

@ZachWeinersmith the biggest line for me is when AI is used to "do something worse than humans do, but it's cheaper so let's fire all the humans". Where it makes life worse for everyone around to make a quick buck for company owners.

Generated news articles aren't as good as human-written ones, but they sure are cheaper than paying a reporter! An actual artist is expensive but using an AI trained on that artist's previous work - it's not as good, but it's cheap! Ugh.

Swampwulf,

@ZachWeinersmith easy! If it’s AI I want nothing to do with it. I believe the quote that sold me is “Why would I be bothered to read something no one could be bothered to write.”

jmcs,
@jmcs@jsantos.eu avatar

@ZachWeinersmith the line is when they replace creativity with a random generator, or when they present a random bullshit factory as if it was Hitchhikers Guide to Galaxy's Deep Thought.

tykayn,
@tykayn@mastodon.cipherbliss.com avatar

@ZachWeinersmith
i dislike that any thing is called general Ai when it it just a regular program or a Markov tree.

Tomscimyt,

@ZachWeinersmith Good AI does stuff people don't want to or can't do anyways, like plant identification is okay since it's not really possible to bring a botanist per person.

villasbc,
@villasbc@mastodon.social avatar

@ZachWeinersmith The line is: does it remind me of any project I see in https://github.com/daviddao/awful-ai#readme or does it not

dryak,
@dryak@mstdn.science avatar

@ZachWeinersmith Can I run it on my hardware VS a big corporation tries to get me locked into their cloud.

s427,
@s427@lou.lt avatar

@ZachWeinersmith A distinction I haven't seen mentioned: an AI that does not pretend to be anything else than a tool for a specific job, vs an AI that pretends to be some kind of vague all encompassing revolution (true general AI, etc).
(Of course I mean "that people pretend")
In other words, being upfront about what it is vs. bullshitting.

dos,

@ZachWeinersmith I don't dislike AI, it's a super cool tech, especially generative. I dislike people in power using it to make the world worse.

evannakita,
@evannakita@mastodon.online avatar

@ZachWeinersmith I think it’s about the harm it can cause, and not just the obvious effects on artists and writers. Chatbots are spreading dangerously false information, image generators are used to make deepfakes, facial recognition algorithms are getting people of color arrested for crimes they didn’t commit, algorithms used to process résumés are leading to discriminatory hiring practices…while accessibility features aren’t harming anyone.

raineyday,
@raineyday@mstdn.games avatar

@ZachWeinersmith

I forget who said it, but machines are supposed to handle the drudgeries of life so that mankind can be free to pursue creative endeavors. Machines were never meant to be creative for us.

kionay,

@ZachWeinersmith
Simple. It's bad when it decreases demand for a human to do something they enjoy and get paid for it.
The AI that takes out the trash is fine. AI that cries for me in the corner of my bedroom is ok because nobody is paying per-tear.
People like making art, and the only way AI art would work is if it didn't decrease this demand-for-human-happiness by proxy.

Ryanxiety,

@ZachWeinersmith If it wasn't for the massive wholesale art theft and proposing to use it instead of paying artists for work, I think I would really like the generative art. That's really it and where the line is for me. Is it built on massive wholesale content theft, and is it being used or proposed to replace a human element? If yes: bad, if no: good.

Crell,
@Crell@phpc.social avatar

@ZachWeinersmith Any machine learning that is put in a creative or decision making role is horrible. That includes suggestion algorithms.

When it's purely in a support role for a human making decisions, it's acceptable.

Caveat: Assuming the training data is ethically sourced.

gwennpetrichor,
@gwennpetrichor@eldritch.cafe avatar

@ZachWeinersmith Tools human use, or human used as tools

BigTheDave,
@BigTheDave@mastodon.gamedev.place avatar

@ZachWeinersmith So long as the training data was procured ethically. Ie, the original owners gave consent and/or were paid fairly.

My main beef is that these generative AI's are just this meme

b_age,
@b_age@troet.cafe avatar

@ZachWeinersmith yeah, you're right, here are the ones that are helpful tools and there are the ones to fake things, i think that's the difference.

Orb2069,
@Orb2069@mastodon.online avatar

@ZachWeinersmith My line is "what's the worst thing that could happen if it's wrong?"
Plant identifier: great - Mushroom identifier: terrible.
Guide your Aibo around the kitchen: awesome - guide your car around the parking lot: hell no.
Predict which kids might need tutoring support: Awesome - predict who's going to commit a crime: nightmarish.

malba,

@ZachWeinersmith If giving a ML technology to the masses is going to cause them to behave badly, or in a way that I think is badly, then that's the line. That includes the case where the technology gives bad information.

Plant ID apps are something I would have disdain for - what if the app is wrong about whether or not a plant is poisonous?

The one exception to that rule is algorithmic recommendations, because that's a bit of a wash in my opinion - humans will fuck it up either way.

joey,
@joey@mathstodon.xyz avatar

@ZachWeinersmith Does it make life better or worse? Does it lead us to Star Trek utopia or capitalist hellscape?

tjradcliffe, (edited )
@tjradcliffe@mastodon.scot avatar

@ZachWeinersmith

Anything designed to imitate a human is bad. Anything designed to replace a handbook or field guide is good.

gatesvp,
@gatesvp@mstdn.ca avatar

@ZachWeinersmith

Who is this AI working for and who is it working on?

I don't mind the generative AI tools. But they are being sold in a manner that is both copyright infringing and not fit for purpose. Its outputs are also designed to enrich the owners rather than creators.

Other software is both promising and delivering less. Much of it has also been in paid software for a long time. "Dragon naturally speaking" is two decades old. We have ethical options here.

Amr1ta,
@Amr1ta@mastodon.social avatar

@ZachWeinersmith I’m open to anything that’s an enabler. Which could be tools that simplifies a certain job or enable one to do their job better. I have my skepticism where this can become an imposter or worse a dictator.

f4grx,
@f4grx@chaos.social avatar

@ZachWeinersmith easy:
Like - 0%
Dislike - 100%

  • All
  • Subscribed
  • Moderated
  • Favorites
  • random
  • osvaldo12
  • magazineikmin
  • cubers
  • Durango
  • khanakhh
  • Youngstown
  • mdbf
  • slotface
  • rosin
  • kavyap
  • InstantRegret
  • ethstaker
  • everett
  • DreamBathrooms
  • JUstTest
  • thenastyranch
  • tester
  • GTA5RPClips
  • modclub
  • Leos
  • cisconetworking
  • tacticalgear
  • ngwrru68w68
  • normalnudes
  • anitta
  • provamag3
  • megavids
  • lostlight
  • All magazines