thomasfuchs,
@thomasfuchs@hachyderm.io avatar

“But AI is cheap!”

It’s not, it has horrendous hardware, server housing and water and power requirements; it’s just that VCs are financing it now so you get in on the hype and later they will charge you rent and it will cost you way more—with inferior results—than, you know, hiring the writers and artists it’s stealing from, but those will be gone by then.

WeiMingKai,

@thomasfuchs

This reminds me of the rush to put everything in the cloud even if it makes more $$$ sense for some use cases to remain on servers in data centers.

AI is currently hyped and touted as being able to solve and do everything - mostly it seems to give executives some plausible reason to lay off workers and declare themselves geniuses … until the effect of shoving all that work onto the surviving workers starts to show but by then the AI arsonists will have fled

Adventurer,
@Adventurer@sfba.social avatar

@WeiMingKai @thomasfuchs
I was just realizing I don't have any hard copies of all the photos I have taken over the last 20 years. Most stored on Google. They have total control to end hosting, delete, whatever. I need to work on a reasonable solution now.

WhiteCatTamer,
@WhiteCatTamer@mastodon.online avatar

@Adventurer @WeiMingKai @thomasfuchs Depending on how many photos you have stored, a 2-4 TB external hard drive can run you $70-250 US.

And it keeps them from using your photos and tags to train their AI.

Npars01,
@Npars01@mstdn.social avatar

@thomasfuchs

The "because it's cheap" usually has some caveats.

For example, American corporations offshored jobs to Asia with the "because it's cheap" excuse, then when covid hit and shut down China's factories, there were no local suppliers left to fill enormous demand for PPE or silicon chips.

Penny wise, pound foolish.
(Of course, American corporations exploited supply chain issues to price gouge & profiteer).

kiki,
@kiki@fnordon.de avatar

@thomasfuchs A.I. is costing artists everything and society as a whole our soul.

suldrew,
@suldrew@xoxo.zone avatar

@thomasfuchs it depends on cheap, or else not-my-billing-account, compute. At some point the major vendors may decide to use that compute for something else (for example, earning $).

derrickb,

@thomasfuchs Really insane that this has to be stated.

LLMs ARE IN PHASE 1 OF

JoshMurray,

@thomasfuchs “1kJ for a typical Google Search vs 3-36 kJ for a ChatGPT query” was mentioned in a recent presentation I attended.

Hypx,
@Hypx@mastodon.social avatar

@JoshMurray @thomasfuchs The 1 kJ figure is from 15 years ago BTW.

SenseException,
@SenseException@phpc.social avatar

@thomasfuchs It's cheap for the individual. Otherwise it would have disappeared pretty fast.

hakfoo,

@thomasfuchs The argument I find interesting is the "condom/tampon vending machine" perspective.

Someone might be embarrassed to ask for their desired product from an actual human, and have to go through rounds of refinement and elicitation with them.

"You're a well known captain of industry, you can't just slink into the Artist's Alley at Comicon and plop down $75 to commission a fursona."

Radeon Instinct cards might be more expensive, but they'll never make eye contact.

thomasfuchs,
@thomasfuchs@hachyderm.io avatar

@hakfoo That’s a complete non-argument because it’s easy to buy art/writing anonymously. Create a burner email and pay via bitcoin if you must.

danhulton,
@danhulton@hachyderm.io avatar

@thomasfuchs It's more correct to say that "AI is SUBSIDIZED." This gives it the appearance of being cheap, when in fact there's the whole rest of your post proving that naw, it ain't.

sibylle,
@sibylle@troet.cafe avatar

@thomasfuchs I came to the conclusion that someone else is paying the prize if something is cheap.
Someone is exploited. In most cases nature is paying quite a share of the price, too.

LordCaramac,
@LordCaramac@discordian.social avatar

@thomasfuchs It all depends on which kind of machine learning application we're talking about. Many artificial neural networks are small enough to run on a halfway decent PC, some slightly bigger ones need a powerful graphics workstation or gaming rig to be really useful, some are small enough to deliver decent performance even on a laptop computer, and some are even small enough to run on a tablet, smartphone, or Raspberry Pi.
The problem are the really huge LLMs like GPT4.

atomicpoet,
@atomicpoet@atomicpoet.org avatar

@LordCaramac @thomasfuchs I don’t want Chat-GPT. I just want something installed on my machine that can assist me with things like tone.

Basically, I want a better version of spelling and grammar check.

krzyzanowskim,
@krzyzanowskim@mastodon.social avatar

@thomasfuchs AI tech is on a straight Serverless path to charge in the future

  • All
  • Subscribed
  • Moderated
  • Favorites
  • random
  • ethstaker
  • DreamBathrooms
  • everett
  • magazineikmin
  • Durango
  • InstantRegret
  • rosin
  • Youngstown
  • love
  • slotface
  • GTA5RPClips
  • kavyap
  • mdbf
  • ngwrru68w68
  • megavids
  • thenastyranch
  • tacticalgear
  • cubers
  • modclub
  • osvaldo12
  • cisconetworking
  • tester
  • khanakhh
  • normalnudes
  • provamag3
  • anitta
  • Leos
  • JUstTest
  • All magazines