N_Crow,
@N_Crow@leminal.space avatar

Can’t wait for the 94% unemployed to raid the banks and eat the Bankers.

roofuskit,

AI will remove 41% of execs, say 100% of people who know what AI is.

Daft_ish,

Lol, this is how you enshitify the workforce.

Fedizen,

🎵Dumb Dumb Dumb Dumb Dumb🎶

cabron_offsets,

“41% of execs display anomalous sexual prowess, 9” dongs thought to play a role.”

AngryCommieKender,

My money is on cyberdongs

AllNewTypeFace,
@AllNewTypeFace@leminal.space avatar

It’ll reduce the workforce from well-remunerated professionals who perform tasks to a larger number of disposable minimum-wage labourers who clean up botshit.

roofuskit,

Pretty sure the entire Republican party and the ruling class they serve just orgasmed at that thought.

PriorityMotif,
@PriorityMotif@lemmy.world avatar

Bye bye middle management!

But seriously, work will always expand to the available workforce. That’s why there are so many stupid industries. They always tank during a resession, but other industries will expand to use excess labor.

ICastFist,
@ICastFist@programming.dev avatar

If it isn’t yet, AI will be calling the shots on the actual money owners (those big investment companies like Blackrock). Invest here, invest there, demand more from elsewhere. Said AI will then dictate who should be appointed CEO, director, etc, because it will be asked to name “a human” and little Timmy McMeritocracy, son of a high up elsewhere, needs his first job, nevermind that putting an AI in his place would be more profitable.

menemen,
@menemen@lemmy.world avatar

And as a result the remaining workforce will have to work more.

interdimensionalmeme,

Eventually the entire economy will be just one overworked australian man.

markon,

Oi did I hear my name, mate?

menemen,
@menemen@lemmy.world avatar

And he’ll still be underpaid.

w3dd1e,

Quiet, mate. Pulling the empty carts is the closest thing we get to sleep.

Lucidlethargy,

People here keep belittling AI. You’re all wrong, at least when considering the long run… We can’t beat it. We need to outlaw it.

Train it to replace CEO’s.

markon,

Y’all are dumbass doomers. Have some fun with AI while your can you some aged peasants. We were always fucked.

echodot,

It’s Schrödinger’s AI. It is both useless and will replace everyone. Depending on the agenda the particular person is trying to push.

We need to outlaw it.
Train it to replace CEO’s.

Oh, there it goes again.

afraid_of_zombies,

I know it’s getting boring. I am tried of people telling me how chatgpt and friends are toys that just spit back website data and in the same comment telling me how they are basically angry gods ready to end the human race.

Fucking make up your mind!

Blackmist,

“Smash the looms” is the wrong idea.

“Eat the rich” might have some merit though.

captainlezbian,

Yeah, don’t smash the looms, seize them. The ability to make labor easier and more efficient is a positive if we don’t allow it to be a means to impoverish the workers

Buttons,
@Buttons@programming.dev avatar

Outlawing it is a very dangerous aim, because outlawing it completely will enable other countries to out-compete us, and a outlawing it completely is right next to “outlaw it for normal people, but allow companies to exploit it for profit” on the dart board of possibilities.

Better path all around is “allow everyone to use AI and establish strong social safety nets and move towards enabling people to work less”.

MajinBlayze,

If AI is outlawed, only outlaws will have AI

TwilightVulpine,

Haven’t I been hearing that since the rise of computing and the internet? And it’s probably been around even longer. Seems like this sort of stuff only gets going when a lot of workers start putting up a fight.

But hey, maybe 41% jobs lost might be the tipping point. Because people aren’t just gonna sit on the sidewalk and starve.

sugar_in_your_tea,

Nah, I disagree on both counts.

We can’t beat it. We need to outlaw it.

Is the intent here to preserve jobs even if it’s less productive? That’s solving the wrong problem. Instead of banning it, we should be adapting to it. If AI is more efficient than people, the jobs people take should change.

I think there’s a solid case that if something would devolve into rent-seeking because competition is unproductive, it should be provided as a public service. Do you need a job if all of your basic needs are met by AI? At that point, any work you do would be optional, so people would follow their passions instead of working to make ends meet (see: Star Trek universe).

Think of it like Basic Income, but instead of cash, you’d get services at-cost. I think there’s room for non-profits (or maybe the government) to provide these AI-services at-cost.

WallEx,

This is why not every business is successful I guess

febra,

Can’t wait for AI to replace all those useless execs and CEOs. It’s not like they even do much anyways, except fondling their stocks. They could probably be automated by a markov chain

echodot,

If they could replace project managers that would be nice. In theory it is an important job, but in practice it’s just done by someone’s mate who was most productive when they don’t actually turn up.

ICastFist,
@ICastFist@programming.dev avatar

The Paranoia RPG has a very realistic way of determining who gets to be the leader of a group. First, you pick who’ll do what kind of job (electronics, brute force, etc). Whoever didn’t get picked becomes the leader, as that person is too dumb to do anything useful.

echodot,

Yes that’s quite a funny and satirical way of doing it but it’s probably not actually the best way in real life.

I think Boeing have proven this quite nicely for everyone, the company was much better off when they had actual engineers in charge. When they got corporate paper pushes everything went downhill.

afraid_of_zombies, (edited )

I have been on enough projects where engineers were in charge that went to hell to know that isnt always a solution. And yes I am an engineer.

One of the projects I am on now the main lead is full PE civil and its a manmade clusterfuck well behind schedule, overbudget, and several corporate bridges burned. Haven’t even started digging yet.

By far the very biggest cluster fuck I was ever on was run by a Chemical Engineer. A 40 million dollar disaster that never should have been even considered.

Being good at technical problems (which frankly most of us aren’t) doesn’t mean you know how to do anything else.

afraid_of_zombies,

I have had good ones and not so good ones.

Wanderer,

I swear people don’t know the difference between a good project manager and a bad one, or no one.

Everyone on here is on about how the.board has no idea what the bottom rungs of the ladder do and are all “haha they are so stupid they think we do nothing”. Then in the next sentence say they don’t know what the board does and that they just do nothing.

echodot,

Project managers on board members what the hell you want about

Wanderer,

People slagging off jobs they don’t understand.

Both project managers that they probably have experience with dealing with but don’t understand and board members they probably don’t have any experience with and also don’t understand.

afraid_of_zombies,

Board members don’t do shit

Wanderer,

I see.

What is this judgment based on?

afraid_of_zombies,

First hand experience

afraid_of_zombies,

Don’t get a job in government contracting. Pretty much I do the work and around 5 people have suggestions. None of whom I can tell to fuck off directly.

Submit the drawing. Get asked to make a change to align with a spec. Point out that we took exception to the spec during bid. Get asked to make the change anyway. Make the change. Get asked to make another change by someone higher up the chain of five. Point out change will add delays and cost. Told to do it anyway. Make the next change…

Meanwhile every social scientist “we don’t know what is causing cost disease”

SomeGuy69, (edited )

I never had the impression that there were enough people for the amount of work anyways. I don’t see jobs go, but shift. Most developers will be fine, because of never ending work, AI is just a tool speeding things up. But not that much, as someone who is good with Google and git, is just a bit slower to find the same answers. And AI needs verification too, even if it links you directly to the issue at hand, via source url.

AI will create new issues. Some of the low level requirement jobs will go, like working in first level support, but only if you learn the AI yourself, else it’s too generic. We’re not there yet, where companies learn their own LLM yet. some outlier try.

We got to understand that there’s still a human layer and a lot of people might prefer calling a human, even if the result is worse, simply because we’re social beings. This can cost a lot of customers, if companies believe they can just shove an AI in front.

No one really knows how good AI will get. As the technology advances, we find more and more hard to solve issues, for instance that AI will make things up or gives wrong answers, despite knowing the real answer, if you pressure hard enough.

Also for security reasons you can’t add AI everywhere, unless you want to send all secrets directly to Microsoft, Google or Facebook.

My 5 cents.

JimboDHimbo,

I find the way that you write peculiar, in a good way. I mean no offense, but is English your secondary language?

SomeGuy69,

Yeah, it’s my second language. Sorry I wrote it a minute before bed, sometimes sentences become even weirder then. I went back and added some more commas. Haha

Eccitaze,
@Eccitaze@yiffit.net avatar

After reading this article that got posted on Lemmy a few days ago, I honestly think we’re approaching the soft cap for how good LLMs can get. Improving on the current state of the art would require feeding it more data, but that’s not really feasible. We’ve already scraped pretty much the entire internet to get to where we are now, and it’s nigh-impossible to manually curate a higher-quality dataset because of the sheer scale of the task involved.

We also can’t ask AI to curate its own dataset, because that runs into model collapse issues. Even if we don’t have AI explicitly curate its own dataset, it’s highly likely going to be a problem in the near future with the tide of AI-generated spam. I have a feeling that companies like Reddit signing licensing deals with AI companies are going to find that they mostly want data from 2022 and earlier, similar to manufacturers looking for low-background steel to make particle detectors.

We also can’t just throw more processing power at it because current LLMs are already nearly cost-prohibitive in terms of processing power per query (it’s just being masked by VC money subsidizing the cost). Even if cost wasn’t an issue, we’re also starting to approach hard limits in physics like waste heat in terms of how much faster we can run current technology.

So we already have a pretty good idea what the answer to “how good AI will get” is, and it’s “not very.” At best, it’ll get a little more efficient with AI-specific chips, and some specially-trained models may provide some decent results. But as it stands, pretty much any organization that tries to use AI in any public-facing role (including merely using AI to write code that is exposed to the public) is just asking for bad publicity when the AI inevitably makes a glaringly obvious error. It’s marginally better than the old memes about “I trained an AI on X episodes of this show and asked it to make a script,” but not by much.

As it stands, I only see two outcomes: 1) OpenAI manages to come up with a breakthrough–something game-changing, like a technique that drastically increases the efficiency of current models so they can be run cheaply, or something entirely new that could feasibly be called AGI, 2) The AI companies hit a brick wall, and the flow of VC money gradually slows down, forcing the companies to raise prices and cut costs, resulting in a product that’s even worse-performing and more expensive than what we have today. In the second case, the AI bubble will likely pop, and most people will abandon AI in general–the only people still using it at large will be the ones trying to push disinfo (either in politics or in Google rankings) along with the odd person playing with image generation.

In the meantime, what I’m most worried for are the people working for idiot CEOs who buy into the hype, but most of all I’m worried for artists doing professional graphic design or video production–they’re going to have their lunch eaten by Stable Diffusion and Midjourney taking all the bread-and-butter logo design jobs that many artists rely on for their living. But hey, they can always do furry porn instead, I’ve heard that pays well~

melpomenesclevage,

Missing the point.

AI won’t so much replace labor as make it more fungible, and thus exploitable/abusable.

Except where its used as an excuse to just… Not. “Yes we have customer service; its just all chatgpt with no permissions” so nobody can ever return shit that was delivered broken.

Leate_Wonceslace,
@Leate_Wonceslace@lemmy.dbzer0.com avatar

59% of execs are wrong.

melpomenesclevage,

I think that’s a little low.

UnderpantsWeevil,
@UnderpantsWeevil@lemmy.world avatar

They’ll be replaced with AI

Harbinger01173430,

Thankfully I don’t even wanna work. I just wanna live and if that’s not possible, exist.

Punk_face,

Same. I welcome our AI overlords as long as that means I can just stay at home and fully embrace my autism by not giving a fuck about the workforce while studying all of the thousands of subjects I enjoy learning about.

Ultragigagigantic,
@Ultragigagigantic@lemmy.world avatar

I say AI overlords might be an improvement over the human overlords that have persisted throughout human history.

menemen,
@menemen@lemmy.world avatar

The AI overlords will be trained on data based on human overlords decisions and justifications. We are fucked, my man.

echodot,

They won’t be though because the managers don’t know anything about AI. People who actually train the AI will be some poor sap in IT who’s been lumbered with a job they don’t want, because AI is computers right.

So I’m going to train it on good stuff written by professionals, Star Trek episodes, and make it watch War Games.

The managers don’t even have any data sets the AI could absorb anyway because most of their BS is in person, and so not recorded for analysis.

menemen,
@menemen@lemmy.world avatar

Oh my. I see you don’t know mich about the hell called key performance indicators…

Key performance indicators will be what will turn our AI overlords into AI tyrants. And there is so so much data available for training the AIs.

melpomenesclevage,

Not a thing til the revolution, dear.

echodot,

The autism is not required. No one cares about their jobs, especially people who work in jobs where “everyone is a family”. People care about those jobs the least.

markon,

I will never care if AI takes mandatory work from me, but I want income replacement lol. Seriously though I hate working so much every job I’ve ever had has made me suicidal at some point. I’m glad there’s a chance at least I won’t have nothing but work and death ahead of me. If that’s all that’s left it’s okay, a little disappointing but it is what it is.

melpomenesclevage,

Not allowed. Work or die, im afraid.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • technology@lemmy.world
  • GTA5RPClips
  • DreamBathrooms
  • thenastyranch
  • magazineikmin
  • tacticalgear
  • cubers
  • Youngstown
  • mdbf
  • slotface
  • rosin
  • osvaldo12
  • ngwrru68w68
  • kavyap
  • InstantRegret
  • JUstTest
  • everett
  • Durango
  • cisconetworking
  • khanakhh
  • ethstaker
  • tester
  • anitta
  • Leos
  • normalnudes
  • modclub
  • megavids
  • provamag3
  • lostlight
  • All magazines