Am I the only software engineer greatly worried and disturbed by AI ?

Ok let’s give a little bit of context. I will turn 40 yo in a couple of months and I’m a c++ software developer for more than 18 years. I enjoy to code, I enjoy to write “good” code, readable and so.

However since a few months, I become really afraid of the future of the job I like with the progress of artificial intelligence. Very often I don’t sleep at night because of this.

I fear that my job, while not completely disappearing, become a very boring job consisting in debugging code generated automatically, or that the job disappear.

For now, I’m not using AI, I have a few colleagues that do it but I do not want to because one, it remove a part of the coding I like and two I have the feeling that using it is cutting the branch I’m sit on, if you see what I mean. I fear that in a near future, ppl not using it will be fired because seen by the management as less productive…

Am I the only one feeling this way? I have the feeling all tech people are enthusiastic about AI.

grue,
Radicaldog,

Kind of nice to see NFTs breaking through the floor at the trough of disillusionment, never to return.

pineapplelover,

Currently at the crossroads between trough of disillusionment and slope of enlightenment

ForestOrca,
ForestOrca avatar

Betteridge's law of headlines: No.

sunbrrnslapper,

The trough of disillusionment is my favorite.

BolexForSoup,
BolexForSoup avatar

To answer your question directly: The debate has been going on in the broader public since ChatGPT 3 dropped

To answer how you’re feeling: that’s valid, because a lot of big pockets seem to not care at all about the ethical considerations.

Hestia,

I’ve been messing around with running my own LLMs at home using LM Studio and I’ve got so say it really helps me write code. I’m using Code Llama 13b, and it works pretty well as a programmer assistant. What I like about using a chatbot is that I go from writing code to reviewing it, and for some reason this keeps me incredibly mentally engaged. This tech has been wonderful for undoing some of my professional burnout.

If what keeps you mentally engaged does not include a bot, then I don’t think you need any other reason to not use one. As much as I really like the tech, anyone that uses it is still going to need to know the language and enough about the libraries to fix the inevitable issues that come up. I can definitely see this tech getting better to the point of being unavoidable, though. You hear that Microsoft is planning on adding an AI button to their upcoming keyboards? Like that kind of unavoidable.

purpleprophy,

This might cheer you up: visualstudiomagazine.com/…/copilot-research.aspx

I don’t think we have anything to worry about just yet. LLMs are nothing but well-trained parrots. They can’t analyse problems or have intuitions about what will work for your particular situation. They’ll either give you something general copied and pasted from elsewhere or spin you a yarn that sounds plausible but doesn’t stand up to scrutiny.

Getting an AI to produce functional large-scale software requires someone to explain precisely the problem domain: each requirement, business rule, edge case, etc. At which point that person is basically a developer, because I’ve never met a project manager who thinks that granularly.

They could be good for generating boilerplate, inserting well-known algorithms, generating models from metadata, that sort of grunt work. I certainly wouldn’t trust them with business logic.

fievel,

I think you raise a very good point about explaining the problem… Even us as “smart humans” have often great difficulty to see the point while reading PM specs…

spez_,

Yeh

cobra89,

I’m gonna sum up my feelings on this with a (probably bad) analogy.

AI taking software developer jobs is the same thinking as microwaves taking chefs jobs.

They’re both just tools to help you achieve the same goal easier/faster. And sometimes the experts will decide to forego the tool and do it by hand for better quality control or high complexity that the tool can’t do a good job at.

LordGimp,

As a welder, I’ve been hearing for 20 years that “robots are going to replace you” and “automation is going to put you out of a job” yadda yadda. None of you code monkies gave a fuck about me and my job, but now it’s a problem because it affects you and your paycheck? Fuck you lmao good riddance to bad garbage.

IphtashuFitz,

I’m a 50+ year old IT guy who started out as a c/c++ programmer in the 90’s and I’m not that worried.

The thing is, all this talk about AI isn’t very accurate. There is a huge difference in the LLM stuff that ChatGPT etc. are built on and true AI. These LLM’s are only as good as the data fed into them. The adage “garbage in, garbage out” comes to mind. Anybody that blindly relies on them is a fool. Just ask the lawyer that used ChatGPT to write a legal brief. The “AI” made up references to non-existent cases that looked and sounded legitimate, and the lawyer didn’t bother to check for accuracy. He filed the brief and it was the judge that discovered the brief was a work of fiction.

Now I know there’s a huge difference between programming and the law, but there are still a lot of similarities here. An AI generated program is only going to be as good as the samples provided to it, and you’re probably want a human to review that code to ensure it’s truly doing what you want, at the very least.

I also have concern that programming LLMs could be targeted by scammers and the like. Train the LLM to harvest sensitive information and obfuscate the code that does it so that it’s difficult for a human to spot the malicious code without a highly detailed analysis of the generated code. That’s another reason to want to know exactly what the LLM is trained on.

l0st_scr1b3,

Yes of course you are.

fievel,

I probably should have used llm to help me write a clearer question :D

csm10495,
@csm10495@sh.itjust.works avatar

I use GitHub Copilot from work. I generally use Python. It doesn’t take away anything at least for me. It’s big thing is tab completion; it saves me from finishing some lines and adding else clauses. Like I’ll start writing a docstring and it’ll finish it.

Once in a while I can’t think of exactly what I want so I write a comment describing it and Copilot tries to figure out what I’m asking for. It’s literally a Copilot.

Now if I go and describe a big system or interfacing with existing code, it quickly gets confused and tends to get in the weeds. But man if I need someone to describe a regex, it’s awesome.

Anyways I think there are free alternatives out there that probably work as well. At the end of the day, it’s up to you. Though I’d so don’t knock it till you try it. If you don’t like it, stop using it.

feoh,

This. I’ve seen SO much hype and FUD and all the while there are thousands of developers grinding out code using these tools.

Does code quality suffer? ONLY in my experience if they have belt wielding bean counters forcing them to ship well before it’s actually ready for prime time :)

The tools aren’t perfect, and they most DEFINITELY aren’t a panacea. The industry is in a huge contraction phase right now so I think we have a while before we have to worry about AI induced layoffs, and if that happens the folks doing the laying off are being incredibly short sighted and likely to have a high impact date with a wall coming in the near future anyway.

itsnotits,

Its* big thing

L0rdMathias,

It doesn’t matter what you think about AI. It’s very clear that this technology is here to stay and will only improve. From this point on AI will become deeply integrated into human culture and technology, after all we’ve been fetishizing it for almost 100 years now. Your only logical option as a developer is to learn how to use it and abuse it. Choosing not to do so is career suicide, possibly even societal suicide depending on how quickly adoption happens.

You’re probably right, in the near future people that can’t use it will be fired. To that point they should be fired. Why the fuck would I allow my accounts to do their financal work on paper when Excel exists?

Welcome to the future.

daniyeg,

i’m still in uni so i can’t really comment about how’s the job market reacting or is going to react to generative AI, what i can tell you is it has never been easier to half ass a degree. any code, report or essay written has almost certainly came from a LLM model, and none of it makes sense or barely works. the only people not using AI are the ones not having access to it.

i feel like it was always like this and everyone slacked as much as they could but i just can’t believe it, it’s shocking. lack of fundamental and basic knowledge has made working with anyone on anything such a pain in the ass. group assignments are dead. almost everyone else’s work comes from a chatgpt prompt that didn’t describe their part of the assignment correctly, as a result not only it’s buggy as hell but when you actually decide to debug it you realize it doesn’t even do what its supposed to do and now you have to spend two full days implementing every single part of the assignment yourself because “we’ve done our part”.

everyone’s excuse is “oh well university doesn’t teach anything useful why should i bother when i’m learning <insert js framework>?” and then you look at their project and it’s just another boilerplate react calculator app in which you guessed it most of the code is generated by AI. i’m not saying everything in college is useful and you are a sinner for using somebody else’s code, indeed be my guest and dodge classes and copy paste stuff when you don’t feel like doing it, but at least give a damn on the degree you are putting your time into and don’t dump your work on somebody else.

i hope no one carries this kind of sentiment towards their work into the job market. if most members of a team are using AI as their primary tool to generate code, i don’t know how anyone can trust anyone else in that team, which means more and longer code reviews and meetings and thus slower production. with this, bootcamps getting more scammy and most companies giving up on junior devs, i really don’t think software industry is going towards a good direction.

shasta,

I think I will ask people if they use AI to write code when I am interviewing them for a job and reject anyone who does.

olbaidiablo,

AI allows us to do more with less just like any other tool. It’s no different than an electric drill or a powered saw. Perhaps in the future we will see more immersive environment games because much of the immersive environment can be made with AI doing the grunt work.

Yerbouti,

I’m a composer. My facebook is filled with ads like “Never pay for music again!”. Its fucking depressing.

cobra89,

Good thing there’s no Spotify for sheet music yet… I probably shouldn’t give them ideas.

suction,

As someone with deep knowledge of the field, quite frankly, you should now that AI isn’t going to replace programmers. Whoever says that is either selling a snake oil product or their expertise as a “futurologist”.

viralJ,

Could you elaborate? I don’t have a deep knowledge of the field, I only write rudimentary scripts to make some ports of my job easier, but from the few videos on the subject that I saw, and from the few times I asked AI to write a piece of code for me, I’d say I share the OP’s worry. What would you say is something that humans add to programming that can’t (and can never be) replaced by AI?

Ludrol,
@Ludrol@szmer.info avatar

It can’t reason. It can’t write novel high quality, high complexity code. It can only parrot what other had said.

sunbeam60,

90% of code is something already solved elsewhere though.

Ludrol,
@Ludrol@szmer.info avatar

AI doesn’t know if the code copied is correct. It will stright up hallucinate non existing libraries just because they seem to look good at first glance.

sunbeam60,

Depends on how you set it. A RAG LLM verifies up against a set of sources, so that would be very unlikely in state of the art.

ggwithgg,

I think the need for programmers will always be there, but there might be a transition towards higher abstraction levels. This has actually always been happening: we started with much focus on assembly languages where we put in machine code, but nowadays a much less portion of programmers are involved in those and do stuff in python, java or whatever. It is not essential to know stuff about garbage collection when you are writing an application, because the compiler already does that for you.

Programmers are there to tell a computer what to do. That includes telling a computer how to construct its own commands accordingly. So, giving instructions to an AI is also programming.

LarmyOfLone,

Yeah that’s what I was just thinking. Once we somehow synthesize this LLM into a new type of programming language it gets interesting. Maybe a more natural language that gets the gist of what you are trying to do. And then a unit test to see if it works. And then you verify. Not sure if that can work.

TBH I’m a bit shocked that programmers are already using AI to generate programming, I only program as a hobby any more. But it sounds interesting. If I can get more of my ideas done with less work I’d love it.

I think fundamentally, philosophically there are limits. Ultimately you need language to describe what you want to do. You need to understand the problem the “customer” has and formulate a solution and then break it down into solvable steps. AI could help with that but fundamentally it’s a question of describing and the limits of language.

Or maybe we’ll see brain interfaces that can capture some of the subtleties of intend from the programmer.

So maybe we’ll see the productivity of programmers rise by like 500% or something. But something tellse me (Jevons paradox) the economy would just use that increased productivity for more apps or more features. But maybe the needed qualifications for programmers will be reduced.

Or maybe we’ll see AI generating programming libraries and development suits that are more generalized libraries. Or like existing crusty libraries rewritten to be more versatile and easier to use by AI powered programmers. Maybe AI could help us create a vast library of more abstract / standard problem+solutions.

knightly,
@knightly@pawb.social avatar

Generative neural networks are the latest tech bubble, and they’ll only be decreasing in quality from this point on as the human-generated text used to train them becomes more difficult to access.

One cannot trust the output of an LLM, so any programming task of note is still going to require a developer for proofreading and bugfixing. And if you have to pay a developer anyway, why bother paying for chatgpt?

It’s the same logic as Tesla’s “self-driving” cars, if you need a human in the loop then it isn’t really automation, just sparkling cruise control that isn’t worth the price tag.

I’m really looking forward to the bubble popping this year.

viralJ,

This year? Bold prediction.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • asklemmy@lemmy.ml
  • DreamBathrooms
  • magazineikmin
  • cubers
  • everett
  • rosin
  • Youngstown
  • ngwrru68w68
  • slotface
  • osvaldo12
  • Durango
  • kavyap
  • InstantRegret
  • tacticalgear
  • khanakhh
  • megavids
  • GTA5RPClips
  • normalnudes
  • thenastyranch
  • mdbf
  • ethstaker
  • modclub
  • Leos
  • tester
  • provamag3
  • cisconetworking
  • anitta
  • JUstTest
  • lostlight
  • All magazines