LeFantome,

Implementing old standards does not magically result in unstable software. I can create software today that implements decades old standards using whatever whiz-bang tech is in vogue.

I do lot accept that “old bases” have to succumb to any of the things you suggest either. Refactoring is a thing. You can remove dead code, you can adopt new patterns, you make code modular, you can even extend using new tech if you want.

Linux is 30 years old ( the basic design is decades older ). Should we throw it out? I vote no but allowing Rust into the kernel seems like a good idea. How old is GCC? How old is Microsoft Office? How old is Firefox? This is software you may use every day. Trust me, your life relies on software that is much, much older. How often do you think they rewrite air traffic control systems or core financial software to to make it more “stable” as you suggest?

I mostly hear your argument when devs want to try new tech and cannot justify it any other way. Most often the result is something that is far buggier and missing many features. By the time the features return, the new code is at least as bloated as the original. Around then, somebody usually suggests a total rewrite.

Old architectures are a different story. Sometimes things are not worth fixing in place. In my experience though, this is fairly rare. Even then, in-place migration to something else often makes more sense.

In my view, if you cannot modernize an old code base, it is a skills issue.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • linux@lemmy.ml
  • DreamBathrooms
  • everett
  • cisconetworking
  • magazineikmin
  • mdbf
  • rosin
  • ngwrru68w68
  • thenastyranch
  • Youngstown
  • slotface
  • khanakhh
  • kavyap
  • Durango
  • ethstaker
  • JUstTest
  • normalnudes
  • tester
  • cubers
  • tacticalgear
  • InstantRegret
  • osvaldo12
  • modclub
  • Leos
  • provamag3
  • GTA5RPClips
  • anitta
  • megavids
  • lostlight
  • All magazines