AmenZwa,
@AmenZwa@mathstodon.xyz avatar

In software, the #C language rules—and for many good reasons, too. I have loved C for more than four decades. But I admit that C is woefully dated, inherently unsafe, and imposes a high cognitive load upon the .

Newer languages vying to knock C off its perch—Rust, Nim, Zig, Odin, etc.—are overly cute and complicated to suit the real-time, embedded work.

The embedded sector needs a new language with C's simplicity, efficiency, semantics, and determinism and Haskell's safety, effectiveness, syntax, and dash.

qqmrichter,
@qqmrichter@mastodon.world avatar

@AmenZwa Personally I would kill my grandmother to get a decent Ada job in embedded space.

AmenZwa,
@AmenZwa@mathstodon.xyz avatar

@qqmrichter Ah mate, you're an idealist.🤣

Jokes aside, Ada is still being used within prominent places in the DoD, and the DoD is desperately trying to "modernise" it away and displace it with Python.🤷‍♂️

dougmerritt,
@dougmerritt@mathstodon.xyz avatar

@AmenZwa @qqmrichter
Replacing Ada with Python? I thought you said "jokes aside"??

Anyway yeah I've been under the impression that there are always some jobs in embedded Ada.

AmenZwa,
@AmenZwa@mathstodon.xyz avatar

@dougmerritt @qqmrichter Yeah, no joke—Python is very good at the strangle-swallow-shitout job.

In the federal sector, the government is desperately trying to replace FORTRAN, Ada, and COBOL. They can no longer find new programmers to work on old systems. Depending on which shitty contractor is doing the job, the replacement language is usually C# or Python.

But I draw the line, when they start proposing to replace Verilog and VHDL with Python.

dougmerritt,
@dougmerritt@mathstodon.xyz avatar

@AmenZwa @qqmrichter
IIRC Fortran eventually added some cutting-edge-for-the-1970s features like goto-less control constructs, but no doubt the primary need is maintenance of 1965-era Fortran programs. 🙄

(I had a student job doing maintenance on a 1 million line Fortran program that simulated passive solar buildings over the course of a year's worth of weather; some ancient cruddy things are actually worthwhile...)

"But I draw the line, when they start proposing to replace Verilog and VHDL with Python."

Ha! It was probably proposed and only reluctantly un-proposed.

Oh wait. Sigh, I should have known. HDL "Amaranth" is based on Python. https://github.com/amaranth-lang/amaranth

Isn't there some ML-based HDL?

Let's see...Bluespec, based on Haskell https://en.wikipedia.org/wiki/Bluespec

AmenZwa,
@AmenZwa@mathstodon.xyz avatar

@dougmerritt @qqmrichter Oh, mate, "will no one rid [us] of this [pesky Python]?".🤣

dougmerritt,
@dougmerritt@mathstodon.xyz avatar

@AmenZwa @qqmrichter
I don't hate Python (maybe because I've only used it lightly?) but there's such a thing as the right tool for the job.

AmenZwa,
@AmenZwa@mathstodon.xyz avatar

@dougmerritt @qqmrichter I don't hate Python, either; in fact, I quite like it—for what it is. But when those priests started delivering "Python everywhere" sermons, it soured for me.

whitequark,
@whitequark@mastodon.social avatar

@AmenZwa @dougmerritt @qqmrichter funnily enough, I don't like Python either--Amaranth is written in Python because I needed a HDL that
(a) uses a real language for a metalanguage (SV's metalanguage is emphatically not, and the support for it is still nonexistent in some toolchains)
(b) accessible to people who neither learned Ada nor spent five years dealing with every wart of SV

if not for (b), I'd have based it on OCaml

dougmerritt,
@dougmerritt@mathstodon.xyz avatar

@whitequark @AmenZwa @qqmrichter
Very interesting, thanks for your note!

whitequark,
@whitequark@mastodon.social avatar

@dougmerritt @AmenZwa @qqmrichter several years into it, I can say that the accessibility aspect has worked out great, and by putting in a lot of effort into tooling--an amount far higher than goes into your typical new HDL, and comparable with Rust--the end product is quite nice to use

AmenZwa,
@AmenZwa@mathstodon.xyz avatar

@whitequark @dougmerritt @qqmrichter Creating an HDL is a praiseworthy achievement, indeed. And without detracting from your hard work, I'd like to ask if you've explored Xtext, MPS, and other "language workbenches" that are specifically designed for creating external DSLs.

Xtext is an Eclipse EMF and the DSL created in Xtext inherits Eclipse IDE integration. MPS, too, is open source; it is from JetBrains. The DSL created in MPS gets IntelliJ IDEA IDE integration.

Both frameworks produce external DSL with complete tooling. So, they offer the language designer the complete freedom to design the syntax and semantic that best matches the specific task at hand.

Not a suggestion, just a question from a curious bloke.

whitequark,
@whitequark@mastodon.social avatar

@AmenZwa @dougmerritt @qqmrichter I don't think that would serve the goal (b) in this case--it is easy to find programmers that can somewhat do Python and reasonably feasible to teach them the basics of a HDL (when the HDL doesn't feel like it's actively making it more difficult for you to write correct code), but it's much, much harder to find people who will commit to using your completely new HDL metalanguage.

whitequark,
@whitequark@mastodon.social avatar

@AmenZwa @dougmerritt @qqmrichter (You might interject, "random Python programmers can't be expected to understand all the intricacies of FPGA development offhand!". And you'd be right, they can't. But it is much easier to teach them all that stuff when e.g. the entirety of arithmetics in the HDL can be described as "exactly the same as in Python, but assignment truncates to a known bit width".)

whitequark,
@whitequark@mastodon.social avatar

@AmenZwa @dougmerritt @qqmrichter As a professional language designer whose career in computing was spent doing mostly that, I find it very easy to pick up new languages--I don't even notice it anymore--but equally I recognize how many people find it profoundly difficult, and that providing familiar semantics as a base lowers the entry barrier to the point where almost anyone can pick up enough to write passable HDL in a few days.

whitequark,
@whitequark@mastodon.social avatar

@AmenZwa @dougmerritt @qqmrichter This enables applications such as the Glasgow Interface Explorer https://glasgow-embedded.org/latest/intro.html, which is an FPGA-based tool that doesn't come with a bitstream--instead it generates a bitstream on the fly to fit the exact configuration you want. The entire thing is in Python and even mechanical engineers can pick it up and write decent code with minimal training. I don't see that being enabled by almost any other solution.

whitequark,
@whitequark@mastodon.social avatar

@AmenZwa @dougmerritt @qqmrichter To add to this from a different perspective, I come from a programming culture that emphasizes batch processing over IDEs (with a language server here and there), so having Eclipse and IntelliJ integration are just not things I find inherently desirable. I do put in a lot of work to get good diagnostics (clang-style) and I write all the tooling for that myself. You get used to it after the second or third time :)

AmenZwa,
@AmenZwa@mathstodon.xyz avatar

@whitequark @dougmerritt @qqmrichter There is only so much "bar lowering" one could do though, at least in hardware arena. It becomes a losing proposition (and a dangerous one) to admit to many low-bar programmers into the ranks.

But I'm sure there is a Q-point in this process.

whitequark,
@whitequark@mastodon.social avatar

@AmenZwa @dougmerritt @qqmrichter I fundamentally disagree. Amaranth doesn't try to hide the complexities of RTL development from you (it's actually slightly lower level than Verilog, though not in a way that matters a lot). You still have to understand synchronous logic, CDC, IO primitives, all that. But what you don't have to deal with is the language quirks that people make entire careers teaching others to avoid, and which require linters to get remotely acceptable code. Also,

whitequark,
@whitequark@mastodon.social avatar

@AmenZwa @dougmerritt @qqmrichter you get a real package manager, a real build system, integration with most of the toolchains on the market, and other things that hardware developers for some reason tolerate the absence of.

So it's not that it aims to admit 'worse' programmers--rather, it aims to admit programmers who rightly don't have time for bullshit that the rest of the industry stopped considering acceptable in 70s and 80s.

AmenZwa,
@AmenZwa@mathstodon.xyz avatar

@whitequark @dougmerritt @qqmrichter Good on ya. Have you sensed much resistance from the establishment in this sector?

whitequark,
@whitequark@mastodon.social avatar

@AmenZwa @dougmerritt @qqmrichter It really depends on who you're talking to. Some people have an open mind and some don't. I'm in an odd position though where I've never even really advertised or promoted it--there's been more interest in the project organically than I've been able to reasonably support until recently. (I'm disabled and spent a few years not able to work full time.)

whitequark,
@whitequark@mastodon.social avatar

@AmenZwa @dougmerritt @qqmrichter Oh, and I'm also working on CXXRTL, a new simulation engine that provides Verilator-like performance while also having IDE integration, 100% debug coverage (i.e. whenever you stop simulation you can examine any signal, even those that were optimized out), a checkpoint based record/replay system that obsoletes huge VCD files, mixed language support, and other things you normally only get at a very high price tag.

AmenZwa,
@AmenZwa@mathstodon.xyz avatar

@whitequark @dougmerritt @qqmrichter That's very true, of course. But every programmer, even juniors, deal with tonnes of DSLs on a daily basis: HTML, CSS, SQL, GraphQL, the list keeps growing. And in hardware design, programmers tend to be of a higher calibre—the job demands so. As such, they may well see the value of a put-together DSL.

But then, every hardware designer I know is hyper conservative—again, the job demands so. Perhaps they'd object a "new" language on that ground, too.

whitequark,
@whitequark@mastodon.social avatar

@AmenZwa @dougmerritt @qqmrichter They do! One thing that is useful to point out is that Amaranth isn't targeting the SystemVerilog die-hards. It is targeting the less conservative people, yes, but also to a large extent people who would otherwise never do HDL at all because it's too difficult or time-consuming to learn all the ins and outs of a toolchain.

As for the DSLs people deal with: haven't we dealt with decades of SQL injection because people could not be bothered...

whitequark,
@whitequark@mastodon.social avatar

@AmenZwa @dougmerritt @qqmrichter to learn even the absolute basics of SQL beyond what was sufficient to get something working? That was bad enough, but with a HDL metalanguage, we don't even have the luxury of having the domain and scope be limited; instead, this language will be tasked with doing just about anything, from file I/O to parsing XML to doing math to integrating with build systems to ... the list continues. If your metalanguage is powerful enough for all that, it is...

whitequark,
@whitequark@mastodon.social avatar

@AmenZwa @dougmerritt @qqmrichter also correspondingly difficult enough to get started with, much less become proficient in. (I think SystemVerilog itself demonstrates this pretty well! SystemVerilog and the Perl scripts and Emacs macros that do the heavy lifting of maintaining that SystemVerilog at every ASIC company.)

Hardware design is difficult enough without asking people to learn an entirely new metalanguage that is definitely not tested or documented nearly as well as e.g. Python.

paninid,
@paninid@mastodon.world avatar

@AmenZwa @dougmerritt @qqmrichter
Is it feasible to train an LLM on a FORTRAN, Ada, or COBOL codebase and have it “translate” into a different, newer language?

dougmerritt,
@dougmerritt@mathstodon.xyz avatar

@paninid @AmenZwa @qqmrichter
That's absolutely within the state of the art.

People are still arguing about how trustworthy LLM is with programming in general, but that's not quite the same topic.

paninid,
@paninid@mastodon.world avatar

@dougmerritt @AmenZwa @qqmrichter

If it can do 80% of the “lift and shift” from one language to another, then the remaining 20% is still on a software engineer to identify the “plausible, but inaccurate” pieces?

dpflug,
@dpflug@hachyderm.io avatar

@paninid
I'm not up to date with the state of the art, but it seems like there'd be little chance of the output being idiomatic?

Not that these projects are typically given the time budget to achieve that before AI.
@dougmerritt @AmenZwa @qqmrichter

ai6yr,

@dpflug @paninid @dougmerritt @AmenZwa @qqmrichter Okay, having written assembly / firmware / C (plus a ton of other languages) and now doing a lot of Python, there is some very specific (extremely specific) things you'd do in C which manipulate register values in embedded programming which cannot be translated into other languages (and not nearly as efficiently or clearly). Having an LLM translate that kind of C code into Python (or anything else) is likely to result in something just like taking English, translating it to Japanese, translating it to German, and then back into English.. there's drift, and at worst, the translation errors multiply...

AmenZwa,
@AmenZwa@mathstodon.xyz avatar

@paninid @dougmerritt @qqmrichter In recent years, I've been asked that very question by many government officials. This is generally my response.

LLMs and other modern DL-type ML algorithms do very well with natural languages. Neural networks and statistical techniques will always have measurable error rates. This is acceptable in natural languages, because the languages are inherently ambiguous and humans have evolved to cope with vagaries of life.

But programming languages are instances of formal languages and computers are instances of Turing machine. As such, both hardware and software are deterministic—to a fault. So, a single-bit error in translation of programmes, be they stock trading, tax calculations, missile control, whatever, will incur serious, but indeterminate, harm. So, I won't recommend using current generation of AI to translate FORTRAN, COBOL, and other legacy code ("modernisation", as the government calls this effort).

The current approach is to use a repurposed compiler front-end to generate equivalent code in a new language. But this process is painful. Machine-generated code ugly, incomprehensible, and unmaintainable. And translators invariably falter, because old code always contain special-case exploits that were necessary due to technological limitations of the past. As such, translation efforts require many old, expert programmers, which we no longer have in abundance.

Another approach, which was popular in the 1990s when machine translation didn't exist, was replicating the old system using modern technologies. This, too, is flawed. Modern practitioners don't have the original requirements and they don't have the skills to analyse old implementations.

anca,
@anca@mastodon.xyz avatar

@AmenZwa @paninid @dougmerritt @qqmrichter it’s kind of good that we can’t automatically perpetuate all of these systems. Some of them deserve to be rethought and reimplemented completely.

AmenZwa,
@AmenZwa@mathstodon.xyz avatar

@anca @paninid @dougmerritt @qqmrichter Yeah, that's a good point. But replicating old systems, too, is full of its own problems. I only know two such projects, both back in the 1990s, and both failed.

The common problems I observed were these:

• These large, mainframe-based systems are always the mission-critical applications, often even life-critical
• Even 30 years ago, those "old" systems were already far too old for the then-modern owners and users to have the insight of their "original intent"
• When the original systems were built in the 1970s, the builders observed the manual processes and built a decent digital replica, but by the 1990s the users only knew digital versions of the business processes
• The consultants hired to modernise them weren't mainframe folk and they had palpable disdain for mainframes
• The technical divide between mainframes and open-systems were much greater then than today

That was a witch's brew of failure factors.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • embedded
  • tacticalgear
  • DreamBathrooms
  • InstantRegret
  • magazineikmin
  • khanakhh
  • Youngstown
  • ngwrru68w68
  • slotface
  • everett
  • rosin
  • thenastyranch
  • kavyap
  • GTA5RPClips
  • cisconetworking
  • JUstTest
  • normalnudes
  • osvaldo12
  • ethstaker
  • mdbf
  • modclub
  • Durango
  • tester
  • provamag3
  • cubers
  • Leos
  • anitta
  • megavids
  • lostlight
  • All magazines