@AmenZwa@mathstodon.xyz
@AmenZwa@mathstodon.xyz avatar

AmenZwa

@AmenZwa@mathstodon.xyz

I am an ordinary man. So, I am unique up to isomorphism.
🇺🇸 JD ⊥ MSCS Ω BSEE 🇺🇦
Washington, DC

This profile is from a federated server and may be incomplete. Browse more on the original instance.

AmenZwa, to baltimore
@AmenZwa@mathstodon.xyz avatar
AmenZwa, to random
@AmenZwa@mathstodon.xyz avatar

Since I learned to drive many decades ago, I've always adjusted the three rear view mirrors (inside and outside) to cover the rear 180° view—no blindspots. But most drivers I know adjust their outside mirrors so that they could see the rear end of their car in those mirrors, which creates blindspots between the mirrors coverage and their peripheral vision.

AmenZwa, to random
@AmenZwa@mathstodon.xyz avatar

Almost every practising horologist, luthier, or craftsman takes delight in his work. He takes pride in mastering the tools and techniques of the field. He makes it a point to study the history and background of the field, the tools, the techniques, and the thought leaders.

As , it behooves us to follow suit. We must know the languages, the algorithms, and the exponents of our field.

AmenZwa, to IT
@AmenZwa@mathstodon.xyz avatar

Most have never heard of Curry-Howard isomorphism between type theory and proof theory (type (\equiv) proposition, programme (\equiv) proof), which in and have exploited for decades.

Knowing the techniques is well and good, but understanding the theories matters, at least as much.

AmenZwa,
@AmenZwa@mathstodon.xyz avatar

@dougmerritt Depressing, isn't it....

AmenZwa, to Futurology
@AmenZwa@mathstodon.xyz avatar

I get ridiculed by young JavaScript and Python coders, whenever I say that parallel processing is essential to the future of computing.

The seasoned among them point out to me that the idea of #supercomputers is almost as old as me, that their iPhone can run rings round a typical supercomputer I may have used in my grad school days, and that their Python programmes running on laptops can beat anything I may have written on a CRAY in Fortran or C. Those points seem valid, but they miss the mark.

First, just outrunning a 30-year-old system is not a legitimate measure of current performance.

Secondly, if modern hardware performance has reached a level where a naïve implementation of an algorithm in a slow scripting language can beat a hand-tuned parallel programme running on an old supercomputer, then today's programmers have the ethical responsibility to optimise their software implementations by exploiting those newer, greater hardware capabilities available to them.

Thirdly, if there is so much excess hardware capacity, the software should soak that up by striving for more accuracy, more precision, more features, whatever, but not by mining bitcoins.

Lastly, just about every consumer-grade machine today—server, desktop, laptop, tablet, phone, single-board computer—is a multicore, multiprocessor monster. Programmers should be exploiting those readily available parallel resources, now. Automatic performance upgrade of sequential code by Moore's law and Dennard scaling is dead and gone. And fully automatic parallelisation of sequential code by compilers is still a distant dream.

#Parallel #programming matters—especially today.

AmenZwa, to technology
@AmenZwa@mathstodon.xyz avatar

I would never think of walking up to a neurosurgeon and to start lecturing him on the best use of a micro Kerrison. As an engineer, I know well how that delicate instrument is designed and how it functions. But I know even better that, given my complete lack of medical knowledge, I should never go anywhere near a live patient with a split-open skull, let alone flailing about that instrument near his head.

Yet, on several occasions, I've been accosted by doctors, lawyers, and businessmen alike on how best I should employ a particular in my work—VoIP, Bluetooth, Java, you name it, with LLMs being their favourite tech-of-the-day, at the moment.

I know some businessmen who fly on a weekly basis, practically living on an airliner. But none would dare deliver to a gathering of airline pilots a sermon on how best to perform an approach into a Class Bravo airspace. So then, why would these businessmen feel comfortable enough to come tell me how I should do my job?

What is it about our field that makes every non-techie who saw a Reddit post on some believes he's now an expert in it?

AmenZwa, to Turkey
@AmenZwa@mathstodon.xyz avatar

Holy smoked ....

One down, many more to go.

https://bbc.com/news/world-europe-68704375

AmenZwa, to Watches
@AmenZwa@mathstodon.xyz avatar

Speaking as an old fancier of mechanical , back in the day, we owned one good piece, usually Swiss or Japanese with a black or a white dial, that we wear for decades, until we were gifted another watch our birthday or a significant event. But today, every watch enthusiast owns at least tens of watches, each colour coordinated to the wardrobe. What's up with that!

AmenZwa, to Engineering
@AmenZwa@mathstodon.xyz avatar

A friend of mine, who is an airline #pilot, asked me why I took a rather harsh stance against #Boeing in their recent "troubles". The short answer is that I loathe their profit-driven #management philosophy.

The long answer is this. #Engineering inevitably affects lives, and engineering disasters kill not by ones and twos but by the thousands. A company who, for almost a century, have been manufacturing aircraft must hold itself to a higher standard than a fly-by-night web development sweatshop, when it comes to processes, especially those that affect safety. But Boeing, for the past 30 or so years, have raised #profits above #safety. This, I detest, as an engineer.

AmenZwa, to LLMs
@AmenZwa@mathstodon.xyz avatar

Many young I know are using to generate code. But none of them are contemplating the consequences.

In the past decade, AI had taken an exponential leap forward. And this appears to be just the beginning. If this field continues to progress even only linearly, AI will soon learn to program all machines. Human will then follow the path of human —oblivion.

And AI, being a machine, does not need high-level programming languages; AI can program the hardware in its native language—machine code.

So, if humanity were to use AI to program machines, it is eminently more sensible the devote the resources to training AI to generate machine code, not snippets of Python code.

AmenZwa,
@AmenZwa@mathstodon.xyz avatar

@promovicz Right, there are many issues that the IT industry is not considering, not the least of which is copyright and licensing. Moreover, many programmers aren't even checking the correctness of the generated code snippets; they are relying on the testing, instead.

Even if we don't use AI to program machines directly, in the near future, code-generator-using humans will no longer know how to write programmes well. Then, humanity would have no alternative but to use AI to program machines, directly.

AmenZwa, to random
@AmenZwa@mathstodon.xyz avatar

The first time I heard was when he was backing in their Tiny Desk Concert. Gary was being Gary, of course. But Julian, only 25 at the time, shone through, from the back.
https://youtu.be/fXjfvEcAV6w?si=T7c602xDRaTQrXuo&t=274

These days, I don't see Julian playing much of his Manzer jazz box; he seems to play his '54 Tele often. What tone....
https://youtu.be/q5ggv-5s4bs?feature=shared

AmenZwa,
@AmenZwa@mathstodon.xyz avatar

@dougmerritt Yea, guitarists quibble over tone factors like programmers battle over languages.

Guitar tone is not truly in the tone wood, effects, amp, amp box wood, mic, cables, etc.; it's primarily in the fingers. I was, of course, referring to Julian's touch, not his wood, when I remarked about his tone.

Yeah, we agree that those famous disputes over tone are as pointless as those tiresome disputes over the best programming language. I once met a kid who walked into a jazz gig with his large paddle board and a Tele, and he promptly engaged with others about tone wood. I told him once the guitar signal has passed through phaser, flanger, echo, fuzz, wah, compressor, etc., the talk of tone wood is moot. He dismissed me with a "OK, Boomer".

You're absolutely right. That $100 Squire is better than the real Fender I used to play in the 1970s. But then, that was a CBS Fender, so.... It's mighty impressive what Mexico, Indonesia, and Korea are doing these days. They even better than the 1980s Japanese guitars. And that price!

By the way, have you seen Fender's take on the Les Paul? I'm a Strat guy, and I most likely won't pick up a Tele or anything Gibson. But this Tele tempts me.

https://www.fender.com/en-US/electric-guitars/telecaster/special-edition-custom-telecaster-fmt-hh/0262004520.html

AmenZwa,
@AmenZwa@mathstodon.xyz avatar

@dougmerritt I like your thought process—no surprise there—birthday. Hm....😀

I assure you, I'm no great guitarist, but I've play the guitar longer I've written programmes. The difference is that I made sure I improve incrementally at programming through the years, but I just noodle on the fretboard. But boy, was it fun—on both fronts.

Some of the guitarists (jazz, naturally) I know have ditched their 30 kg amps and have gone portable. They're very happy with laptops or modelling boxes. I don't blame them. Most of the time, they had to carry the amp then had to mic it up to the PA, which destroys the tone. Now, they pull out a palm-sized modelling amp and plug directly into the PA, and they're no worse off.

I grew up on AC30, and it's still my favourite. And no, I don't use paddles; I use the knobs on the Strats and on the amp. Of the portable amps, I like the DV Mark Jazz—nope, not valve; it's transistor. Try it out at your local guitar shop; it's great for clean tone.

Yeah, guitar instructors—we could always use one, regardless of our playing experience and our age. Look up old John; see if you could reconnect with him, if he's still out and about. I hope so.

AmenZwa, to random
@AmenZwa@mathstodon.xyz avatar

and are dead.
Long live Fortran and Cobol.

Modern Fortran is indispensable for high-performance, scientific computing, like weather simulation on supercomputers. Modern Cobol is indispensable for high-throughput, business computing, like financial transaction processing on mainframes.

But Fortran and Cobol suffer from the image problem. Young will not devote their careers to these seemingly dead languages. As such, many Fortran and Cobol shops are desperately trying to "modernise" their codebases by translating into C++, Java, Python, etc.

This is a mistake. A weather forecast that takes a couple of hours for a Fortran implementation that runs on a 1000-CPU supercomputer will take months for a Python version that runs in an enterprise cloud. Analogous examples abound for Cobol. These niche systems are cloud-proof—they will not bend to the charms of cloud computing.

New language features and implementation techniques are continuously, albeit gradually, being integrated into Fortran and Cobol, and new supercomputers and mainframes are still being designed and manufactured. Yet, there is no injection of new programmers into these specialised domains.

A sensible approach, then, is this. Instead of converting pieces of code written in 60yo languages into those written in 30yo languages, design brand new languages—with dependent type system, algebraic types, type inferencing, memory safety, and other accoutrements of modernity—that target standardised Fortran and Cobol, much like TypeScript and ReScript target standardised JavaScript to "modernise" web development. And if these new languages become established, retarget them to binary.

AmenZwa, to random
@AmenZwa@mathstodon.xyz avatar
AmenZwa, to random
@AmenZwa@mathstodon.xyz avatar

Nature has "survival of the species". The Party has "survival of the ".

AmenZwa, to ComputerScience
@AmenZwa@mathstodon.xyz avatar

Curry-Howard correspondence is a foundational principle of modern type theory and programme verification in .

Its evil twin is Hurry-Coward, which states that bold who hurry their type designs turn into cowards on go-live date.

AmenZwa, to cs
@AmenZwa@mathstodon.xyz avatar

How many confusing, callous, condescending ways are there to teach the concept of to the incoming undergrads?

FeralRobots, to random
@FeralRobots@mastodon.social avatar

Finally figured out who Boebert's been reminding me of: Edith Prickley. & now I want to find Andrea Martin & apologize to her.

AmenZwa,
@AmenZwa@mathstodon.xyz avatar

@FeralRobots

Prickley is a legend in her own time.

Boebert is a legend in her own mind.

Funny how little things make big differences....

AmenZwa,
@AmenZwa@mathstodon.xyz avatar

@FeralRobots Yeah, Andrea is the true legend in her own time.

AmenZwa, to random
@AmenZwa@mathstodon.xyz avatar

The new swapping attack against two-factor authentication applies the oldest technique: .

https://youtu.be/A73BdBxnYl0?si=X6zWs4S-UCksbw3o

AmenZwa, to cs
@AmenZwa@mathstodon.xyz avatar

It is common knowledge that it takes 10,000 hours of sustained practice to become proficient in any substantive endeavour. That is, to become an , one must work on just one thing for 5 years—50 work weeks per year and 40 work hours per week.

Let us apply this aphorism to and . Those four years of undergrad studies do not count toward the 10,000 hours, because the undergrad has neither the ability nor the opportunity to focus on one thing. So, to become an expert in one area of study or one segment of technology, one must earn a PhD in one narrow academic research area or, equivalently for someone with a BS, work in the industry for at least five years exclusively applying just one technology.

But in today's climate of technological churn, that level of devotion to one area of technology is tantamount to a career suicide. Sad....

AmenZwa, to programming
@AmenZwa@mathstodon.xyz avatar

Like others of my ilk, I've used many different languages, through the decades. Of those, a handful managed to blow my mind upon first encounter:

• 6502 assembly—it was a shock to my system, because it was my first language, which I learned in the early 1980s as an undergrad learning to incorporate microprocessors into electronic circuits
• C—I was stunned by the friendliness of a high-level PP language, compared to assembly (depending on one's perspective, C is high level)
• LISP—it wasn't the "lost in stupid parentheses" syntax that shook me, because in the company of assembly any syntax is good syntax, but it was the power of the LISP macros, compared to the criminally insane C macros, that blew me away (another plus was that I came across 𝜆-calculus by way of LISP)
• Smalltalk—forty years ago, OO wasn't yet a thing it is today, but what a refreshing take on design and organisation Smalltalk was, compared to the then-prevailing PP approaches
• ML—the Hindley-Milner inferencing type system swept me away (it was my first encounter with Type Theory), and later the Standard ML functors (SML modules introduced me to Category Theory)
• Haskell—a professor of mine in grad school introduced me to Haskell in the early 1990s, and I'd just say that it was a stunner on many levels

I pity today's youngsters. Many of them just learned Python and are done. Learning a high-level, interpreted, scripting language (which has picked up loads of new features over a three-decade lifespan and has accumulated tonnes of technical debt, and is now being used on an enterprise scale, nay global scale) as the first, and possibly the only, one is unkind to the mind of a programmer.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • normalnudes
  • rosin
  • ngwrru68w68
  • GTA5RPClips
  • DreamBathrooms
  • mdbf
  • magazineikmin
  • thenastyranch
  • Youngstown
  • Durango
  • slotface
  • everett
  • vwfavf
  • kavyap
  • megavids
  • Leos
  • khanakhh
  • cisconetworking
  • cubers
  • InstantRegret
  • ethstaker
  • osvaldo12
  • modclub
  • anitta
  • provamag3
  • tacticalgear
  • tester
  • JUstTest
  • All magazines