Posts

This profile is from a federated server and may be incomplete. Browse more on the original instance.

AmenZwa, to baltimore
@AmenZwa@mathstodon.xyz avatar
AmenZwa, to random
@AmenZwa@mathstodon.xyz avatar

Since I learned to drive many decades ago, I've always adjusted the three rear view mirrors (inside and outside) to cover the rear 180° view—no blindspots. But most drivers I know adjust their outside mirrors so that they could see the rear end of their car in those mirrors, which creates blindspots between the mirrors coverage and their peripheral vision.

rjblaskiewicz,
@rjblaskiewicz@mstdn.social avatar

@AmenZwa My parents have a feature on their newish car where a warning light flashes on the outside mirror whenever someone is in your blind spot. It's damned handy.

AmenZwa, to random
@AmenZwa@mathstodon.xyz avatar

Almost every practising horologist, luthier, or craftsman takes delight in his work. He takes pride in mastering the tools and techniques of the field. He makes it a point to study the history and background of the field, the tools, the techniques, and the thought leaders.

As , it behooves us to follow suit. We must know the languages, the algorithms, and the exponents of our field.

AmenZwa, to IT
@AmenZwa@mathstodon.xyz avatar

Most have never heard of Curry-Howard isomorphism between type theory and proof theory (type (\equiv) proposition, programme (\equiv) proof), which in and have exploited for decades.

Knowing the techniques is well and good, but understanding the theories matters, at least as much.

dougmerritt,
@dougmerritt@mathstodon.xyz avatar

@AmenZwa
Plain old coders don't know any theory, let alone advanced theory. CS degrees exist for a reason.

You and I may self-teach theory, but you already know that almost no one does that, percentage-wise.

AmenZwa,
@AmenZwa@mathstodon.xyz avatar

@dougmerritt Depressing, isn't it....

AmenZwa, to Futurology
@AmenZwa@mathstodon.xyz avatar

I get ridiculed by young JavaScript and Python coders, whenever I say that parallel processing is essential to the future of computing.

The seasoned among them point out to me that the idea of is almost as old as me, that their iPhone can run rings round a typical supercomputer I may have used in my grad school days, and that their Python programmes running on laptops can beat anything I may have written on a CRAY in Fortran or C. Those points seem valid, but they miss the mark.

First, just outrunning a 30-year-old system is not a legitimate measure of current performance.

Secondly, if modern hardware performance has reached a level where a naïve implementation of an algorithm in a slow scripting language can beat a hand-tuned parallel programme running on an old supercomputer, then today's programmers have the ethical responsibility to optimise their software implementations by exploiting those newer, greater hardware capabilities available to them.

Thirdly, if there is so much excess hardware capacity, the software should soak that up by striving for more accuracy, more precision, more features, whatever, but not by mining bitcoins.

Lastly, just about every consumer-grade machine today—server, desktop, laptop, tablet, phone, single-board computer—is a multicore, multiprocessor monster. Programmers should be exploiting those readily available parallel resources, now. Automatic performance upgrade of sequential code by Moore's law and Dennard scaling is dead and gone. And fully automatic parallelisation of sequential code by compilers is still a distant dream.

matters—especially today.

AmenZwa, to technology
@AmenZwa@mathstodon.xyz avatar

I would never think of walking up to a neurosurgeon and to start lecturing him on the best use of a micro Kerrison. As an engineer, I know well how that delicate instrument is designed and how it functions. But I know even better that, given my complete lack of medical knowledge, I should never go anywhere near a live patient with a split-open skull, let alone flailing about that instrument near his head.

Yet, on several occasions, I've been accosted by doctors, lawyers, and businessmen alike on how best I should employ a particular #technology in my work—VoIP, Bluetooth, Java, you name it, with LLMs being their favourite tech-of-the-day, at the moment.

I know some businessmen who fly on a weekly basis, practically living on an airliner. But none would dare deliver to a gathering of airline pilots a sermon on how best to perform an approach into a Class Bravo airspace. So then, why would these businessmen feel comfortable enough to come tell me how I should do my job?

What is it about our field that makes every non-techie who saw a Reddit post on some #technology believes he's now an expert in it?

thomy2000,
@thomy2000@fosstodon.org avatar

@AmenZwa Can I boost this twice somehow?

JohnJBurnsIII,
@JohnJBurnsIII@kzoo.to avatar

@thomy2000 @AmenZwa

Stick around. Someone from payroll, or purchasing directors staff will be by to help you out.

🤣

AmenZwa, to Turkey
@AmenZwa@mathstodon.xyz avatar

Holy smoked ....

One down, many more to go.

https://bbc.com/news/world-europe-68704375

rjblaskiewicz,
@rjblaskiewicz@mstdn.social avatar

@AmenZwa Woah. Nice

AmenZwa, to Watches
@AmenZwa@mathstodon.xyz avatar

Speaking as an old fancier of mechanical , back in the day, we owned one good piece, usually Swiss or Japanese with a black or a white dial, that we wear for decades, until we were gifted another watch our birthday or a significant event. But today, every watch enthusiast owns at least tens of watches, each colour coordinated to the wardrobe. What's up with that!

ppn,
@ppn@mastodon.online avatar

@AmenZwa it reflects the fact that watches nowadays are less tools that happen to be pretty and more jewelry that happens to tell time.

AmenZwa, to Engineering
@AmenZwa@mathstodon.xyz avatar

A friend of mine, who is an airline , asked me why I took a rather harsh stance against in their recent "troubles". The short answer is that I loathe their profit-driven philosophy.

The long answer is this. inevitably affects lives, and engineering disasters kill not by ones and twos but by the thousands. A company who, for almost a century, have been manufacturing aircraft must hold itself to a higher standard than a fly-by-night web development sweatshop, when it comes to processes, especially those that affect safety. But Boeing, for the past 30 or so years, have raised above . This, I detest, as an engineer.

AmenZwa, to LLMs
@AmenZwa@mathstodon.xyz avatar

Many young I know are using to generate code. But none of them are contemplating the consequences.

In the past decade, AI had taken an exponential leap forward. And this appears to be just the beginning. If this field continues to progress even only linearly, AI will soon learn to program all machines. Human will then follow the path of human —oblivion.

And AI, being a machine, does not need high-level programming languages; AI can program the hardware in its native language—machine code.

So, if humanity were to use AI to program machines, it is eminently more sensible the devote the resources to training AI to generate machine code, not snippets of Python code.

AmenZwa,
@AmenZwa@mathstodon.xyz avatar

@promovicz Right, there are many issues that the IT industry is not considering, not the least of which is copyright and licensing. Moreover, many programmers aren't even checking the correctness of the generated code snippets; they are relying on the testing, instead.

Even if we don't use AI to program machines directly, in the near future, code-generator-using humans will no longer know how to write programmes well. Then, humanity would have no alternative but to use AI to program machines, directly.

promovicz,
@promovicz@chaos.social avatar

@AmenZwa IDK about those consequences. This is certainly possible, and some people will go down that route - but then, I don't buy the hype that much, and others won't either. But it's not a clear-cut situation (yet). I wouldn't want to decide yet if we can use the generators in a way that is actually beneficial in the long term, because the whole development situation around AI is not stable.

AmenZwa, to random
@AmenZwa@mathstodon.xyz avatar

The first time I heard was when he was backing in their Tiny Desk Concert. Gary was being Gary, of course. But Julian, only 25 at the time, shone through, from the back.
https://youtu.be/fXjfvEcAV6w?si=T7c602xDRaTQrXuo&t=274

These days, I don't see Julian playing much of his Manzer jazz box; he seems to play his '54 Tele often. What tone....
https://youtu.be/q5ggv-5s4bs?feature=shared

dougmerritt,
@dougmerritt@mathstodon.xyz avatar

@AmenZwa I respect classic instruments and collectibles, and 1954 Telecaster is both, but I eventually discovered that, for solid body, tone is partly in the effects + amp + speakers + ambient environment (obviously), and then in the coils (which may or may not dominate over the first set of factors, depending on characteristics of both), and but potentially mostly in the fingers.

(Hollow body tone does include the body, but not as much as people think: many people have made violins that are competitive with Stradivarius in tone in blind tests. It's largely not about e.g. some magic varnish whose recipe was lost long ago -- which is not to say it's easy to make master-class instruments, just that there's lots of disproven mythology out there)

People have put identical strings from a standard e.g. strat on a cast concrete body, and blind tests can't hear the difference. There's one data point.

Another data point is exemplified by a video I saw long ago, where a teenager is unhappy with the cheap $100 Fender his musician dad bought him, and is visited by his dad's friend, a famous guitarist (name forgotten). The guitarist grabbed the cheapo guitar and proceeded to play with beautiful tone -- not as good as on his $5K guitar he said, but it still proved the point; the kid needed to get good (git gud) first, then worry about high end equipment.

So Julian clearly has the magic fingers. :)

Incidentally, those $100 Fender guitars used to be made in Indonesia (I've got one) and in Japan, long long ago, and those were actually good quality, unlike a few years later.

AmenZwa,
@AmenZwa@mathstodon.xyz avatar

@dougmerritt I like your thought process—no surprise there—birthday. Hm....😀

I assure you, I'm no great guitarist, but I've play the guitar longer I've written programmes. The difference is that I made sure I improve incrementally at programming through the years, but I just noodle on the fretboard. But boy, was it fun—on both fronts.

Some of the guitarists (jazz, naturally) I know have ditched their 30 kg amps and have gone portable. They're very happy with laptops or modelling boxes. I don't blame them. Most of the time, they had to carry the amp then had to mic it up to the PA, which destroys the tone. Now, they pull out a palm-sized modelling amp and plug directly into the PA, and they're no worse off.

I grew up on AC30, and it's still my favourite. And no, I don't use paddles; I use the knobs on the Strats and on the amp. Of the portable amps, I like the DV Mark Jazz—nope, not valve; it's transistor. Try it out at your local guitar shop; it's great for clean tone.

Yeah, guitar instructors—we could always use one, regardless of our playing experience and our age. Look up old John; see if you could reconnect with him, if he's still out and about. I hope so.

AmenZwa, to random
@AmenZwa@mathstodon.xyz avatar

and are dead.
Long live Fortran and Cobol.

Modern Fortran is indispensable for high-performance, scientific computing, like weather simulation on supercomputers. Modern Cobol is indispensable for high-throughput, business computing, like financial transaction processing on mainframes.

But Fortran and Cobol suffer from the image problem. Young will not devote their careers to these seemingly dead languages. As such, many Fortran and Cobol shops are desperately trying to "modernise" their codebases by translating into C++, Java, Python, etc.

This is a mistake. A weather forecast that takes a couple of hours for a Fortran implementation that runs on a 1000-CPU supercomputer will take months for a Python version that runs in an enterprise cloud. Analogous examples abound for Cobol. These niche systems are cloud-proof—they will not bend to the charms of cloud computing.

New language features and implementation techniques are continuously, albeit gradually, being integrated into Fortran and Cobol, and new supercomputers and mainframes are still being designed and manufactured. Yet, there is no injection of new programmers into these specialised domains.

A sensible approach, then, is this. Instead of converting pieces of code written in 60yo languages into those written in 30yo languages, design brand new languages—with dependent type system, algebraic types, type inferencing, memory safety, and other accoutrements of modernity—that target standardised Fortran and Cobol, much like TypeScript and ReScript target standardised JavaScript to "modernise" web development. And if these new languages become established, retarget them to binary.

AmenZwa, to random
@AmenZwa@mathstodon.xyz avatar
AmenZwa, to random
@AmenZwa@mathstodon.xyz avatar

Nature has "survival of the species". The Party has "survival of the ".

AmenZwa, to ComputerScience
@AmenZwa@mathstodon.xyz avatar

Curry-Howard correspondence is a foundational principle of modern type theory and programme verification in .

Its evil twin is Hurry-Coward, which states that bold who hurry their type designs turn into cowards on go-live date.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • megavids
  • thenastyranch
  • rosin
  • GTA5RPClips
  • osvaldo12
  • love
  • Youngstown
  • slotface
  • khanakhh
  • everett
  • kavyap
  • mdbf
  • DreamBathrooms
  • ngwrru68w68
  • provamag3
  • magazineikmin
  • InstantRegret
  • normalnudes
  • tacticalgear
  • cubers
  • ethstaker
  • modclub
  • cisconetworking
  • Durango
  • anitta
  • Leos
  • tester
  • JUstTest
  • All magazines