"The benchmark results show that #Egison pattern-matching embedded in #Gauche#Scheme is faster than the original Egison interpreter written in [#Haskell]"
@rml The published egison package shows that they use String extensively and then transformer stacks in the interpreter. I wonder if there’s a few OOM improvement left on the table.
This sucks so much. ChatGPT took their jobs. Now they walk dogs and fix air conditioners. echnology used to automate dirty and repetitive jobs. Now, artificial intelligence chatbots are coming after high-paid ones. https://www.washingtonpost.com/technology/2023/06/02/ai-taking-jobs/
@travisfw I’ve seen a lot of humongous illegible diagrams that could have been a few lines of code. If there’s a tool to make it legible, I don’t know about it.
I’m not against no-code or something. It is visual coding that makes my eyes hurt.
To be fair, I think Haskell will continue to fill the niche it filled ~10 years ago, around the time it started to get mainstream hype. Small teams of skilled devs delivering robust products that would normally require much larger teams to maintain will continue to prevail. Purely functional lazy programming was never bound for world domination in an economy which is antagnostic to curiosity, creativity and truths.
On the other hand, I have the feeling that we're going to see more and more Haskellers-turned-Rustaceans come to realize that #Rust does little to alleviate the primary barrier to Haskell's wider success -- fast and predictable turnaround time for projects developing cutting-edge technologies -- and will wind up going the same route as some major Haskell projects such as #Unison and #Idris have in recent years, which is to try #Chez Scheme, only to discover that it allows them to release blazing fast functional programs on a generic foundation where major breaking changes are practically non-existent, providing incredible flexibility while significantly reducing dependencies by dint of the ad-hoc tooling that falls out of the bottom of #scheme. Not to mention the joys that come from near-instant startup times, some of the fastest compile time you've ever encountered, fully-customizable interactive development and a surgical #debugger that rivals Haskell in scheer fun. Yesterdays naysayers will become tomorrow's enthusiastic bootstrappers. Or a at least a boy can dream.
That said, in all seriousness I don't think Scheme will ever reach the heights of Haskell's moderate commercial success. But I do think that projects built on Scheme, like Unison, will get a leg up and eventually surpass it, and interest in #lisp will only grow.
The amd guy says that "the ally is an inflection point for portable gaming" how many blow jobs did he get to avoid mentioning the Steam Deck that went live more than a year ago ? #rogally#corruption#amd
If you feel the urge to explain to president #Biden that AI CEOs are the wrong people to talk with about #AI#ethics1, I want you to understand that the president is not clueless. The real reason for such meetings is always this: "If voters get mad enough for us to have to regulate you, please tell us how we can do that without endangering your profits."
Same as Comcast were experts on net neutrality and Disney were experts on copyright.
@isagalaev If that is indeed their plan blueprint for comprehensive regulation touted at that interview then there would be no voters to get mad. Alas, no profits either.
The “open source” models are parasiting on their behind-the-doors overseers. I doubt that it is even according to their APIs usage terms, but that isn’t relevant in the end.
Google has a moat here - they simply don’t (?) have a public API. It is the OpenAI that has to sell away its core to remain afloat.
The incentives for “foundational models” business here is to sell API access under tight contracts. With the progressively steep fines for breaches, making them only accessible for progressively bigger B2B peers. And whack-a-mole any leaks of course. “Intellectual property” gets a new ring to it.
But then there’s fundamental research, like the Google paper that brought us transformers. Even with more performance per dollar gains, the open source community is stuck with the published models until they collectively start doing their own research. This further incentivizes labs going dark.
Actually, this may be even good for AI Notkillingeveryoneism as it would be more incentives for non-proliferation of capabilities.
But then, there’s this “commoditize your complement” drive, that forces hardware vendors into fundamental research and open-sourcing capability gains - so the clients would buy their chips to run the newest and hottest models.
And this is worrying, since even if AI labs go dark or extinct the hardware vendors would be happy to plunge us into AIpocalypse.