Also I realized that using the transparency channel meant I had no need for gamma correction! But it then did require constructing an image from parts. That’s what made it more fun, though.
"The focus of my research is applying #fp, in particular #chez#scheme, to low-level problems — the type of situations that usually call for #rust or #c"
— highly recommended talk on programming with serialized data from @vollmerm @ #ELSconf
@rml@theruran@vollmerm
I love how the minimal #Chez interpreter (without the compiler) on x86_64 Linux is only 346800 bytes. That is small and minimal. But the Chez compiler is best-in-class, producing the among the fastest and smallest binaries of any Lisp.
That said, Guile's #GPL licensing and it's use in #GuixOS, and it's extensive collection of libraries and SRFI support, make #Guile a superior choice for practical applications (IMHO). Also now I know of someone working on porting the "PreScheme" compiler from Scheme48 (a Scheme subset with no garbage collector) to Guile for use in building low-level performance binaries: https://gitlab.com/flatwhatson/guile-prescheme
> "I promise I'll convince everyone here that types are good by the end of this talk"
As a Haskeller, I do not need convincing at all. One thing that got me to even pay attention to #Scheme however was a conversation with a friend, William Byrd -- by the way, who's dissertation is in relational logic programming from the University of Indiana under Dan Friedman, same school as the presenter in this video -- explained to me that the power of Scheme comes from both it's minimalism, but also it's macro system which you can use to implement any type system you might want. Byrd told me he is frustrated by the world kind of gravitating toward the Hindley Milner type checking algorithm used by OCaml, F#, Haskell, Typed-Racket, Coalton, Carp, and PreScheme, as if it is the end-all-be-all of type systems.
So anyway, Will Byrd convinced me how cool it is being able to use any type system at all in Scheme. Hindley-Milner, CSP, Pi Calculus, Calculus of Constructs, Separation Calculus, Location Calculus (which I just now learned about!), or maybe even more exotic constraints systems modeled on physics -- use whatever is best for your problem domain.
Guix maintainers Janneke Nieuwenhuizen @janneke and Ludovic Courtès @civodul have announced just today that their "seed" C compiler "Mes" is now in production in Guix OS. Mes can, after several boostraping stages eventually compile GCC which in turn compiles Linux, Guile, and Guix. The bootstrap program (as I understand it) is written in Guile Scheme, and compiles to a 357 byte binary. Now when you do guix pull you will see that the entirety of the core operating system (some 22,000 expressions) all depend on that single 357-byte bootstrap program. The idea is to eliminate the footprint of trusted binaries that build the software for the OS and compiler toolchain -- the famous "Trusting Trust" problem outlined by Ken Thompson which he presented while receiving his Turing Award. Thanks to their hard work, we now have an operating system for which every stage of the build can be verified by a human. https://guix.gnu.org/blog/2023/the-full-source-bootstrap-building-from-source-all-the-way-down/
Nix OS people do not need to feel left out, a new issue on the Nix OS GitHub page has announced that they will begin a similar project. https://github.com/NixOS/nixpkgs/pull/227914
We are 1 month away from the next Lisp Game Jam! Make a dating sim in Emacs Lisp. Or make a Souls-like in Chicken Scheme (aka Chicken Scheme for the Souls.) Or make a kart racer in Fennel. Or make a post-apocalyptic action platformer in Common Lisp. Or make a roguelike in Racket. Or make a farming sim in Guile. Or make a strand type game in Clojure.
"Something I'm curious about working on is an imperative dependently typed programming language that uses linear types and #TypeTheory to keep the mutation in line. Something I have to admit is that I'm not actually interested in #FunctionalProgramming. I'm simply interested in #types."
Wow, Marc Nieper-Wisskirchen's new #scheme macros tutorial looks epic... perhaps the first detailed deep dive since JRM's famous syntax-rules primer for the merely eccentric. #lisp content :chart-with-upwards-trend:
the one good thing to come out of covid is a whole slew of great courses were shared online. just found Jeremy Siek's lectures from the famous IU #compiler course here:
"#Gauche tracks source code location information and shows it in the stack trace. However, what if the source is generated by macros? In 0.9.12, the macro expander re-attached the original source info to the outermost form of the macro output. However, if a runtime error occurred in constructed code other than the outermost one, stack trace couldn't find the info and had to show '[unknown location]'. It was annoying especially when the code was the result of nested macro expansions, that you didn't get a clue about where the error came from."
I've been learning about Delimited Continuations lately, because I have recently learned that several programming language theory heavyweights, including Oleg Kiselyov, now believe the classic continuation control construct, e.g. "call/cc" in the #Scheme language, were big mistake that make program optimization needlessly difficult, though I don't fully understand why this would be the case. Unfortunately, "call/cc" is encoded in the Scheme language standard, so compiler authors need to come up with work-arrounds -- who knew at the time of the first revision (1975) it would be a bad idea?
Delimited Continuations solve the problems inherent in continuations (again, I don't understand why), and are composable, meaning it is easy to write modular pieces of code using continuations in isolation that can all be made to fit nicely together with a few simple high-order functions.
I think they are pretty similar actually, with the major differences imo being that #scheme implementations typically reify stack frames as continuations, allowing you to step through program execution live without necessarily needing a macrostepper, while #CommonLisp offers the SLIME/Sly experience on top of countless battle hardened tools & techniques developed over decades, with of course the downsides of a dynamic lexical environments and lack of hygiene that can lead to particularly funky debugging situations.
Most schemes have let-syntax, which I believe is like macrolet but with syntax-objects, which is another distinguishing difference. Syntax-objects are like records with an AST & source location information. It's worth noting that Robert Smith said that Common #Lisp's lack of a means to perform transformations over locations is one of the biggest obstacles to improving #coalton's user experience: https://twitter.com/stylewarning/status/1574868014855380992
But overall, scheme systems are typically very bare bones, you're often expected to roll ad-hoc debugging tools that I believe common lisp ships with. But the attraction of this is that our systems are easy to decompose and mold into whatever you need, and tools you simply can't imagine elsewhere (because of lack of first-class continuations) simply fall out the bottom once you get the hand of it. But I'll admit, moving from #racket to pure scheme was at times daunting and very challenging, whereas I could pretty much pick up racket and roll with it.
Today I received the first payment for the Free and Open Source Software work. It's of course is not a sustainable business yet, but a good step towards.
#guix tip of the day:
upstream server issues got you retriggering a command until the download succeeds? fire up a #guile repl and have it automate the process for you with just a few lines of #scheme.