@sgraf@mastodon.online avatar

sgraf

@sgraf@mastodon.online

Talk to me about Haskell, GHC, static analysis, PL design and theory, general CS, or my trumpet

This profile is from a federated server and may be incomplete. Browse more on the original instance.

sgraf, to random German
@sgraf@mastodon.online avatar

I'm currently evaluating what is the best hygiene system to implement in Template Haskell, because the current one is not even hygienic: https://gitlab.haskell.org/ghc/ghc/-/issues/24713

I must say, I found the literature a bit confusing for lack of succinct examples that distinguish new/concurrent systems from prior work. Good thing there is https://dl.acm.org/doi/10.1145/3386330 which at least helps me trace decisions that led to different systems and brings a few examples as well.

kha, to random
@kha@functional.cafe avatar

Omg. In Munich, even the dentist is posh as hell

sgraf,
@sgraf@mastodon.online avatar

@kha yaay

joey, to random
@joey@mathstodon.xyz avatar

Anyone have recommendations for a good ergonomic keyboard? My old Microsoft 4000 is wearing out

sgraf,
@sgraf@mastodon.online avatar

@joey The Logitech Ergo K860 is quite decent. (Although I had one giving up on me after one year of use.)

brokenix, to haskell

quoting @prophet
mli files are mostly used to constrain the visibility of definitions whereas hs-boot files are about allowing mutual recursion between modules (which OCaml doesn't support, even with mli files!)
But the mechanism by which they achieve their goals is nearly identical even though the perception of it is so vastly different.

I guess the conclusion to draw from this is that both sides are wrong: IMO, mli files are not nearly as good as OCamlers think they are, but hs-boot files aren't as ugly as Haskellers think either.
-- prettySrcLoc and prettyCallStack are defined here to avoid hs-boot
-- files. See Note [Definition of CallStack]

Backpack's design is primarily driven by compatibility considerations (“how do we build upon GHC's existing foundation?”), rather than elegance. In particular, Backpack doesn't eliminate those ugly .hs-boot files, it just automates and hides their generation and processing.

For all their faults, Standard ML and OCaml have pretty good support for modular programming. And, as the Modular Type Classes paper you linked shows, type classes can be built elegantly on top of a good modular foundation.

https://cohost.org/prophet/post/3251638-it-s-really-interest
https://haskell.fi.muni.cz/doc/base/src/GHC-Exception.html
https://twitter.com/lexi_lambda/status/1172629363730333697
https://news.ycombinator.com/item?id=11371130

sgraf,
@sgraf@mastodon.online avatar
sgraf, to random German
@sgraf@mastodon.online avatar

This paper (or rather, the research field it provides an overview of) could have saved me a lot of head-scratching if I had known about it a year ago: https://www.cambridge.org/core/journals/journal-of-functional-programming/article/impact-of-higherorder-state-and-control-effects-on-local-relational-reasoning/DA921680527426F49ED79644C8C6A565

What is your favorite case of "Related Work that I wish I had known about before reinventing the wheel"?

zanzi, to random
@zanzi@mathstodon.xyz avatar

I kinda feel like there should be a link between monadic parsers and monad comprehensions, since both are based on similar structures (MonadPlus, Alternative). Has anyone written about this?

sgraf,
@sgraf@mastodon.online avatar

@zanzi @jer_gib I think the question is: What sort of comprehension syntax should desugar to MonadPlus operations? That is, what kind of list comprehension syntax do you have in mind that desugars to (++)?

brokenix, to haskell

pseq combinator is used for sequencing; informally, it eval-
uates its first argument to weak-head normal form, and then eval-
uates its second argument, returning the value of its second argu-
ment. Consider this definition of parMap:
parMap f [] = []
parMap f (x:xs) = y ‘par‘ (ys ‘pseq‘ y:ys)
where y = f x
ys = parMap f xs
The intention here is to spark the evaluation of f x, and then
evaluate parMap f xs, before returning the new list y:ys. The
programmer is hoping to express an ordering of the evaluation: first
spark y, then evaluate ys.
The obvious question is this: why not use ’s built-in seq
operator instead of pseq? The only guarantee made by seq is that
it is strict in both arguments; that is, seq a ⊥ = ⊥ and seq ⊥
a = ⊥. But this semantic property makes no operational guaran-
tee about order of evaluation. An implementation could impose this
operational guarantee on seq, but that turns out to limit the optimi-
sations that can be applied when the programmer only cares about
the semantic behaviour. Instead, we provide both pseq and seq
(with and without an order-of-evaluation guarantee), to allow the
programmer to say what she wants while leaving the compiler with
as much scope for optimisation as possible.
https://simonmar.github.io/bib/papers/multicore-ghc.pdf

sgraf,
@sgraf@mastodon.online avatar

@brokenix ironically, the implementation of pseq is broken today https://gitlab.haskell.org/ghc/ghc/-/issues/23233 and probably has been broken since inception

anuytstt, to random
sgraf,
@sgraf@mastodon.online avatar

@anuytstt ... Great minds think alike? 🙂

jesper, to random
@jesper@agda.club avatar

Is there anything known about the efficiency implications of using HOAS for representing the (internal) syntax of a lambda calculus instead of de Bruijn syntax? Assuming one can get away with it, it seems like it could theoretically speed things up compared to looking up things in lists. OTOH functions can do arbitrary things, so I imagine much would depend on how you actually construct these.

Maybe @AndrasKovacs has an idea?

sgraf,
@sgraf@mastodon.online avatar

@wouter @jesper I think HOAS is the "staged" form of deBruijn indices, so that matches with what I expected. It's also why NbE is faster than entirely symbolic interpretation (and thus useful).

sgraf,
@sgraf@mastodon.online avatar

@jesper @wouter what I don't quite understand is why Stephanie Weirich benchmarked a whole lot of other implementations that would probably all be slower than HOAS. I mean, HOAS beat the competition by a factor of at least 10 according to Lennart! Why bother benchmarking locally-nameless, bound etc. when those approaches will reasonably be slower than normalisation-by-evaluation anyway??

sgraf, (edited )
@sgraf@mastodon.online avatar

@jesper @wouter There are probably other reasons why HOAS often isn't an option, but when substitution/normalisation performance is important, you can always convert to HOAS just for NbE and then back. No need to benchmark other approaches that are bound to be slower

sgraf,
@sgraf@mastodon.online avatar

@kosmikus @sgraf @jesper @wouter Then I'd rather want to see benchmarks for a concrete use case (simplification? type-checking?) than for a use case (normalisation) that apparently is better done using NbE

underlap, to haskell
@underlap@fosstodon.org avatar

There are plenty of descriptions of Haskell's do notation as syntactic sugar for stringing together sequences of operators such as >>=.

There are also "laws" that define the semantics of the do notation.

But I haven't found any decent description of how to use do notation as if it was a first class language construct. That would be especially useful for beginners.

Anyone know of such a description?

Boosts appreciated!

sgraf,
@sgraf@mastodon.online avatar

@underlap What do you mean by "first-class language construct"?

Do you mean that you want to give semantics to do-notation independent of the underlying Monad desugaring?

Provided you are OK with treating return as ... a keyword, you can certainly describe the Monad laws in terms of do-notation and return; doing so is actually pretty helpful. Tom Ellis posted about it on Reddit: https://old.reddit.com/r/haskell/comments/1ag88q4/monad_laws_in_terms_of_do_notation/.

sgraf,
@sgraf@mastodon.online avatar

@underlap

I suppose you can then go on and describe Monad instances in terms of what their do-notation does. Here is what would be called the "equational theory" on do-notation at Maybe:

  • (do return x) == Just x
  • (do x <- Nothing; k x) == Nothing
  • (do x <- Just y; k x) == k y

And so on, for [a], Except, State, etc.

mangoiv, (edited ) to random
@mangoiv@functional.cafe avatar

random stuff:
I have long ago started writing Haskell types like: newtype A = MkA {unA :: B}, to avoid punning. So far, so good. However, someone told me that writing this: f (MkA b) = ... is somewhat odd because you don't MkA, you only do that the other way around. So I have started doing this very cursed thing everywhere I have a MkA-like constructor: f (unA -> b) = ...

sgraf,
@sgraf@mastodon.online avatar

@mangoiv f (Cons x xs) = ... does not "do Cons" either. I wouldn't bother and simply write MkA.

sgraf, to random German
@sgraf@mastodon.online avatar

How do people (well, semanticists) say in short terms that some syntactic restriction is necessary to "work around the lack of full abstraction" in their semantic domain?

E.g., I have some lemma f(d) &lt;= g(f,d) that I need to prove correct, and I can only do so if I assume that f = [[λx.e]] for some e. I say "this syntactic restriction is necessary due to a lack of full abstraction", which seems very indirect.
Is there a more succinct term to describe this?

https://www.pls-lab.org/en/Full_abstraction

  • All
  • Subscribed
  • Moderated
  • Favorites
  • JUstTest
  • GTA5RPClips
  • DreamBathrooms
  • InstantRegret
  • magazineikmin
  • khanakhh
  • Youngstown
  • ngwrru68w68
  • slotface
  • everett
  • rosin
  • thenastyranch
  • kavyap
  • tacticalgear
  • megavids
  • cisconetworking
  • normalnudes
  • osvaldo12
  • ethstaker
  • mdbf
  • modclub
  • Durango
  • tester
  • provamag3
  • cubers
  • Leos
  • anitta
  • lostlight
  • All magazines