@terrorjack@functional.cafe avatar

terrorjack

@terrorjack@functional.cafe

This profile is from a federated server and may be incomplete. Browse more on the original instance.

terrorjack, to haskell
@terrorjack@functional.cafe avatar

ghc wasm backend jsffi has finally been implemented! the rest is source notes as well as user-facing documentation. not sending a discourse thread yet, but for the curious eyes, here's what's supported so far (and also an ama thread):

  1. calling sync js function

foreign import javascript unsafe "console.log($1)" js_log :: JSVal -> IO ()

js src text can be any valid js expr or statements, using $1 etc to refer to the haskell arguments. JSVals are first class haskell values and garbage collected.

  1. calling async js function

foreign import javascript "fetch($1)" js_fetch :: JSString -> IO JSVal

await is supported in js src text. calling it initiates the side effect immediately, and returns a thunk that only blocks on promise resolution when forced, allowing concurrency without even needing to fork haskell threads. promise rejections are captured as haskell exceptions.

  1. js callback -> haskell function

foreign import javascript "dynamic" js2hs :: JSVal -> Arg1 -> Arg2 -> Result

pretty much same thing like "dynamic" ccalls, though we use JSVal to represent a js callback instead of FunPtr.

  1. hs function -> js callback

foreign import javascript "wrapper" hs2hs :: (Arg1 -> Arg2 -> Result) -> IO JSVal

yup, this converts any haskell function closure to a js callback that you can pass to 3rd party frameworks. it's garbage collected on the js side as well, the haskell closure will be dropped if the callback is unreachable in js later.

  1. foreign exports

foreign export javascript "js_func_name" hs_func_name :: Arg1 -> Arg2 -> Result

this will end up as a wasm export named "js_func_name" directly callable in js, and it returns a promise of the final result that can be awaited.

the hardest part of all the above work is concurrency & re-entrancy: calling async js should be cheap, should not block the runtime, should play well with existing threading model in haskell. and hs can call into js that calls back into hs, indefinitely. i'll write up more detailed explanation on how this is delivered

terrorjack, to random
@terrorjack@functional.cafe avatar

for the past few days i've been adding asan support to rts. motivation: the rts is a c monolith that does complex memory management, segfaults are very rare but they do occur sometimes as people reach for help in issue tracker and matrix channel. so i'm really hoping the rts development workflow can be backed by sanitizers and fuzzers to make this monolith more rock solid than it currently is.

terrorjack, to random
@terrorjack@functional.cafe avatar
terrorjack, to random
@terrorjack@functional.cafe avatar

after landing jsffi for ghc wasm backend, next thing i'll work on is template haskell support. and after that, threaded rts. yes, it'll be possible to run a haskell app in your browser that eats all your cpu cores (for a better purpose than crypto stuff, i personally hope)

terrorjack, to random
@terrorjack@functional.cafe avatar

i can fuzz haskell programs with afl++ now, let's see how well it works

terrorjack, to random
@terrorjack@functional.cafe avatar

hmm, someone worked on a parallel quickcheck (paper accepted for ifl): https://github.com/Rewbert/quickcheck

terrorjack,
@terrorjack@functional.cafe avatar
terrorjack, to random
@terrorjack@functional.cafe avatar

added a few popl'24 papers to my rarely shrinking read later list. if i have to pick a single one, that term generator thing looks really useful

terrorjack, to random
@terrorjack@functional.cafe avatar

this might be the first haskell program in the world that natively runs on arm64 windows

terrorjack, to random
@terrorjack@functional.cafe avatar

somebody named "pls fix " is a gold sponsor of godot engine

terrorjack, to random
@terrorjack@functional.cafe avatar

why am i staring at greek letters in the ghc coercion paper. i'm just supposed to generate a few c functions that don't crash at runtime

terrorjack, to random
@terrorjack@functional.cafe avatar

comfy night, comfy drink, comfy panic! (the 'impossible' happened)

terrorjack, to random
@terrorjack@functional.cafe avatar

join the petition to tax higher order functions from programming! don't let devs get away without paying maintenance tax when they casually add a continuation parameter and pass a function defined 100k lines away in the codebase

terrorjack, to random
@terrorjack@functional.cafe avatar

evil little trick learned from fosdem: to debug systemd, you could use a shell script as stub init script that spawns gdb server to debug itself, then execs into systemd

terrorjack, to haskell
@terrorjack@functional.cafe avatar

ok. sparks is indeed a nice way to get work stealing nested parallelism for free in , as long as you work with spark# directly and don't use par, pseq or anything built upon these combinators

terrorjack,
@terrorjack@functional.cafe avatar

@leftpaddotpy spark# is state# passing so you can explicitly spawn a spark as a monadic operation in io or st which feels natural. on the other hand, par's purity is an undesired burden because you now have evaluation order to worry about and litter your code with pseq. the entire "Strategy" thing and "Eval" monad in the parallel package is a huge distraction in the same sense

terrorjack, to haskell
@terrorjack@functional.cafe avatar

one aspect still sucks is build parallelism:

  1. vanilla cabal builds are coarse grained and have component level build dependency

  2. cabal/ghc has multiple home units now but that's only for repl for the time being

  3. cabal/ghc has semaphores now so multiple ghc --make -jsem processes can share cpu cores without oversubscribing. which is nicer, but not nice enough

  4. semaphore format is home brew and not something more standard like make jobserver. hard to fit in external build systems

  5. external build systems resort to using oneshot mode instead of make mode, so one ghc invocation produces one .hi .o pair, and a fair amount of cpu cycles is wasted compared to make mode due to repeatedly building context that could have been shared

  6. more importantly, once .hi of upstream module is emitted, before ghc -c exits, downstream module should queue for compilation immediately. but this is tricky to implement and often omitted

  7. ironically the wasted cpu cycles in ghc oneshot mode is often compensated by increased parallelism. because external build systems parse cabal metadata but breaks cabal component level dependency wall

  8. but now there's a thing called cabal custom setup and now you need to resort to actually respecting Setup.hs for those packages and they can easily become bottlenecks of a build

  9. the people equipped with knowledge to fix the situation thoroughly have tons of more important issues on their plate

terrorjack, to random
@terrorjack@functional.cafe avatar

jira is finally happening yay i will agile and 10x

terrorjack, to random
@terrorjack@functional.cafe avatar

oh yes, one case where i find laziness rocks in my current wip branch: calling an async import would invoke the side effect (e.g. fetch()) immediately, but it returns a thunk, doesn't block the current thread if not forced immediately. one does not even need to fork 10k haskell threads to do 10k fetch() calls concurrently, just a plain mapM and you get concurrency for free :)

haskell, to haskell
@haskell@fosstodon.org avatar
terrorjack,
@terrorjack@functional.cafe avatar

@haskell https://gitlab.haskell.org/ghc/ghc/-/issues/24603 for the discussion thread and instructions for reproducing the result

terrorjack, to random
@terrorjack@functional.cafe avatar

i guess formally breaking up with only real friend irl on a sleep deprived day might not be the best way to celebrate a birthday

terrorjack,
@terrorjack@functional.cafe avatar

@sanityinc thank you :)

terrorjack,
@terrorjack@functional.cafe avatar

@Profpatsch i'm afraid it's a "couldn't care less" on his side. c'est la vie

terrorjack,
@terrorjack@functional.cafe avatar

@GZGavinZhao 谢谢,虽然已经是油腻大叔了,,,

terrorjack,
@terrorjack@functional.cafe avatar

@sinkerine 谢谢/

wingo, to random
terrorjack,
@terrorjack@functional.cafe avatar

@wingo another example of wasm stack switching probably worth mentioning is wasmtime, which support async host rust functions as wasm imports and works similarly as js promise integration in v8

  • All
  • Subscribed
  • Moderated
  • Favorites
  • megavids
  • kavyap
  • DreamBathrooms
  • modclub
  • normalnudes
  • ngwrru68w68
  • magazineikmin
  • Durango
  • ethstaker
  • Youngstown
  • rosin
  • slotface
  • InstantRegret
  • everett
  • JUstTest
  • thenastyranch
  • osvaldo12
  • Leos
  • cubers
  • tacticalgear
  • khanakhh
  • mdbf
  • GTA5RPClips
  • anitta
  • provamag3
  • cisconetworking
  • tester
  • lostlight
  • All magazines