MicroHs binaries are ~100× smaller and ~5–10× slower for this workload; for many data-wrangling tasks that’s a great swap
Under which conditions is that a great swap? A 5x increase in processing times is absolutely huge, and even for moderate data volumes could make a data processing pipeline completely non-viable.
This is what I'm thinking. There are still use cases I would say where small binaries really matter. But then you are really choosing the wrong tool for the job with haskell, and I say this as a haskell stan. I expect an optimized C binary is much, much smaller still.
MicroHs binaries are ~100× smaller and ~5–10× slower for this workload; for many data-wrangling tasks that’s a great swap
Under which conditions is that a great swap? A 5x increase in processing times is absolutely huge, and even for moderate data volumes could make a data processing pipeline completely non-viable.
Local environments, embedded applications, client side processing via wasm... It's a cool project! We can figure out what to do with it later.
Well, this is only "a great swap" in cases where the time taken is already so high that you won't notice a 10x.
But this tradeoff would actually pay off where the compile time has a similar improvement as the size.
This is what I'm thinking. There are still use cases I would say where small binaries really matter. But then you are really choosing the wrong tool for the job with haskell, and I say this as a haskell stan. I expect an optimized C binary is much, much smaller still.