But then it has other weird features too, like they seem to be really emphasising "friendliness" (great!) but then it has weird syntax like `\` for anonymous functions (I dunno where that dumb syntax came from by Nix also uses it and it's pretty awful). Omitting brackets and commas for function calls is also a bad decision if you care about friendliness. I have yet to find a language where that doesn't make the code harder to read and understand.
Are the Roc people really doing everything they can to ruin everything they had going for them? Appealing to those who know nothing and won't be willing to touch anything FP at the cost of annoying those who actually want to try it is just so stupid
I say this as someone who enjoys reading Rust more than Haskell or Elm -- that looks like a really bad idea for aesthetic reasons anyway. I mean if you want the syntax to look like Zig or Rust, perhaps go all the way there instead of making a kind of a mutant hybrid like this. Syntax is superficial and the semantics actually matter, but that doesn't mean the syntax can be just anything.
Is there some deeper technical reason for making such changes?
I feel that he got a lot of pressure from the FP community and wrote a bunch of nonsense instead of being straightforward with them.
The only relevant reason he lists is point-free, but he doesn't go far enough. Point-free very often turns into write-only balls of unmaintainable nastiness. Wanting to discourage this behavior is a perfectly reasonable position. Unfortunately, this one true argument is given the most tepid treatment of all the reasons.
Everything else doesn't hold water.
As he knows way better than most, Elm has auto-curry and has been the inspiration for several other languages getting better error messages.
Any language with higher-order functions can give a function as a result and if you haven't read the docs or checked the type, you won't expect it. He left higher-order function in, so even he doesn't really believe this complaint.
The argument about currying and pipe isn't really true. The pipe is static syntax known to the compiler at compile time. You could just decide that the left argument is applied/curried to the function before the right argument.
I particularly hate the learning curve argument. Lots of great and necessary things are hard to learn. The only question is a value judgement about if the learning is worth the payoff. I'd guess that most of the Roc users already learned about currying with a more popular FP language before every looking at Roc, so I don't think this argument really applies here (though I wouldn't really care if he still believed it wasn't worth the learning payoff for the fraction of remaining users).
To reiterate, I agree with his conclusion to exclude currying, but I wish he were more straightforward with his one good answer that would tick off a lot of FP users rather than resorting to a ton of strawman arguments.
No trolling/nitpicking from me: You wrote <<The only relevant reason he lists is point-free>>. What do you mean by "point-free"... or did you write "point three" and it was auto-corrected on a mobile phone?
I also tried Googling for that term (never heard before), and I found these:
We recently changed Roc's lambda syntax from the syntax that languages like Elm and Haskell use...
foo = \arg1, arg2 ->
body
...to this:
foo = |arg1, arg2|
body
The reason for this change was that we have a new and extremely well-received language feature (landed but not yet formally announced) which results in `->` and `=>` having different meanings in the type system. This made it confusing to have `->` in the syntax for anonymous functions, because it seemed to suggest a connection with the type-level `->` that wasn't actually there.
The most popular syntax that mainstream languages use today for anonymous functions is something like `(arg1, arg2) => body` but of course that has the same problem with having an arrow in it, so changing to that wouldn't have solved the problem.
Rust uses `|arg1, arg2| body` (and Ruby kinda uses it too for blocks), and we'd all had fine experiences using that syntax in Rust, so we chose it as the new lambda syntax. You can see the new syntax in the code example at the top of roc-lang.org.
Yeah Rust compiles slowly, so we need two more half-baked languages - Zig and Roc, both of which I couldn't care less.
Rust's slow compilation comes from lots of features and an excellent generated machine code quality. Both Zig and Roc will be equally slow or slower if they match what Rust offers.
If all they want is fast compilation, they can just try Pascal.
this didn't faze me in the least because it's just a more easily typed λ, the lambda character, which has for a long time now (many decades) been used to describe anonymous functions (i.e. "lambdas")
did you never take any formal CS education? if not, that might explain it
so before you jump to calling it "dumb", maybe next time lean on Chesterton's Fence for a bit.
It's nice to see Zig continuing to gain support. I have no idea why I've ended up siding with Zig in the lower languages wars.
I used to root for the D programming language but it seems to have got stuck and never gained a good ecosystem. I've disliked Rust from the first time I saw it and have never warmed up to it's particular trade offs. C feels unergonomic these days and C++ is overfull with complexity. Zig feels like a nice pragmatic middle ground.
I actually think Rust is probably perfectly suited to a number of tasks, but I feel I would default to choosing Zig unless I was certain beyond doubt that I needed specific Rust safety features.
I wanted to like Rust (memory safety and performance - what's not to like?) but both of my experiments with it ended in frustration. It seemed a way too complex language for what I needed.
Recently, a coworker of mine made a great observation that made everything clear to me. I was looking for a replacement for C, but Rust is actually a replacement for C++. Totally different beast - powerful but complex. I need to see if Zig is any closer to C in spirit.
> I was looking for a replacement for C, but Rust is actually a replacement for C++. Totally different beast - powerful but complex.
I've seen this sentiment a lot, and I have to say it's always puzzled me. The difference between Rust and basically any other popular language is that the former has memory safety without GC†. The difference between C++ and C is that the former is a large multi-paradigm language, while the latter is a minimalist language. These are completely different axes.
There is no corresponding popular replacement for C that's more minimalist than Rust and memory safe.
† Reference counting is a form of garbage collection.
This is actually accurate. While Rust is a great replacement for C, so is C++. But Rust has tons of extra features that C doesn't have, just like C++. Moving from C++ to Rust is a sideways move. Moving from C to Rust is definitely an increase in complexity and abstraction.
Rust is a great replacement for C++ as it fits into the same place in the stack of tools.
Somewhat related, even when I work with C++ I use it as "C with RAII". What I actually want is a scheme (R5RS) with manual memory management and a borrow checker. I don't know how well such a monstrosity would actually work in practice, but it's what I've convinced myself that I want.
Last I checked, rust touted itself as a systems programming language, which C++ kinda is but mostly isn't (many C++ features just aren't appropriate or are poorly suited for systems programming).
I would never choose Rust over C++ for scientific programming, because Rust metaprogramming features are so poor.
However I'd probably choose Rust over C in many areas where C is best suited.
So to me the venn diagram of utility shows Rust to overlap with C far more than C++.
I've played with Hare, Zig, and Odin. Odin is my favorite. It's a fair bit faster to compile (similar to Hare), and has the nicest syntax (subjectively speaking). I wish it would get more traction. Looking forward to trying Jai if it ever makes it to GA.
Interestingly enough, it is one of the only "talked about" languages I have almost no experience with. Even Roc I've watched a few YouTube videos on. I've only really seen Odin mentioned on X, not even a HN post.
I suppose there is also Jai in a similar space as well, although I'm not a devotee to Jonathan Blow and I don't share much of the excitement his followers seem to have.
I do feel Zig has the current trend moving in its favor, with projects like Ghostty and Bun gaining prominence. I think Odin would need something like that to really capture attention.
It's questionable how attractive Zig really is, outside of people getting caught in the wave generated by its foundation, where there is a clear self financial benefit. A Zig language review indicates there are many questions as to its usefulness, and why it hasn't hit v1 yet, over just using C and the many available libraries of that ecosystem. Famous C programmers like Muratori, homemade hero, do not like Zig nor recommend it[1].
Rust is primarily aimed at more specific use cases, evolving around memory safety and low(er) level programming. So where people might dislike the confusing syntax and difficulty of learning either Zig or Rust (among other inconveniences), its harder to make arguments against Rust's usefulness for safety, maturity (v1 plus), or present job market popularity. Zig does not have the luxury of those bonus points.
When it comes to general-purpose programming, including for hobbyists or students, there are many other alternative languages that are arguably much more attractive, easier to use, and/or easier to learn. Golang, Vlang, Odin, Jai, C3, etc...
I'm happy for Zig to gain support for one major reason: Zig does not engage in "woke" culture the way Rust does.
Zig seems to take a neutral, technical-first stance, while Rust has a strong focus on inclusivity, strict moderation, and social policies like with the foundation drama (which is just nonsense for a programming language).
I really wish Nim would have won the language wars though.
Roc couldn't be optimized for writing the Roc compiler without sacrificing some of its own goals. For example, Roc is completely memory-safe, but the compiler needs to do memory-unsafe things. Introducing memory-unsafety into Roc would just make it worse. Roc has excellent performance, but it will never be as fast as a systems language that allows you to do manual memory management. This is by design and is what you want for the vast majority of applications.
There are a number of new imperative features that have been (or will be) added to the language that capture a lot of the convenience of imperative languages without losing functional guarantees. Richard gave a talk about it here: https://youtu.be/42TUAKhzlRI?feature=shared.
It still feels kinda weird. Parsers, compilers etc are traditionally considered one of the "natural" applications for functional programming languages.
FP is bad for computing the Fibonacci series unless you have the compiler optimization to turn it into a loop (as seen in imperative languages).
To be fair, most practical FP languages have that, but I never saw the appeal for a strictly functional general purpose language. The situations where I wished one could not use imperative constructs are very domain specific.
That's an interesting point, and something I thought of when reading the parser combinator vs. recursive descent point
Around 2014, I did some experiments with OCaml, and liked it very much
Then I went to do lexing and parsing in OCaml, and my experience was that Python/C++ are actually better for that.
Lexing and parsing are inherently stateful, it's natural to express those algorithms imperatively. I never found parser combinators compelling, and I don't think there are many big / "real" language implementations that uses them, if any. They are probably OK for small languages and DSLs
I use regular expressions as much as possible, so it's more declarative/functional. But you still need imperative logic around them IME [1], even in the lexer, and also in the parser.
---
So yeah I think that functional languages ARE good for writing or at least prototyping compilers -- there are a lots of examples I've seen, and sometimes I'm jealous of the expressiveness
But as far as writing lexers and parsers, they don't seem like an improvement, and are probably a little worse
> Zig came up with a way around this: compile directly to LLVM bitcode (LLVM has strong backwards-compatibility on its bitcode but not on its public-facing API) and then upgrades become trivial because we can keep our existing code generation the same.
i'm glad that zig helps offset what i find a rather worrying trend -- replacing protocols with libraries (and as a consequence designing new underlying protocols in view of their only being used through the vendor library).
protocols are traditionally designed to facilitate an independent implementation; in fact, many standards ratification processes require several independent implementations as a prerequisite.
libraries (or worse, frameworks) intermingle the actual api with their own design, and necessitate a single implementation.
just the other day i wanted to display a notification a linux desktop where it wasn't convenient to depend on a library (it was a game, so not a traditional application expected to have many dependencies). the protocol (there is one, wrapped by the library) is very unpleasant, but i got it working out of spite.
and of course, when there is a perfectly nice protocol available (llvm ir, in either the bitcode or text representation) why not choose it? at least on unix, where starting processes and interprocess communication is cheap. (and as an added bonus, you won't crash when llvm does.)
Go was built while waiting for C++ to compile- fast compilation was an implicit design goal.
Rust on the other hand didn’t prioritize compile times and ended up making design decisions that make faster compilation difficult to achieve. To me it’s the biggest pain point with Rust for a large code base and that seems to be the sentiment here as well.
The Rust ecosystem has long downplayed the importance of compile times.
Many foundational crates, serde for example, contribute much more to compile times than they need to.
I spent a long time reinventing many foundational rust crates for my game engine, and I proved its possible to attain similar features in a fraction of the compile time, but it’s a losing battle to forgo most of the ecosystem.
Thank you for your work! Rust is still a great language.
I think a significant portion of our pain with rust compile times is self inflicted due to the natural growth of our crate organization and stages.
I still think the rewrite in zig is the right choice for us for various reasons, but I think at least a chunk of our compile times issues are self inflicted (though this happens to any software project that grows organically and is 300k LOC)
But were any _language_ decisions discarded due to compile time concerns? I don't think anyone would claim the folks working on the rust compiler don't care.
On that note, thank you for your part! I sure enjoy your work! :)
From listening to Feldman's podcast, this doesn't really come as a surprise to me. The rigor that Rust demands seems not to jibe with his 'worse is better' approach. That coupled with the fact they already switched the stdlib from Rust to Zig. The real question I have is why he chose Rust in the first place.
Zig was not ready or nearly as popular back in 2019 when the compiler was started.
Not to mention, Richard has a background mostly doing higher level programming. So jumping all the way to something like C or Zig would have been a very big step.
Sometimes you need a stepping stone to learn and figure out what you really want.
> The real question I have is why he chose Rust in the first place.
If you read the linked post carefully you will know.
> Compile times aside, the strengths and weaknesses of Rust and Zig today are much different than they were when I wrote the first line of code in Roc's Rust compiler in 2019. Back then, Rust was relatively mature and Zig was far from where it is today.
This is more about simplicity, maintainability, and possiblity for new contributors to easily jump in and fix things. Our current parser is not fun for new contributors to learn. Also, I think parser combinators obsficate a lot of the tracking required for robust error messages.
Personally, I find parser combinators nice for small things but painful for large and robust things.
When I used parser combinations in rust years ago the compile times were really long. In also think it’s a strange reason to move away from rust as a language.
I recommend reading Roc's FAQ too - it's got some really great points. E.g. I'm internally screaming YESSS! to this: https://www.roc-lang.org/faq.html#curried-functions
But then it has other weird features too, like they seem to be really emphasising "friendliness" (great!) but then it has weird syntax like `\` for anonymous functions (I dunno where that dumb syntax came from by Nix also uses it and it's pretty awful). Omitting brackets and commas for function calls is also a bad decision if you care about friendliness. I have yet to find a language where that doesn't make the code harder to read and understand.
It’s a syntax that’s several decades old at this point.
It’s different, but not harder. If you learned ML first, you’d found Algol/C-like syntax equally strange.
That's not ML syntax. Haskell got it from Miranda, I guess?
In SML you use the `fn` keyword to create an anonymous function; in Ocaml, it's `fun` instead.
https://github.com/roc-lang/roc/releases/tag/0.0.0-alpha2-ro...
Is there some deeper technical reason for making such changes?
The only relevant reason he lists is point-free, but he doesn't go far enough. Point-free very often turns into write-only balls of unmaintainable nastiness. Wanting to discourage this behavior is a perfectly reasonable position. Unfortunately, this one true argument is given the most tepid treatment of all the reasons.
Everything else doesn't hold water.
As he knows way better than most, Elm has auto-curry and has been the inspiration for several other languages getting better error messages.
Any language with higher-order functions can give a function as a result and if you haven't read the docs or checked the type, you won't expect it. He left higher-order function in, so even he doesn't really believe this complaint.
The argument about currying and pipe isn't really true. The pipe is static syntax known to the compiler at compile time. You could just decide that the left argument is applied/curried to the function before the right argument.
I particularly hate the learning curve argument. Lots of great and necessary things are hard to learn. The only question is a value judgement about if the learning is worth the payoff. I'd guess that most of the Roc users already learned about currying with a more popular FP language before every looking at Roc, so I don't think this argument really applies here (though I wouldn't really care if he still believed it wasn't worth the learning payoff for the fraction of remaining users).
To reiterate, I agree with his conclusion to exclude currying, but I wish he were more straightforward with his one good answer that would tick off a lot of FP users rather than resorting to a ton of strawman arguments.
I also tried Googling for that term (never heard before), and I found these:
If you really meant "point-free", can you tell me where in his post he mentions it? I would like to learn more.A one-character ASCII rendering of the Greek lowercase letter lambda: λ
λx → x + 5
\x -> x + 5
If anything I don't think Haskell goes far enough the automatic currying, points free stuff. If you're going to be declarative, don't half ass it.
The most popular syntax that mainstream languages use today for anonymous functions is something like `(arg1, arg2) => body` but of course that has the same problem with having an arrow in it, so changing to that wouldn't have solved the problem.
Rust uses `|arg1, arg2| body` (and Ruby kinda uses it too for blocks), and we'd all had fine experiences using that syntax in Rust, so we chose it as the new lambda syntax. You can see the new syntax in the code example at the top of roc-lang.org.
Rust's slow compilation comes from lots of features and an excellent generated machine code quality. Both Zig and Roc will be equally slow or slower if they match what Rust offers.
If all they want is fast compilation, they can just try Pascal.
this didn't faze me in the least because it's just a more easily typed λ, the lambda character, which has for a long time now (many decades) been used to describe anonymous functions (i.e. "lambdas")
did you never take any formal CS education? if not, that might explain it
so before you jump to calling it "dumb", maybe next time lean on Chesterton's Fence for a bit.
https://sproutsschools.com/chesterton-fence-dont-destroy-wha...
That said, granted, the fact that it's also the escape character is problematic. Maybe /\ might have been better but that's even harder to type.
I used to root for the D programming language but it seems to have got stuck and never gained a good ecosystem. I've disliked Rust from the first time I saw it and have never warmed up to it's particular trade offs. C feels unergonomic these days and C++ is overfull with complexity. Zig feels like a nice pragmatic middle ground.
I actually think Rust is probably perfectly suited to a number of tasks, but I feel I would default to choosing Zig unless I was certain beyond doubt that I needed specific Rust safety features.
Recently, a coworker of mine made a great observation that made everything clear to me. I was looking for a replacement for C, but Rust is actually a replacement for C++. Totally different beast - powerful but complex. I need to see if Zig is any closer to C in spirit.
I've seen this sentiment a lot, and I have to say it's always puzzled me. The difference between Rust and basically any other popular language is that the former has memory safety without GC†. The difference between C++ and C is that the former is a large multi-paradigm language, while the latter is a minimalist language. These are completely different axes.
There is no corresponding popular replacement for C that's more minimalist than Rust and memory safe.
† Reference counting is a form of garbage collection.
Rust is a great replacement for C++ as it fits into the same place in the stack of tools.
Go is not a C or Python replacement.
Zig is a good replacement for C.
I would disagree. C++ provides way more features than Rust and to me Rust feels way more constrained comparatively.
Last I checked, rust touted itself as a systems programming language, which C++ kinda is but mostly isn't (many C++ features just aren't appropriate or are poorly suited for systems programming).
I would never choose Rust over C++ for scientific programming, because Rust metaprogramming features are so poor.
However I'd probably choose Rust over C in many areas where C is best suited.
So to me the venn diagram of utility shows Rust to overlap with C far more than C++.
Odin has the best approach for "standard library" by blessing/vendoring immensely useful libraries
Odin also has the best approach for Vector Math with native Vector and Matrix types
I suppose there is also Jai in a similar space as well, although I'm not a devotee to Jonathan Blow and I don't share much of the excitement his followers seem to have.
I do feel Zig has the current trend moving in its favor, with projects like Ghostty and Bun gaining prominence. I think Odin would need something like that to really capture attention.
Rust is primarily aimed at more specific use cases, evolving around memory safety and low(er) level programming. So where people might dislike the confusing syntax and difficulty of learning either Zig or Rust (among other inconveniences), its harder to make arguments against Rust's usefulness for safety, maturity (v1 plus), or present job market popularity. Zig does not have the luxury of those bonus points.
When it comes to general-purpose programming, including for hobbyists or students, there are many other alternative languages that are arguably much more attractive, easier to use, and/or easier to learn. Golang, Vlang, Odin, Jai, C3, etc...
[1]: https://www.youtube.com/watch?v=uVVhwALd0o4 ("Language Perf and Picking A Lang Stream" from 29:50)
Zig seems to take a neutral, technical-first stance, while Rust has a strong focus on inclusivity, strict moderation, and social policies like with the foundation drama (which is just nonsense for a programming language).
I really wish Nim would have won the language wars though.
And now they are doubling down on that by moving from "OCaml meets C++" to "C, the good parts"!
If FP isn't good for writing a compiler, what is it good for?
There are a number of new imperative features that have been (or will be) added to the language that capture a lot of the convenience of imperative languages without losing functional guarantees. Richard gave a talk about it here: https://youtu.be/42TUAKhzlRI?feature=shared.
sorry, but why?
Summing the Fibonacci sequence I guess.
To be fair, most practical FP languages have that, but I never saw the appeal for a strictly functional general purpose language. The situations where I wished one could not use imperative constructs are very domain specific.
[1] https://www.roc-lang.org/faq#self-hosted-compiler
I assume that statement will need updating.
https://ocaml.github.io/ocamlunix/ocamlunix.html
Around 2014, I did some experiments with OCaml, and liked it very much
Then I went to do lexing and parsing in OCaml, and my experience was that Python/C++ are actually better for that.
Lexing and parsing are inherently stateful, it's natural to express those algorithms imperatively. I never found parser combinators compelling, and I don't think there are many big / "real" language implementations that uses them, if any. They are probably OK for small languages and DSLs
I use regular expressions as much as possible, so it's more declarative/functional. But you still need imperative logic around them IME [1], even in the lexer, and also in the parser.
---
So yeah I think that functional languages ARE good for writing or at least prototyping compilers -- there are a lots of examples I've seen, and sometimes I'm jealous of the expressiveness
But as far as writing lexers and parsers, they don't seem like an improvement, and are probably a little worse
[1] e.g. lexer modes - https://www.oilshell.org/blog/2017/12/17.html
You get to decide what part of your code should be imperative, and which should be functional.
i'm glad that zig helps offset what i find a rather worrying trend -- replacing protocols with libraries (and as a consequence designing new underlying protocols in view of their only being used through the vendor library).
protocols are traditionally designed to facilitate an independent implementation; in fact, many standards ratification processes require several independent implementations as a prerequisite.
libraries (or worse, frameworks) intermingle the actual api with their own design, and necessitate a single implementation.
just the other day i wanted to display a notification a linux desktop where it wasn't convenient to depend on a library (it was a game, so not a traditional application expected to have many dependencies). the protocol (there is one, wrapped by the library) is very unpleasant, but i got it working out of spite.
and of course, when there is a perfectly nice protocol available (llvm ir, in either the bitcode or text representation) why not choose it? at least on unix, where starting processes and interprocess communication is cheap. (and as an added bonus, you won't crash when llvm does.)
Rust on the other hand didn’t prioritize compile times and ended up making design decisions that make faster compilation difficult to achieve. To me it’s the biggest pain point with Rust for a large code base and that seems to be the sentiment here as well.
(Source: I wrote much of the Rust compiler.)
Many foundational crates, serde for example, contribute much more to compile times than they need to.
I spent a long time reinventing many foundational rust crates for my game engine, and I proved its possible to attain similar features in a fraction of the compile time, but it’s a losing battle to forgo most of the ecosystem.
I think a significant portion of our pain with rust compile times is self inflicted due to the natural growth of our crate organization and stages.
I still think the rewrite in zig is the right choice for us for various reasons, but I think at least a chunk of our compile times issues are self inflicted (though this happens to any software project that grows organically and is 300k LOC)
On that note, thank you for your part! I sure enjoy your work! :)
Why aren't people designing modern languages to make it easier to keep a stable ABI, rather than giving up entirely?
Deleted Comment
Deleted Comment
Not to mention, Richard has a background mostly doing higher level programming. So jumping all the way to something like C or Zig would have been a very big step.
Sometimes you need a stepping stone to learn and figure out what you really want.
If you read the linked post carefully you will know.
> Compile times aside, the strengths and weaknesses of Rust and Zig today are much different than they were when I wrote the first line of code in Roc's Rust compiler in 2019. Back then, Rust was relatively mature and Zig was far from where it is today.
Almost all parser combinators are recursive descent with backtracking, they just add higher-order plumbing.
I have a feeling that whatever issue they've encountered with the combinator approach could have been worked around by handwriting only a few pieces.
Personally, I find parser combinators nice for small things but painful for large and robust things.