Congrats to the fish team! Great writeup with lots of interesting detail.
I wonder if this is the biggest project that has moved from C++ entirely to Rust (or maybe even C to Rust?) It probably has useful lessons for other projects.
If I'm reading this right, it looks like fish was not released as a hybrid C++ / Rust program, with the autocxx-generated bindings. There was a release during that time, but it says "fish 3.7 remains a C++ program" [1]
It sounds like they could have released if they wanted to, but there was a last stage of testing that didn't happen until the end.
Some people didn't quite get the motivation for adding C++ features to Rust [2], and vice versa, to enable inter-op. But perhaps this is a good case study.
It would be nice if you could just write new Rust code in a C++ codebase, without writing/generating bindings, and then throwing them away, which is mentioned in this post.
---
Also the #1 gripe with Rust seems to be that it supports version detection, not feature detection.
But feature detection is better for distros, web browsers, and compilers:
Version/name detection is why Chrome and IE pretend to be Mozilla, and why Clang pretends to be GCC. Feature detection (e.g. ./configure and eval() ) doesn't cause this problem!
To clarify, work on the rust rewrite started after 3.7.0, but the C++ code remained in a working branch on the git repo. Midway through the rewrite, we backported additions and improvements to fish scripts (most observable being new and improved completions) and a couple of important bugfixes from the rust-containing `master` branch to the C++ branch and released that as 3.7.1.
We never considered releasing anything with a hybrid codebase; aside from the philosophical purity of fully making the switch to rust, it would have been a complete distribution nightmare (we take package maintainer requirements very seriously). Moreover, the code itself was not in a very pretty state - the port was very much like trying to undo a knot: you had to make it much uglier in order to get it properly undone. There were proverbial tons of SLoC that were introduced only for transitional interop purposes that were later removed, this code was never held to the same quality standards (in terms of maintainability; it was still intended to be bug-free and required to pass all our unit and integration tests, however).
As mentioned in the article, we prefer to do feature detection when and where needed/possible. The old codebase was purely feature-detected via the CMake build system but we ended up writing our own feature detection crate for rust invoked via build.rs (maintained here [0]) though we just defer to libc on a lot (which doesn't do that yet). One side effect of the libc issue is that we're beholden to their minimum supported targets (though I'm not sure if that's strictly the case if we don't use the specific apis that cause that restriction?), which are higher than what we would have liked because we were fine with feature detecting and implementing using both older and newer apis where needed.
> Feature Detection Is Better than Version Detection
The problem with feature detection (normally referred to as configuration probing), at least the way it's done in ./configure and similar, is that it relies on compiling and potentially linking (and sometimes even running, which doesn't work when cross-compiling) of a test program and then assuming that if compilation/linking fails, then the feature is not available.
But the compilation/linking can fail for a myriad of other reasons: misconfigured toolchain, bug in test, etc. For example, there were a bunch of recent threads on this website where both GCC and Clang stopped accepting certain invalid C constructs which in turn broke a bunch of ./configure tests. And "broke" doesn't mean you get an error, it means your build now thinks the latest Fedora and Ubuntu all of a sudden don't have strlen().
IMHO a broken toolchain is a broken toolchain and that's kind of outside the scope of autoconf -- and I say this despite having banged my head against the wall only too many times as a result of an odd toolchain misconfiguration leading me into chasing autoconf gremlins.
One thing about rust is that it has always treated cross-compiling as a first-class citizen. Cargo is very intentional about the difference between the HOST and TARGET triplets and you can't mix them up unless you are doing so intentionally.
The rsconf feature detection crate was similarly designed with cross-compilation in mind from the start and eschews running binaries in favor of some clever hacks to exfiltrate values during the cross-compilation process.
There is only one rsconf feature (retrieving compile-time constants) that is currently labeled caveat emptor as it does not support cross-compilation; perhaps I can nerdsnipe someone here into figuring out a workaround: https://github.com/mqudsi/rsconf/issues/3
I generally think autoconf etc should be defined to expect certain things by default (keyed by OS), and fail loudly rather than auto-disabling those features. If you really don't want those features, pass in --disable-foo.
I re-did Firefox's autoconf to do this back around 2010 (was contracting for Mozilla as a part-time job in college), after running into one too many features that were automatically disabled because of a missing library. There was at least one Firefox nightly that was missing an important feature because the build machine didn't have the required library.
Hm what's an example of those invalid C constructs? I'd be interested in seeing what happened
One answer is the __has_feature tests mentioned in a sibling comment. Then you are using a supported API, not arbitrary code. Browsers should probably support something like that, if they don't already.
But the arbitrary code is still a useful fallback, for when the platform itself doesn't support config probing
I think you're saying that "writing good ./configure is hard", which is absolutely true. But it's still true that feature detection is better than version detection.
Although Clang does set the `__GNUC__` macro and you have to distinguish it using the `__clang__` macro, Clang and GCC also both have very fine-grained feature detection features as well, both at the CLI level and in the preprocessor (such as the `__has_feature` family of builtins).
I remember switching from bash to zsh a few years back and thinking I was the bees knees. After the switch trying other shells seemed like bike-shedding because, I mean, what more could a shell? Then I got a new computer and decided to start from scratch with my tooling and downloaded fish. I was shocked how it instantly made zsh feel cumbersome and ancient.
Heartily recommend others give it a try as a daily driver for a couple of weeks. I liken it to Sublime Text: an excellent “out of the box” tool. Just the right amount of features, with the option to add more if you want. But you also don’t feel like your missing out if you keep it bare bones. A great tool in and of itself.
Same here. I used it for about 3 days before I installed it on all my systems and permanently switched. For me, it was like the first time I learned a non-Latin language, and my eyes were opened to how much stuff I took for granted was completely arbitrary.
For example, here's how you write an autoloaded function "foo" in Fish: you make a file called "foo.fish" in its config directory. Inside that, you write "function foo ..." to implement it. There's no step 3. That's it.
Want to customize your shell prompt? Follow the process above to write a function called "fish_prompt" that uses normal scripting things like echo, pwd, git, or whatever to write your prompt to the screen. There's no step 2. That's it.
Fish was revelatory. Other shells of the same vintage feel hopelessly outdated to me now. For context, I was the maintainer of FreeBSD's "bash-completion" port for a few years way back when. It's not that I don't have experience with other shells. I have plenty. I just don't want to use any of the others now.
Interesting, I went the other way about 7 years ago - switched from fish to zsh (initially with oh-my-zsh). The interactive experience was similar enough on both shells, and the performance was great on fish and okay-ish on zsh, but two things won me over:
1. With zsh, I can copy-paste some bash snippet and in 99% of cases it will just work. Aside of copy-pasting from StackExchange, I also know a lot of bash syntax by heart by now, and can write some clever one-liners. With zsh, I didn't need to learn everything from scratch. (I guess this matters less now that you can ask AI to convert a bash one-liner into fish one-liner?)
2. For standalone scripts... well, I think it's best to reach for a proper programming language (e.g. Python) instead of any shell language, but if I had to use one, I would pick bash. Sure, it has many footguns, but I know them pretty well. And fish language is also not ideal - e.g. IIRC it doesn't have an equivalent of `set -e`, you have to add `; or return 1` to each line.
I use fish and on the very, very rare occasion I need to copy and paste bash from the internet it's pretty easy to just type 'bash' into fish and paste it in. Its not like bash and fish conflict, you can have them both installed.
FWIW, fish is much more bash-compatible these days. We've introduced support for a lot of bash-isms that don't completely break the fish spirit or clash with its syntax in the last few releases.
> 2. For standalone scripts... well, I think it's best to reach for a proper programming language (e.g. Python) instead of any shell language, but if I had to use one, I would pick bash. Sure, it has many footguns, but I know them pretty well. And fish language is also not ideal - e.g. IIRC it doesn't have an equivalent of `set -e`, you have to add `; or return 1` to each line.
I'm sure you know this, but: no particular reason the interactive shell you use has to match the shell you use for scripts. All of my scripts are in bash, but I haven't used bash interactively in decades now, at least on purpose.
I “devolved” mostly along the same path. Bespoke shell to OMZSH to Zsh to Bash.
Zsh has a few nasty Bashism footgun incompatibilities. If I remember correctly the worst one is with how globbing / “*” works, which is why that is guarded with an option.
My main reason for sticking with Bash is that it’s everywhere, and the places where it isn’t try very hard to support the most-used featureset of Bash.
A stock Bash shell does feel a little naked without my dotfiles though :)
Reading the associated issue (https://github.com/fish-shell/fish-shell/issues/510) about the lack of "set -e" was interesting as it highlighted how weird Bash, and shell scripting in general, is from a programming language perspective. Imagine programming in any other environment where every function you call could either succeed or fail catastrophically. There's some talk about adding exception handling to Fish, but maybe the sensible thing to do is to have a mode where Fish ensures that you've dealt with each possible error before moving on. Which is what you would do anyway if you were invoking external programs from a non-shell language (like Python's subprocess.check_call).
In any case the discussion in that issue made a convincing (to me) argument that if you're doing the sort of scripting for which "set -e" makes sense, which is most of it, you should be using Bash. That doesn't mean you need to use Bash interactively though, as others have pointed out.
I think that oilshell is aimed at people like you. I’ve never used it, but their website does make some interesting points about how a shell ought to work and how this could be compatible with bash.
The main reason I switched is because zsh can (often) source bash scripts and can use bash completion scripts (usually), and I was tired of having to translate things from bash to fish. I also ran into a few things where something that was relatively easy to do in bash was impossible to do with fish. But that was years ago so maybe that is less of an issue now, and I don't remember exactly what it was.
Having used zsh, I think a big advantage it has over fish is the completions. There are completions available for more programs for zsh, and the zsh completions are sometimes higher quality in zsh.
But I do generally like the syntax, and good out of the box experience of fish. I wish it had a bash or even posix compatibility mode and more available completions.
I can relate with your comment a few years ago, but later the situation drastically got better, while not perfect yet (i.e. I still need a custom autocomplete function for aws). You might want to give it a try now anyway.
I used bash for ages, and never really saw what zsh offered in comparison: I would have had to customize it almost as much as bash, and it didn’t really give me anything new.
Fish was so much better than either out of the box, and I still have done virtually no configuration other than setting it up to use my common starship prompt, which is supported in bash as well.
I don’t understand personally the argument about not having bash syntax. If I want it, I just run `bash`.
It's a Bash-like shell written in Python. It has significant overlap with the awesomeness of fish, and has the advantage of being able to write your shell scripts in a Python dialect. So if you know Python, the mental burden is much lower.
On top of that, it's cross platform, since Python is. No WSL needed.
I switched to it in 2018 and haven't looked back. Originally it was just because I wanted a better command prompt environment in Windows for work, but I liked it so much I switched to it in Linux as well.
(And yes, you can type any Python statement right in the command prompt).
Fish has a lot of features out of the box I find really useful:
* Command auto suggestions as you type based on your history
* History search (using up arrow) based on a partial command
* Helpful completions and descriptions when you hit TAB
* Muti-line command editing
* Syntax highlighting
You can get all those same features in Zsh by using plugins, but those features work out-of-the-box with Fish with zero configuration. Zsh is a bit of a pain to configure, and pretty anemic without plugins. Fish makes configuration optional because it works how you'd hope your shell would out of the box. Even though Zsh has those features as plugins, they're kinda janky, not well maintained, and often conflict with other plugins.
Additionally, Fish also has:
* Excellent built in commands (string, math, argparse)
* Sane scripting (word parsing where you don't need to quote everything, etc)
* Great documentation
* A web-based configuration if you're into that sort of thing (it's a bit of a gimmick for beginners)
The main reason I use Zsh (or Bash) at all is for POSIX/portability, or for when I can't install something else. But for an interactive shell on a machine I control, it's hard to compete with Fish for speed, features, and ease of use.
For me, it's that the ergonomics are straightforward, and everything works out of the box. If I find myself on a new machine, just installing fish gives me an ergonomic setup without having to install too many additional tools or mess with configuration.
Being able to avoid OMZ and entire cargo cult of zsh configuration performance “hacks” that litter the internet.
Really, not needing to pull in other people’s janky scripts because the built-in features work well is huge. I still configure fish and use a few scripts, but it’s the lack of the massive cottage industry that is the primary draw for me.
Of course, many devs see that as a failing: “how could a shell do its job well without a thousand knobs to tweak?”
My only issue with Fish is when pasting things from the web that assume Bash, a lot of the time it just works, then now and then I get screwed. I don't know nearly enough Fish or Bash to switch. Still though, I prefer Fish ultimately.
It’s interesting how many folks in the comments have essentially this complaint, of not being able to paste bash from the internet. I just run `bash`, paste the thing, and then exit bash.
People don't realize they don't have to stick to a single shell for both scripting and terminal use.
I use zsh with plugins which pretty much makes it act like fish's convenience but one can use fish as their shell scripting while keeping the "bash" compatibility by keep using zsh or bash under terminal.
> The one platform we care about a bit that it does not currently seem to have enough support for is Cygwin, which is sad, but we have to make a cut somewhere.
> We’re also losing Cygwin as a supported platform for the time being, because there is no Rust target for Cygwin and so no way to build binaries targeting it. We hope that this situation changes in future, but we had also hoped it would improve during the almost two years of the port. For now, the only way to run fish on Windows is to use WSL.
I understand, but this is indeed incredibly sad. To this day I still use Cygwin, and in fact prefer it to WSL depending on what I'm doing. Cygwin is an incredible project that is borderline miraculous for what it accomplished and provides. Without Cygwin I may not have any sanity left. I can't exude enough love for the Cygwin team.
Hopefully rust will support cygwin as a build target in the future!
It's strange how the article starts off complaining about C++'s platform "issues":
> We’ve experienced some pain with C++. In short:
> tools and compiler/platform differences
before conceding that, because of Rust, they 1) are actually dropping support for a platform they previously supported and 2) can only support (in theory) a small fraction of those platforms supported by g++, but that that's OK because those are the only platforms which really matter. I get that it's a trade-off, but it would have been more intellectually honest to just admit this is one area (portability, backwards compatibility, and ABI stability) where C++ mops the floor with Rust, instead of pretending it's a another paintpoint Rust avoids.
I don't see how the article is pretending anything. They had platform issues with C++ (portability and usability on the platforms they supported), and switching to Rust fixed those issues but gave them a different set of platform issues (they could no longer support Cygwin).
Neither c++ nor rust is a clear winner in portability and platform support. C++ is available on more platforms, but in some ways rust makes it easier to support multiple platforms than it is in c++, for example using rustup to install the latest version of the compiler.
What they got from this isn't that they can now support more platforms, but that they now don't have to spend as much effort on supporting dealing with differences between different platforms.
Yeah, it's somewhat interesting that they point to Debian's popcon (which is opt-in), when the statistics are basically coming from amd64, whereas I think it would be much more interesting (if possible) to see what the number of installs of fish are on openwrt (and other embedded distros). Currently the openwrt fish install is ~2MB (which is massive on a router), I wonder what the new install size will be with the rust version, and if practically they've dropped everything except desktop/server linux and MacOS?
Corporate jobs are nearly always on Windows machines. Cygwin+GitBash can usually sneak past the gate without raising too many eyebrows. WSL is still voodoo dark mark that can require conversations to get IT to allow.
> The one goal of the port we did not succeed in was removing CMake.
> That’s because, while cargo is great at building things, it is very simplistic at installing them. Cargo wants everything in a few neat binaries, and that isn’t our use case. Fish has about 1200 .fish scripts (961 completions, 217 associated functions), as well as about 130 pages of documentation (as html and man pages), and the web-config tool and the man page generator (both written in python).
I would definitely love to see Cargo have the ability to do this -- it means that `cargo install --locked` stays as a viable approach. It probably won't apply to fish, but I think being able to run a post-install command from the binary you just installed would suffice for my needs.
We've actually added support to make single-binary fish deployments possible by (optionally) bundling static resources that would be part of the CMake-based deployment into the binary itself and having it unwrap those on first execution. The limitations of Cargo and the idiomatic `cargo install` usage primarily motivated this.
As a decade-long user and as a professional C++ developer, I'm so happy they've managed to successfully port the shell to Rust. While I have a lot of fun writing C++ (and Rust), I must admit that Rust is vastly nicer to use.
People can complain as much as they want about the borrow checker, but you basically have to be as strict as Rust is in C++ if you want to really avoid use-after-free issues, ... I've been writing "Rusty C++" since before Rust was a thing, because that's the only sane approach to memory safety. I'd rather have a program check that I don't fumble up instead of running sanitizers when things go awry (often years later). The best bug is a bug that can't happen at all.
Static analyzers are sadly too limited compared to what a borrow checker can do in my experience. Some bad stuff will always slip in in C/C++.
Surprised to see the line count go up so much, 56K LOC of C++ to 75K of Rust. The blog attributes it to rustfmt using less oneliners. Even so, i would believe that should be a small factor compared to the heaps of duplicate code you get from c++ header files and all the other syntax ergonomics rust gives you.
Is this typical for such a translation. They also mention addition of new features contributing to more code, how much of the addition was new features vs pure translation?
Would be interesting to see the line count of the c++ version if it was run through a formater with similar configuration.
The tone in the "The Timeline" section seems apologetic:
> The initial PR had a timeline of “handwaving, half a year”. It was clear to all of us that it might very well be entirely off, and we’re not disappointed that it was.
I'm amazed that you estimated it at so little time originally, and I'm amazed you shipped it in full in just 2 years. Congrats!
It's actually not fair to judge this one way or the other at the two year mark.
We technically removed the last C++ code from the core project in January 2024 (~a year ago), the last C++ code altogether (a test helper) in June 2024 (six months ago). We only decided to push out a release now because we've added enough new features (not counting the rewrite as a feature) to warrant a release.
But at the same time, someone could argue that the current codebase is still far from being fully idiomatic rust, there are various C++-isms ranging from the use of UTF-32 (historical from the nature of std::wchar/std::wstring under *nix) to still passing around file descriptors rather than rust `File` objects (that will take a lot of rearchitecting to make mut-safe).
Ultimately, a project is never "done" and we're not being paid at all let alone contingent upon completion of the port, so there's no real use in saying it took precisely this long or that long. We're releasing now because we want to, but I wouldn't tie the release cadence with the port timespan.
Very nice too see Rust being used where it is actually appropriate! Hopefully Rust "easy" multi-threading will allow more parts of fish to be async, even though it's already much better in that regard than bash (or any other shell I've seen).
One weird thing I'd also like to see is more bash integration, as others pointed out that being their primary motivation against switching to fish full-time. My use case is mostly sourcing bashrc/bashevv, and theoretically it should be possible in fish if I understand correctly: you need to be able to import e.g. every new env variable that changed before and after sourcing a bash script via real bash.
I wonder if this is the biggest project that has moved from C++ entirely to Rust (or maybe even C to Rust?) It probably has useful lessons for other projects.
If I'm reading this right, it looks like fish was not released as a hybrid C++ / Rust program, with the autocxx-generated bindings. There was a release during that time, but it says "fish 3.7 remains a C++ program" [1]
It sounds like they could have released if they wanted to, but there was a last stage of testing that didn't happen until the end.
Some people didn't quite get the motivation for adding C++ features to Rust [2], and vice versa, to enable inter-op. But perhaps this is a good case study.
It would be nice if you could just write new Rust code in a C++ codebase, without writing/generating bindings, and then throwing them away, which is mentioned in this post.
---
Also the #1 gripe with Rust seems to be that it supports version detection, not feature detection.
But feature detection is better for distros, web browsers, and compilers:
Feature Detection Is Better than Version Detection - https://github.com/oils-for-unix/oils/wiki/Feature-Detection...
Version/name detection is why Chrome and IE pretend to be Mozilla, and why Clang pretends to be GCC. Feature detection (e.g. ./configure and eval() ) doesn't cause this problem!
[1] https://github.com/fish-shell/fish-shell/releases
[2] e.g. https://news.ycombinator.com/from?site=safecpp.org
We never considered releasing anything with a hybrid codebase; aside from the philosophical purity of fully making the switch to rust, it would have been a complete distribution nightmare (we take package maintainer requirements very seriously). Moreover, the code itself was not in a very pretty state - the port was very much like trying to undo a knot: you had to make it much uglier in order to get it properly undone. There were proverbial tons of SLoC that were introduced only for transitional interop purposes that were later removed, this code was never held to the same quality standards (in terms of maintainability; it was still intended to be bug-free and required to pass all our unit and integration tests, however).
As mentioned in the article, we prefer to do feature detection when and where needed/possible. The old codebase was purely feature-detected via the CMake build system but we ended up writing our own feature detection crate for rust invoked via build.rs (maintained here [0]) though we just defer to libc on a lot (which doesn't do that yet). One side effect of the libc issue is that we're beholden to their minimum supported targets (though I'm not sure if that's strictly the case if we don't use the specific apis that cause that restriction?), which are higher than what we would have liked because we were fine with feature detecting and implementing using both older and newer apis where needed.
[0]: https://github.com/mqudsi/rsconf
The problem with feature detection (normally referred to as configuration probing), at least the way it's done in ./configure and similar, is that it relies on compiling and potentially linking (and sometimes even running, which doesn't work when cross-compiling) of a test program and then assuming that if compilation/linking fails, then the feature is not available.
But the compilation/linking can fail for a myriad of other reasons: misconfigured toolchain, bug in test, etc. For example, there were a bunch of recent threads on this website where both GCC and Clang stopped accepting certain invalid C constructs which in turn broke a bunch of ./configure tests. And "broke" doesn't mean you get an error, it means your build now thinks the latest Fedora and Ubuntu all of a sudden don't have strlen().
One thing about rust is that it has always treated cross-compiling as a first-class citizen. Cargo is very intentional about the difference between the HOST and TARGET triplets and you can't mix them up unless you are doing so intentionally.
The rsconf feature detection crate was similarly designed with cross-compilation in mind from the start and eschews running binaries in favor of some clever hacks to exfiltrate values during the cross-compilation process.
There is only one rsconf feature (retrieving compile-time constants) that is currently labeled caveat emptor as it does not support cross-compilation; perhaps I can nerdsnipe someone here into figuring out a workaround: https://github.com/mqudsi/rsconf/issues/3
I re-did Firefox's autoconf to do this back around 2010 (was contracting for Mozilla as a part-time job in college), after running into one too many features that were automatically disabled because of a missing library. There was at least one Firefox nightly that was missing an important feature because the build machine didn't have the required library.
One answer is the __has_feature tests mentioned in a sibling comment. Then you are using a supported API, not arbitrary code. Browsers should probably support something like that, if they don't already.
But the arbitrary code is still a useful fallback, for when the platform itself doesn't support config probing
I think you're saying that "writing good ./configure is hard", which is absolutely true. But it's still true that feature detection is better than version detection.
Deleted Comment
Heartily recommend others give it a try as a daily driver for a couple of weeks. I liken it to Sublime Text: an excellent “out of the box” tool. Just the right amount of features, with the option to add more if you want. But you also don’t feel like your missing out if you keep it bare bones. A great tool in and of itself.
For example, here's how you write an autoloaded function "foo" in Fish: you make a file called "foo.fish" in its config directory. Inside that, you write "function foo ..." to implement it. There's no step 3. That's it.
Want to customize your shell prompt? Follow the process above to write a function called "fish_prompt" that uses normal scripting things like echo, pwd, git, or whatever to write your prompt to the screen. There's no step 2. That's it.
Fish was revelatory. Other shells of the same vintage feel hopelessly outdated to me now. For context, I was the maintainer of FreeBSD's "bash-completion" port for a few years way back when. It's not that I don't have experience with other shells. I have plenty. I just don't want to use any of the others now.
1. With zsh, I can copy-paste some bash snippet and in 99% of cases it will just work. Aside of copy-pasting from StackExchange, I also know a lot of bash syntax by heart by now, and can write some clever one-liners. With zsh, I didn't need to learn everything from scratch. (I guess this matters less now that you can ask AI to convert a bash one-liner into fish one-liner?)
2. For standalone scripts... well, I think it's best to reach for a proper programming language (e.g. Python) instead of any shell language, but if I had to use one, I would pick bash. Sure, it has many footguns, but I know them pretty well. And fish language is also not ideal - e.g. IIRC it doesn't have an equivalent of `set -e`, you have to add `; or return 1` to each line.
I'm sure you know this, but: no particular reason the interactive shell you use has to match the shell you use for scripts. All of my scripts are in bash, but I haven't used bash interactively in decades now, at least on purpose.
Zsh has a few nasty Bashism footgun incompatibilities. If I remember correctly the worst one is with how globbing / “*” works, which is why that is guarded with an option.
My main reason for sticking with Bash is that it’s everywhere, and the places where it isn’t try very hard to support the most-used featureset of Bash.
A stock Bash shell does feel a little naked without my dotfiles though :)
In any case the discussion in that issue made a convincing (to me) argument that if you're doing the sort of scripting for which "set -e" makes sense, which is most of it, you should be using Bash. That doesn't mean you need to use Bash interactively though, as others have pointed out.
The main reason I switched is because zsh can (often) source bash scripts and can use bash completion scripts (usually), and I was tired of having to translate things from bash to fish. I also ran into a few things where something that was relatively easy to do in bash was impossible to do with fish. But that was years ago so maybe that is less of an issue now, and I don't remember exactly what it was.
Having used zsh, I think a big advantage it has over fish is the completions. There are completions available for more programs for zsh, and the zsh completions are sometimes higher quality in zsh.
But I do generally like the syntax, and good out of the box experience of fish. I wish it had a bash or even posix compatibility mode and more available completions.
Fish was so much better than either out of the box, and I still have done virtually no configuration other than setting it up to use my common starship prompt, which is supported in bash as well.
I don’t understand personally the argument about not having bash syntax. If I want it, I just run `bash`.
It's a Bash-like shell written in Python. It has significant overlap with the awesomeness of fish, and has the advantage of being able to write your shell scripts in a Python dialect. So if you know Python, the mental burden is much lower.
On top of that, it's cross platform, since Python is. No WSL needed.
I switched to it in 2018 and haven't looked back. Originally it was just because I wanted a better command prompt environment in Windows for work, but I liked it so much I switched to it in Linux as well.
(And yes, you can type any Python statement right in the command prompt).
[1] https://xon.sh/
> what more could a shell?
Is quite good. It could almost be the tag line for fish shell.
* Command auto suggestions as you type based on your history
* History search (using up arrow) based on a partial command
* Helpful completions and descriptions when you hit TAB
* Muti-line command editing
* Syntax highlighting
You can get all those same features in Zsh by using plugins, but those features work out-of-the-box with Fish with zero configuration. Zsh is a bit of a pain to configure, and pretty anemic without plugins. Fish makes configuration optional because it works how you'd hope your shell would out of the box. Even though Zsh has those features as plugins, they're kinda janky, not well maintained, and often conflict with other plugins.
Additionally, Fish also has:
* Excellent built in commands (string, math, argparse)
* Sane scripting (word parsing where you don't need to quote everything, etc)
* Great documentation
* A web-based configuration if you're into that sort of thing (it's a bit of a gimmick for beginners)
The main reason I use Zsh (or Bash) at all is for POSIX/portability, or for when I can't install something else. But for an interactive shell on a machine I control, it's hard to compete with Fish for speed, features, and ease of use.
Really, not needing to pull in other people’s janky scripts because the built-in features work well is huge. I still configure fish and use a few scripts, but it’s the lack of the massive cottage industry that is the primary draw for me.
Of course, many devs see that as a failing: “how could a shell do its job well without a thousand knobs to tweak?”
I use zsh with plugins which pretty much makes it act like fish's convenience but one can use fish as their shell scripting while keeping the "bash" compatibility by keep using zsh or bash under terminal.
> We’re also losing Cygwin as a supported platform for the time being, because there is no Rust target for Cygwin and so no way to build binaries targeting it. We hope that this situation changes in future, but we had also hoped it would improve during the almost two years of the port. For now, the only way to run fish on Windows is to use WSL.
I understand, but this is indeed incredibly sad. To this day I still use Cygwin, and in fact prefer it to WSL depending on what I'm doing. Cygwin is an incredible project that is borderline miraculous for what it accomplished and provides. Without Cygwin I may not have any sanity left. I can't exude enough love for the Cygwin team.
Hopefully rust will support cygwin as a build target in the future!
https://github.com/rust-lang/rust/issues/5526
(this feature request has been open for 12 years)
> We’ve experienced some pain with C++. In short:
> tools and compiler/platform differences
before conceding that, because of Rust, they 1) are actually dropping support for a platform they previously supported and 2) can only support (in theory) a small fraction of those platforms supported by g++, but that that's OK because those are the only platforms which really matter. I get that it's a trade-off, but it would have been more intellectually honest to just admit this is one area (portability, backwards compatibility, and ABI stability) where C++ mops the floor with Rust, instead of pretending it's a another paintpoint Rust avoids.
What they got from this isn't that they can now support more platforms, but that they now don't have to spend as much effort on supporting dealing with differences between different platforms.
> That’s because, while cargo is great at building things, it is very simplistic at installing them. Cargo wants everything in a few neat binaries, and that isn’t our use case. Fish has about 1200 .fish scripts (961 completions, 217 associated functions), as well as about 130 pages of documentation (as html and man pages), and the web-config tool and the man page generator (both written in python).
Our issue for this is https://github.com/rust-lang/cargo/issues/2729
Personally, I lean away from Cargo expanding into these use cases and prefer another tool being implemented on top. I've written more about this at https://epage.github.io/blog/2023/08/are-we-gui-build-yet/
I would definitely love to see Cargo have the ability to do this -- it means that `cargo install --locked` stays as a viable approach. It probably won't apply to fish, but I think being able to run a post-install command from the binary you just installed would suffice for my needs.
People can complain as much as they want about the borrow checker, but you basically have to be as strict as Rust is in C++ if you want to really avoid use-after-free issues, ... I've been writing "Rusty C++" since before Rust was a thing, because that's the only sane approach to memory safety. I'd rather have a program check that I don't fumble up instead of running sanitizers when things go awry (often years later). The best bug is a bug that can't happen at all.
Static analyzers are sadly too limited compared to what a borrow checker can do in my experience. Some bad stuff will always slip in in C/C++.
Is this typical for such a translation. They also mention addition of new features contributing to more code, how much of the addition was new features vs pure translation?
Would be interesting to see the line count of the c++ version if it was run through a formater with similar configuration.
It may be just down to rustfmt. It really adds a lot of vertical sprawl. I personally can't stand how much rustfmt makes multi-line code explode.
> The initial PR had a timeline of “handwaving, half a year”. It was clear to all of us that it might very well be entirely off, and we’re not disappointed that it was.
I'm amazed that you estimated it at so little time originally, and I'm amazed you shipped it in full in just 2 years. Congrats!
We technically removed the last C++ code from the core project in January 2024 (~a year ago), the last C++ code altogether (a test helper) in June 2024 (six months ago). We only decided to push out a release now because we've added enough new features (not counting the rewrite as a feature) to warrant a release.
But at the same time, someone could argue that the current codebase is still far from being fully idiomatic rust, there are various C++-isms ranging from the use of UTF-32 (historical from the nature of std::wchar/std::wstring under *nix) to still passing around file descriptors rather than rust `File` objects (that will take a lot of rearchitecting to make mut-safe).
Ultimately, a project is never "done" and we're not being paid at all let alone contingent upon completion of the port, so there's no real use in saying it took precisely this long or that long. We're releasing now because we want to, but I wouldn't tie the release cadence with the port timespan.
One weird thing I'd also like to see is more bash integration, as others pointed out that being their primary motivation against switching to fish full-time. My use case is mostly sourcing bashrc/bashevv, and theoretically it should be possible in fish if I understand correctly: you need to be able to import e.g. every new env variable that changed before and after sourcing a bash script via real bash.