Interesting list (not very surprising to me personally though saw $REPLY mentioned anywhere after a long time). One that surprised me recently (in a how-come-I-didn't-know-about-it-for-so-long way) was $PIPESTATUS e.g. if you are running a pipeline "cmd1 | cmd 2" and would like to know the return status of cmd1, you can use ${PIPESTATUS[0]} to get that. Useful, e.g. when you are tee-ing the output etc.
I would recommend against using $REPLY, for two related reasons:
1. It’s confusing to do a plain “read” and later have a magic $REPLY variable show up, seemingly from nowhere. It has shades of Perl, and I mean nothing positive with that statement.
2. Names are good. By setting a name to the thing which is read, you make the later code easier to read. It’s easier to understand “echo $filename” than “echo $REPLY”, so if you’re reading filenames, do “read filename”. (Or, rather, you probably want to do “read -r filename”, but that’s a different subject.)
I don't do a lot of bash programming, but I imagine using $REPLY also increases the danger of having the value overwritten before you use it as your code changes.
Yeah I don't use it in practice - I remembered it because it was part of my bash trivia collection when I started learning more about shell scripts many years go. Named variables are obviously a cleaner choice.
Why do you find $PIPESTATUS useful? It is in theory, but in practice I've never found a situation where I wouldn't just want to fail the whole pipe if one part of it failed, which is what "set -o pipefail" is for.
It's good practice to start your scripts with "set -euo pipefail" and then list exceptions to that scoped with "set +..." as needed. That's the "unofficial bash strict mode"[1].
I don't use the IFS recommendation in that article. I prefer to just keep one set of shell inanities in my head, and being used to the non-default IFS behavior doesn't help when debugging other people's shellscripts.
Because sometimes you want to take some action in case of non-zero $PIPESTATUS instead of just failing everything immediately. This provides full control.
WARNING: this page has a dangerous typo. do not put '.' in your path.
"This is similar to the confusion I felt when I realised the dot folder was not included in my more familiar PATH variable… but you should [not] do that in the PATH variable because you can get tricked into running a ‘fake’ command from some downloaded code."
Today, the threat model for a personal computer makes this a lot less important: to exploit this someone needs to be able to create a file on your computer and set its mode to +x: but if someone has that kind of access to one’s personal computer, it’s already game over.
It's too late, the damage is irreparable. By now, hundreds of people have probably already put . in their path, mounted their repos and /home noexec nodev and nosuid, and mounted a separate file system on ~/Downloads, just to spite you. ;) Or were you trying to trick them into doing that all along, so you could hack into their systems? I can never tell with you slippery cyber security types...
If you set $CDPATH, then for goodness’ sake don’t export it. It changes the behaviour of the cd command to make it output the absolute path of the directory changed to, which breaks a common shell-scripting pattern for converting relative directory paths to absolute paths, viz:
absolute=$(cd "$relative" && pwd)
Conversely, if you’re writing a bash script and it needs to be robust against people who do export CDPATH, you can do it like this instead:
That’s a very Linux-centric view. Neither of these works on BSD. It’s probably fine if you’re writing scripts that only need to run on Linux, but I wouldn’t call it the “right method”.
As much as people use the shell, it’s regrettable how few have actually read the manual. Anything you use for extended periods of time on a regular basis is worth sitting down and reading the manual for.
"the manual" is a form of communication for humans. Something nix has been de-prioritizing for thirty years! By experience, the nix culture has been "if you do not understand this, you should not be here" and a low-level hostility towards questions and learning, comes from some science gestalt in the 1800's German or something.
*nix is friendly, it is just selective in who its friends are" .. is a dot-signature from that era
A manual is a reading experience, a working reference system, and relies on the visual context it is presented in. There are vast differences in the quantitative and qualitative contents of manuals, on the same content! The BASH man page is .. improving ?
Specifically compare an alphabetized list of every option, including "change the preferred shortcut" and "let the sea-water in, thereby killing everyone if you are underwater" .. are right next to each other .. to modern manuals in say, the Python culture. Specifically compare say, huge amounts of text supplied for some obscure option that is not at all needed, to a terse one-liner for something crucial that is used everyday.
The parent comment cheerfully "blames the victim" with implied guilt for "not reading the manual" as if more time and effort on the part of the user, is the answer to the communication problem posed with such a dense and subtle realm.
Last thing in this rant is, that a small interesting article on something specific is an antidote to the largely unsolvable challenge here, so good on that.
When I had to start dealing with bash a lot about 5 years ago I consulted "Greg's Bash Wiki" at https://mywiki.wooledge.org/BashGuide and it pretty comprehensively treats best practices for writing bash scripts. I often go back to consult the guide.
I don't think you can generalize that. E.g. some of the BSDs pride themselves in great documentation. The shell manuals are not just alphabetical lists, but sort-of guides to their programming languages. etc.
> Something nix has been de-prioritizing for thirty years! By experience, the nix culture has been "if you do not understand this, you should not be here"
This is an open-source thing. Hardly anyone wants to write documentation that isn’t paid to do so. There’s no glory in it. I first learned Unix on SunOS/Solaris and the man pages were excellent. You could start with “man intro” and learn your way around the entire system from there.
BTW, if you think you’re any good at documentation, open-source needs your help. It’s a great way to contribute.
With a manual as labyrinthine and disorganized as Bash's, in my mind they are forgiven for not being able to struggle through it. It makes even the Zsh manual seem delightfully lucid by comparison.
I think that's a testament to how successful shell scripting is. Somehow you can be really productive and write useful tools without really ever diving deep into the docs.
I've written literally thousands of lines of shell scripts (maybe even 10k+) across dozens of scripts over the years and I've only really looked at the docs a couple of times. Of course I'm Googling various syntax things that are humanly impossible to remember, but I never once really sat there and combed through all of the docs or even read a book on Bash.
You can't do that with most programming languages.
Some of us actually have to USE shell scripts written by other people who like to brag about only looking at the doc a couple of times, who just copy and paste things they found in outdated pages on google into hodge-podges of frankenscripts, and who then run them as root at regular intervals on unattended servers.
It may be care free, fun, and effortless for you, but try being on the receiving end of that behavior some time.
RTFM! And if the manual sucks (like the Bash manual), then learn to use another language.
Instead of learning to write shell scripts better, people should learn NOT to write shell scripts, and learn a decent scripting language instead.
You will find that most decent scripting languages like Python are actually a hell of a lot easier to learn and program and maintain than shell scripts.
The Bash manual page is concise because it is a Unix manual page. Unix manual pages are supposed to be concise, because they are meant to be reference documents, not tutorials. In the GNU project, this is what the Info documentation is for. However, the bash project has elected not to maintain such a document, so the above documents from TLDP (The Linux Documentation Project) are the next best thing.
The bash man page is simply way too long, and IMHO not organized very clearly. And man pages don't even have a table of contents, never mind hyperlinks. The info version is naturally better in those respects, though.
The bash manpage ... runs longish. 110 pages as 'man bash | pr". I do read it, frequently, though it's hard to assimilate in total andI constantly find new or unexpected things (after 20+ years).
Checking zsh by comparison: 6 pages, though a long "SEE ALSO" list with 10 zsh-specific entries, totalling 374 pages in all:
Bash is painful and awful, IMO. The last thing I want to do (if I were even capable of comprehending the man pages) is really learn it and internalize its awfulness (same with, say, Vim); I want to be able to do what I want to with some syntax quickly glued on from StackOverflow.
Actually, I don't want to do that either.
What I really want to do is use a Python script instead. :3
But using Python is very painful as well for shell use-case. Do you really want to open files, subprocesses, and create fifo manually instead of typing "cmd1 | cmd2 > output"?
Then you have Xonsh, but in my opinion, switching between the two contexts, shell and python, at each part of a command was too difficult. I was totally lost even at writing very basic commands.
My problem with bash is that things are not very discoverable. There is so much cool stuff you find out even after ten years of using it daily. I wonder if there is a way to have all its features in a less obscure language.
Could that be a property of command line things in general? Is that why GUIs are so popular with non-professionals that don't want to take the time to learn to use a system properly? There is so much in terms of data manipulation that you can do with basic tools, but people will open op Excel to find a graphical button they need instead of using, I don't know, maybe `man -K` or `apt search` to find the right tool.
For years I've been trying to figure out what exactly the difference is. Terminals are seen as difficult and outdated, yet the more nerdy you get, the more likely it is that you end up using (and the more time you will spend in) a terminal. There has to be a reason terminals haven't replaced GUIs altogether, and GUIs haven't replaced terminals altogether, despite the near-complete overlap in things you can do with them. The only thing that really needs a mouse/touchscreen is when you have something spatial like photo editing, but conversely, the extreme end of the power user tools almost always use a terminal and can't really work with a GUI.
Possible. I would say that the shell is even worse than most command line tools. I have started using PowerShell in my Mac and I find it easier to find stuff there then bash. As soon as I don’t use bash for a month I forget all the little obscure tricks.
>4) SHLVL: [...] This can be very useful in scripts where you’re not sure whether you should exit or not, or keeping track of where you are in a nest of scripts.
If you're writing recursive shell scripts where you're not sure whether you should exit or not, you shouldn't be writing shell scripts.
Leave it to bash to go out of its way to make it easier to write scripts that break encapsulation and implement spooky mysterious action-at-a-distance by implicitly depending on how they were run, and purposefully behave differently whether they're invoked from a command line or another script, and are so confused about what they should do that they have to make guesses about whether or not to exit.
What about a simple "die" function that echoes a message and return an error code? Could be used both in scripts and in the interactive shell. It would use "exit" when SHLVL is more than 1, and "return" otherwise.
It's not about the best way to behave differently at runtime and sometimes call exit depending on how the script was invoked, it's about the idea that a scripting language would SUPPORT and ENCOURAGE writing that kind of script, with a special magic built-in variable and a section in the already-complex documentation encouraging its use.
Scripts should not behave differently depending on if they were invoked from other scripts, and modular reusable libraries should never take it upon themselves to call "exit" when there's an error.
The way to control the flow of recursive functions is by passing explicit parameters (like the recursion depth, or a data structure to walk, or an error callback), not reflecting on the depth of the runtime stack or looking at how the function was called.
If bash is too weak to handle errors, or exceptions, or recursion, or passing functional callback parameters properly, or even file names with spaces in them, then use a better scripting language, don't just throw up your hands and exit.
1. It’s confusing to do a plain “read” and later have a magic $REPLY variable show up, seemingly from nowhere. It has shades of Perl, and I mean nothing positive with that statement.
2. Names are good. By setting a name to the thing which is read, you make the later code easier to read. It’s easier to understand “echo $filename” than “echo $REPLY”, so if you’re reading filenames, do “read filename”. (Or, rather, you probably want to do “read -r filename”, but that’s a different subject.)
I know it because I once needed to strip whitespace and only found out through IRC; the difference is only somewhat hinted at in the manual.
It's good practice to start your scripts with "set -euo pipefail" and then list exceptions to that scoped with "set +..." as needed. That's the "unofficial bash strict mode"[1].
I don't use the IFS recommendation in that article. I prefer to just keep one set of shell inanities in my head, and being used to the non-default IFS behavior doesn't help when debugging other people's shellscripts.
1. http://redsymbol.net/articles/unofficial-bash-strict-mode/
"This is similar to the confusion I felt when I realised the dot folder was not included in my more familiar PATH variable… but you should [not] do that in the PATH variable because you can get tricked into running a ‘fake’ command from some downloaded code."
https://en.wikipedia.org/wiki/Wikipedia:Don%27t_stuff_beans_...
It's too late, the damage is irreparable. By now, hundreds of people have probably already put . in their path, mounted their repos and /home noexec nodev and nosuid, and mounted a separate file system on ~/Downloads, just to spite you. ;) Or were you trying to trick them into doing that all along, so you could hack into their systems? I can never tell with you slippery cyber security types...
Deleted Comment
A bit of scripting is then wanted to coerce the output of 'dirs' into a readable format.
Deleted Comment
*nix is friendly, it is just selective in who its friends are" .. is a dot-signature from that era
A manual is a reading experience, a working reference system, and relies on the visual context it is presented in. There are vast differences in the quantitative and qualitative contents of manuals, on the same content! The BASH man page is .. improving ?
Specifically compare an alphabetized list of every option, including "change the preferred shortcut" and "let the sea-water in, thereby killing everyone if you are underwater" .. are right next to each other .. to modern manuals in say, the Python culture. Specifically compare say, huge amounts of text supplied for some obscure option that is not at all needed, to a terse one-liner for something crucial that is used everyday.
The parent comment cheerfully "blames the victim" with implied guilt for "not reading the manual" as if more time and effort on the part of the user, is the answer to the communication problem posed with such a dense and subtle realm.
Last thing in this rant is, that a small interesting article on something specific is an antidote to the largely unsolvable challenge here, so good on that.
When I had to start dealing with bash a lot about 5 years ago I consulted "Greg's Bash Wiki" at https://mywiki.wooledge.org/BashGuide and it pretty comprehensively treats best practices for writing bash scripts. I often go back to consult the guide.
This is an open-source thing. Hardly anyone wants to write documentation that isn’t paid to do so. There’s no glory in it. I first learned Unix on SunOS/Solaris and the man pages were excellent. You could start with “man intro” and learn your way around the entire system from there.
BTW, if you think you’re any good at documentation, open-source needs your help. It’s a great way to contribute.
An oddly specific comparison.
I've written literally thousands of lines of shell scripts (maybe even 10k+) across dozens of scripts over the years and I've only really looked at the docs a couple of times. Of course I'm Googling various syntax things that are humanly impossible to remember, but I never once really sat there and combed through all of the docs or even read a book on Bash.
You can't do that with most programming languages.
It may be care free, fun, and effortless for you, but try being on the receiving end of that behavior some time.
RTFM! And if the manual sucks (like the Bash manual), then learn to use another language.
Instead of learning to write shell scripts better, people should learn NOT to write shell scripts, and learn a decent scripting language instead.
You will find that most decent scripting languages like Python are actually a hell of a lot easier to learn and program and maintain than shell scripts.
I often describe my bash book as an attempt to carefully demystify the man page so that you can approach it with confidence.
http://tldp.org/guides.html#abs
Beginners might prefer the Bash Guide for Beginners:
http://tldp.org/guides.html#bbg
The Bash manual page is concise because it is a Unix manual page. Unix manual pages are supposed to be concise, because they are meant to be reference documents, not tutorials. In the GNU project, this is what the Info documentation is for. However, the bash project has elected not to maintain such a document, so the above documents from TLDP (The Linux Documentation Project) are the next best thing.
Checking zsh by comparison: 6 pages, though a long "SEE ALSO" list with 10 zsh-specific entries, totalling 374 pages in all:
Actually, I don't want to do that either.
What I really want to do is use a Python script instead. :3
Then you have Xonsh, but in my opinion, switching between the two contexts, shell and python, at each part of a command was too difficult. I was totally lost even at writing very basic commands.
FWIW I don't think I've ever seen this done in any system anywhere. My usual expectation is that local executables are run as "./foo".
For years I've been trying to figure out what exactly the difference is. Terminals are seen as difficult and outdated, yet the more nerdy you get, the more likely it is that you end up using (and the more time you will spend in) a terminal. There has to be a reason terminals haven't replaced GUIs altogether, and GUIs haven't replaced terminals altogether, despite the near-complete overlap in things you can do with them. The only thing that really needs a mouse/touchscreen is when you have something spatial like photo editing, but conversely, the extreme end of the power user tools almost always use a terminal and can't really work with a GUI.
https://books.google.com/books?id=KJQRAwAAQBAJ&pg=PA36&lpg=P...
http://tldp.org/LDP/abs/html/exit-status.html (HTML)
http://tldp.org/LDP/abs/abs-guide.pdf#page=57 (PDF)
If you're writing recursive shell scripts where you're not sure whether you should exit or not, you shouldn't be writing shell scripts.
Leave it to bash to go out of its way to make it easier to write scripts that break encapsulation and implement spooky mysterious action-at-a-distance by implicitly depending on how they were run, and purposefully behave differently whether they're invoked from a command line or another script, and are so confused about what they should do that they have to make guesses about whether or not to exit.
Scripts should not behave differently depending on if they were invoked from other scripts, and modular reusable libraries should never take it upon themselves to call "exit" when there's an error.
The way to control the flow of recursive functions is by passing explicit parameters (like the recursion depth, or a data structure to walk, or an error callback), not reflecting on the depth of the runtime stack or looking at how the function was called.
If bash is too weak to handle errors, or exceptions, or recursion, or passing functional callback parameters properly, or even file names with spaces in them, then use a better scripting language, don't just throw up your hands and exit.
https://stackoverflow.com/questions/14199689/how-can-i-handl...
https://github.com/texane/stlink/issues/634