Readit News logoReadit News
toyg · 7 years ago
The Powershell idea to push structured data through the pipe, which this project fundamentally replicates with a relational structure, has its merits; but it will always displease someone, and carries the burden of having to rewrite millions of utilities in order to unlock the real potential.

What if, instead, we pushed the approach “upstream”, asking systems to have an additional stream that all shells can access? We have stdout and stderr, we could add “stddata” - a pipe that expects and produces structured data. Then it would become trivial to add support around the userland, without displeasing anyone - because it’s just an additional interface, not a replacement. The pipe should support two or three formats to start with (csv, json, and maybe xml, so you can handle both tabular and child-parent relations with a varying degree of precision) and shells could have a special character for piping stddata (I like ¥) so everything else about them would stay the same.

StreamBright · 7 years ago
I am at the stage that I am more than happy to rewrite whatever it takes to get better experience. This includes the OS. I think we have sacrificed too much on the altar of backward compatibility. I guess I am just tired of breakage because a file has a space in its name.
alxlaz · 7 years ago
> I am at the stage that I am more than happy to rewrite whatever it takes to get better experience. This includes the OS

Ever tried to rewrite one (and the software on top of it, since we're "including" the OS) :)?

arendtio · 7 years ago
Why not just push all that data through stdout and interpret it at the other end accordingly?

Another option would be to just use the 4th fd.

tlamponi · 7 years ago
Using te 4th fd is exactly what OP proposed?
toxik · 7 years ago
The problem with adding a new standard fd is that lots and lots of code relies on first three fds being stdin out err. It could of course have a high fd number. Would be a great thing!
dragonsh · 7 years ago
I am not sure how long "ls" will take with 1 million entries in directory, given this paradigm. This has been a problem so I use "ls -1" to list files in bash shell, otherwise it takes forever.

Probably it might benefit some. Good to see competition in this area with sh, bash, zsh, fish, conch and many others.

virtualritz · 7 years ago
Why not add a new parameter, --nushell, to any too that supports the 2-dimensional format? These tools would just be stored in nushell's config file and the shell would silently add --nutshell to any command it ran that was registered to support it.
pixelmonkey · 7 years ago
The compelling idea here is that they convert the output of common shell commands into tabular data that can be manipulated using common operators, so that you don't have the remember sorting/filtering/grouping flags that may be different for every different shell command. So, imagine being able to use the same sorting/filtering logic on the output of `ls` as you might on the output of `ps`, and without relying on hacky solutions like pipelines to `grep`, `cut`, and `sort`.

It also means shell command output can be easily transposed into JSON and CSV. Actually pretty clever!

entangledqubit · 7 years ago
I've been delving into bash scripting a bit more than I'd like as of late and the lack of universally available consistent structured output for the CLI really got to me. Most of the script contents end up being these obfuscating and brittle "hacky solutions" that never should have been necessary. When I thought about pursuing fixing this I felt like the task was a bit overwhelming. I'm delighted that these developers are working on this!
bigtrakzapzap · 7 years ago
The key flaw of UNIX philosophy is destructuring deserialization and reserialization based on lines, necessitating all manner of argument escaping and field delimiters, when pipelines should be streams of typed messages that encapsulate data in typed fields. Logs especially (logging to files is a terrible idea, because it creates log rotation headaches and each program requires a log parser because of the loss of structured information) also. Line-oriented pipeline processing is fundamentally too simple. Settling on a common data simple/universal format robust enough for all purposes, including very large data sets, to exchange between programs and is just complicated enough to eliminate escapement and delimiter headaches without throwing away flexibility (by a new/refined set of processing tools/commands) is key.
munchbunny · 7 years ago
You can also just use PowerShell. Its chief crime is its verbosity in naming, but it has the "universally available consistent structured output for the CLI" you're asking for.
kbenson · 7 years ago
I've generally found you can get most of what you need done with head, tail, grep, cut, paste, sed and tr (and maybe a couple others I'm missing). Oh, and test (both bash's and coreutil's). It can be a little hacky, but once you realize the crazy output you sometimes have to parse from third party programs, you realize that the ability to handle any free form text is pretty essential.

That said, since I'm a Perl developer, I'll generally just go straight to perl's inlined implicit loop mode (-p or -n) if I need to use more than a couple of those utils, unless I want it to be portable with a minimum of requirements (as even though more binaries are used in the shell version, they're all essentially guaranteed to exist on a Linux/Unix).

j88439h84 · 7 years ago
I agree. The idea of Mario is to use Python objects instead of text lines. This means it's much more structured and has a familiar syntax as well.

https://github.com/python-mario/mario

z3t4 · 7 years ago
Thats when you switch to a real programming language. Relying on output from other programs will always be brittle - as error messages etc will change every new dist release.
yread · 7 years ago
Yeah just iterating through a directory with filenames with spaces..
geophile · 7 years ago
I think they took one step in the right direction, where two more were needed.

The UNIX idea of piping text output from one command to the next was fine at the time, but it means that the downstream command has to parse text. With nushell, you are parsing slightly more structured text.

The two steps they could have taken in addition:

1. Instead of generating rows of a table, why not lists of strings? E.g. ("abc", "def", 123) instead of "abc | def | 123". Much easier to deal with in the command receiving this input.

2. And instead of just lists, why not objects? E.g. a process object, a file object, or, if you'd like a list as above.

I have implemented this idea: https://github.com/geophile/osh. For example, if you want to find processes in state 'S' with more than 3 descendents, and print the pid and command line of each, sorted by pid:

    osh ps ^ select 'p: p.state == "S" and len(p.descendents) > 3' ^ f 'p: (p.pid, p.commandline)'  ^ sort $
^ is used to denote piping. ps yields Process objects. Select examines those processes and keeps those with the right characteristics. Then f applies a function, mapping to the the pid and commandline in a list, and then sort sorts the lists, by pid (the first field of each list). $ does output.

mkl · 7 years ago
The "abc | def | 123" is just a textual display of the data, not the data itself. Each row of the table is a structured record, like you suggest.
nailer · 7 years ago
I think in nushell and powershell you are parsing objects.

'where' in pwsh / nush literally looks for keys with particular values.

j88439h84 · 7 years ago
I agree objects are way better than text for manipulating structured data. Python is great at this, so I use Mario to do it in the shell. https://github.com/python-mario/mario

For example, to sort filenames by length

    $ ls | mario apply 'sorted(x, key=len)' chain
    doc
    src
    bench
    extra
    AUTHORS
    LICENSE

jayd16 · 7 years ago
Seems its only one level deep, no tables of tables are shown. Powershell has a lot of issues but one cool thing is you can access output as nested object data. Seems like the logical evolution.
jntrnr1 · 7 years ago
You can have tables of tables. To get to the inner table data, you can use the 'get' command to retrieve it and work with it.
spullara · 7 years ago
They have an example in their video.
devwastaken · 7 years ago
This is pretty much what I've idealized for a new shell. Where every program can just output a standard table of information for each action. No need to have the program itself be queried for it every time, or find all the right options. You just get all the data, and filter what you want programaticly.
vthriller · 7 years ago
> You just get all the data, and filter what you want programaticly.

There are definitely cases when you're better off with producer-side filtering. For example, `find` allows you to not traverse certain directories with -prune, which might save a lot of resources.

sciurus · 7 years ago
Do they actually convert the output of the existing commands? Or are they reimplementing then one by one? It looks like the latter.

In their examples it looks like 'ls' is built-in to their shell instead of from e.g. coreutils.

JohnBooty · 7 years ago
I love the core idea of nushell, but reimplementing them seems rather insane, given the incalcuable number of programmer-hours that have gone into coreutils.

That seems like a real mistake, rather than simply having nushell's "ls" be a wrapper for coreutils "ls -la" or some such.

I understand the benefit of reimplementing everything from scratch, as that way you have a more consistent nushell experience everywhere, regardless of which version of "native" ls your system has. And allowing "^ls" is a useful and necessary hedge.

But, wow reimplementing nearly everything seems like an enormous undertaking.

(It is of course possible that I'm completely misunderstanding things! Perhaps the devs can comment?)

sha666sum · 7 years ago
ls is a shell builtin. When he types ^ls he uses the one from coreutils.
xz0r · 7 years ago
This idea is already preset in xonsh
svd4anything · 7 years ago
How do you think it compares to xonsh? I ask because I had been thinking recently about investing some effort into xonsh and now nushell has appeared. I’d be very curious what any serious xonsh users think about this newcomer.
dima55 · 7 years ago
There're already numerous ways to do this without rewriting the world. For instance

https://github.com/dkogan/vnlog#powershell-style-filtering-o...

astrobe_ · 7 years ago
> So, imagine being able to use the same sorting/filtering logic on the output of `ls` as you might on the output of `ps`, and without relying on hacky solutions like pipelines to `grep`, `cut`, and `sort`

That would awksome.

ecnahc515 · 7 years ago
Seems similar to the purpose of `osquery` which is SQL based.

Deleted Comment

mamcx · 7 years ago
This is almost exactly what I have dreamed with a shell. I'm building a relational language (http://tablam.org) with the plan of something alike.

Some extra ideas:

- Tables are amazing. But I think we need at least support for this "shapes" of data: List, Tables, Trees and maybe graphs

- I think will be nice to have an equivalent of Request/responses alike http. And then a way to introspect what a command do, like auto-swagger.

- Having widgets like table-browser, graphs, etc AS COMPONENTS.

In FoxPro you could say "browse" and it display a DataTable grid with full editing capabilities on the current table (or it ask for a table to open). In this case:

    ls | ui_browse
The key here is that the widgets work stand-alone, so is not necesary to fully build a terminal app just to see nicely data. Look like https://goaccess.io and more like this idea: https://github.com/sqshq/sampler

This mean we could get a jupyter-like experience here!

- Some way to do curryng and maybe environment alias: You could setup an environment for the system, the user, this directory. In the environment is a .toml with vars and stuff.

The other thing where I think shells fail... is that are still old-style terminal. Having text output is cool, but having more rich widget system will unlock better the potential

Per_Bothner · 7 years ago
"The other thing where I think shells fail... is that are still old-style terminal. Having text output is cool, but having more rich widget system will unlock better the potential"

You might find DomTerm (http://domterm.org) interesting. It combines solid traditional-terminal support (including solid xterm compatibility) with a data model based on HTML/DOM. So you can "print" images, HTML tables, and more. DomTerm "understands" shell/repl commands as consisting of prompts, inputs, and outputs. You can add markers to group the output into logical groups, so it is automatically re-formatted on window re-size. And more.

felixfbecker · 7 years ago
Like PowerShell's different output formats (table, list, wide, ...)?
mamcx · 7 years ago
No output. Declared the input/output is in the shape table, list, tree.

Like when a http response say is json, text, csv, etc. This information unlock certain capabilities and will allow to provide efficient execution of queries.

If psql (postgresql terminal cli) declare their data is in the "tables" shape then it could take the |where name = "" query itself and execute it instead of output all the rows...

roryrjb · 7 years ago
There's a lot of comments here already so I don't know if this has been addressed already, but I don't get this shell. It seems to be fixing a problem that doesn't exist, and a lot of comments are echoing it. Current shells are not deficient, as Doug McIlroy originally stated, the universal interface is text and tools should take input and generate output based on this. You obviously can then pipe things together with these tools not necessarily knowing in advance how they will be used in a pipeline. You can get very far with sed, awk, etc, you just have to spend a little bit of time learning them, but they are not complicated. You can of course then introduce a command like jq for specialised purposes. But hte point is the shell isn't providing this, they are separate tools that you orchestrate using the shell. There is not a flaw with how it works at the moment, I truly believe and this isn't a dig at this project or anyone in particular, that people do not spend enough time learning these tools that have been there for decades. The re-inventions that appear on GitHub are not a product of the deficiency of the existing tools, I don't know what they are but it's not that. Unless I am missing something.
arendtio · 7 years ago
You state the problem yourself

> [...] you just have to spend a little bit of time learning them

I can still remember the many times I tried to learn shell scripting and gave up too quickly. So 'little bit of time' might be an understatement. Actually, you need a good use-case, sufficient motivation and enough time at hand to bite through the first problems...

And show me a serious, modern programming language that is so susceptible to whitespace usage. Yes, there are some that care about whitespace (e.g. Python), but rarely it will cause a syntax error to not place a space before a bracket.

Another problem is that those text formats are a bit brittle. I am talking about having to use IFS="" and ls giving back something different to humans than to pipes. Most of the time you can solve it one way or the other but having to search frequently for solutions to such problems just sucks.

I am not saying that the old tools aren't worth learning. In fact, I love writing shell scripts (for the good parts) and can't remember the last day I haven't used a shell. But saying there are no problems to be solved is just wrong. So I think a little discussion about what could be done and some prototyping shouldn't hurt anybody.

roryrjb · 7 years ago
I've now read a lot more of the comments, people are really excited about this, it has taken me a bit by surprise. There are a lot of comments in the vein of "bash sucks but it's everywhere". My dream shell is pdksh, which already exists, I would only favour bash because it has more advanced completion logic. Aside from interactive use I write everything in POSIX shell. I think popularity is being conflated with portability and that's a property that isn't valued enough. I have no problem at all using Bourne-compatible shells, I can't quite formulate it in my mind but I think they are probably misunderstood. To me, who hasn't been doing this too long, I don't think that bash sucks, I don't find bash/POSIX shell to be a bad language. In fact in terms of different things I write (not LOC) I think I write more things in POSIX shell than anything else, writing wrappers in shell is not to account for a deficiency, it's utilising that power to change your shell experience and workflow into what you want, into what fits your brain. This is my experience, I don't understand (it would be good to hear from others) why people have such a bad opinion of the current state of shells.

EDIT: wording.

furyofantares · 7 years ago
When PowerShell first came out, I was very excited by the idea and then barely even gave it a chance. Instead I gained an appreciation for input and output being just text. Yeah, I might have to awk’wardly massage the data a lot of the time, but the fact that I can just run a command and see the output makes the flow of incrementally solving a problem really nice.

I’m gonna give nushell a chance, because tabular textual data seems like it still fits the bill while also making it less awk’ward to massage the data as needed for the pipeline.

lottin · 7 years ago
I too don't understand the excitement these object-oriented shells elicit in some circles. It's exactly like you say, the job of a shell is to manage jobs and run programs. The only data processing that it performs is the processing of the command line arguments and nothing else, as far as I can tell.
neop1x · 7 years ago
yes, I have a feeling that Windows people are coming to Linux through "Windows Subsystem for Linux" to reinvent almost 50 years old concept which have been working well. Instead of using small bash which is everywhere (even in inird), more convenient zsh or Python, they are going to setup Rust environment, download gigabytes of crate dependencies and build a new shell in order to run their web development workflow script. Maybe in the 10 years, there will be Nodejs/Electron-based shell with fancy css animations and primarily mouse control. And tables will be sortable by a mouse click on a column header and searchable in real-time and there will be speech commands for common tasks...
nwah1 · 7 years ago
The point of this project is standardized inputs and outputs, that's all. Working with commands is not standardized on traditional posix systems, in terms of either input or output.

The other comments about adding a "stddata" and handing this responsibility to userspace is a good one.

You shouldn't need to know a specialized DSL for each tiny core util.

oblio · 7 years ago
You are missing something. Current shells are brittle and they're just bad tech in 2019. We've made them half decent by brute forcing tools and hacks for them.

Everyone has to reinvent the same basic tool output parsing because ls (or top or ps or...) can't spit out real structured data.

roryrjb · 7 years ago
I disagree. I'm not saying that everyone should think the way I do, I guess realistically all I mean is that when I'm confronted with a parsing issue in this context I just write a shell script. Ok, this can be taken as reinventing, which I'm advocating against, but I see this as utilising the UNIX philosophy and by sticking to POSIX shell specifically (and using sed, awk, tr, et al) it is still a built-in and portable way to deal with this problem. Also I am not saying this because of some irrational submission to an old philosophy, I advocate this style of working because I find it easy, convenient and fast, and it works for me in the real world. But of course people have different ways of working and I respect that. This is just what works for me and how I see things. Definitely not meant as an attack against this project or different ways of working.
roryrjb · 7 years ago
One downvote and counting. This is just my opinion. Let's crank this up a notch. Those advocating for more advanced interfaces are missing the point. Text is all you need, no graphics or objects or widgets, just text. I have no problem with this, I enjoy it like this, we don't need an improvement. There I said it, any downvotes will be worth it, because I get to voice my opinion. In the end I will still be happily using my POSIX shell scripts, awk, sed and enjoy my life.
Siyo · 7 years ago
"Goto is all you need, no ifs or loops or dynamic dispatch, just jumps. I have no problem with this, I enjoy it like this, we don't need an improvement. There I said it, any downvotes will be worth it, because I get to voice my opinion. In the end I will still be happily using my assembler and enjoy my life."

I'm sorry, I couldn't help myself. Your comment reminds me of an anecdote I heard from the early times of structured programming. When structured programming was just gaining its feet, there was a certain class of programmers who just could not understand why people would want to write structured code. You can do everything in assembly they said, you have much more control over performance, etc. They looked down on structured programming as not "real programming".

There's a lot of benefits to adding some structure to text. I don't think that Nushell's approach is the best one, but to say that there are no problems and we shouldn't look to improve things is just backwards. We should always look to improve our tools and our craft, otherwise we would still be stuck writing assembly.

kryptiskt · 7 years ago
"Text" is a useless specification when building a program that produces data that other programs consume, or consumes data that other programs produce. There is nothing actionable in knowing that you have text to work on other that you can presumably discount the appearance of nulls in it.
CJefferson · 7 years ago
I find it basically impossible to compose programs where I have things like a list of filenames with spaces. It turns out some programs support some kind of option to separate with nulls, but they are inconsistent and not everything supports it.

So I end up having to put everything in one giant call to 'find', which defeats the compositionality of shells.

stinos · 7 years ago
Text is all you need

In essence, yes. Yes you can get pretty far with sed and the likes. But it's not because it's all we really need, that other things (objects in this case) don't offer an improvement and make things better/easier/more convenient. Just like we theoretically could get along fine with just the CLI, doesn't mean that it is the single best thing in the world for all possible occasions.

sixothree · 7 years ago
Text is awful and annoying to deal with.
rvz · 7 years ago
A brand new shell, especially in Rust is a great thing to see and also how it tackles the problems that exist in other shells.

However, its interesting to see so many dependencies that are required to build it (73 crates last time I checked) and as shells have always been portable across many operating systems, I wonder how portable nushell would be since it wishes to replace my current shell.

smt88 · 7 years ago
I think the JavaScript/npm world has trained a generation of devs that high dependency counts are a code smell. This can be true, of course, but that's largely because of external factors.

In a perfect world, a language's ecosystem would have a tiny stdlib and allow importing of libraries for everything else. The Linux philosophy would be followed strictly.

The problem is just that the overhead in securing, maintaining, and organizing all those libraries is pretty large, as we've seen by npm's repeated failures. Of course the *nix community seems to have largely solved the problem, but there's also a massive culture of volunteerism benefiting them.

wwright · 7 years ago
I think there’s definitely room for mirrors of package repos that focus on a stablized and audited subset of packages, with frequent security and bug backports and medium-length cycles for new features — like what Fedora and Debian do, but for language ecosystems.
ekc · 7 years ago
UNIX philosophy. You mean the UNIX philosophy.
sha666sum · 7 years ago
It is safe to say that it is unrealistic to expect that developers in the real world audit their dependencies.

I'd imagine a feasible solution being a permission-based system (which might already exist), where programs and their dependencies do not have access to certain facilities, like the filesystem, other processes or the network, without them being declared in a manifest file of sorts. Permissions should be inheritable when explicitly specified, and a sub-dependency could not gain a permission it didn't inherit from its parent. Unfortunately this does not work so well with a shell, since the shell needs the ability to spawn arbitrary processes. At least the shell itself would not have network access, forcing it to rely on tools already installed in the system

Also, if we resort to some magical wishful thinking, then all the tools in the system follow this permission model and the package manager is aware of the permissions. You could then query the package manager for all installed software with a certain capability, like network access, disabling tools like curl and netcat.

umanwizard · 7 years ago
The key difference between typical GNU/Linux distributions and the npm/cargo model is that the set of packages available in the distributions is to some extent curated.
orf · 7 years ago
There are 464 dependencies in the complete tree when doing `cargo install`.

That's a bit too many. But it's not often you compile your shell from scratch.

jntrnr1 · 7 years ago
It's definite a lot. We're looking into making more of the features optional, so you can streamline it if you want.
seaish · 7 years ago
And 484 when installing with rawkey and clipboard.
saghm · 7 years ago
> shells have always been portable across many operating systems

Most shells work on just one of either Unix or Windows; the blog post specifically mentions Windows support as being something on the radar of the developers, which Rust is arguably a better fit for than C/C++ (which most shells are written in) due there being a single compiler, standard library, and build tool with first-class support for both Unix and Windows in the Rust toolchain.

johnisgood · 7 years ago
Which shells are those? Bash works fine on both Unix and Windows.
j88439h84 · 7 years ago
Mario gives a lot of the benefits of using objects instead of text, but since it doesn't control the whole shell it's less of a commitment.

https://github.com/python-mario/mario

_urga · 7 years ago
The risk of a supply chain attack, roughly speaking, is then multiplied by all 73 dependencies. If any of those are compromised, the shell and system are compromised.

To ameliorate the risk, the security of all 73 dependencies would need to be at least an order of magnitude greater, just to catch up to shells with no dependencies.

The irony is that this lack of "dependency-safety" is far more easily exploitable than any previous lack of "memory-safety".

Of course, this can be easily fixed if developers stop multiplying third-party dependencies, so that importing a dependency involves O(1) risk, not O(?).

perlgeek · 7 years ago
> But this can be easily fixed. Developers need to stop using third-party dependencies, so that importing a dependency should involve O(1) risk, not O(N^2).

How is this an easy fix? Now developers need to develop and maintain O(N) instead of O(1) software projects.

I guess that if nushell had a zero dependency policy, it would have never happened.

And there is a significant risk to spreading your resources thin when reimplementing sensitive dependencies, like crypto or network / http libraries.

nailer · 7 years ago
What shells, if any, have no dependencies?
seaish · 7 years ago
This isn't a problem if versions are pinned.
bsder · 7 years ago
> However, its interesting to see so many dependencies that are required to build it (73 crates last time I checked)

The problem is that Rust has two issues:

1) In Rust, you don't pay for what you don't use. This means that a lot of stuff tends to be an external library since not using something means you can exclude it COMPLETELY. The issue is that things tend to get atomized more finely than most languages would do.

2) Rust is trying to not bake things into a standard library too early. This is a good comment about standard libraries from Python:

"The standard libraries constitute one of Python’s great features, but they tend to suck up all the oxygen. Programmers are reluctant to write libraries that duplicate their functions, so poor libraries in the standard set persist. Only a few people, like Reitz, are willing to write modules that compete with the standards."

https://leancrew.com/all-this/2012/04/where-modules-go-to-di...

felixfbecker · 7 years ago
This looks great, but what exactly are the benefits over PowerShell? The article mentions PowerShell, then says:

> What if we could take the ideas of a structured shell and make it more functional? What if it worked on Windows, Linux, and macOS? What if it had great error messages?

- Any shell with piping is already very functional, and PowerShell has script blocks (higher-order functions). Or does this mean "functional" in the literal sense? What functionality is missing? - PowerShell works on macOS, Linux and Windows - The examples of error messages further down look almost exactly like PowerShells error messages (underlining the command in the pipeline that failed)

It is not clear to me what exactly the authors sought out to do different

wvenable · 7 years ago
I wanted to like PowerShell but I find the syntax choices absolutely exhausting. This looks far more comfortable to me even as someone who mostly uses Windows but also knows Unix shell.
felixfbecker · 7 years ago
I know the long cased commands like Foo-Bar can be alienating to Unix folks at first, but have you tried just using aliases and not caring about casing? PowerShell will accept any unambiguous case-insensitive abbreviation of commands and parameters, and has almost all frequently used commands aliased (see this thread, it's literally the same as nushell).

I often wonder if people just don't like it because all they see is PowerShell scripts, where people like to spell out commands so it's clearer what they do. Imo it's a nightmare that we write bash scripts with all these 2-3 letter commands and flags that no beginner could ever understand. (but if you want to write that way even in scripts, nothing is stopping you)

qwerty456127 · 7 years ago
That's why I wish this shell could replace both bash and PowerShell in the real life. That's a pity this is highly improbable.

Deleted Comment

kjksf · 7 years ago
They clearly know about PowerShell. They acknowledge it as an inspiration and the core concept is the same.

That doesn't mean there aren't ways to improve upon PowerShell.

As a light user of PowerShell I consider it a poor implementation of a great idea.

Commands are way too long, control structures have poor syntax etc.

To make this very concrete, take this example from nu:

    ls | where size > 4kb
Now give me an equivalent in PowerShell and then let's talk about how different Nu is from PowerShell.

dragonwriter · 7 years ago
The equivalent, btw, is:

  ls | where length -gt 4kb
The only difference is powershell uses “>” and “<” for redirection, and so uses mnemonic operators for comparisons.

Nu doesn't seem to support redirection (or even have anything other than one input and one output stream, though metadata might serve the purposes otherwise served by additional streams; it's not how a program would return something Nu would treat as metadata, though.)

I guess that with a structured-data shell, error type of info could just be returned in-band under a different tag, in any case.

jodrellblank · 7 years ago
Take your pick:

    # most common in script, vs at shell
    Get-ChildItem | Where-Object { $_.Length -gt 4kb }
    gci | where length -gt 4kb


    # longest most restrictive with namespaces, 
    # most stable over time, 
    # least risk of hitting the wrong command 
    # or wrong property as surroundings change
    Microsoft.PowerShell.Management\Get-ChildItem | Microsoft.PowerShell.Core\Where-Object -Property Length -gt 4kb


    # shortest(?)
    gci|? le* -gt 4kb


    # because you can
    switch(gci){{$_.length-gt4kb}{$_}}
    gci | & { process{ if($_.Length -gt 4kb) { $_ } } }
    $larges, $smalls = (gci).where({$_.length -gt 4kb}, 'split')
    [linq.enumerable]::where([string[]][io.directory]::EnumerateFiles($pwd), [func[string,bool]]{param($f) [io.fileinfo]::new($f).Length -in 0..4kb})
In Powershell you can tab complete "Length" in the filter, from the available properties of the objects coming out of gci, which is neat.

hobs · 7 years ago
Complaining about command length when PS explicitly had full fledged named commands (for readability) and default aliases (for terseness) doesn't make sense imo.
chokolad · 7 years ago

  dir |where length -gt 4kb
Let's talk.

yyyk · 7 years ago
PowerShell has some great ideas, but I can think of quite a few things that can be improved in (or over) PowerShell: its abysmal speed, its verbose syntax, verb-noun structure which is much less useful than noun-verb would have been...
qznc · 7 years ago
Is there any noun-verb shell?
perlgeek · 7 years ago
Last I tried Powershell (on Linux, that is) it was so slow that I simply gave up, frustrated.

I really like the concept, but the slowness ruined the user experience for me.

dragonwriter · 7 years ago
> This looks great, but what exactly are the benefits over PowerShell?

To my perspective, making parsing for a variety of text-based formats a default action is an advantage, because it means tools don't have to use a particular runtime (e.g., .NET) to feed objects into shell pipelines without the shell user doing explicit conversion.

vips7L · 7 years ago
Maybe I'm misunderstanding you, but isn't this what the ConvertFrom-{Json, Xml} etc are for?
unixhero · 7 years ago
For one Powershell isn't libre as in freedom am I right.

Deleted Comment

massung · 7 years ago
Pretty cool idea and project. Hope you take it further and it turns into something truly spectacular!

My only comment is on the article. This line:

> Nu would not have been possible without Rust.

That's just patently false.

Perhaps you wouldn't have chosen to code it in another language. Or perhaps Rust had one or two benefits that assisted in the development of Nu. But you don't word it that way or go into any details as to what made Rust special/better than X, Y or Z language (not that you need to, but if you're going to make a wild statement, you should be prepared to back it up). The only 3 points made following that were about features that have widely existed in other languages well before Rust came along: async/await, async streams, and serde.

Rust is cool and is still exploring some really great ideas in making programs safer and more maintainable. But, it's not the greatest language ever (at least not yet), nor does it make the impossible possible. It can merely make some things easier at the cost of something else.

I'm glad you were able to use it effectively to create a tool you - and others - find useful.

canadaduane · 7 years ago
It may be that they intended the statement more like "Rust's collection of capabilities uniquely inspired us" rather than "Rust is more turing complete than all the others."

For example, the error reporting style in nushell appears to be a direct decendent of Rust's visual error message style.

TomasSedovic · 7 years ago
I was about to say a similar thing. Yes, in a CS-sense anything you can do in one Turing-complete language you can do in any other Turing-complete lang. They almost certainly were not disputing that.

If you factor in the real world, the number shrinks as you get to things like performance, cross-platform support, ease of packaging and distribution.

But still, Nu could plausibly have been written in Assembly, C, C++, D, Go, Nim, Zig, Haskell and so on.

You would have lost some of Rust's strengths and weaknesses and imported another set (regarding the language, tooling, performance, ease of debugging and so on).

I took the comment to mean that as far as they (all people intimately familiar with the language, tooling and ecosystem) are concerned, this would be a trade-off that would be negative enough to not begin or finish the project in the first place.

I don't know what plans and requirements the Nu authors had. But for my own game (a roguelike written in Rust) the requirements on the performance, distribution, stability, ease of development and dependency management meant that Rust was (at the time) pretty much the only viable choice for me.

Could someone have written it in C? Sure. Could I have written it in C? Likely, yes. Would I have ever finish the game in C? Almost certainly no. For various reasons none of the other languages would have done in those circumstances, not really.

Human language is an imperfect method for transferring information and sometimes one trades off brevity for nuance.

felixfbecker · 7 years ago
The error reporting style also looks almost exactly like PowerShell's (underlining the command in the pipeline that failed), which is written in C#.
simag · 7 years ago
It's a 3 person side-project over 3 months. I suppose he means that Rust is to thank for this level of polish and reuse of 3rd party libraries. My experience with Rust is similar and there are many projects I would consider impossible (practically not technically) without Rust.

Programs like shells or servers are expected to perform well and usually cannot afford the GC tax (more memory usage and pauses). I wouldn't use a shell written in a GCed language. That would leave them with non-GC'ed languages. C or C++ simply don't have a usable equivalent of Cargo/crates.io which makes 3rd party reuse slower, and then they have much weaker execution safety guarantees.

qaq · 7 years ago
"That's just patently false." Why? Projects have constraints including time and skillset of partcipants.
moogly · 7 years ago
Yeah I agree. That was intellectually dishonest.

I really like the idea and look of this project. To me it feels like PowerShell done better (and implemented by non-aliens).

lokedhs · 7 years ago
I was about to post the same thing.

I could put this together in Common Lisp. I know precisely what CL technologies I'd use to do it, and perhaps after finishing the work I'd say "this wouldn't have been possible without Lisp", which would be true in some respect, but just as dishonest as the original article.

That doesn't mean I don't respect the work done. It's an interesting project and most definitely something that deserves attention. The language used to develop it is the least interesting aspect of this.

mongol · 7 years ago
Maybe it should be read as "my child not be possible without me".
elcomet · 7 years ago
I think you got it wrong. They meant "It would have been impossible for us without rust".

Of course it is possible in any language you can think of, but not with their skills and time.

Legogris · 7 years ago
Interesting! I am also looking forward to setting aside time to play around with https://elv.sh/, which takes a completely different approach with a similar philosophy.
xiaq · 7 years ago
Elvish author here, thanks for the plug :)