Readit News logoReadit News
mrkeen · 2 years ago
> Then, it parses the JSON with the aforementioned Unix staples in a for loop.

Show the damn code. Otherwise I'm just going to presume that the awk&sed reimplemented Json parsing in an indecipherable, buggy way.

mid-kid · 2 years ago
My assumption is that he probably just made assumptions about the format. Stuff like "my strings won't be represented by hexadecimal escape sequences" and "my json file will be split up line by line".

It's really convenient when you can get away with stuff like that, and even if it's not a "proper" solution, at the end of the day it really doesn't always have to be.

umvi · 2 years ago
I too can never remember jq syntax when I need to. I usually just end up writing a Python script to extract the part of the JSON I need.
defrost · 2 years ago
If JQ is too much, see GRON &| Miller

gron transforms JSON into discrete assignments to make it easier to grep for what you want https://github.com/tomnomnom/gron

Miller is like awk, sed, cut, join, and sort for data formats such as CSV, TSV, JSON, JSON https://github.com/johnkerl/miller

eternityforest · 2 years ago
Really cool but... Python exists. If I am gonna do anything with JSON that is probably well into "Time to use a full programming language" territory
kitd · 2 years ago
Nushell is also good for this, especially when you need shell capabilities too.

https://www.nushell.sh/

e40 · 2 years ago
Gron looks interesting, but I wish there was an option for bash output!
elesiuta · 2 years ago
> I too can never remember jq syntax when I need to. I usually just end up writing a Python script

Same here! That's why for small things I made pyxargs [1] to use python in the shell. In another thread I also just learned of pyp [2] which I haven't tried yet but looks like it's even better for this use case.

[1] https://github.com/elesiuta/pyxargs

[2] https://github.com/hauntsaninja/pyp

Grimburger · 2 years ago
I tend to frustrate colleagues with this but it's honestly so much more manageable.

A python script is the sane extensible choice rather than some esoteric bash incantation that you have no clue about 6 months after writing it.

paiute · 2 years ago
I would say the same thing about sed and awk. I have a few basics down solid, but anything complex is best done with python.
raincole · 2 years ago
There is nothing wrong with choosing a language like Python/Ruby over "command line sorcery".
ElCapitanMarkla · 2 years ago
ChatGPT is a fantastic usecase for this. Last week I had to extract a bunch of data that was in some nested arrays, pasted a json sample into ChatGPT and asked it to use jq to extract certain fields when certain conditions were met, a couple of tweaks later and I had exactly what I needed.
fragmede · 2 years ago
Honestly, whenever I need to use jq, I just search my bash history (fzf ftw) for the last time I used it. How'd I get it to work the first time? lost to the sands of time...
alexwasserman · 2 years ago
jq is definitely tough to learn, I can never remember it.

But, the whole argument against jq as a unitasker not worth learning and traditional unix tools being better is weird. Traditionally Unix tool mentality was “do one thing only and do it well” then pipe it together. jq fits perfectly into the Unix toolset.

mongol · 2 years ago
Yes jq is definitely a tool like that. The problem nowadays is maybe more that there are many competing options. jq seems most popular for JSON parsing but once in a while another tool is used that maybe has better ergonomics but a different syntax etc. You can master jq but it will not be enough since there is no consensus. This used to be easier I believe, the tools in common use were fewer. So let's imagine awk got JSON support. Would that be for the better or worse?
bigstrat2003 · 2 years ago
Also, Alton Brown doesn't advise against unitaskers because he's some kind of hater. The reason you don't have unitaskers in the kitchen is because they take up physical space. You have to decide which tools are worth the space they take up, and it's hard to justify that for a tool you will only use once in a while. That's not a concern with computers any more, so the reasoning doesn't apply here.
geocrasher · 2 years ago
Replace kitchen space with head space.
mongol · 2 years ago
I recall a tool that rewrites json to a dot notation that is easily grep-able. It prepended each value so all the parents were in front, something like a.b.c=d

But I have forgot the name..

Edit: was already mentioned in the thread! Gron

ninjin · 2 years ago
There is also json2tsv [1] that follows a similar philosophy and I have had some fun combining it with awk(1) recently for database ingestion.

[1]: https://www.codemadness.org/json2tsv.html

1vuio0pswjnm7 · 2 years ago
"Those tried and true commands we were referring to? None other than the usual awk sed cut grep and of course the Unix pipe | to glue them all together. Really, why use a JSON parsing program that only could only do one function (parse JSON) when I could use a combination of tools that, when piped together, could do far more?"

IMHO, drinking the UNIX Kool-Aid means not only using coreutils and BSD userlands but also using the language in which almost all of those programs are written: C. For me, that means gcc and binutils are amongst the "tried and true commands". Also among them is flex. These are found on all the UNIX varieties I use, usually because they are used in compiling the OS. As such, no special installation is needed.

When I looked at jq source code in 2013, I noticed it used flex and possibly yacc/bison. No idea if it still does.

Using UNIX text processing utilities to manipulate JSON is easy enough. However if I am repeatedly processing JSON from the same source, e.g., YouTube, then I use flex instead of sed, etc. It's faster.

jq uses flex in the creation of a language interpreter intended^1 to process any JSON. I use flex not to create a language interpeter but to process only JSON from a single source. The blog author uses shell script to process JSON from a single source.^2 I think of the use I make of flex as like a compiled shell script. It's faster.

The blog author states than jq is specific to one type of text processing input: JSON. I write a utility that is specific to one source of JSON.

1. Sometimes it's not used as intended, e.g., https://github.com/makenowjust/bf.jq

2. I also used flex to make simple utility to reformat JSON from any source so it's easer to read and process with line-oriented UNIX utilities. Unlike jq and other JSON reformatters it does not require 100% correct JSON; e.g., it can accept JSON that is mixed in with HTML which I find is quite common in today's web pages.

foul · 2 years ago
> When I looked at jq source code in 2013, I noticed it used flex and possibly yacc/bison. No idea if it still does.

It has bison and flex files in the source code currently.

> jq uses flex in the creation of a language interpreter intended^1 to process any JSON. I use flex not to create a language interpeter but to process only JSON from a single source. The blog author uses shell script to process JSON from a single source.^2 I think of the use I make of flex as like a compiled shell script. It's faster.

Like, you have a flex template and you fill in with ad-hoc C code? Nice, would find it more readable than a jq script, although a basic jq script is just a mix of bash and javascript and when I grok it for the 123th time (because it's innatural, odd, inside a shell) it gets better.

zoom6628 · 2 years ago
Still use awk since I learnt it in 1991 ans only use the others when I can make awk do all I need so that means leaning on python for more complex logic when required. Tried ha a few times and it did my head in too (yes I'm a lazy idiot obviously - guilty as charged because if I wasn't lazy I would write a JDON parser I could actually use!).

I naturally gravitate to the simplest solution with the most well known and proven tools. Not fan of boiling oceans or immersing in obscurity when there is a deadline to meet.

pointlessone · 2 years ago
> If I can’t do it with grep sed awk and cut and a for or while loop then maybe I’m not the right guy to do it. Oh sure, I could learn Python. But I don’t need Python on a daily basis like I do bash.

I get the sentiment but to me it looks like everything looks like a nail when you only have a hammer.

Those are fine tools for an ad-hock one-liner. But if you're building something that doesn't fit into a line or two in the terminal you're better of with a proper scripting language. I don't care what it is — Python, Perl, Ruby, JavaScript, whatever — anything's better than Shell script. Shell is just too brittle with too many gotchas and edge cases. It's extremely hard to write more than about 10 lines of shell script without bugs. It's even harder to write if you don't know your exact shell. The only exception is that you know your script has to be executed on a system that only has a shell on it and you absolutely can not install/require any other interpreter.

baz00 · 2 years ago
The trick here is to get the hell away from JSON quickly. I only ever use jq to turn JSON into text and then use my crusty old tools on that.

If I’m working on my own thing I’ll use a text based format.

Although recently I parsed json out with head, tail and sed. It was numeric data nicely formatted in lines so it was easier to just remove all the semantic structure than actually parse it.