Readit News logoReadit News
einichi · 3 years ago
My number one requirement for a tool like this is that the JSON content never leaves the machine it's on.

I can only imagine the kind of personal information or proprietary internal data that has been unwittingly transmitted due to tools like this.

If my objective was to gain the secrets of various worldwide entities, one of the first things I would do is set up seemingly innocent Pastebins, JSON checkers, online file format convertors and permanently retain all submitted data.

alias_neo · 3 years ago
Personal requirements aside (I have the same requirements); just using this would constitute misconduct at the very least at my place of work.

Yes it's a cool looking tool, but there are certslain requirements that ignorance doesn't exempt us from.

My pet gripe is all of the seemingly local (open source) tools that phone home with opt-out metrics, not mentioned in the "getting started" and take some obscure flag to disable and it's just that little bit more complex to do when running the defacto (containerised) build.

Jenk · 3 years ago
> My pet gripe is all of the seemingly local (open source) tools that phone home with opt-out metrics, not mentioned in the "getting started" and take some obscure flag to disable and it's just that little bit more complex to do when running the defacto (containerised) build.

Exhibit A: DotNet! https://learn.microsoft.com/en-us/dotnet/core/tools/telemetr...

briandear · 3 years ago
I worked at a $massive_tech_company_with_extreme_secrecy and using these tools was expressly forbidden because of the risk. Maybe one exists, but I would gladly pay $20 for a Mac app that could do all of this locally: like a Markdown Pro type app but for JSON formatting and validation. I want to simply open the app, paste in some json and have it format it to my requirements (spaces/tabs/pretty/etc.)
Mogzol · 3 years ago
Completely agree. I could actually get a lot of use out of a tool like this, but the fact that even the VSCode extension sends the JSON to their servers and opens it at a publicly accessible URL makes this a no-go for me. I wouldn't recommend anyone use this for any remotely sensitive data.
chii · 3 years ago
the extension apparently can be configured to use a locally running instance of the server. But yes, by default it uses the remote version, and thus you post publicly the json, which may or may not be ideal depending on what you're doing.
eallam · 3 years ago
Eric here (one of the creators of JSON Hero) and this is a really good point. We built JSON Hero earlier this year and partly wanted to use it to try out Cloudflare Workers and Remix, hence the decision to store in KV and use that kind of architecture. We're keen to update JSON Hero with better local-only support for this reason, and to make it easier to self-host or run locally.
jessikat · 3 years ago
If the vscode extension did it all locally, I'd 100% install in an instant!
epaulson · 3 years ago
There are instructions in the readme to 'run locally' - are you saying that even that version (running on localhost:8787) is sending something back to y'all, either from the client in the browser or sending something back via the locally-running server?

I was totally about to clone this repo and run it locally so I can play with some internal json.

T3RMINATED · 3 years ago
If the vscode extension did it all locally, I'd 100% install in an instant! DITTO
cryptonector · 3 years ago
Try WASM.
wohfab · 3 years ago
This reminds me of an "Online HTML Minifier" website that analyzed the text and included affiliate links for random words within the text.

And they operated for years, when someone noticed links on their own website, they haven't added themselves and tried to figure out, how it happened, because nobody else had access to the website.

(Will update with a link, if I find it.)

naan_bread · 3 years ago
I agree.

My tool flatterer: https://lite.flatterer.dev/ converts deeply nested JSON to csv/xlsx, is done in web assembly in the browser.

It hard to prove that it is not sending data to a server, so it can be trusted. I know people could check dev tools but that is error prone and some users may not be able to do it.

I wish there was an easy way to prove this to users as it would make online tools like this much more attractive.

ljw1004 · 3 years ago
I think there is an easy way to prove this to users. Make your thing be a single page self contained html file which they save into the hard disk. Then they can trust the restricted permissions with which chrome runs such local files.

If you have a tech savvy audience they can also view your thing in an iframe with only sandbox="allow-scripts" to prove that it's not making network requests.

I wrote an html/js log viewer with those security models https://GitHub.com/ljw1004/seaoflogs - it handles up to 10kline log files decently, all locally.

alpaca128 · 3 years ago
Would be nice to have the option to switch tabs into offline mode, just like we can mute them.
jspash · 3 years ago
Turn off wifi? Unplug the ethernet cable? Try it from my garden shed where there never seems to be connectivity no matter what I try.
veltas · 3 years ago
No Dave, you can't upload this export-controlled document to this web tool. I don't care how convenient it is.
plusminusplus · 3 years ago
There's an issue on the github requesting a local version

https://github.com/apihero-run/jsonhero-web/issues/134

kQq9oHeAz6wLLS · 3 years ago
Just set up an online HL7 or better yet CCDA parser and let the PHI roll in.
throw903290 · 3 years ago
Even more, it has to work completely offline! And if it makes ANY network calls, it is a huge red flag for some!

Deleted Comment

ottoflux · 3 years ago
100% literally came here to make sure someone said this.
syngrog66 · 3 years ago
yep if I were a Bad Guy and had nation state resources I'd be salivating over trying to get "in" at JetBrains, GitHub and the like
kwertyoowiyop · 3 years ago
All those free online .PSD utilities make my spidey-sense tingle.
2devnull · 3 years ago
Better yet, build an operating system and link it to the cloud.
aembleton · 3 years ago
If anyone wants to try it out, but doesn't want to send them your Json, here's an example of some real world data https://jsonhero.io/j/t0Vp6NafO2p2

For me, this is harder to use than reading the JSON in a colour text editor such as VSCode. I'm getting less information on the page, and its harder to scan, but that might be because I'm used to reading JSON.

lampe3 · 3 years ago
Thanks for the link!

And yes I feel the same. For me its also easier to read it either on raw form or in VsCode.

_____-___ · 3 years ago
Yeah, it looks cool, but anything I'd use it for I'd use jq for and much, much faster.
tfsh · 3 years ago
See also jsoncrack [1] which visualises JSON as n-ary tree data-structures.

This project takes a different approach, in that it handles the displaying of JSON node leaf data in a more human way. E.g for hex colours showing a colour picker or a date picker for dates.

What sets this tool apart however is the static analysis of the JSON data, which in doing so can uncover divergences or outliers in the data. E.g a single null value somewhere, or supposedly data which deviates from the majority data-type (i.e a number where every other value is a string).

I think there's value proposition in just edge case detection. Datasets can be massive, with something like JSON there exists no formal type verification. Although to be honest, I don't see a valid reason to use JSON as a backend given graph based databases with type-safe schemas exist.

1: https://news.ycombinator.com/item?id=32626873

rurban · 3 years ago
jsoncrack cannot even open the simpliest of my json files (600K: too large), whilst this handled it easily.
vincnetas · 3 years ago
what is the use case for having 600K (lines? bytes?) JSON? I'm a bit shocked and curious at the same time :)
mvindahl · 3 years ago
Tried it out on some REST response from a local test server.

And, well, as much as I applaud the effort, I also think that I'll stick to my text editor for browsing JSON data and to jq for extracting data from it.

My text editor because it's easy to perfom free text search and to fold sections, and that's all that I need to get an overview.

Jq because it's such a brilliantly sharp knife for carving out the exact data that out want. Say I had to iterate a JSON array of company departments, each with a nested array of employees, and collect everyone's email. A navigational tool doesn't help a whole lot but it's a jq one liner. Jq scales to large data structures in a way that no navigational tool would ever do.

Also, there is the security issue of pasting potentially sensitive data into a website.

alin23 · 3 years ago
Also check out jqp (jq REPL) for when you need a few tries to get the right jq selector: https://github.com/noahgorstein/jqp

Looks a bit like fzf combined with jq.

irrational · 3 years ago
The first thing I see when I go to the site: JSON SUCKS

Uh... It does? I remember when XML was the main data interchange format of the web. That sucked. JSON is amazing, terrific, wonderful, etc. in comparison.

KronisLV · 3 years ago
> I remember when XML was the main data interchange format of the web. That sucked.

I wonder why - apart from the "Should this be an element or an attribute?" issues and oddities in various implementations, XML doesn't seem like the worst thing ever.

Actually, in a web development context, I'd argue that WSDL that was used with SOAP was superior to how most people worked with REST (and how some do), since it's taken OpenAPI years to catch up and codegen is still not quite as widespread, despite notable progress: https://openapi-generator.tech/

What does leave a sour taste, however, is the fact that configuration turned into XML hell (not in a web context, but for apps locally) much like we have YAML hell nowadays, as well as people being able to focus on codegen absolved them of the need to pay lots of attention towards how intuitive their data structures are.

That said, JSON also seems okay and it being simpler is a good thing. Though personally JSON5 feels like it addresses a few things that some might find missing: https://json5.org/ (despite it being a non-starter for many, due to limited popularity/support)

orthoxerox · 3 years ago
Namespaces. I know why they were introduced, but they still were an incredible pain to use, especially with SOAP. You want to pass a <Customer> to the update method? No, it must be <Customer xmlns="http://example.com/api/customers/v2"> that is wrapped in a <soap:Envelope xmlns:soap="http://www.w3.org/2003/05/soap-envelope">.

Oh, you're writing a service? You can't just XPath your way to that <Customer>, because it's a namespaced <Customer>, your XML parser will claim there's no <Customer> in the message, you have to register your namespaces "http://example.com/api/customers/v2" and "http://www.w3.org/2003/05/soap-envelope" and look for /soap:Envelope/soap:Body/c:Customer instead.

JSON is annoyingly anal about its commas, but at least it has a single global namespace and I have never encountered a situation where I wished I could disambiguate between two different "customer" objects in my JSON payload.

irrational · 3 years ago
For me, it is because I can do JSON.parse() and boom, it is plain JavaScript objects, arrays, strings, etc. XML was never that simple.
IshKebab · 3 years ago
It's not the worst ever (that would be YAML) but it does have an accumulation of annoying features.

* Elements and attributes (as you said).

* Text children mixed up with elements. These two are both good for writing documents by hand (i.e. HTML) but really annoying to process.

* Namespaces are frankly confusing. I understand them now but I didn't for years - why is the namespace a URL but there's nothing actually at that URL? 99% of the time you don't even need namespaces.

* The tooling around XML is pretty good but it's all very over-engineered just like XML.

* The syntax is overly complicated and verbose. Repeated tag names everywhere. Several different kinds of quoting.

* XML schema is nice but it would be good if there was at least some support for basic types in the document. The lack of bool attributes is annoying, and there's no standard way to create a map.

JSON is better by almost every metric. It is missing namespaces but I can't think of a single time I've needed that in JSON. Mixing up elements from different schemas in the same place is arguably a terrible idea anyway.

The only bad things about JSON are the lack of comments and trailing commas (which are both fixed by JSON5) and its general inefficiency.

The inefficiency can be sometimes solved by using a binary JSON style format e.g. CBOR or Protobuf. With very large documents I've found it better to use SQLite.

998244353 · 3 years ago
It's always strange to think that we went through formats like XML (and even earlier, XDR) before inventing something as seemingly simple and obvious as JSON.
eru · 3 years ago
We had S-Expressions before we had JSON. (And JavaScript originally wanted to be a Lisp, too.)

It's not that we had XML and SGML and XDR because nobody had invented something as simple as JSON, yet. The real reasons are some complicated social hodgepodge that made those complicated beasts more accepted than the already-invented simpler approaches.

simplotek · 3 years ago
> It's always strange to think that we went through formats like XML (and even earlier, XDR) before inventing something as seemingly simple and obvious as JSON.

It's my understanding that JSON was not invented. It's just the necessary and sufficient parts of JavaScript to define data structures, and could be parsed/imported in a browser with a call to eval().

People who complain about JSON completely miss the whole point. It's not that it's great or trouble-free, it's that it solved the need to exchange data with a browser without requiring any library or framework.

s3000 · 3 years ago
Why does React use JSX when it could be all JavaScript and JSON?

My guess is that XML is good for situations where text and data is mixed.

stevoski · 3 years ago
Indeed. I’m (sorta) old and cranky and even I love JSON as a data interchange format. Because I had to use XML for years, including SOAP.
eallam · 3 years ago
Yea I really wanted to write "JSON is worse is better" because JSON is great and simple. We really just mean "reading complicated JSON files sucks"
paco3346 · 3 years ago
So why not say that? When I see a tool or service make a claim as simple as "JSON SUCKS" it makes me think I'm not the target audience.

I can read a 10 like file without a parser. What I don't like is 7 layers deep, nested, 890 lines long.

andy800 · 3 years ago
If we're talking JSON document visualization, it's essential to point out this brilliant page:

https://altearius.github.io/tools/json/index.html

Was formerly hosted at http://chris.photobooks.com/json

aembleton · 3 years ago
Fantastic, that is far more useful
kondro · 3 years ago
I like it, but don't love that it's a web app.

I guess I could fork it myself, but don't particularly want to have to run a web app to browser JSON either.

I wonder how easy it would be to port to Electron.

tfsh · 3 years ago
Forking perfectly functional browser-based web app into Electron apps is an irritating trend with very limited benefits. Some apps exist as Electron apps because they require native OS access, this app does not, therefore there is no reason to do so.

Porting to Electron would be trivial but in doing so you incur the following ramifications:

- the user has yet another instance of Chromium running on their device.

- they can't interact with browser based UIs easily any longer (bookmarking, retaining in history, copying the URL, different cookie/login jars, etc...).

- might fragment the users workflow even more if they have to interleave between electron apps and their browser

- lack of user extensions and some important accessibility features

In Chrome you can create a shortcut for the page and select "open in a new window" which by-in-large emulates the workflow you request. I'm sure there's a similar process for Firefox.

nemosaltat · 3 years ago
Interestingly enough, when I want the compartmentalization experience that comes from an “App” on macOS, I turn to… Microsoft Edge. Edge has a nifty little feature that lets you “Appify” a website. I mostly find this useful for company-required PWAs, and most-of-all, Microsoft Teams. The Edge-“Appified” MS Teams on macOS is leaps and bounds more performant than the “Native” (Electron) MS Teams apps on macOS (consumes ~25MB of mem vs ~800MB). Has the nice benefit of your “Apps” being a Command-Space away.

edit: clarify & format

cobertos · 3 years ago
I think at least forking and hosting it yourself if a good middle ground between forking into an Electron app.

Who wants a tool they rely on to one day update with spyware, HTTP 404, or filled with ads (like Toptal did with keycode.info)?

zarzavat · 3 years ago
Browsers should make it easier to interact with the command line then. I want to be able to run a command and it opens the result in a browser tab.
bakugo · 3 years ago
You are aware that Electron is just a stripped down browser, right?

Deleted Comment

sorenjan · 3 years ago
joeyjojo · 3 years ago
It just opens your json in a publicly accessible page at https://jsonhero.io/
Timpy · 3 years ago
I get a lot of useful information from reading raw URLs, the exact thing they're advertising here is the thing I hate most about Jira. I even wrote a chrome extension that prevents Jira from loading smart links because I hate it so much. I can't imagine ever wanting a YouTube video preview while I'm skimming JSON data.