But nm size is so baked into culture, it didn't take off.
Angstrom is kind of a fun word, so I’m sure a few nodes will be named after that.
It’s a marketing number and has been for many years.
EDIT: Also, "I know it was a challenge to review a book of its size..." comes off as insinuating that (1) the book is somehow "grand" and (2) maybe the reviewer didn't "get it".
Edit: just under 1200 pages on Amazon. I never got into it because I couldn’t figure out what the big revelation was supposed to be. It would take some serious dedication to go through such a large book for the sake of an unfavorable review.
I take wolfram’s words at face value.
I understand the need for the masses to have people ideas that are obviously practical.
Stephen Wolfram is more of an explore. And he is documenting phenomena that I don't see any one else doing because everyone else is so teleological.
I think we need to give a break to researchers doing this original non teleological research.
I don't understand why people find him "insufferable"?
From what I gather of other people’s comments, they are often bothered by his apparently pervasive discussion of himself and his life.
I’ve never met the man, but the few interviews I’ve see or read about him I thought were pretty interesting.
This may be the first HDL I've seen that attempts to move the needle on catching bugs at compile time. (I've worked with several engineers, on hardware bugs which turned out to be pipelining errors, who did not understand what I meant by "make this design error inexpressible.") I have several pages of notes on what I'd do differently if I designed my own HDL - the typical software engineer hubris - and this is the first language I've seen that starts to line up with what I was thinking.
Another perennial area where bugs crop up are when crossing clock and reset domains. The language ought to be able to make it so that you simply can't make many kinds of clock domain errors - trying to read a signal from the wrong clock domain shouldn't compile. Dedicated "unsafe" primitives from a stdlib should perform the crossing and the type conversion.
I’m all in favor of a better HDL. Verizon/SystemVerilog is loaded with completely non-obvious landmines. I’ve been doing this so long I forget they are there, but it’s pretty painful seeing someone new to the language step on them. But the alternative, VHDL has largely fallen out of favor in the US.
You would be hard pressed to find a more strongly typed language than VHDL, but damn is it verbose. None of the footguns, but you might get an RSA before you finish typing the code in. If you have ever given Ada a try, VHDL will look pretty familiar.
I know this may be a weird thing for software folks to think about, but writing HDL is a tiny part of digital design. In digital design, if done with discipline, writing the HDL is an almost mechanical process of translating the design. In a design that might take a year, writing the code might be 3 weeks.
Done without discipline you will spend all your time debugging. Wondering why it worked in the lab an hour ago, but after lunch nothing works and you won’t be able to make sense of it.
Understanding basic combinational logic, then sequential logic, followed by state machines (which are the bread and butter of digital design), followed by understanding IO timing and timing constraints (a brain damaged “language” to itself) will take you far.
Domain crossing isn’t so bad if you have those fundamentals.
Then you can spend time learning algorithms other more interesting things. Writing low power software accelerators for neural nets and signal processing.
You can go through all the gyrations of language design in the world, but the language isn’t the hard part. There is a huge amount of improvement to be done, no doubt. But digital design is not the language.
If you want to make the world of digital design a better place, more open, easier to break into, work on tools, not languages. I’d give a kidney for an open source timing diagrammer that could do simple setup and hold checks, create derived signals through Boolean combinations of other signals, and emulate a flop.
I’d do it, but I’ve tried and programming a gui is about the most painful thing I’ve done on a computer. So much work for so little payoff.