Also, given the choice between optimising for 21 x 21 modules or something a lot larger, what is recommended in this day and age? Is blocky best?
So a code with a high error correction factor can repair a QR code with a logo obliterating the middle of the code.
https://www.eaa.org/eaa/news-and-publications/eaa-news-and-a...
In 2012 I created a killer prototype that demonstrated that you could accurately reconstruct most people's flight history at scale from social media and/or ad data. Probably the first of its kind. This has been possible for a long time.
A quick sketch of how it worked:
We filtered out all spatiotemporal edges in the entity graph with an implied speed of <300 kilometers per hour or <200 kilometers distance, IIRC. This was the proxy for "was on a plane". It also implicitly provided the origin and destination.
These edges can be correlated with both public flight data and maintenance IoT data from jet engines to put entities on a specific flight. People overlook the extent to which innocuous industrial IoT data can be used as a proxy for relationships in unrelated domains.
In rare cases, there was more than one plausible commercial flight. Because we had their flight history, we assumed in these cases that it was the primary airline they had used in the past, either generally or for that specific origin and destination. This almost always resolved perfectly.
This was impressively effective and it didn't require first-party data from airlines or particularly sophisticated analytics. Space and time are the primary keys of reality.
It is, and they are. It’s why Reagan fired ATC strikers and blackballed them. It’s why private enterprise stockpiled machine guns and chemical weapons against strikers back in the Gilded Age. It’s why companies will spend billions to block Unions rather than just give workers the few million or so more they need over a decade to just maintain a standard of living. It’s why they’ll close down stores, warehouses, offshore jobs and outsource to contractors to penalize Unions.
Unions are a direct response to the inequality of Capital allocation and distribution.
It's just one more errata in a language that's filled with horrible hacks from centuries of iterative development.
My hill to die on would be exactly one way (NOT the funky dictionary way!) of spelling words exactly as they should be pronounced and writing them back similarly.
The hill to die on part of that is they need to start with children, teach them ONLY the correct way of spelling words as use in school and stick to it. While we're at it, FFS, do metric measures conversion the same way. Cold turkey force it, and bleed in dual measures and spelling with a cutover plan that starts to make the new correct way required to be larger text by the time the grade -2 kids graduate. (So about a 14-15 year plan.) That's to give all us adults time to bash into our heads the new spellings for old words too.
Why can't it be dictionary spelling? Offhand, 1) those phonetics aren't used quite like that anywhere else. 2) those phonetics are more strongly based on the other languages in Europe so the structure isn't as expected. I'd sooner force everyone to learn how to write TUNIC's shapes... though there's some coverage issues for that.
Effectively I want different shapes for the chart ( https://en.wikipedia.org/wiki/International_Phonetic_Alphabe... ) that DO NOT MATCH EXISTING ENGLISH LETTERS so that when I look at a 'new spelling' my old pronunciation programmed brain doesn't index the wrong lookup table.