Different checksum algorithms can provide better error detection for specific channel error models (potentially even with fewer bits). Non-cryptographic checksums are typically designed for various failure models like a burst of corrupt bits, trading off what they do/don't detect to better match detection of corruption in the data they will protect.
For example, if you know that there will be at most one bit flip in your message, a single bit checksum (parity check) is sufficient to identify that an error occurred, regardless of your message size. (Note that this is an illustrative example only, since, typically, messages have a certain number of errors for a certain number of message bits -- the expected number of errors depends on the size of the message.)
Important real-life-facts.
There was no "give-me-an-appropriate-hash" function.
There was:
md5sum yourfile.txt
Nobody wants to think about "channel's bit error distribution" in a non-security critical context. In fact, its irrelevant, and possibly a usability issue.
Then why use a cryptographic hash at all? much better hashes out there that only strive for distribution/avalanche.
https://en.wikipedia.org/wiki/Non-cryptographic_hash_functio...
Sure, there are better non-cryptographic hashes, but, again the concern of lawyers and genomics folk is neither security nor efficiency - simplicity and "works most of the time" are the two metrics at stake.
If either laywers or genomics folks cared about document forgery of this nature (spoiler, they don't), they would move to something like SHA3. If they had a need for high-scalability hash algorithms (spoiler, they don't), they would switch to another faster algorithm.
This is a concept I understand security folks struggle to understand - sometimes we _just don't care_. And we never should.
Maybe, something a struggling security enthusiast could understand - a video game.
If you implement e.g. a caesar cipher, you can have fun, accessible puzzle. Implementing AES in your game as a puzzle, while much harder, fails desperately at the "accessibility" metric. In your single player game, if you want to see some "identifying hash", if you see an md5 one, that's enough. No, you should not worry about people forging documents for your ad-hoc identification system, if you don't have people attempting to forge in-game items. Maybe its even a feature that you need to forge such a hash, as a way to solve a puzzle.
Sorry for that, I have to context switch when I talk to people outside physics. I always forget that. Also definitely the context of the convo was about QCs breaking encryption so my bad.
There is the cost to consider, yes, there is also an energy cost to a stable QC system. Asymmetric/symmetric are not unbeatable, they have an energy cost. Shors algorithm is theoretically great, but rarely if ever have I seen an associated energy cost...even outside of the answer "will we build one" the question is "can you efficiently build one" or not, i.e. what does a QC capable of executing shor's algorithm look like, a small planet or star perhaps?
Could I boost it to 2-5x time with a simple JS script? Sure. However I figured rate limits were in place/that wasn't the spirit of...whatever this is.
Reminds me of Peter Molyneux's Curiousity: https://en.wikipedia.org/wiki/Curiosity%3A_What%27s_Inside_t...