I know pip has its own flaws with not-so-great package and dependency management. But does that warrant a whole new tool? Or does that warrant fixing the existing tool?
I wonder what the reason is for this kind of behavior to exist only in the Javascript community? Could it be that a vast majority of Javascript developers are really in high school? Are there any good stats sources for it?
Pretty much all of them.
If you redefine the law like that, then sure, I agree that there are many uniform distributions too where the 1st digit is likely to be small. Here is another simple example: Consider the distribution of positive integers from 1 to 2. If we pick a number at random from {1, 2} then the 1st digit is likely to be small. This kind of analysis is boring.
But (fortunately!) that's not what Benford's law says. Benford's law provides a specific formula. Check https://en.wikipedia.org/wiki/Benford%27s_law#Definition to see the specific formula that must hold good for a set of numbers to be said to obey Benford's law. That's what makes Benford's law so interesting whereas your example ranges are degenerative cases where nothing new, surprising, or interesting is going on.
The law states that in many naturally occurring collections of numbers, the leading significant digit is likely to be small.
I have explained why that happens for the vast majority of UNIFORMLY DISTRIBUTED VARIABLES.
The vast majority. That implies that there is a collection of all possible uniformly distributed variables, and in particular those that are sampled from real world processes.
As long as they are uniformly distributed, with 0 as the minimum and M as the maximum, the first digit will appear more commonly.
I explained it several times. Why are you still insisting that statements about MAJORITY of uniform distributions are weird?
Yes statements about collections of uniform distributions are not statements about ONE SPECIFIC uniform distribution. And?
How does that work? Who can explain this to me?
It is trivial to see that literally any range with min = 0 and max = any number other than a power of 10 makes it LESS likely that a 9 will come up as the first digit. For example the range 0-300 has 1 and 2 come up as the first digit way more than the rest. Don’t you think the same is true of 0-30000 and 0-300000000000000000000000? The size of the range doesn’t make your assertion any more true, that for large ranges every leading digit begins to have an equal chance of appearing.
My point is that, given a uniform distribution from 0 to a max, it has to have a max somewhere. If we assume that max itself is uniformly distributed then we derive the proportions you find in Benford’s law.
Look to put it another way, Benford’s law comes from the numbers which are the same number of digits as the max. The rest are evenly distributed but those numbers are the most numerous at that point and they contribute the phenomenon. Ok?
Are you convinced?
PS: There has got to be someone who figured this out before 2020. Come on. Someone post a link to this derivation.
It is impossible on Hacker News for a new green account with less than 500 points to downvote someone else.
When the max is uniformly distributed then Benford’s law emerges. I mean, all you have to do is read the link - where I derive it.
What exactly is the law — please don’t handwave. If the law is those exact point values mentioned in the article then I just showed you how we arrived at them.
What does "max is uniformly distributed" even mean? If you think that the Benford's law holds good for a set of uniformly distributed numbers, why not simply provide that set? It would be so easy to prove your claim if you just provide an example set of numbers that obeys Benford's law.
All sets of numbers you have presented so far (0-300, 0-30000, 0-300000000000000000000000) do not follow Benford's law. It is very simple to show. In all these sets, the probability of first digit as 1 is equal to the probability of first digit as 2 which contradicts Benford's law.
What programming languages' de-facto thread implementations are not wrappers around pthreads? I think Go has its own thread implementation? Or am I mistaken?