Readit News logoReadit News
dsp1234 commented on Goto and the folly of dogma (2018)   manybutfinite.com/post/go... · Posted by u/luu
maxxxxx · 6 years ago
I hope you are kidding. But yes let’s worry about injection for even the simplest things. Maybe we should start with language injection where you write code and later on you inject a different language. That would be the ultimate maintainable system.
dsp1234 · 6 years ago
> language injection where you write code and later on you inject a different language.

I (still) work with a Classic ASP code base.

I kid you not, that there are VBScript functions, which call JScript (not JavaScript) functions, which in turn call VBScript functions.

It is a terrifying and glorious mess.

edit:

I also worked with a system at one time that used node.js to create C# files on the file system, then use the C# compiler to create an executable and then run it. It was... not great.

dsp1234 commented on CPU.fail   cpu.fail/... · Posted by u/razer6
dsp1234 · 6 years ago
The blog post is buried a bit deep, but has the actual technical information on the topic

https://www.cyberus-technology.de/posts/2019-05-14-zombieloa...

dsp1234 commented on Dear Client, Here’s Why That Change Took So Long   simplethread.com/dear-cli... · Posted by u/jetheredge
commandlinefan · 6 years ago
What gets me is that software has been a mainstay of modern business for _at least_ 30 years. And this whole time, every single professional software developer has been telling every single non-software developer the exact same thing, over and over: this takes longer to do than you think. If, say, 80% of developers were knocking things out, problem free, in an hour or two and the other 20% were hanging back like a 50’s union boss saying, “yeah, that’s going to be an all-day job easy”, then maybe I could understand why they STILL think we’re lying. Even if 20% of the devs could get things done in the time they seem to think it takes and the other 80% were hemming and hawing I could still understand this perspective. But that’s not the ratio. 0% of devs can reliably complete tasks in the time that MBA’s seem to think it should take and 100% of devs take longer than they “wish” it would take and THEY STILL AREN’T PAYING ATTENTION.
dsp1234 · 6 years ago
This isn't all on the non-developers. I've walked into two rooms this week, which contained several senior developers, and when I asked them about non-functional requirements, they had to ask what they were, and why they were important.

I've also seen teams where the Defintion of Done doesn't include any steps at all towards deployment. Done is when someone approves the pull request.

Not surprisingly, it takes longer for those teams to 'complete' a change. In the former case, they are continuously surprised, and angered, by 'requirements' that 'no one' told them about. In the later, they stop halfway, and wonder why everyone is waiting on them, because it's 'development complete'.

dsp1234 commented on India will soon overtake China to become the most populous country in the world   ourworldindata.org/india-... · Posted by u/okket
nickelcitymario · 6 years ago
This graph is awful and misleading. (The article is fine, I think, but the graph is ridiculous.)

A cursory look would imply that each color segment increments population count in some sort of consistent and logical way, but it doesn't.

The increments are like so:

0 to 1 million

1 million to 5 million (5x increase)

5 million to 10 million (2x)

10 million to 20 million (2x)

20 million to 50 million (2.5x)

50 million to 100 million (2x)

100 million to >500 million (5x+)

That last category is the most misleading, because it paints the US, Russia, China, India, and a bunch of other countries as being in the same league, when they're not even close.

The details show the US at 320 million people, Russia at 144 million, the China at 1.4 BILLION and India at 1.3 Billion.

It's insane that these are grouped together. Totally nonsensical. The difference between China and Russia is about 10x. [UPDATE: My initial bad math said it was a 77x difference.]

</rant>

dsp1234 · 6 years ago
> The difference between China and Russia is 77x.

The population of Russia is ~146.7M [0]

The population of China is ~1,403M (~1.4B) [1]

This makes China ~9.6 times larger than Russia.

The US is ~327M [2], which makes China about ~4.3 times as large.

It's a big difference, but not as large as 77x.

[0] - https://en.wikipedia.org/wiki/Russia

[1] - https://en.wikipedia.org/wiki/China

[2] - https://en.wikipedia.org/wiki/United_States

dsp1234 commented on Type 2 diabetes: NHS to offer 800-calorie diet treatment   bbc.com/news/health-46363... · Posted by u/lxm
drenvuk · 7 years ago
Citations for the 5 to 7% would be nice. Here's a meta analysis that states otherwise.

https://academic.oup.com/ajcn/article/74/5/579/4737391

dsp1234 · 7 years ago
Per that study, "This analysis of 5-y weight-loss maintenance indicates, on average, that obese individuals maintained weight losses of 3.0 kg, representing a reduced weight of 3.2% below initial body weight. These individuals were successfully maintaining a weightloss averaging 23.4% of their initial weight loss at 5y"

Put another way, after 5 years, the average weight loss was ~3kg, with an average weight gain of 78.6% of the original lost weight.

Also, in general, an obese person who is obese before, and after 5 years has a weight that is ~3kg is likely still obese.

So this study basically just says that an average person isn't successful at keeping a large amount of weight, and gains most of it back. The specific statistics stated by the commenter above may not be correct, but the sentiment is definitely true.

That said, the study does confirm that very low energy diets (~800 meal replacement) beat out hypoenergetic balanced diet (~1200-1500 normal food) in long term weight maintence. Which is what the article is suggesting being implemented. So it's still not a great long term solution, but it's the better of the two non-exercise based solutions which this study evaluated.

dsp1234 commented on “So the silence from Facebook over the weekend is.. deafening.”   twitter.com/gavinsblog/st... · Posted by u/vinnyglennon
joering2 · 7 years ago
From $219 to $160 thats “measly 1.5%” ? Maybe by Sesame Street’s Yellow Bird math standards.
dsp1234 · 7 years ago
The closing price on Thursday was $168.84. The close yesterday was $162.44. That's a 3.8% decline.

I'm not sure where you got the $219 price from, but the stock price was nowhere near that before the breach.

dsp1234 commented on Proving our universe is one among many would be a fourth Copernican revolution   nautil.us/issue/64/the-un... · Posted by u/pseudolus
tvmalsv · 7 years ago
I would say "yes" to that. But fortunately, with the load being so widely distributed, the load on our "local" quantum computers would effectively be zero (ie. x/inf). Unless, of course, our universe is the oddball and most others are running at full capacity. That's a depressing possibility.
dsp1234 · 7 years ago
The load could be zero, it could be infinite, or anywhere in between. Infinite universes sending infinite work is inf/inf. It's not possible to know if that's going to tend towards something like 0 or something like positive infinity without having some way to measure. But it's an error to just assume it's zero.
dsp1234 commented on Zoho.com CEO says domain with 40M users suspended for abuse complaint   twitter.com/svembu/status... · Posted by u/achynet
foo101 · 7 years ago
Honest question: What exactly does it mean for a registrar to block a domain? I believed so far that for my browser to successfully connect to a web server running on a domain or for a mail server to deliver email to a domain, there should only be valid A, AAAA, MX, and/or CNAME records in the DNS.

Was it really a block at the registrar level or was it a block at the DNS level, i.e., the registrar also ran DNS service and their DNS service refused to return responses for zoho.com domains?

At what layer or at which stage of the protocol can a registrar disrupt this and take a domain offline?

dsp1234 · 7 years ago
There are several layers where a registrar has control over DNS resolution.

Terms:

ICANN: The organization responsible for coordinating the maintenance of the domain name system (among other things).

Registrar: A company authorized to update ICANN database on behalf of registrants. Google, GoDadddy, Enom, etc are registrars

Registrants: An entity that wants to register a domain name. In this case, Zoho is a registrant, but it could also be an individual. This is your role if you 'own' a domain.

Authoritative Name Server: A domain name server that is considered authoritative for a specific domain.

Stuff registrars can do (among other things):

1.) They can update the ICANN database to disable a domain completely[1]

2.) They can replace your authoritative name servers with their own or someone else's (ex: botnet domains being reassigned to a security company for dismantling via court order)[2]

3.) If the authoritative name servers for a domain are owned by the registrar, then the registrar can merely change the DNS entries themselves to point to something other than the domain owner's wishes.

[0] - https://en.wikipedia.org/wiki/ICANN

[1] - https://www.icann.org/resources/pages/epp-status-codes-2014-...

[2] - https://www.icann.org/en/system/files/files/guidance-domain-...

dsp1234 commented on NewSQL databases fail to guarantee consistency and I blame Spanner   dbmsmusings.blogspot.com/... · Posted by u/evanweaver
Twirrim · 7 years ago
It's not really the default settings, per se. You don't have to change any bit of configuration about your database to get consistency. The DynamoDB API gives you the GetItem API call and a boolean property to choose to make it a consistent read.

It's left as a very simple task for developers leveraging DynamoDB to make the appropriate trade offs on consistent or inconsistent read.

source: Used to work for AWS on a service that heavily leveraged DynamoDB. Not _once_ did we experience any problems with consistency or reliability, despite them and us going through numerous network partitions in that time. The only major issue came towards the end of my time there when DynamoDB had that complete service collapse for several hours.

On the sheer scale that DynamoDB operates at, it's more likely to be a question of "How many did we automatically handle this week?" than "How often do we have to deal with network partitions?"

dsp1234 · 7 years ago
From the GetItem docs[0]

"GetItem provides an eventually consistent read by default."

This seems to meet the definition of "DynamoDB's default settings"

[0] - https://docs.aws.amazon.com/amazondynamodb/latest/APIReferen...

dsp1234 commented on Randomness in .NET   lowleveldesign.org/2018/0... · Posted by u/lowleveldesign
iainmerrick · 7 years ago
You should use your own PRNG in that case.

I understand not wanting to change the implementation now, but users should never have assumed it would be stable in the first place.

dsp1234 · 7 years ago
users should never have assumed it would be stable in the first place.

It's not an assumption. It's directly in the documentation.

"If the same seed is used for separate Random objects, they will generate the same series of random numbers."[0]

[0] - https://docs.microsoft.com/en-us/dotnet/api/system.random

u/dsp1234

KarmaCake day3944September 11, 2014View Original