Readit News logoReadit News
onli commented on Dropbox announces new gen server hardware for higher efficiency and scalability   dropbox.tech/infrastructu... · Posted by u/juanviera23
theanonymousone · 14 days ago
How is Dropbox doing by the way? Are they in good shape?

They seem to be "out of headlines" for some time, which of course is not necessarily a bad thing...

onli · 14 days ago
Aren't they out of the headlines because they cut ties with the tech community? Hired Condoleezza rice, spammed notifications, made sharing without account impossible (or hard?), broke shared links on purpose, bought and shut down mailbox. And all of that plus very high prices.

They do claim to be profitable though, if I read that correctly.

onli commented on The Framework Desktop is a beast   world.hey.com/dhh/the-fra... · Posted by u/lemonberry
epistasis · 15 days ago
I have had the system for eight years and at no point would upgrading RAM have increased performance.

Upgrading the RAM would have created more waste than properly sizing the RAM to COU proportion from the beginning.

It is very odd to encounter someone who has such a narrow view of computing that they cannot imagine someone not upgrading their RAM.

I have not once, literally not once have RAM break either. I have been part of the management of clusters of hundreds of compute nodes, that would occasionally each have their failures, but not once was RAM the cause of failure. I'm fairly shocked to hear that anybody's RAM has failed, honestly, unless it's been overlocked or something else.

onli · 15 days ago
> It is very odd to encounter someone who has such a narrow view of computing that they cannot imagine someone not upgrading their RAM.

Uncalled for and means the end of the discussion after this reaction. Ofc I can imagine that, it's just usually a dumb decision.

That you did not have to upgrade the ram means one of two things: You either had completely linear workloads, so unlike me did not switch to a compiled programming language or experimented with local LLMs etc. Or you bought a lot of ram in the beginning, so 8 years ago with a hefty premium.

Changes nothing about the fundamental disagreement with the existence of such machines. Especially from a company that knows better. I do not expect ethical behaviour from a bottom of the barrel company like Apple, but it was completely reasonable to expect better from framework.

onli commented on The Framework Desktop is a beast   world.hey.com/dhh/the-fra... · Posted by u/lemonberry
epistasis · 15 days ago
I have been continuously baffled by the people that think that soldered on RAM is somehow "throwaway". My last desktop build is eight years old and I have never upgraded the ram. Never will. My next build will have an entirely new motherboard, ram, and GPU, and the last set will end up at the ewaste recycler, because who could I find that wants that old hardware?

Soldered RAM, CPU, and GPU, that give space benefits and performance benefits is exactly what I want, and results in no more ewaste at all. In fact less ewaste, because if I had a smaller form factor I could justify keeping the older computer around for longer. The size of the thing is a bigger cause of waste for me than the ability to upgrade RAM.

Not everybody upgrades RAM, and those people deserve computers too. Framework's brand appears to be offering something that other suppliers are not, rather than expand ability. That's a much better brand and niche overall.

onli · 15 days ago
> Not everybody upgrades RAM, and those people deserve computers too.

No. It's end of the line with consumerism and we either start repairing and recycling or we die. Framework catered to people who agree with that, and this product is not in line.

I have no idea why you would not upgrade your memory, I have done so in all PCs I ever owned and all laptops, and it's a very common (and cheap) upgrade. It reduces waste because people can then use their system longer, which means less garbage over the lifetime of a person. And as was already commented, it is not only about upgrades, but also about repairs. Ram breaks rather often.

Deleted Comment

onli commented on The Framework Desktop is a beast   world.hey.com/dhh/the-fra... · Posted by u/lemonberry
sethops1 · 15 days ago
The RAM is soldered on all Halo Strix platforms because physics is getting in the way. With pluggable DIMMs the memory bandwidth would be halved, at best.
onli · 15 days ago
He is still right. It is a desktop PC that is less repairable than all other desktop PCs, from a brand that is known to champion repairability. They had a reason for it, but could've chosen to not create more throwaway things.
onli commented on What's wrong with the JSON gem API?   byroot.github.io/ruby/jso... · Posted by u/ezekg
byroot · 15 days ago
> Dont feel obliged to further discuss

Just clarifying this one:

> I do not understand why the option to do so would not have helped in the hacker issue.

Because users aren't omnipotent. When a user need to parse some JSON, they'll reach to `JSON.parse`, which they either already know about or will find from a cursory search. That will have solved their problem so they will not likely look in detail for the dozen or so various options this method takes and won't consider the possibility of duplicated keys nor their potential nefarious impact.

Hence why the defaults are so important.

> it is important for me to try to argue against churn

It's alright, I'm with you on this in general, just not in this particular case.

onli · 15 days ago
Alright, thanks.
onli commented on What's wrong with the JSON gem API?   byroot.github.io/ruby/jso... · Posted by u/ezekg
byroot · 16 days ago
> but why not let devs set `allow_duplicated_key: false` manually in contexts where it matters?

Because that wouldn't have prevented the issue I linked to.

Default settings are particularly important, because most of the gem's users have no idea that duplicated keys are even a concern.

JSON is used a lot to parse untrusted data, as such having strict and safe default is particularly valuable.

In your case the JSON documents are trusted, so I understand this change is of negative value for you, but I believe it has positive value overall when I account for all the users of the gem.

Additionally, even for the trusted configuration file or similar case, I believe most users would see it as valuable to not let duplicated keys unanswered, because duplicated keys are almost always a mistake and can be hard to track down.

e.g. a developer might have some `config.json` with:

    {
      "enabled": true,
      // many more keys,
      "enabled": false,
    }
And waste time figuring out why `JSON.parse(doc)["enabled"]` is `false` when they can see it's clearly `true` in their config.

So again, I understand you are/will be annoyed, because your use case isn't the majority one, but I'm trying to cater to lots of different users.

If going over your code to add the option is really too much churn for your low maintenance project, as I mention in the post, for such cases a totally OK solution is to monkey patch and move on:

    require "json"

    module JSONAllowDuplicateKey
      def parse(doc, options = nil)
        options = { allow_duplicate_key: true }.merge(options || {})
        super(doc, options)
      end
    end
    JSON.singleton_class.prepend(JSONAllowDuplicateKey)

onli · 15 days ago
It is the other way around, I am convinced. In the config scenario you describe the input is also trusted, as will be the bulk of JSON usage. In all those scenarios duplications will happen regularly and developers rely on the JSON parser to work with the input regardless. Which means they all will have to churn the change required by this to get a working program again, one that does not crash with input that passed before.

I get that the option to detect and prevent this is good. The warning is seriously useful. But it does not need to be a new default to forbid the old valid inputs. And I do not understand why the option to do so would not have helped in the hacker issue.

Thanks for the code snippet. Actually, it is not a lot of code for me to change and likely I do not have an intermediate dependency, I hope. Problem is the dependencies vastly outnumbering my program and thus work like this potentially adding up.

Dont feel obliged to further discuss if it is a waste of your time - but it is important for me to try to argue against churn.

onli commented on What's wrong with the JSON gem API?   byroot.github.io/ruby/jso... · Posted by u/ezekg
byroot · 16 days ago
If you don't care about duplicated keys, all you got to do is to set the `allow_duplicated_key: true` option.

The whole point of the article is to explain why while I have empathy for code owners that will be impacted by various changes sometimes I believe the benefits outweigh the cost.

I even linked to an example of a very nasty security issue that would have been prevented if that change had been made sooner.

> Is the ruby community breaking?

No, the community is trying to fix past mistakes. For instance the YAML change you are cursing about has been the source of numerous security vulnerabilities, so tenderlove had to do something to remove that massive footgun.

I totally get that it annoyed you, you can't imagine how much code I had to update to deal with that one.

But again, while I totally agree maintainers should have empathy for their users, and avoid needless deprecations, it'd be good if users had a bit of empathy for maintainers that are sometimes stuck in "damned if you do, damned if you don't" situations...

onli · 16 days ago
I will try to understand your position, and thanks for answering.

I appreciated the "avoiding breakage" sentiment included in the post (and commented accordingly below). I'm just really of the opinion that dependencies should not trigger me to do something like set `allow_duplicated_key: true` if it is not absolutely necessary, and I guess I do not see why it is absolutely necessary here. I saw the link to a security vulnerability, but why not let devs set `allow_duplicated_key: false` manually in contexts where it matters? Avoiding churn is more important - this is not a "the api will cause random code to be executed" situation, unlike the create_additions options situation you describe and possibly unlike the YAML situation. There I understand the need (even with YAML, there I'm just sad about the currently broken default code path).

Also, we saw with Yaml how there it wasn't viable to do such a change without breakage via the intermittent dependencies (and I was very happy you mentioned that API not being nice) and churn via the direct. The same is very likely to happen here, programs will break because their dependencies to not set the new option when those store JSON.

onli commented on What's wrong with the JSON gem API?   byroot.github.io/ruby/jso... · Posted by u/ezekg
ezekg · 19 days ago
First thing we could do here is rename the JSON.parse :symbolize_names keyword to :symbolize_keys -- always trips me up for some reason.
onli · 16 days ago
Breaking the API like that would be extremely hostile and exactly the kind of churn byroot thankfully claimed he does not want to produce. If anything both options could be offered. Don't be disrespectful with developers time.
onli commented on What's wrong with the JSON gem API?   byroot.github.io/ruby/jso... · Posted by u/ezekg
jmull · 19 days ago
Changing the default behavior for duplicate keys is unnecessary... and therefore should not be done.

IMO, it's nuts to purposely introduce random bugs into the apps of everyone who uses your dependency.

onli · 16 days ago
I agree. This will actually cause me problems.

I have a huge selection of mostly handwritten JSON data that powers a benchmark collection. Sometimes, mistakes happen and keys get duplicated. That is not a big problem, if e.g. a graphics card is two times in an array nothing bad can happen. Even if the actual value is wrong because of such a duplication (like a benchmark update gone wrong) that will usually not be a problem, cause there is a lot of other data that will place the component at around the correct position regardless.

Now, what will happen is that a ruby update forces me to update the gems, and then the new JSON gem will crash with my data, and then I will have to take time I don't have to fix something that does not need fixing for a project that does not really generate income. Awesome.

The right solution here is to specify the unspecified behaviour about the duplicate keys as it is currently for this parser. Then applications randomly running into new bugs is prevented. And then print a warning, but that's done now already. And then maybe offer an opt-in to disallow input data that has duplicated keys. Not to waste the developers time by making the breakage default.

And that's why dependencies breaking are unacceptable (if the old behaviour is not completely broken), and relying on invisible deprecation messages is not okay. The sqlite gem did the same thing, completely broke their old API of how parameters could be supplied and did not revert the change even when the chaos that caused was reported, and then took it as a personal insult when I opened a discussion about how that's still a problem.

Nice is also the YAML and psych gem, that just this month suddenly could not write a YAML file with YAML::Store.new anymore. I had to workaround that with `file.write(Psych.dump(theyamlhash))`, https://gitlab.com/onli/sustaphones/-/commit/fecae4bb2ee36c8... is the commit. If I got that right this is about `permitted_classes: [Date]` not being given and not being givable to psych in YAML::Store. Made me pause.

Is the ruby community breaking? I'm not aware of gems producing developer hostile situations like that before.

u/onli

KarmaCake day8974July 18, 2011
About
onli@paskuda.biz

Current projects:

A PC hardware recommender, https://www.pc-kombo.com/

A Yahoo Pipes inspired feed programming editor, https://www.pipes.digital/

[ my public key: https://keybase.io/onli; my proof: https://keybase.io/onli/sigs/bNRi8ad_Yqzfnd89O_tkt-lXFypwn9PRbyxknVaQ4PI ]

View Original