Readit News logoReadit News
mafuy commented on Microsoft keeps adding stuff into Windows we don't need   theregister.com/2025/08/1... · Posted by u/rntn
caztanj · 11 days ago
The title is misleading. It should be "Microsoft keeps adding crap into Windows that no one would ever want."

If Microsoft wanted to fix Windows it would be an easy task. Step 1: Delete everything added since Windows 7. Step 2: Delete all dotnet crap. Step 3: Make the APIs good by deleting almost everything and making new plain C89 APIs. Step 4: Realize we need a new operating system and delete all of Windows and start over.

mafuy · 10 days ago
Leave dotnet alone, it has nothing to do with this.

And you can't complain about the API. It's so good and established that games can now just use the Windows API to run on Linux.

This destructive attitude will just turn people off your cause.

mafuy commented on Microsoft keeps adding stuff into Windows we don't need   theregister.com/2025/08/1... · Posted by u/rntn
gcanyon · 11 days ago
Microsoft adds stuff "we" don't need to everything. Think about the ribbons of tools available in Office -- Excel and Word in particular have features for pretty much anything you can think of. I'm not arguing it's not a valid choice -- I'm sure someone needs all those features. But for me, personally, I prefer products that do a smaller set of things simply and perfectly, as opposed to products that do All The Things somewhat well. Even if it does everything very well, having all those extra features in the way makes it harder to get the few things I want to do harder.
mafuy · 10 days ago
Well, that's what Ribbons are for. I don't get the hate on them - they are rather cool.
mafuy commented on Microsoft keeps adding stuff into Windows we don't need   theregister.com/2025/08/1... · Posted by u/rntn
mastry · 11 days ago
See the links I listed above. None of those features solved a language problem as large as the lack of sum types. It baffles me that they even spent time on them before providing a feature that is in such high demand (and has been for more than a decade).

I understand that you shouldn’t always give users what they ask for - but this is something that has picked up steam in other languages because it’s actually useful and makes code bases easier to maintain.

mafuy · 10 days ago
You're projecting your strong opinion on others.

I've used C# since 2008 for business software and high performance computing. I've not missed sum types at all. Most of what was added is something I see a lot of value in. I don't like that it, by design, obsoletes some older parts of the language, but that's about it.

I'm now using C# on linux almost exclusively. No complaints from me!

mafuy commented on The future of large files in Git is Git   tylercipriani.com/blog/20... · Posted by u/thcipriani
ozim · 11 days ago
Yeah but GIT is not the tool for that.

That is why I don’t understand why people „need to use GIT”.

You still can make something else like keeping versions and keeping track of those versions in many different ways.

You can store a reference in repo like a link or whatever.

mafuy · 11 days ago
Git is the right tool. It's just bad at this job.
mafuy commented on Leonardo Chiariglione – Co-founder of MPEG   leonardo.chiariglione.org... · Posted by u/eggspurt
mike_hearn · 20 days ago
> There is no way that a single company could develop a state of the art compiler without using an existing one. Intel had a good independent compiler and gave up because open source had become superior.

Not only can they do it but some companies have done it several times. Look at Oracle: there's HotSpot's C2 compiler, and the Graal compiler. Both state of the art, both developed by one company.

Not unique. Microsoft and Apple have built many compilers alone over their lifespan.

This whole thing is insanely subjective, but that's why I'm making fun of the "unsubstantiated claim" bit. How exactly are you meant to objectively compare this?

mafuy · 16 days ago
I've searched for some performance comparisons between Graal and equivalent GCC programs and it seems like Graal is not quite at the same level - unsurprisingly, it is probably more concerned with avoiding boxing than optimal use of SIMD. And as much as I love Roslyn, which is/was a Microsoft thing: it has the same issue. It only recently got serious about competing with C, and that's years after it was open sourced.
mafuy commented on Let's properly analyze an AI article for once   nibblestew.blogspot.com/2... · Posted by u/pabs3
thrown-0825 · 18 days ago
Stop kidding yourself, most CS grads are about as much of a scientist as your car mechanic.
mafuy · 16 days ago
A majority, yes. Though I know enough of them that proceeded with a PhD to say that it's far from universal.
mafuy commented on MCP overlooks hard-won lessons from distributed systems   julsimon.medium.com/why-m... · Posted by u/yodon
cnst · 18 days ago
I'd like to add that the culmination of USB-C failure was Apple's removal of USB-A ports from the latest M4 Mac mini, where an identical port on the exact same device, now has vastly different capabilities, opaque to the final user of the system months past the initial hype on the release date.

Previously, you could reasonably expect a USB-C on a desktop/laptop of an Apple Silicon device, to be USB4 40Gbps Thunderbolt, capable of anything and everything you may want to use it for.

Now, some of them are USB3 10Gbps. Which ones? Gotta look at the specs or tiny icons, I guess?

Apple could have chosen to have the self-documenting USB-A ports to signify the 10Gbps limitation of some of these ports (conveniently, USB-A is limited to exactly 10Gbps, making it perfect for the use-case of having a few extra "low-speed" ports at very little manufacturing cost), but instead, they've decided to further dilute the USB-C brand. Pure innovation!

With the end user likely still having to use a USB-C to USB-A adapters anyways, because the majority of thumb drives, keyboards and mice, still require a USB-A port — even the USB-C ones that use USB-C on the kb/mice itself. (But, of course, that's all irrelevant because you can always spend 2x+ as much for a USB-C version of any of these devices, and the fact that the USB-C variants are less common or inferior to USB-A, is of course irrelevant when hype and fanaticism are more important than utility and usability.)

mafuy · 16 days ago
As far as I know, please correct me if I'm wrong, the USB spec does not allow USB-C to C cables at all. The host side must always be type A. This avoids issues like your cellphone power supplying not just your headphones but also your laptop.
mafuy commented on Let's properly analyze an AI article for once   nibblestew.blogspot.com/2... · Posted by u/pabs3
thrown-0825 · 18 days ago
Great, and if you got a job with us I would be having to explain how docker works because you refused to learn it for some reason.

Point is that what is deemed important in academic circle is rarely important in practice, and when it is I find it easier to explain a theory or algorithm than teach a developer how to use an industry standard tool set.

We should be training devs like welders and plumbers instead of like mathematicians because practically speaking the vast majority of them will never use that knowledge and develop an entirely new skill set the day they graduate.

mafuy · 18 days ago
You're simply wrong about CS. CS is a science. You are not looking for scientists, you are looking for apprentices. Don't hire an architect to paint your wall. Don't hire scientists to be code monkies. Universities are not the right place to look at.

Except if you instead just want smart people, yea, they tend to aggregate at unis. There, you can hire an athlete to paint your wall, if that's what you need.

mafuy commented on Let's properly analyze an AI article for once   nibblestew.blogspot.com/2... · Posted by u/pabs3
thrown-0825 · 18 days ago
I agree with you.

SSH and even just managing your dev env in a sane manner are skills that I have to literally hand hold people through on a regular basis and would fully expect people to have coming out of a 4 year degree.

Git and basic SQL are next on the list.

mafuy · 18 days ago
SSH is absolutely not a core CS skill. I use it daily, and I think students should pick it up somewhere along the line. But it still is a mere tool, not a concept that should be taught in mandatory classes. Same goes for latex and git and C.

All of that is, or should be, vocational, because anyone can learn it given some time. Universities are about the hard stuff that is difficult to get right at even a mediocre level.

If your company requires it, include it in your regular training program. Don't dilute the material because you don't want to be bothered. If you think people spend too much time on hard stuff, hire BSc instead of MSc.

mafuy commented on Let's properly analyze an AI article for once   nibblestew.blogspot.com/2... · Posted by u/pabs3
Der_Einzige · 18 days ago
If this is true than I hope that AI kills both computer programming and computer science as fast as possible.

It’s not. If you’re a computer scientist who’s not coding, you are a bad computer scientist. “Those who cannot do, teach”

mafuy · 18 days ago
Sorry but that's just nonsense. "Programming" has for a long time been a low level job. In Germany, it is taught as an apprentice level program. CS in practical universities will instead, or rather on top, teach program architecture and planning. At theory focussed universities, you'll learn how an LLM actually works, how to design high or low level protocols, how to design a new algorithm suited to your environment, etc. The programming required for these can be done for personal recreation, be outsourced, or replaced with an existing library as desired.

u/mafuy

KarmaCake day440September 11, 2014View Original