Readit News logoReadit News
1000100_1000101 commented on Ask HN: Why hasn't x86 caught up with Apple M series?    · Posted by u/stephenheron
BirAdam · 6 days ago
1. Apple’s optimizations are one point in their favor. XNU is good, and Apple’s memory management is excellent.

2. X86 micro-ops vs ARM decode are not equivalent. X86’s variable length instructions make the whole process far more complicated than it is on something like ARM. This is a penalty due to legacy design.

3. The OP was talking about M1. AFAIK, M4 is now 10-wide, and most x86 is 6-wide (Ryzen 5 does some weird stuff). X86 was 4-wide at the time of M1’s introduction.

4. M1 has over 600 reorder buffer registers… it’s significantly larger than competitors.

5. Close relative to x86 competitors.

1000100_1000101 · 5 days ago
> 4. M1 has over 600 reorder buffer registers… it’s significantly larger than competitors.

And? Are you saying neither Intel nor AMD engineers were able to determine that this was a bottleneck worth chasing? The point was, anybody could add more cache, rename, reorder or whatever buffers they wanted to... it's not Apple secret-sauce.

If all the competition knew they were leaving all this performance/efficiency on the table despite there being a relatively simple fix, that's on them. They got overtaken by a competitor with a better offering.

If all the competition didn't realize they were leaving all this performance/efficiency on the table despite there being a relatively simple fix, that's also on them. They got overtaken by a competitor with better offering AND more effective engineers.

1000100_1000101 commented on Show HN: I built a word game. My mom thinks it's great. What do you think?   whatsit.today/... · Posted by u/mkate
y-curious · 5 months ago
I got the letters "AACNT" and even cheated online. The only word online dictionaries could make was "TACAN" and it wasn't accepted.
1000100_1000101 · 5 months ago
I got stuck on Catan for a while too, as I assumed it was like most games and didn't use names.

At some point I just moved on, and solved the next word, Chess, in like 2 seconds. The results claimed I spent a whole pile of time on Chess, since it has no idea which result I'm thinking of solving (though all the wrong guesses for Catan should have been a clue that the time should have gone towards Catan.)

I'm not sure the per-word time values are useful if they can't be trusted to be accurate.

1000100_1000101 commented on Please don't force dark mode   iamvishnu.com/posts/pleas... · Posted by u/vishnuharidas
sosowhitewhite · 7 months ago
> That transition was painful.

I was actually alive then, so I can speak to it.

Athena text widgets on X were black on white in the 80's. So was Lisa, Mac, and NeXT, OS/X and SunOS's first GUI. Yes, amber on black was long running, but since you weren't alive then let me tell you something: it sucked. Moving from VT100 (VT104) terminals to actual Sun/Aix machines running X was a HUGE improvement on eye strain.

1000100_1000101 · 7 months ago
I was actually alive then, and the painfulness of the white background was real physical eye-strain pain.

I was alive for both amber and green screens too.

Please don't suggest people are making things up from times before their birth simply because you have a different view.

1000100_1000101 commented on Please don't force dark mode   iamvishnu.com/posts/pleas... · Posted by u/vishnuharidas
flockonus · 7 months ago
Traditional normal is not an absolute statement. Sure DOS / Unix back in the early days of PC displayed black backgrounds due the display's at the time working better this way.

Before that, people shared information in white paper; and the beginning of the internet brought it back with black text over white background.

Therefore there is no canonical traditional normal, it all depends when one joined.

1000100_1000101 · 7 months ago
Paper and paper-like writing surfaces were non-white for a long time before we got bleached white paper.

We haven't yet had a glowing-white paper.

Traditional-normal for computing was a dark background.

There was likely a technological limit in the use of pure white at the start when "emulating" paper. VGA 16-color mode likely meant that the choice was between bright white and medium grey, which was too dark. Configurability has lagged behind though.

1000100_1000101 commented on Please don't force dark mode   iamvishnu.com/posts/pleas... · Posted by u/vishnuharidas
amelius · 7 months ago
Wasn't it the original Mac that changed it?
1000100_1000101 · 7 months ago
Mac likely did use this scheme, and yes, copied it from Xerox. However neither Macs nor Xerox had mainstream use. I'd only actually seen 3 Macs in the wild before their switch to Intel, over 20 years later.

Windows adopting the "paper"-white background and whole world drooling over the arrival of Windows 3.1 and 95 is when it became the standard, I think.

1000100_1000101 commented on Please don't force dark mode   iamvishnu.com/posts/pleas... · Posted by u/vishnuharidas
1000100_1000101 · 7 months ago
Dark mode was the traditional normal.

From early green or amber text on black mono displays. Grey on black DOS text mode. Light Blue on Dark Blue C-64. Apple 2's grey/white (I don't recall) on black. Even GUI wise, Amiga used a dark-blue background as the default Workbench, with user selectable palettes for everything.

It was Microsoft Windows that changed the paradigm to default to a searing white display with black text in most apps, like Notepad, Word, etc., because "it's more like paper". Sure, paper is white, but it's not glowing white. That transition was painful.

I'm glad to see dark-modes return, I agree there needs to be an option, not just forced dark-mode. Preferably light mode options to use a not-as-bright-as-possible white too.

1000100_1000101 commented on Why is my CPU usage always 100%?   downtowndougbrown.com/202... · Posted by u/pncnmnp
nerdile · 8 months ago
As a former Windows OS engineer, based on the short statement here, my assumption would be that your programs are IO-bound, not CPU-bound, and that the next step would be to gather data (using a profiler) to investigate the bottlenecks. This is something any Win32 developer should learn how to do.

Although I can understand how "Please provide data to demonstrate that this is an OS scheduling issue since app bottlenecks are much more likely in our experience" could come across as "denying and gaslighting" to less experienced engineers and layfolk

1000100_1000101 · 8 months ago
I'm not the original poster, but I ran into something similar late in Win 7 (Win 8 was in beta at the time). We had some painting software, and we used open-MP to work on each scan-line of a brush in parallel.

It worked fine on Mac. On Windows though, if you let it use as many threads as there were CPUs, it would nearly 100% of the time fail before making it through our test suite. Something in scheduling the work would deadlock. It was more likely to fail if anything was open besides the app. Basically, a brush stoke that should complete in a tenth of a second would stall. If you waited 30-60 minutes (yes minutes), it would recover and continue.

I vaguely recall we used the Intel compiler implementation of OpenMP, not what comes with MSVC, so the fault wasn't necessarily a Microsoft issue, but could still be a kernel issue.

I left that company later that year, and MS rolled out Windows 8. No idea how long that bug stuck around.

1000100_1000101 commented on A rare alignment of 7 planets is about to take place   sciencealert.com/a-rare-a... · Posted by u/koolba
kadoban · 8 months ago
> Will a picture from a wide angle lens actually show the planets?

Yes.

> I thought planets just show up as a bright dot in the sky.

Correct. :)

There's no real way to get around that geometry problem, you can either see several at once but they're pinpricks or one at a time but potentially somewhat more clearly.

1000100_1000101 · 8 months ago
To add to this, I'll try to give an idea of how much zoom (or focal length really) you'd need to get a picture with detail.

I took photos of both Jupiter and Saturn w/ a Canon R7 and the RF 100-500mm lens, with a 1.4x extender. The 1.4x extender make the lens act like 700mm instead of 500mm. The R7 being an APS-C sensor adds another 1.6x factor, making the combo the equivalent of 1120mm. In these photos the planets are still just dots. The camera takes 32.5 megapixel photos. When zoomed in to the pixel level, both planets were still tiny, about 50 pixels wide. It was enough to see Saturn had a ring and some color striping on Jupiter, but that's it.

The iPhone main camera is like 26mm (42x less zoom). The iPhone 13 Pro's telephoto lens is 77mm (14.5x less zoom), and the iPhone 15 Pro Max is 120mm (9.3x less zoom)... so you're unlikely to get much more than what looks like an out of focus few pixel wide dot even on the zoomiest of iPhones, but with that wider 26mm lens, you just might be able to capture them all in one shot.

To me, what's more technically impressive than the fact I took pictures of the planets with readily available camera gear was that I did with 1/125s shutter speed, handheld, standing in my yard. The accuracy of the image stabilization needed to pull that off is what astounded me the most.

1000100_1000101 commented on Write Your Own Virtual Machine (2022)   jmeiners.com/lc3-vm/... · Posted by u/sebg
saithound · 8 months ago
I'm not talking about the instruction set, or teaching basic assembly (probably anything except Malbolge is suitable for that).

Let's look at just one thing every programmer has to deal with, memory.

On an LC-3, the address space is exactly 64KiB. There is no concept of missing memory, all addresses are assumed to exist, no memory detection is needed or possible, and memory mapped IO uses fixed addresses.

There are no memory management capabilities on the LC-3, no MMU, no paging, no segmentation. In turn there are no memory-related exceptions, page faults or protection faults.

When an x86 machine boots with 1MB of RAM, the 4GB address space still exists in full, but accessing certain addresses will cause bus timeouts, crashes. One must track and manage available memory. There's a BIOS, and manually probing memory locations may trash its critical structures. There's INT 0x15.

I picked memory arbitrarily but you run into the same limitations no matter what you pick. Would a students who was educated on LC-3 know how a computer keeps time? Of course not, there's no PIT, there's no CMOS clock. Would they have thought about caches? Nope.

Oh, but wouldn't a student who implements a timer emulation extension for LC-3 learn more about timers than somebody who just learned to use an x86 PIT? Alas, no. There are 20 equally easy and reasonable mathematical ways to implement a timer abstraction. A good 15 of these are physically impossible on real hardware, out of the remaining 5 two would be prohibiitively expensive due to electrical engineering reasons, one has never been implemented in real hardware due to historical accidents, and two are designs that are actually in use. So to write timer emulation that teaches you anything at all about how actual timers work, you'll have to look at and understand a real architecture anyway.

That's why educational architectures are so contraproductive. They abstract away exactly the things that make modern computers modern computers. One comes away with fundamentally wrong ideas about what computers do and how they actually work, or could work.

It's like learning to drive in GTA: in principle, there could be plenty of skills that transfer to the real thing, but in practice you'll prefer to teach how to drive to the person who didn't play GTA at all.

1000100_1000101 · 8 months ago
> On an LC-3, the address space is exactly 64KiB. There is no concept of missing memory, all addresses are assumed to exist, no memory detection is needed or possible, and memory mapped IO uses fixed addresses.

> There are no memory management capabilities on the LC-3, no MMU, no paging, no segmentation. In turn there are no memory-related exceptions, page faults or protection faults.

Sounds an awful lot like a Commodore 64, where I got my start. There's plenty to learn before needing to worry about paging, protection, virtualization, device discovery, bus faults, etc.

It sounds like it's not teaching the wrong things like your GTA driving example, but teaching a valid subset, but not the subset you'd prefer.

1000100_1000101 commented on M4 MacBook Pro   apple.com/newsroom/2024/1... · Posted by u/tosh
talldayo · 10 months ago
What you see as anticompetitive payment processing on iOS, others may see friendly and harmless business model. HNers, be respectful when criticizing bigger companies like John Deere and Apple - it's important you don't hurt these customer's feelings and scare them off.
1000100_1000101 · 10 months ago
Your tractor doesn't (I would hope) contain your banking details and all your emails, contacts, browsing history, photos, etc. It deserves to be treated as the tool that it is.

Apple taking your data privacy seriously seems a worthy exception to me. You're free to disagree, and buy an Android.

u/1000100_1000101

KarmaCake day352June 12, 2020View Original