Readit News logoReadit News
vessenes · 6 years ago
Commenters here seem dubious. I’ll take the contra-position. This feels to me like it’s going to be great; a big win for consumers and developers.

Current A12z chips are highly performant; Apple is roughly one chip cycle ahead on perfomance/watt from any other manufacturer. I presume their consumer hardware will launch with an A13Z, or maybe an A14 type chip.

Apple has consistently shipped new chip designs on time; Intel’s thrashing has cost them at least two significant update cycles on the macbook line in the last six years. Search this fine site for complaints about how new mac laptops don’t have real performance benefits over old ones —- those complaints are 100% down to being saddled with Intel.

Apple has a functional corporate culture that ships; adding complete control of the hardware stack in is going to make for better products, full stop.

Apple has to pay Intel and AMD profit margins for their mac systems. They are going to be able to put this margin back into a combination of profit and tech budget as they choose. Early days they are likely to plow all this back into performance, a win for consumers.

So, I’m predicting an MBP 13 - 16 range with an extra three hours of battery life+, and 20-30% faster. Alternately a Macbook Air type with 16 hours plus strong 4k performance. You’re not going to want an Intel mac even as of January of 2021, unless you have a very unusual set of requirements.

I think they may also start making a real push on the ML side in the next year, which will be very interesting; it’s exciting to imagine what Apple’s fully vertically integrated company could do controlling hardware, OS and ML stack.

One interesting question I think is outstanding - from parsing the video carefully, it seems to me that devs are going to want ARM linux virtualized, vs AMD64. I’m not highly conversant with ARM linux, but in my mind I imagine it’s still largely a second class citizen — I wonder if systems developers will get on board, deal with slower / higher battery draw intel virtualization, or move on from Apple.

Languages like Go with supremely simple cross architecture support might get a boost here. Rust seems behind on ARM, for instance; I bet that will change in the next year or two. I don’t imagine that developing Intel server binaries on an ARM laptop with Rust will be pleasant.

ianhowson · 6 years ago
> So, I’m predicting an MBP 13 - 16 range with an extra three hours of battery life+, and 20-30% faster.

I'm predicting the opposite: you won't actually see any difference.

Once you look closely at power profiles on modern machines you'll see that most energy is going into display and GPU. CPUs mostly run idle. Even if you had a theoretical CPU using zero energy, most people are not going to get 30% battery life gains [1]. Not one thing that they demoed requires any meaningful CPU power.

Similarly, while ARM parts are more efficient than x86 per compute cycle, it's not a dramatic change.

The big changes, I think, are more mundane:

- Apple is going to save $200-$800 cost per Mac shipped

- Apple can start leaning on their specialized ML cores and accelerators. They will probably put that stuff in T2 for Intel Macs. If they're already shipping T2 on every machine, with a bunch of CPU cores, why not just make those CPU cores big enough for the main workload?

Doubling CPU perf is meaningless if you can ship the right accelerators that'll do 100x energy/perf for video compression, crypto and graphics.

[1] for a regular web browsing type user; obviously if you're compiling stuff this may not apply; if that is true you're almost certainly better off just getting a Linux desktop for the heavy lifting

JohnBooty · 6 years ago

    Apple can start leaning on their specialized ML 
    cores and accelerators
Thank you for mentioning this. I feel like many have missed it.

I think Apple sees this sort of thing as the future, and their true competitive advantage.

Most are focusing on Apple's potential edge over Intel when it comes to general compute performance/watt. Eventually Apple's likely to hit a wall there too though, like Intel.

Where Apple can really pull away is by leaning into custom compute units for specialized tasks. Apple and their full vertical integration will stand alone in the world here. Rather than hoping Intel's chips are good at the things it wants to do, it can specialize the silicone hardcore for the tasks it wants MacOS to do in the future. It will potentially be a throwback to the Amiga days: a system with performance years ahead of competitors because of tight integration with custom hardware.

The questions are:

1. Will anybody notice? The initial ARM Macs may be underwhelming. I'm not sure the initial Mac ARM silicon will necessarily have a lot of special custom Mac-oriented compute goodies. And even if it does, I don't know Mac software will be taking full advantage of it from Day 1. It will take a few product cycles (i.e., years) for this to really bear fruit.

2. Will developers bother to exploit these capabilities as Apple surfaces them? Aside from some flagship content-creation apps, native Mac apps are not exactly flourishing.

Teknoman117 · 6 years ago
I fully expect any reduction in costs for Apple will get sent to their shareholders, not the consumers.
etqwzutewzu · 6 years ago
I'm also predicting there will be no difference in battery life.

If you check technical specifications on past MBP battery specification and battery life you can notice one thing: Watt/hour battery is always decreasing and battery life is always remaining constant (e.g., 10 hours of web scrolling).

Gain in power consumption allows to reduce component space which allows further slimmer designs.

43920 · 6 years ago
> Once you look closely at power profiles on modern machines you'll see that most energy is going into display and GPU. CPUs mostly run idle. Even if you had a theoretical CPU using zero energy, most people are not going to get 30% battery life gains

This doesn't really seem to match my experience; at least on a 2015 MBP, the CPU is always consuming at least 0.5-1W, even with nothing running. If I open a webpage (or leave a site with a bunch of ads open), the CPU alone can easily start consuming 6-7 watts for a single core.

Apple claims 10 hours of battery life with a 70-something WH battery, which would indicate they expect total average power consumption to be around 7W; even the idle number is a decent percentage of that.

(Also, has anyone been able to measure the actual power consumption of the A-series CPUs?)

klelatti · 6 years ago
Seems to me to be very likely that Apple's graphic's silicon is much more performant and power efficient than Intel's integrated GPUs. CPUs idle most of the time seems to point to the advantage of a big.LITTLE style design which Apple have been using for iPad's etc for a while. So maybe not 30% but not negligible either.

They demoned lightroom and photoshop which are surely using meaningful CPU resources?

Agreed on the accelerators and the cost savings. All together probably a compelling case for switching.

papito · 6 years ago
Try browsing the web on a semi-decent laptop from, say, 2008. It's a frustrating experience. It is obnoxious how much CPU power modern websites require.
bobdole12345 · 6 years ago
You know that Apple is going to be making the GPU with the same technology as the CPU right?

And those accelerators don't need to be discrete, Apple can add them to their CPUs.

So, it looks like your point is: Sure, Apple is going to jump a couple process nodes from where Intel is, but everything is somehow going to remain the same?

eugeniub · 6 years ago
> Once you look closely at power profiles on modern machines you'll see that most energy is going into display and GPU.

Hard to square this with the simple fact that my 2018 MacBook Pro 13" battery lifespan goes from 8 hours while internet surfing to 1.5 hours for iOS development with frequent recompilations.

zanethomas · 6 years ago
I'm predicting a future where the os is completely locked down and all software for macs be purchased from the app store. Great revenue model for Apple.
an_opabinia · 6 years ago
> Apple is going to save $200-$800 cost per Mac shipped

Whom is Apple going to sell binned A14s to?

Where does everyone think margin comes from in the chip business?

ancientknight · 6 years ago
> CPUs mostly run idle. Even if you had a theoretical CPU using zero energy, most people are not going to get 30% battery life gains

I don't agree. Simply disabling Turbo Boost on MBP16 nets me around 10-15% more battery life. Underclocking a CPU can even result in twice to thrice the battery life on a gaming laptop under same workload.

Here's more details. https://www.extremetech.com/computing/304884-disabling-intel...

andy_ppp · 6 years ago
I actually think total battery life will go up a fair bit and compile times will be much faster, 20-30%, while giving everyone full power when not on the mains. The amount my MacBook throttles when on battery is startling and stopping that while still giving huge battery life, say 6h at 80% CPU will be a huge win. Apple wouldn’t bother unless they knew the benefits they can bring over the next 10 years will be huge.

All of this is complete speculation of course but I don’t believe it will be a financial decision this one, it’ll be about creating better products.

baddox · 6 years ago
Don’t iPads get significantly better battery life than similar casual workloads on a similar form factor Intel PC (like a MacBook Air)?
Xixi · 6 years ago
I was looking at the benchmarks of the latest MacBook Air here [1]. In GPU performance it's not competitive with the iPad Pro, and that's quite an understatement. For me the most obvious win of this migration to "Apple Silicon" will be that entry-level MacBook/iMac will have decent GPU performance, at long last...

[1] https://arstechnica.com/gadgets/2020/03/macbook-air-2020-rev...

deepGem · 6 years ago
Apple can start leaning on their specialized ML cores and accelerators - Can you elaborate a bit more on this ?

I was wondering how this would unfold and looks like things are moving in that direction https://blog.tensorflow.org/2020/04/tensorflow-lite-core-ml-...

If TF models can interoperate with CoreML - boy that'll literally be a home run for Apple cos eventually all ML frameworks will follow suit.

Someone · 6 years ago
“Apple can start leaning on their specialized ML cores and accelerators“

I think that hits the nail on the head. Since I only cursory listened to both the keynote and the state of the union I may have missed it, but I heard them neither mention “CPU” nor “ARM”. The term they use is “Apple Silicon”, for the whole package.

I think they are, at the core, but from what they said, these things need not even be ARM CPUs.

acjohnson55 · 6 years ago
This doesn't match my experience. Web browsing kills my battery life, and my assumption is that it's driven by JS.
pier25 · 6 years ago
Also heat is greatly reduced with ARM, no?
hn_check · 6 years ago
To distill your post-

-the CPU will be a lot more powerful and faster, but it isn't really faster because it's like an accelerator or something.

-if you actually use your computer get some vague "Linux desktop" or something (which is farcical and borders on parody, completely detached from actual reality). Because in the real world people actually doing stuff know that their CPU, and its work, is a significant contributor to power consumption, but if we just dismiss all of those people we can easily argue its irrelevance.

My standards for comments on HN regarding Apple events are very low, but today's posts really punch below that standard. It's armies of ignorant malcontents pissing in the wind. All meaningless, and they're spraying themselves with piss, but it always happens.

In the end this noise doesn't matter whatsoever.

bluedino · 6 years ago
CPU’s do not mostly run idle, if my CPU usage indicator is accurate.
greggman3 · 6 years ago
I don't know about MBP but the iPad Pro has a faster GPU than an MBA for most of the shaders on Shadertoy.com
raydev · 6 years ago
> most energy is going into display and GPU

You know Apple Silicon is going to handle this too, right?

taneq · 6 years ago
> Apple is going to save $200-$800 cost per Mac shipped

Does Apple actually have its own silicon fab now or are they outsourcing manufacture? If the former, those are /expensive/ and they'll still be paying it off.

julienb_sea · 6 years ago
This seems very inaccurate to me. Most laptops do not have discrete GPUs, so tasks like rendering a youtube video do require CPU cycles. Zoom is very CPU intensive on basically any mac laptop, and people always have a ton of tabs open, which can be fairly CPU intensive.

In other words, there are definitely gains to be had. My ipad pro offers a generally more smooth and satisfying experience with silent and much cooler running CPU versus my MBP, and they offer similar battery life. Scale up to MBP battery size and I suspect we will be seeing a few hours battery life gain.

jl6 · 6 years ago
Here's an analogy to help explain the skepticism: ants have amazing efficiency - they can lift multiples of their own body weight. So why can't an ant lift my car? Well, because it's too small. So let's just take the same ant design and scale it up? Unfortunately, it doesn't work like that. A creature capable of lifting my car wouldn't be much like an ant.

There is no guarantee that a phone-scale CPU can just become 4x faster by 4x'ing the power/TDP/die area. If it were that easy, Intel would already have done it (and no, the x86 architecture isn't so terribly inefficient that they are leaving triple-digit percentage improvements on the table).

What I expect we'll see are ARM chips that are power and performance competitive with x86 chips only for specific curated use cases. Apple will extract an advantage by putting custom hardware acceleration into them, to cater for those specific tasks. They will not be able to achieve general purpose performance improvements wildly beyond what Intel can already do.

This is how the current iDevices achieve their excellent performance and battery life. Not through raw general-purpose CPU horsepower, but by a finely tuned synergy between hardware and software. Apple are taking their desktop down the same route. This will be the ultimate competitive advantage for their own software - they will be able to move key software components into hardware, and make it look like magic. But as a developer, you won't be able to participate in this unless you target Apple-blessed hardware instructions/APIs. Your Python script isn't going to start running 4x faster unless you can convince Apple to implement its inner loop in custom silicon.

I have no doubt that Apple will be leaning hard into ASIC territory as they build out their new CPUs. The endgame? Every software function you need, baked into perfectly optimised silicon by the monovendor.

klelatti · 6 years ago
> What I expect we'll see are ARM chips that are power and performance competitive with x86 chips only for specific curated use cases.

Sorry but there is no justification for this. With the same thermal constraints there is every expectation that an Apple / Arm CPU would be more performant and efficient than a comparable x86. Why? Because aarch64 doesn't have the historical legacy that x86 has and Apple has already shown what they can do in the iPad etc. Sure they won't be triple digits but it will enough to be noticeable.

And, as you say, they will have the advantage of Apple's custom silicon for specific use cases. So best of both worlds.

fiblye · 6 years ago
Your wording makes it sound like ARM is still being used just for smaller devices and controllers with very well defined and limited uses. General purpose computing is already possible with iPads and iPhones. They're just artificially limited by the OS.

iDevices weren't really made with games in mind, but they can push out performance that beats handheld gaming devices. Artists (including myself) use iPads extensively and the response time with the Apple Pencil beats out just about anything else on the market. The only limiting factor is the tiny memory that limits the file size and layer limit on some programs. They're just fine for watching video, and even multitasking with a video playing while working on something else. This is on tiny device with no active cooling and long battery life, beating out most laptops in the same price range.

I don't believe there is any curated use case. They're already more than capable of being general purpose computers. I mean, Apple is already openly advertising that they're making iPad OS more desktop-like and operable with mice and keyboards. Literally the only things holding them back are the OS and Apple's refusal to put some decent memory inside.

occamschainsaw · 6 years ago
Here's a counterexample (stretching your analogy a bit): Ants lifting the heaviest car in the world.

[1]https://www.top500.org/news/japan-captures-top500-crown-arm-... [2]https://news.ycombinator.com/item?id=23601098

eanzenberg · 6 years ago
They already have a mobile chip that is as fast as an active-thermally cooled notebook chip.
looping__lui · 6 years ago
Yeah, last time they did that with Nvidia graphics cards just as Adobe released it’s new rendering engine everybody was really thrilled to learn they could also buy Apple video editing software that would not sht itself instead of using the Adobe tools because of that inherent Apple advantage...

E.g., a nice article from 2010: https://nofilmschool.com/2010/07/apple-snubs-adobe-again-wit...

dzhiurgis · 6 years ago
Intel can't shrink the die size to what TSMC/Global Foundries/Samsung can and they will never let them manufacture due to IP/national security/etc reasons.
IOT_Apprentice · 6 years ago
Are you saying that Amazon AWS's Graviton 2 EC 2 servers can't handle server level performance?
jimnotgym · 6 years ago
I wonder how that will pan out for the people running MS Office all day on their Macs...
kllrnohj · 6 years ago
> Apple is roughly one chip cycle ahead on perfomance/watt from any other manufacturer.

Eh? This is a flimsy claim. AMD's performance/watt is extremely impressive right now. Apple is ahead of Intel for sure, but Intel isn't the only other player here.

> So, I’m predicting an MBP 13 - 16 range with an extra three hours of battery life+, and 20-30% faster. Alternately a Macbook Air type with 16 hours plus strong 4k performance.

A slightly more efficient CPU doesn't get you this. You need significant efficiency improvements across a variety of aspects, including those Apple has already been optimizing for years like the display.

GeekyBear · 6 years ago
I'd say that you should take a look at a comparison of the power efficiency of Apple's "little" core in the A13 to a stock ARM little core.

>In the face-off against a Cortex-A55 implementation such as on the Snapdragon 855, the new Thunder cores represent a 2.5-3x performance lead while at the same time using less than half the energy.

https://www.anandtech.com/show/14892/the-apple-iphone-11-pro...

Apple has been hitting it out of the park lately.

StillBored · 6 years ago
AMD is tiny compared to intel, the fact that they are besting them goes to show how they have been stuck for ~5 years.

The real problem though, is that apple is actually designing a core 100% focused on the target market. Unlike intel, for whatever reason, and AMD which didn't have the funds to run a dedicated design team for laptop/desktops.

So, I would expect the engineering tradeoffs for said laptop/desktop processor to show. AKA, things like hyperthreading are quite a win for servers, but at best are a wash for a desktop use case focused on extremely high single thread perf at the expense of throughput.

Eric_WVGG · 6 years ago
You're correct that AMD’s offerings are impressive, but that‘s vs an uncooled A12 chip. Add active cooling and a few more watts there's no reason why they couldn't blow the doors off.
valuearb · 6 years ago
Apple SOCs are going to be a big part of the performance/watt gains.
MR4D · 6 years ago
>Eh? This is a flimsy claim.

Anandtech would like to disagree with your word choice.

https://www.anandtech.com/show/14892/the-apple-iphone-11-pro...

ascendantlogic · 6 years ago
AMD is small though. I have no data to back up my gut but anecdotally I feel like they don't have the manufacturing capacity to keep up with Apple's demands right now.
moduspol · 6 years ago
Agreed on all points.

I saw speculation elsewhere that this change, along with AWS's addition of Graviton-based (their own ARM processors) instances at much more competitive price points relative to x86, are bound to spearhead the change to "ARM by default."

If your devs are already using ARM, and ARM's notably cheaper in the cloud, that's a compelling case. If you're already using Kubernetes / Docker heavily, you're probably already 80% of the way there. Linuxes that aren't supporting ARM as a "first class citizen" will soon, and undoubtedly that will be a speed bump at worst.

I'm interested to see the specs relative to the x86 Macs, but the only open question to me was whether or not we'd see the x86 emulation layer. Well, we did, and it may not be perfect but it certainly looks like they put a lot of effort into it. If it works as well as it looks, I think this transition is borderline inevitable. I think I've bought my last x86 hardware.

kllrnohj · 6 years ago
This claim doesn't really hold up. The problem here is the vast majority of non-Apple laptops & desktops that are in use. THOSE will still all be x86 for the foreseeable future as ARM CPUs not made by Apple all have terrible per-core performance. Graviton2 compensates by just throwing 64 cores at the problem, but that's not going to do anything for your Electron-based text editor that struggles to use 2 CPU cores in the first place. Or for a typical webpage, which struggles to use more than a single CPU core.

But otherwise as a developer-focused example the 32-core Epyc Rome compiles Build2 faster than the 64-core Graviton 2: https://openbenchmarking.org/embed.php?i=2006047-PTS-EPYC2EC...

That's going to matter when a company is spec'ing out workstations to buy, which are unlikely to have an Apple option on the table at all in the first place, and Amazon isn't going to sell you Graviton2 CPUs to put under your desk, either.

This _could_ be the start of a bigger focus on ARM, definitely, but to really make inroads into what devs use you'll need someone other than Apple to step up to the plate. Or for Apple to become vastly larger than they are in the desktop space. Otherwise we'll all just keep cross-compiling like we have been for the last decade of mobile app development.

miohtama · 6 years ago
Does Docker work on ARM or does hypervisor have any Intel specific features?
leonidasv · 6 years ago
Apple chips are fast mostly because they have a lot of cache to spare.

Take for example the A12Z. It has 8MB of L2 (not L3, it's L2!) cache. An Intel Core i7-1068NG7 present in the latest Macbook Pros (that performs akin to the A12Z according to Geekbench) has only 2MB of L2 cache.

No other ARM CPU has this level of L2 cache. Apple chips are not "magical", Apple just can afford packing up lots and lots of cache because they are not in the silicon "race to the bottom" like Qualcomm and Intel are, for example. L2 cache is very expensive and Apple is just hacking its way up by packing as much L2 cache as they can.

Don't take me wrong, it's not that Apple is "right" or "wrong" by doing it. They just can and did it. However, it's needless to say that their CPUs are not so different from other ARM ones, they just happen to have a budget and a business model that let's them ignore the price/performance ratio when designing chips in order to achieve the maximum performance.

kllrnohj · 6 years ago
> Take for example the A12Z. It has 8MB of L2 (not L3, it's L2!)

It's not really that clear cut. You could also argue that the A12Z has 8MB of L3 and 0MB of L2. The L2 in the A12Z is shared while the L2 in the Intel CPU is not.

Similarly the latency of the A12Z's L2 is a lot higher than the latency of Intel's L2, but also then still lower than Intel's L3. https://images.anandtech.com/doci/13661/A12X-lat.png & https://images.anandtech.com/doci/14664/ICLlat.png

So it's not "traditional" L2 as you're familiar with it, it's more like an L2.5 or something. Although still accurately called L2 as it is the second level of cache, it's just that Apple went with a rather different cache hierarchy & latency structure than Intel did. But it's really not at all accurate to compare the 8MB of L2 on the A12Z to the 2MB of L2 on Ice Lake. Those are very different caches.

azinman2 · 6 years ago
In college my comp. architecture prof. used to say 'cache is king'.
hn_check · 6 years ago
Why do you quote magical as if someone said that.

Further, your point is that Apple chips are faster because of easily replicated reasons. Intel is charging hundreds of dollars for their chip -- you should charge them some consulting fees and tell them how easy this is to boost their performance! This isn't even considering that your whole analysis is flawed to begin with and you're comparing apples and oranges.

old-gregg · 6 years ago
I'll bite.

  > Current A12z chips are highly performant; Apple is
  > roughly one chip cycle ahead on perfomance/watt from 
  > any other manufacturer.
We haven't been able to compare them. Micro-benchmarks do not count because mobile versions of Apple chips haven't been designed for desktop requirements. People love comparing CPU cores with micro-benchmarks, but the hardest thing for a modern desktop/server chip is to feed data to many cores while maintaining cache coherence.

  > So, I’m predicting an MBP 13 - 16 range with an 
  > extra three hours of battery life+, and 20-30% faster.
Before agreeing with your estimates, I want to play with a true 8-core Apple CPU with a large multi-level cache first. Building the Linux kernel with -J16 will be a fun exercise. Look, AMD is not stupid, and they're on the same node Apple will be using, and they're not 30% faster than even a 5y.o. Skylake.

  > Apple has to pay Intel and AMD profit margins for 
  > their mac systems. They are going to be able to put 
  > this margin back into a combination of profit and 
  > tech budget as they choose.
I wonder how their conversations with TSMC go. With Intel, at least they had AMD to use as a bargaining chip. With TSMC there's no alternative.

  > One interesting question I think is outstanding - 
  > from parsing the video carefully, it seems to me 
  > that devs are going to want ARM linux virtualized
  > vs AMD64.
That's the big one. The world's software is built for and runs in data centers, not laptops. Our machines are increasingly becoming nothing but thin clients, remote displays that happen to run Javascript. CPUs do not matter. And I suspect that's the real reason they're switching.

But from the developers perspective, it's incredibly convenient to use the same platform (OS + instruction set) as the machine they're targeting, even for interpreted languages. Linus Torvalds wrote a well-articulated email about this a while ago, IIRC he was commenting on POWER, but I think his points are valid. At my company, devs keep struggling with Docker on a Mac. Add to that the ARM pain, and I wonder how many will finally get a Thinkpad. Developers will switch to ARM when majority of AWS instance types goes ARM.

P.S. I love how "old-tech" HN is, but for the love of god, give us a decent way to "reply with quote".

Stratoscope · 6 years ago
> P.S. I love how "old-tech" HN is, but for the love of god, give us a decent way to "reply with quote".

You have to copy and paste, but that's not too hard, even on mobile.

The main thing is to not use code formatting, and not break up a quoted sentence or paragraph into multiple lines.

Instead, do it the way I quoted your comment above, like this:

  > *Entire quoted paragraph.*
That will render nicely on all devices regardless of the length of the paragraph. If you quote multiple paragraphs, add a blank line between each paragraph so they don't run together.

imtringued · 6 years ago
Wasn't Linus' email implying that if you were to run something like docker natively on ARM then the images you build would be ARM specific. You are not going to spend time and effort on running the build on a x86 machine to then deploy on another x86 machine. You will just deploy your docker images straight to an ARM server.
pier25 · 6 years ago
> Look, AMD is not stupid, and they're on the same node Apple will be using

Could you elaborate? What do you mean with the "same node"?

auggierose · 6 years ago
I am pretty sure they had those TSMC conversations BEFORE the announcement.
ChrisMarshallNY · 6 years ago
I think it will work out fine.

Apple has an absolutely top-shelf team, designing chips.

By hand (Qualifier: Not sure if they still do, but they did, while everyone else was using automation).

They also have a great deal of experience in repaving the highway while traffic is running at capacity. They mentioned it in their keynote. They've done it three times. I have been there, for each of those times.

I was also there for the one time they completely pooched it (Can anyone say "Copeland"? Drop and give me twenty!).

It will be moderately painful. Not too bad. Quite manageable, and it will take at least a couple of years to transition.

I am in no hurry for one of those dev kits, though. They will be quite rough, and I have no compelling reason to use them.

I am looking forward to an entirely new Xcode. The current one is getting crashier every day. I'd also like to have one that can run on my iPad.

grandmczeb · 6 years ago
> By hand

Not using automation isn’t “badass”, it’s a sign of a deeply screwed up engineering culture. It’s on par with forcing software developers to program exclusively in machine code.

Luckily, Apple uses the standard EDA tools pretty extensively so I don’t really think this applies to them. I also agree that Apple hardware engineers are generally extremely good.

Teknoman117 · 6 years ago
Why do feel that Rust is behind on ARM? I can't comment on the performance, but everything that I used that was purely written in Rust compiled and ran perfectly on my PineBook Pro. (with the exception of alacritty, but that's because PBP doesn't support OpenGL 3.x)

Go does have the (platform) advantage (?) of preferring the "rewrite everything in Go" approach, so those just transition when the tooling supports a new architecture. Rust is intentionally going with a interop design, rather than telling people the only answer to to rewrite all their favorite libraries.

steveklabnik · 6 years ago
Rust and ARM is just fine. For example, Cloudflare famously keeps their entire stack cross-compilable to ARM, and even ships Rust on iPhones.

There's two areas where I believe you could call it second-class at the moment, though:

1. There is no ARM targets in Rust's "Tier 1" platform support.

2. std::arch doesn't have ARM intrinsics in stable.

For 1, ARM targets are in a weird space; they aren't Tier 1, but they're closer than most of the other Tier 2 targets. Several ARM targets are Tier 1 for Firefox, for example, so they get a bunch of work done there.

For 2, well, there hasn't been as much demand before. I expect that to change because of this.

... we'll see what the future holds :)

monocasa · 6 years ago
I've been working on an OS in Rust that has an aarch64 port, and I've for sure seen some... questionable output. It's all been valid code for the input, but not nearly as optimized as I've come to expect out of LLVM based compilers. I'm sure there's some low hanging fruit that needs attention is all.
justinfrankel · 6 years ago
> I’m not highly conversant with ARM linux, but in my mind I imagine it’s still largely a second class citizen

IMO ARM linux is great, the real thing lacking is good hardware to run it.

swebs · 6 years ago
Yeah, I tried out a Raspberry Pi 4 as a desktop replacement and pretty much everything is supported except for proprietary stuff like games.
scythe · 6 years ago
I had an ARM Chromebook for a while (around 2016) that I customized with a 256GB SD card and Linux Mint. The software all worked well, but the WiFi card died after a year and effectively bricked the damn thing. Cross-compiling might be an issue, but that's not a primary use case.
dfox · 6 years ago
You mean all the Android handsets? The history of Linux on ARM is colorful, full of corporate missteps and giant brands (namely HTC and to lesser extent Samsung) that were created from that.
Delk · 6 years ago
> I’m not highly conversant with ARM linux, but in my mind I imagine it’s still largely a second class citizen

In terms of distros maybe yes. Most distros are targeted at laptops, desktops or servers, and few of those have ARM processors.

In terms of architectural support by the kernel and low-level infra, I see no reason for that to be true at all. Open source kernels and (at least lower-level) userspace have for decades paid more attention to compatibility with various hardware architectures than proprietary operating systems.

Of course you'll have fewer drivers for hardware associated with a particular architecture if there's less interest for the hardware, or if the hardware is less available in form factors that most developers are interested in. But that applies at least equally to non-open source platforms. If MS or Apple don't have a commercial interest in maintaining support for a particular platform (and they usually have only one or two in mind), nobody's going to do it.

Gibbon1 · 6 years ago
> Most distros are targeted at laptops, desktops or servers, and few of those have ARM processors.

I wouldn't bet my farm on that without a bunch of research and cross checking because embedded linux is really common. Consider manufacturers are producing very large numbers of embedded ARM micro's with external memory interfaces. I think the majority of those are running linux.

ksec · 6 years ago
I am predicting the opposite. Apple isn't about the extend its Mac for Performance or Battery sake.

They are going about expanding its marketshare.

There are close to 1 Billion iPhone users, most of them have never used a Mac. We will need a 2nd Devices for some of those task. And that will either be a iPad or Mac. Out of the 1.5B total PC Market, Apple has 100M Mac users. I will say it is not too far fetched to say Apple wants to double the number of Mac users to 200M.

For every $100 dollar going to Intel, Apple could knock $200 off its Retail price while getting the same margin.

A $799 MacBook ( the same starting price of iPad Pro ) will be disruptive, the premium is now small enough over the ~$500 PC Notebook price.

In the longer term I think Apple is trying to reach 2 Billion Active Devices. And it certainly cant do this with iPhone alone. There are plenty of Market Space for the Mac to disrupt.

thewileyone · 6 years ago
> For every $100 dollar going to Intel, Apple could knock $200 off its Retail price while getting the same margin.

They're not going to change their retail price. It'll be the same, but "better"/"different" ... that's how they'll market it.

tolmasky · 6 years ago
I haven't felt particularly constrained from CPU in a long time. My main issues have been with RAM (thankfully the new MacBooks finally started supporting 32GB of RAM), and GPU, which ever since Apple has been in a fight with NVIDIA has been miserable. It's not just that Apple doesn't use NVIDIA, it's that they won't allow them to ship their own drivers for it.

I just want to plug in an eGPU to an RTX 2080 card. Instead, you have this incredibly limited set of officially supported cards the are also hyper expensive. Black Magic stopped making their eGPU Pro, so even if money is no object you can't get a great laptop GPU extension that supports their XDR displays.

Now, you might be saying, if you want a great GPU why are buying a laptop? Well, 1) even if I were to get the one model of Mac that allows me to do something interesting with GPUs (Mac Pro), I still can't install the NVIDIA cards I want. And 2) laptop + great eGPU is a great setup that is supported in the non-Mac space, so it is not a bizarre request.

All of this to say: the ARM stuff is fine, but it won't really move the needle for me, and doesn't address any of my performance issues, and I would argue a lot of the performance issues a lot of people actually have (especially graphics artists).

pcwalton · 6 years ago
Rust isn't behind on ARM, except for SIMD intrinsics on stable, which are not things most languages (e.g. Java, Go) have at all for any architecture.
Q6T46nT668w6i3m · 6 years ago
Apple doesn’t control their machine learning stack. The models they ship are likely created and trained on PCs running Linux and NVIDIA GPUs. It’s entirely possible they’ll extend the Neural Engine to be useful for training but they’d still need to contribute or convince others to contribute to the existing tooling.
nojito · 6 years ago
The enduser doesn't care about training. iOS devs just care about APIs that Apple has built for them.
zitterbewegung · 6 years ago
You can do some refinement right now already (from WWDC 2019).

But yes, most models are trained on NVIDIA GPUs that are deep learning. All of the other model types can already be trained on regular CPUs.

https://machinethink.net/blog/coreml-training-part1/

jimnotgym · 6 years ago
> those complaints are 100% down to being saddled with Intel.

And yet the new Macbook pro base models feature an 8th Gen Core i5. That is 2 generations behind the bleeding edge. I think some people might be experiencing the speed problems you described because their machine has an old gen processor in a shiny new box

pathartl · 6 years ago
Not to mention that Apple setting a standard of throttling at 100c and not giving the machines adequate cooling affects performance in a non-trivial fashion.
9fathom · 6 years ago
> One interesting question I think is outstanding - from parsing the video carefully, it seems to me that devs are going to want ARM linux virtualized, vs AMD64. I’m not highly conversant with ARM linux, but in my mind I imagine it’s still largely a second class citizen — I wonder if systems developers will get on board, deal with slower / higher battery draw intel virtualization, or move on from Apple.

It's in fairly good shape, and has an active community. With the current proliferation of IoT devices, both the kernel and userland are well-maintained, and plenty of distros are available. The kernel also benefits from much of the work done for Android and Chromebooks as well.

All of the usual FOSS software is ported and runs well. As of this moment, you could easily take your pick from any of Debian, Ubuntu, Fedora, Arch, Manjaro, Slack, or Alpine just for starters, plus a whole mess of specialty distros. Many of those offer both 32- and 64-bit ARM versions.

Also bear in mind that recent ARM CPUs do support hardware-assisted virtualization as well. KVM and Xen are both available for ARM today, and I'd be shocked if Apple's Hypervisor.framework doesn't roll out with ARM support in the new macOS version as well.

(I'm writing this from Firefox in Manjaro ARM on a PineBook Pro, that I use as my daily driver)

rkagerer · 6 years ago
it’s exciting to imagine what Apple’s fully vertically integrated company could do controlling hardware, OS and ML stack

So true! But from a market dominance perspective, it's also a bit terrifying.

2OEH8eoCRo0 · 6 years ago
Replace Apple with Google or Microsoft and it would get shit on around here. Apple gets a pass
zitterbewegung · 6 years ago
I agree with you but I think we are going to see at least one A12Z consumer product (other than the iPad) probably going to be a rereleased MacBook with a better keyboard.
jlarocco · 6 years ago
> One interesting question I think is outstanding - from parsing the video carefully, it seems to me that devs are going to want ARM linux virtualized, vs AMD64. I’m not highly conversant with ARM linux, but in my mind I imagine it’s still largely a second class citizen — I wonder if systems developers will get on board, deal with slower / higher battery draw intel virtualization, or move on from Apple.

Somewhat ironically, I think it's mostly the languages trying to be safer alternatives to C that are most behind on supporting ARM.

I've done a little bit of Lisp development on my Raspberry Pi (with SBCL and Emacs/Slime), and in most cases I don't have to change anything moving between my AMD64/Linux desktop, Intel/OSX MBP, and ARM64/Linux Raspberry Pi. And that's even when using CFFI bindings to C libraries.

I'm not sure SBCL's ARM backend is at the same level as the x86 backends, but it works well, and there's on going work on it.

stjohnswarts · 6 years ago
I'm not sure I agree with you on most of this, but I think more competition in the cpu biz is always a good thing.
greggman3 · 6 years ago
I'm excited. My #1 wish is a 16" MacBook Pro that ways 3 lbs or less. Take an iPad Pro, make it 16" instead of 12, add a keyboard, run MacOS. LG already makes 15.6" intel notebooks that weigh 3 lbs. Apple can do it too!
larozin · 6 years ago
> One interesting question I think is outstanding - from parsing the video carefully, it seems to me that devs are going to want ARM linux virtualized, vs AMD64.

Hahaha, look for a user-agent at 1:44:26 :) They use old Intel Mac for a virtualization demo.

jrobn · 6 years ago
The main thing Apple has done to improve their A-series chips has been massive L2 caches.

I still major advantages of putting a A-series chip into a MacBook Pro.

1) There will be a much larger thermal and power draw envelope available to new A-series chip. I suspect we will see insane “boosting” clock speeds.

2) Incredible “at idle” performance well beyond what X86 can provide with on did GPU cores, which means a bit better battery life for that screen.

3) More opportunity for tightly integrated acceleration chips On die for codec, ML, and other hardware acceleration methods for Apple only software libraries.

4) Easy porting between iOS and macOS, and tvOS.

#3 will be the most significant.

imtringued · 6 years ago
I don't want to repeat myself so I'll just link to my previous comment. https://news.ycombinator.com/item?id=23612245
mlindner · 6 years ago
> Rust seems behind on ARM, for instance; I bet that will change in the next year or two. I don’t imagine that developing Intel server binaries on an ARM laptop with Rust will be pleasant.

Umm what? Rust supports ARM as a first class citizen as does LLVM. They only list ARM as second class because it's not a desktop platform generally. https://forge.rust-lang.org/release/platform-support.html

Deleted Comment

csjr · 6 years ago
Agree with you on some points, I’m really excited to see what’s next. I’m also betting on a new, and faster, MacBook maybe with a discounted price to incentive the migration?

About the virtualization, they will probably make it more efficient, resource wise? Some cloud providers are also offering ARM so...

Anxious to check the GPU performance too!

To add: Control Center on macOS and some other UI improvements hints for Mac w/ touchscreen?

Could we finally see a true BYOD, like Dex or using the improvements in Handoff?

secondcoming · 6 years ago
I'm not sure virtualising ARM on Intel platforms will ever be performant enough to be usable. They will probably have to ship an emulator, and even then there will be issues as it'll be very difficult to emulate the strictness of ARM CPUs on non-ARM architectures, for things like unaligned memory accesses and replicating the memory model.
geesejuggler · 6 years ago
> Languages like Go with supremely simple cross architecture support might get a boost here.

Go's crypto libraries are heavily optimized on x86, not so on arm (see Phoronix Graviton2 benches).

We will have to wait and see how much of Apple's halo effect will contribute positively to optimized arm code.

Sleaker · 6 years ago
Wont this move result in more software compatibility issues from developer side though? Like why would you buy the update if the developers don't want to move to the new platform or determine it's too big of a change now?
willtim · 6 years ago
You've missed off possibly the biggest advantage of ditching Intel. A perfectly usable machine without jet engine fans and a scolded lap. I don't use Mac's, but I see this as good for the industry.
kovek · 6 years ago
I think MacBooks can run heavy ML processes, but why not run those on separate devices with specific hardware for that? I'm thinking any kind of job you'd want to run on the GPU.
thewileyone · 6 years ago
> Apple has a functional corporate culture that ships; adding complete control of the hardware stack in is going to make for better products, full stop.

One word, Catalina.

celloductor · 6 years ago
it's easy to resign iphone models every year. it's not as easy to increase chip performance every year. there's a lot of R&D involved, i think in the long run it'll be better but Apple will have to devote more resources into it. you can't just wish for specs. the manufacturers actually get the hard job of trying to make it.
tdsamardzhiev · 6 years ago
Not a fan of Apple, but x86 is a mess and I'd love to see companies pushing for its replacement.
skohan · 6 years ago
What's the issue with x86? Genuinely asking
rock_artist · 6 years ago
> Apple has a functional corporate culture that ships;

Well we've already seen things gets delayed and even canceled (AirPower).

Cook's Apple is more about supply chain (where he excels), so if indeed the silicon design works, they would be able to 'ship'.

chooseaname · 6 years ago
> So, I’m predicting an MBP 13 - 16 range with an extra three hours of battery life+f

I predict they'll have the same battery life. Any savings will be used to reduce the battery size.

abhorrence · 6 years ago
I think without Ive they'll take the battery life and market it as the incredible win of moving away from Intel.

Dead Comment

rkhassen9 · 6 years ago
Paid for by Apple?
looping__lui · 6 years ago
“Apple’s own pro apps will be updated to support the company’s new silicon in macOS Big Sur, and the company is hoping developers will update their apps. “The vast majority of developers can get their apps up and running in a matter of days,” claims Craig Federighi, Apple’s senior vice president of software engineering. []...

Microsoft is working on Office updates for the new Mac silicon, and Word and Excel are already running natively on the new Mac processors, with PowerPoint even using Apple’s Metal tech for rendering. Apple has also been working with Adobe to get these pro apps up and running on these new chips.“

So the bottom line is: “your previous tools won’t work, will have to be rewritten, the burden is on the developers so we can rake in more cash”

Great, customer focused, and completely altruistic move back in the days when they killed Nvidia cards on high-performance rendering and simulation machines (and everywhere else).

So, Apple has performance libraries that are better than what Intel has to offer? So, cross-platform applications are now again “passe”?

I don’t use my iPhone for working. Why would I want iOS Apps on my computer? So I can install Apple Mail instead of Outlook

colejohnson66 · 6 years ago
> So the bottom line is: “your previous tools won’t work, will have to be rewritten, the burden is on the developers so we can rake in more cash”

Did you miss the part where there’s going to be a “Rosetta 2” AOT/JIT translation layer?

ksec · 6 years ago
Apple ported many Pro Apps to ARM , especially their Logic Pro, Photoshop and they were showcasing Maya on ARM. That is about as Pro as it gets for Mac.

That reads to me Apple isn't going to have Intel for some high end Pro machine. They intended to go all in with ARM. i.e There will be a Mac Pro with High TDP ARM Chip. I wonder what are the owner of Mac Pro feeling now having just spend a $5K+ Mac Pro with Intel.

Question is,

1. They are going to design their own CPU for the whole range of Mac? up to 10W for MacBook, up to 45W for MacBook Pro. ~150W for iMac, ~ 250W for Mac Pro ? How is that financially feasible considering the volume of Mac Pro sold. Or do they intend to use those high TDP chip in their server farm / iCloud?

2. What happens to GPU? Having their own GPU for iMac and Mac Pro as well? Dual GPU options where Apple GPU for power efficiency? This feels like additional complexity.

3. Would it be like the PowerPC era where you will get a new iMac once you finish with the development kit?

Finally while I am excited for ARM Mac, at the same time I am also feeling a little sad. Good bye x86.

paulpan · 6 years ago
> What happens to GPU? Having their own GPU for iMac and Mac Pro as well? Dual GPU options where Apple GPU for power efficiency? This feels like additional complexity.

I think GPU scaling will be much harder than CPU, so whereas Apple can surpass Intel CPUs for all but the highest segments, putting together a standalone GPU will be hard and very interesting to see. For an entry-level GPU? No issues. But what about a midrange (AMD RX 5700XT or Nvidia 2070S)? And not to mention the top-tier Nvidia 2080ti.

The other unspoken risk is that while Apple may be vertically integrating its SOC, it still relies on a fab like TSMC. Intel's recent problem is rooted in their inability to move off legacy 14nm fabrication process. TSMC may have done great in 7nm and now to 5nm transition, but what happens if/when they stumble? Would Apple also want to acquire them or build its own fabs to mitigate this risk?

liamness · 6 years ago
> The other unspoken risk is that while Apple may be vertically integrating its SOC, it still relies on a fab like TSMC. Intel's recent problem is rooted in their inability to move off legacy 14nm fabrication process. TSMC may have done great in 7nm and now to 5nm transition, but what happens if/when they stumble? Would Apple also want to acquire them or build its own fabs to mitigate this risk?

Surely this is an advantage to being fabless? If TSMC stumble, they can evaluate other options. Same for AMD, where would they be now if they were still tied to GlobalFoundries?

poyu · 6 years ago
I will tell you one thing for sure that it's impossible for Apple to acquire TSMC. TSMC have a lot of customers other than just Apple. I think it's logical for them to come up with their own fab but honestly that is incredibly hard. Maybe in 10 years, I would say.
GeekyBear · 6 years ago
>I think GPU scaling will be much harder than CPU

We only have mobile SOCs as a reference point so far, but Apple is doing very well on that metric.

>On the GPU side of things, Apple has also been hitting it out of the park; the last two GPU generations have brought tremendous efficiency upgrades which also allow for larger performance gains. I really had not expected Apple to make as large strides with the A13’s GPU this year, and the efficiency improvements really surprised me. The differences to Qualcomm’s Adreno architecture are now so big that even the newest Snapdragon 865 peak performance isn’t able to match Apple’s sustained performance figures. It’s no longer that Apple just leads in CPU, they are now also massively leading in GPU.

https://www.anandtech.com/show/15246/anandtech-year-in-revie...

notSupplied · 6 years ago
And what happens if China marches into Taiwan? This is a plausible consideration that must be one of Apple's worst nightmares.
voqv · 6 years ago
I heard from random sources that their GPUs are actually (relatively?) very powerful. Better sources/experience appreciated.

"Apple claims the GPU in the iPad Pro is equivalent to an Xbox One S, although how they came to thise conclusion is difficult to say since we know so little about the underpinnings of the GPU." [1]

[1] https://www.anandtech.com/show/13661/the-2018-apple-ipad-pro...

jwilliams · 6 years ago
TSMC is too intertwined with Taiwan and Taiwanese independence to make that feasible.

Building your own fab would be a gargantuan task.

A future JV with Intel seems more likely (despite it being very unlikely in absolute terms).

zymhan · 6 years ago
There's no reason why Apple can't use Nvidia or AMD GPUs with an ARM CPU.

Some examples here https://linustechtips.com/main/topic/917482-arm-and-pcie-lan...

foota · 6 years ago
I wonder if Apple had anything to do with TSMC's announcement of a fab in the US?
berkut · 6 years ago
It seems (wasn't really that clear?) that Maya was running emulation? (as in, x64 binary). I don't think Maya's viewport on MacOS actually runs with Metal (it's still OpenGL), so I doubt it's a native port.

Did it do any CPU-intensive stuff (skinning, deformation), or was it just GPU-intensive viewing?

High-end VFX will be interesting for this with Apple (Maya, Houdini, Nuke) - already there was quite a lot of anger at OpenGL being deprecated and Vulcan not being officially supported. Another instruction set into the mix for highly-optimised apps (lots of SIMD code) is going to be quite annoying, especially for the CPU renderers (Arnold, Renderman, etc)...

dindresto · 6 years ago
As Apple's own GPUs do not run full OpenGL, does this in turn mean they didn't only create a x86 to ARM translation layer but also a full OpenGL implementation running on top of Metal? Similar to other projects implementing OpenGL on top of Vulkan? Or did they actually invest the time to implement OpenGL directly in their graphics drivers?

That seems a bit weird considering OpenGL has been deprecated in macOS already. I would have expected a full removal once the first ARM Macs ship.

tgv · 6 years ago
> Logic Pro

Logic isn't worth much without plugins, and I expect many smaller developers not to port to ARM, and there's nobody to fill the gap in the first years. If that is indeed the case, Apple will begin losing market share where it currently reigns. When there's no pro software, the mac will be just an iPad with a keyboard. It's quite a gamble.

But I've been too pessimistic before.

CharlesW · 6 years ago
> Logic isn't worth much without plugins…

Rosetta 2 cross-compiles Intel binaries to ARM. Why would VST/AU plug-in binaries be an exception?

jmmcd · 6 years ago
> Logic isn't worth much without plugins

A lot of people do say that, including a lot of professionals, but in my opinion Logic with its stock plugins is already absolutely great. Some people buy a lot of plugins because they don't know how to use the builtin plugins, and some because they enjoy playing with new stuff more than making music (and yes, some who know what they're doing too).

DrJokepu · 6 years ago
Professional users of Logic (but also Pro Tools) tend to be very conservative with their upgrades. People who use Logic for a living won’t be using this for years to come.
thirdsun · 6 years ago
> Logic isn't worth much without plugins, and I expect many smaller developers not to port to ARM

I don't think they can afford not to - mac users make up a signifcant share of their target audience.

Furthermore there's always the emulation option although it remains to be seen how performant that would be in an such a demanding, time-critical application like AU/VST instrument and effects.

mekster · 6 years ago
> When there's no pro software, the mac will be just an iPad with a keyboard.

Maybe in your segment but I can't think of using Windows as a web developer even without any of the "Pro" app from Apple.

michelb · 6 years ago
>I wonder what are the owner of Mac Pro feeling now having just spend a $5K+ Mac Pro with Intel.

Not much. If you bought that machine you didn't care about it much as you bought older hardware at a premium price-point. You bought it to use it right now without fussing and have accepted it's obsolescence in 2-4 years, as is quite normal for studios.

If you bought it as an IT enthousiast, well..why would you even do that?

I don't see Apple coming out with an ARM Mac Pro within 3 years anyway. Why would they do that? No upside for them, that market has to be won back first. Slowly start with laptops and iMac, focusing on consumers, get the OS in shape and third-party vendors accustomed to the platform first.

iPad-like performance is already fast enough for almost all consumers, and I'm sure Apple doesn't want to break with Intel on everything right away.

mantap · 6 years ago
The Mac Pro was seen as an an assurance that Apple was going to support their Pro customers for the foreseeable future. It's not just the Mac Pro itself. It's the software and supporting hardware.

Also crucially, due to Apple's spat with NVIDIA, the Mac Pro doesn't support CUDA. This means software has to be modified to use Metal Compute to support the Mac Pro.

If you make pro software Apple just let off a huge signal that the future of the Mac Pro is at best uncertain. So maybe hold off on that Mac support for the next two years.

Apple is not going to make an ARM Xeon. The resulting computer would be so expensive after you amortise all the R&D to create a single workstation class CPU for it, that nobody would be able to afford it. All the Pros who bought Mac Pro got played hard.

01100011 · 6 years ago
AFAIK, they're going to make their own GPU. A recruiter from Apple reached out to me a few weeks ago trying to poach me from my job at a large GPU manufacturer.
gsnedders · 6 years ago
They already have their own, custom GPU on their A-Series SoCs, so them hiring people working on GPUs is hardly surprising? I'd expect them to replace where they use Intel's integrated parts with their own, which might push them a bit higher end than where they have been previously, but I doubt they're chasing after the dedicated side of things?
vonseel · 6 years ago
Re. the PPC era and whether it will be like that in the future, this is from a lowly web backend developers perspective, so I might be naive, but hopefully we've learned to build generic solutions without sacrificing performance, when it comes to things like this, and it doesn't devolve into a code that only runs on X-hardware type thing.

I think it's a more common problem to see things like the T2 security chip not being present on older hardware or hackintosh(unsupported) hardware, so if you're not running the right hardware you can't take advantage of a feature (AR in certain iPhones) or you won't have the performance advantage (Filevault encryption with T2 chips vs encrypted by CPU).

mkozlows · 6 years ago
1. They explicitly said yes, there would be a whole range of CPUs that they're making. This isn't just "throw the phone CPU on a desktop," they'll have laptop class and desktop class CPUs that are different.

2. They're saying integrated. Whether they also throw Radeons on high-end desktops is probably not yet determined. (They have a two year roadmap for this rollout, and I think it's pretty obvious they'll start with the portables/iMacs where the benefits of ARM are more likely to outweigh the pain of change for more people.)

vessenes · 6 years ago
Inferring from the video announce plus guessing:

Will they have a wide TDP range of designs? Yes. On feasibility, there are two major components to the cost - first is the engineering cost for design. You can bet that these costs are not massive; Apple has roughly $200 billion cash on hand right now. Most of the work will be scaled up as part of their overall design process.

The second cost component is the Non-recurring engineering to get a mask-set made for the chips. A 7nm start right now costs maybe in the range of $20mm? If the Mac Pro is required to stand on its own feet financially, perhaps that will push out timing / push up cost. On the other hand, if one considers the cost of the top end chip to be a marketing cost, it’s literally a rounding error.

GPU - Apple is going to have their own GPUs, and no longer worry about AMD and Nvidia. They’ll have complete control of the hardware and software stack and control their own destiny. I would be very surprised to see dual GPU options.

ianhowson · 6 years ago
> wide TDP range of designs

Keep in mind that this doesn't imply a different chip design. TDP is primarily a function of intended application and cooling capacity, not chip tech.

That said, they've already demonstrated ability to do this with A12/A12X/A12Z, which are the same chip at different CPU/GPU core counts. Clock rates are not impacted except where thermal limitations (TDP) impose a limit.

greatpatton · 6 years ago
200 billions may seem huge but at the same time the have 118 billions $ of debt as they use debt to pay dividend without moving money.
013a · 6 years ago
They said "we expect the transition to take two years [...] we've still got some Intel Macs to show you soon [sic] [roughly what they said]" during the keynote.

This isn't just going to replace low-end machines; its every machine they sell, within two years. Probably starting with low-end, but moving up.

nicoburns · 6 years ago
I'd assume that they'll still have AMD/NVidia GPUs as options if they truly plan to bring these CPUs to the Mac Pro market. There's no way that e.g. Animation studios will accept anything less than absolutely top-end performance. And upgradability too. I think Apple must know this.
liamness · 6 years ago
Here's some mad lad adding PCIe slots to a Raspberry Pi 4:

https://twitter.com/domipheus/status/1167566293861588992

It's definitely possible, just need to expost the PCIe lanes in a sensible way (this has been rare to see on ARM-based machines so far) and have PCIe device manufacturers distribute drivers for ARM macOS.

dr1337 · 6 years ago
Apple go down the Samsung route and license RDNA tech for their ARM GPUs - https://www.tomshardware.com/news/amd-rdna-exynos-samsung-so...
lhoff · 6 years ago
> Apple ported many Pro Apps to ARM , especially their Logic Pro, Photoshop and they were showcasing Maya on ARM. That is about as Pro as it gets for Mac. > That reads to me Apple isn't going to have Intel for some high end Pro machine.

That conclusion seems a bit too far fetched from my point of view. There are many users that use Logic on a MBP and with Photoshop it's even more common to use it on a Laptop.

Sooner or later a ARM MacPro is comming but if i'd had to guess i'd say that will take a while. There were 6 years between trashcan and the current cheese grater. So maybe 2025 then ;)

ancientknight · 6 years ago
> 2. What happens to GPU? Having their own GPU for iMac and Mac Pro as well? Dual GPU options where Apple GPU for power efficiency? This feels like additional complexity.

Nintendo Switch uses a nvidia gpu with a very slow and outdated arm processor. Yet, this allows it to run Rocket League, Witcher 3 etc.

Apple, might go with discrete GPU from nvidia/amd to pair it with their arm processors.

lrem · 6 years ago
Re 1: I believe iCloud is hosted on Google Cloud, isn't it?
luhn · 6 years ago
Apple does use Google Cloud and AWS for some things, but is investing billions into building their own data centers.

https://www.datacenterknowledge.com/apple/apple-spend-10b-us...

dhosek · 6 years ago
Some services are in Google Cloud, some in AWS, some in Azure, plus Apple has eleven data centers of its own. I believe that Apple is working on bringing the pieces that have been offloaded to AWS/Azure/Google back in house.
josemanuel · 6 years ago
Maybe they’ll eventually use some Arm server chip on high end.
9OzUIvDwF4LvfUM · 6 years ago
Maya was running using Rosetta 2 translation rather than being ported.
DCKing · 6 years ago
I gotta say: really slick presentation.

1) Before the demo even starts: all Apple apps, pro or casual? Already running. Microsoft Office? Already running. Adobe Creative Cloud? Already running. That's the vast majority of the Mac userbase right there.

2) No apparent hard cuts on the legacy. I was expecting them to not support x86 backwards compatibility if they could get away with it, but apparently they're committed. Even naming the technologies "Universal Binaries 2" "Rosetta 2" is a confident been-there-done-that-will-do-it-again presentation. Unlike last time around, there also doesn't seem to be a major removal of macOS APIs?

3) Acknowledging what kind of x86 stuff machines are used for by showing VMs right away, and (trying to?) show Docker right away. Is that the first Linux demo in an Apple keynote presentation? It was a Linux desktop environment, even.

Now it seems this ARM announcement was a bit rushed by design, flashing by the features without allowing a substantial look. So it's likely we're going to be dissappointed by x86 performance and having to say good bye to APIs (this for sure is the end of OpenGL right? edit: no [1]), but they do leave an impression of having their priorities of broad software support straight and a pretty seamless transition as far as you can get one.

[1]: https://developer.apple.com/documentation/xcode/porting_your...

apexalpha · 6 years ago
>Acknowledging what kind of x86 stuff machines are used for by showing VMs right away

I do believe they showed a Linux ARM VM. And not a x86 VM.

x86 VM's are probably going to take a massive QEMU performance tax. The positive news it the boost for the Linux ARM space, which will see a massive boost.

Games on Mac? Games on Mac with Windows Bootcamp? Yeah... Maybe buy a console or a second PC...

DCKing · 6 years ago
Your comment made me rewatch that section of the Keynote [1], and I believe you're right. Docker and Parallels were shown in a 'virtualization' subsection, and not the Rosetta subsection. So that must have been ARM64 Debian we saw there indeed. Did I say that their presentation was too fast :) ?

That's going to be interesting in the end. Being able to build/smoke test x86 containers on macOS will be important, at least for a while. So it's up in the air whether that will be addressed, although it's worth noting that Docker already supports cross building images [2].

[1]: https://www.youtube.com/watch?v=GEZhD3J89ZE&t=5918

[2]: https://docs.docker.com/buildx/working-with-buildx/

Another interesting note is that the user agent in their Apache logs still says "Intel Mac OS X". Wonder if they'll keep that.

marcthe12 · 6 years ago
I am frankly surprised they did not ditch opengl and carbon.
Cu3PO42 · 6 years ago
I'm surprised we didn't get any performance numbers. Either raw power or at least power efficiency and projected battery life improvements. Seeing as this is a major reason for the transition (according to them), it feels very weird.

They're shipping a 'Development Transition Kit' Mac mini with an A12Z this week, so it's not like the numbers are going to stay private for a long time. Even if there's an NDA, someone's bound to break it.

m12k · 6 years ago
There's no indication that the A12Z will be the chip that ships to consumers at the end of the year. So honestly it'd be a bit out of character to boast about the specific performance numbers of a pre-release dev kit chip - especially when that chip has already had Geekbench run on it for a while: https://browser.geekbench.com/ios_devices/ipad-pro-12-9-inch...
wtallis · 6 years ago
Last time around, the dev kits had Pentium 4 processors but the Intel Macs that launched used Core and Core 2 processors— a totally different microarchitecture with drastically different performance and power characteristics. It's a pretty safe bet that the first ARM Macs will be using SoCs that are at least a generational improvement over the A12Z. The higher-power Macs that will probably be released toward the end of the transition will likely use chips that are more drastically different from what's in an iPad Pro.
newacct583 · 6 years ago
> So honestly it'd be a bit out of character

This is Apple we're talking about. They boast about unverifiable performance claims in pretty much every product they announce...

joakleaf · 6 years ago
It was pretty much the same with PowerPC to Intel...

Steve Jobs demoed OSX first. Then surprised everybody by saying, OSX had lived a double life in a secret building for many years (with photo), and that he had been running on an Intel Pentium 4 during the demo all morning. Nothing about performance.

There was also a developer system back then; an Intel Pentium 4 in a PowerMac case.

In a lot of ways, this is far more ambitious, and could mean a lot more for Apple long term, but...

... The one thing that hit me the most was, how impressed I was with Apple back then, and how excited I was that a company could do this. Steve Jobs presented it really well, but this time it felt quite flat.

... I really, wish they worked a bit on their showman-ship. They rushed through so many small things, and the presentation felt unnatural. Like they have all over-rehearsed it, but are still reading while presenting (you could even see eye-movements). It is just too smooth, too generic, and a bit too polished.

Please slow down, focus on only the most interesting bits, and give us time to digest it...

jedberg · 6 years ago
I think some of that stems from the fact that Steve Jobs was saying his own words, but everyone else is saying marketing's words. The best marketing could do to Steve was tell him that he's using a competing's brand name incorrectly. Everything else was his. So he could speak passionately in his own words.

Everyone else is acting, but they aren't actors.

cestith · 6 years ago
That wasn't their first processor change, either. Not even was it first in the Mac line.

The Apple I and II were MOS 6502 machines except for the Apple IIgs which was a 65c816. Then the early Macs were 680x0 machines. Then PowerPC. Then Intel.

They looked at Intel chips for the iPhone and settled on Arm before launch. I wouldn't be surprised if some very brittle, early development version of iOS was running on an Intel mobile platform at some point.

Yhippa · 6 years ago
Your "buts" are dead-on. Everything felt so distant and unauthentic. They should require their execs and presenters to not read from somewhere else and do it live.
kps · 6 years ago
It wasn't a surprise to anyone paying attention; NeXTSTEP had run on 68K/x86/Sparc/PA-RISC. Removing architecture support would have been remarkable.

What's important, for those paying attention, is that Apple promoted PowerPC emulation with the first x86 Macs in OS X 10.4 and then removed it after 10.6. If you think Apple won't screw you again, well, go ahead, it's your money.

[Dis]claimer: I have no long or short in AAPL. Anyone posting or voting in this thread should similarly disclose.

bredren · 6 years ago
It wasn't just lack of performance numbers, there were no actual products announced. They would have had to tip their hand on a lot of info that is not helpful to customers or their ability to keep selling Intel stuff.

One big question though will be how this devkit benchmarks against the current maxed Intel mac mini. I'm curious if GPU performance beats the current BlackMagic eGPU. (rx 580)

cbmuser · 6 years ago
> there were no actual products announced.

Correct me if I'm wrong, but I think they also didn't announce any actual hardware when they introduced Intel Macs.

Interestingly, for the PowerPC->Intel transition, they also had an Developer Transition Kit: https://www.macstories.net/stories/this-is-not-a-product-the...

Cu3PO42 · 6 years ago
Not divulging their hand may be a thing. But they could at least have said something (rehashed) about the A12Z: "it performs better than the CPUs currently shipping in the Mac mini by X% in Y benchmark".

I'm not intrisically excited for a new Apple product, but if they could have told me, we can deliver 50% extra battery life in your new MacBook at comparable performance, that would build up some hype and maybe mindshare.

> not helpful to [...] their ability to keep selling Intel stuff.

I hope that that's it. If we're going through the pains of a platform transition, I'd like to get something out of it.

CoolGuySteve · 6 years ago
I think the lack of hardware and lack of benchmarks are related. Apple doesn't know yet what the thermal throttle will be on an A12Z MacBook until they start testing the cooling system.
dfox · 6 years ago
There is no reason why rx580 would not be supported on ARM or why there would be any meaningful performance delta. AMD does not have any kind of “secret-sauce” driver for that, it is simply LLVM targeted to that architecture that converts HLSL/GLSL/SPIR-V into the architecture specific code.
hn_check · 6 years ago
It's an integrated GPU so it isn't going to compete against serious dedicated GPUs, and no one should expect that. I imagine much like existing Apple devices (and Windows laptops) with dedicated GPUs it will switch as necessary. But at least the integrated GPU will be better.
sudosysgen · 6 years ago
I can assure you that performance will not come anywhere close to an RX580, much less to a more modern Radeon 5700.
moduspol · 6 years ago
I don't think they even used the term "ARM" at any point. They're calling it "Apple's silicon," and they acknowledged it's the same as what iPhone and iPad use. But I thought it was interesting how they seemed to avoid the term. It's probably just a matter of avoiding getting too "techy" and marketing.
snazz · 6 years ago
The first guy in the "lab" scene (Johny maybe?) mentioned plenty of other "techy" terms. I think that they want to distance themselves from other ARM manufacturers and put the focus on Apple's advantages over Qualcomm and others.
bgorman · 6 years ago
ARM doesn't really matter very much to Apple - Apple designs the micro-architecture and many (most?) of the other SOC components themselves.

With the technology moves Apple has made, they could probably switch to RISC-V at this point, however being able to use ARM devtools probably adds more value to Apple than any cost savings moving Apple would gain from moving away from ARM

ogre_codes · 6 years ago
I don't know that Apple ever really talks the fact that their chips are based on ARM during this kind of event.
enos_feedler · 6 years ago
Maybe that lives them open to switch the internal instruction set to risc-v down the road without a naming issue.
fomine3 · 6 years ago
But this is WWDC...
empressplay · 6 years ago
It's because the A stands for Acorn and old rivalries die hard.
theshrike79 · 6 years ago
The A12Z is already faster than every tablet and phone in existence. Depending on the benchmark, it smokes about 80% of off the shelf PC hardware.

They really don't need to do that much.

fomine3 · 6 years ago
I believe Apple can make processors for laptop and normal desktop in A-series way but I'm curious how they make for Mac Pro. Adopting architecture like chiplet?
johnbellone · 6 years ago
I am interested in seeing real world performance. I agree that there isn't a lot that Apple needs to do here; the most curious bit will be how much they're able to automate behind the scenes with Rosetta to help out the development community. For most of my workload I am sure that it'll be completely transparent. The only bits that'll likely be less performance will be testing in virtual machines for x86, but it isn't like I care too much about that performance. I'd take the 3+ hours on battery.
wiremine · 6 years ago
> I'm surprised we didn't get any performance numbers.

It makes sense that they wouldn't:

1. Its dev hardware, not final production hardware. They'll want to compare the A13 or A14 (or whatever SoC they finally use) for benchmarks.

2. Until the apps are optimized, it doesn't make sense for Apple to put itself in the position of bad press with premature figures.

r00fus · 6 years ago
3. Osbourne effect avoided by not crowing about specs - those who are cautious or dependent on x86 will continue buying Intel kit. Once the new product is available and it kicks the daylight out of the legacy, they will already have their pareto-split of interest in their new offerings.
Xixi · 6 years ago
The A12Z is already shipping in the latest iPad Pro, so it's not like its performance is some unknown quantity: benchmarks are aplenty... Although I guess this dev kit could run at a different frequency, and have more/less memory bandwidth. Performance should still roughly be that of the iPad Pro.

The A12Z is itself only a small update on the A12X from 2018, so it's basically two years behind whatever will ship in actual ARM Macs this fall...

fomine3 · 6 years ago
I expect DTK performs better than iPad Pro because it can supply more power and has more thermal budget (maybe still has active cooling fan).
floatboth · 6 years ago
The devkit has 16GB of RAM; no way it's not at least dual-channel memory.
pwthornton · 6 years ago
Apple will almost assuredly have a dedicated event later this year where they will announce new hardware.

Apple has development kits running in modified Apple TV's. This is a chip that has essentially been out for a few years in iPad Pros. Why would Apple announce numbers based on this? It also assumes Apple will ship future laptops without fans or ports, which is how the development kit is coming out.

Apple will most likely have an A14X out later this year in at least one laptop. That's going to be significantly newer and more advanced than the A12Z in development kits.

TetOn · 6 years ago
The developer kit is in a Mac Mini. It has a full complement of the usual Mac Mini ports and, unless they've made major internal changes, a fan.
itchynosedev · 6 years ago
From what I understood the transition is of strategic nature. Less dependency on third party suppliers for vital components.

The performance and/or battery gains is almost incidental.

greatpatton · 6 years ago
No it's not less dependency on 3rd parties, it's more dependency on TSMC. Before they were able to play Intel vs TSMC, it's not the case anymore. Then you add the geopolitical issue of Taiwan vs China and the risks level keep increasing.
gjsman-1000 · 6 years ago
I found in the Press release the cost of the Transition Kit: $500. Not bad at all.
macintux · 6 years ago
Probably a rental, based on the last transition.
ogre_codes · 6 years ago
> I'm surprised we didn't get any performance numbers.

It's the CPU in the iPad Pro, performance numbers are out there in the wild. The only big change is the RAM. This isn't a retail product, it's a developer kit. When they release retail Macs I'm sure there will be some performance numbers.

minhazm · 6 years ago
There's still a huge difference in TDP. The iPad probably has a TDP of maybe 10 watts? The Mac Mini Intel CPU's have 65 watt TDP's. They can deliver more power and cooling to the 12Z than in an iPad and it should result in much higher performance.
fabiospampinato · 6 years ago
They said they are planning on making a family of SoC for the Mac though, I doubt we are going to see iPad-level processors in the actual ARM-based Macs they will sell.
pier25 · 6 years ago
They said the first ARM macs are coming at the end of this year, so even if there's an NDA it won't be long until we'll see the real numbers.
marricks · 6 years ago
I wonder if they're holding them for the actual hardware release in Fall? They could still be deciding the tradeoff between battery life and raw power.
goalieca · 6 years ago
Well, it appeared to be a developer version of the chip and not the final customer copy.
ianai · 6 years ago
Possibly. The developer kit is using an A12z chip. It’s possibly more for proof of concept than tech demo of what’s possible.
taylodl · 6 years ago
It's not a product release - it's an announcement of direction for the Mac product line and the Mac OS platform. Once they have hardware with ARM processors for purchase they'll be speaking to the processor specs and how much better they are at power management.
arrrg · 6 years ago
They had to announce this early to allow developers to get ready. If they could have gotten away with not announcing early they would have. Obviously (if all apps would automatically run natively on ARM without any developer involvement) they would have first announced this with an actual new Mac.

That, however, was not an option. So they have to tread carefully in what they say and they also have to be a bit careful about showing off too much.

They only had to tout the benefits of ARM insofar as to placate the fears of consumers (their Rosetta story plus virtualization story helped there) and to provide some reasonable justification to actually make devs at least a bit excited, even though they have to do additional work.

Plus: No ARM Mac (except the transition kit) currently exists. It’s not even clear if the first Mac they will announce is even finished yet, if only internally. And even if it is finished: Do you think going on stage now and talking about a new MacBook Air that has twice the performance and 50% more battery life as the current MacBook Air – oh, and you can get one in December – would be a good idea?

This is Apple’s tightrope walk to avoid too much of an Osborne effect. I think they are ok with some Osborne effect (if only because they know that even if no one buys an Intel Mac ever again during the next two years transition time they will not go bankrupt, so far from it) but you don’t have to provoke one, right?

I expect plenty of numbers and comparisons when they introduce the actual first ARM Mac.

hn_check · 6 years ago
It doesn't seem weird at all, and the fact that they're sending out devices on the A12Z seems to be intentional sandbagging: they know that people are going to benchmark and the results will likely be simply comparable to current hardware (from a performance perspective...energy efficiency will clearly be much better). When they release the actual devices, where their power and thermal profile is dramatically higher than an iPad Pro, it will actually wow.
ianai · 6 years ago
That’s my takeaway too. But imagine the impact of the speed is as good or faster? They’re shipping with 16 gigs ram too so it’s at least not the typical 8 gig minimum.
bredren · 6 years ago
I would also expect this hw to be nerfed compared to what actually goes out to customers. That way no matter what is achieved on this it is better for real users.
totalZero · 6 years ago
Those numbers would be meaningless without knowing what the actual geometry of the internals will be, because cooling is a major limiting factor for laptop processor performance.
rock_artist · 6 years ago
Don’t expect performance. The Intel DTK was Prescott based (while AMD had great dual cores and Intel were lagging). Then they’ve released their Core series that started from mobile and had great performance.

I guess they did some homework before ditching Intel. The big question Is if they have enough headroom for manufacturing reliable chips with sustained high power.

stefan_ · 6 years ago
Have they ever given performance numbers that weren't like "up to 400% faster" in selected tasks?
nojito · 6 years ago
A12Z isn't a consumer chip for macOS. That's why you aren't going to get benchmarks from Apple.

They are also being very very coy with what they have under their sleeves because of small of an upgrade A13 was over A12.

Same thing happened during the intel transition. The first consumer chips were dual cores, but the DTK was Pentium 4s

rahkiin · 6 years ago
If A12Z is based on the A12 they will never make consumer products with it but instead take a new A14
madeofpalk · 6 years ago
> I'm surprised we didn't get any performance numbers

That's not this presentation - when they actually ship hardware (transition kit not included) - then they'll talk performance.

e40 · 6 years ago
How do I get into the DTK program? We produce a compiler for macOS and obviously will need access to this.

I looked all over developer.apple.com and didn't find anything.

mdouglass · 6 years ago
stephc_int13 · 6 years ago
They are probably still working on ARM specific optimizations and benchmarks, and they prefer to wait to show their results, to maximize the impact.
hendersoon · 6 years ago
We pretty much know it'll perform fine in native programs, the only real question is how well the x86 translation layer performs.
modmans2nd · 6 years ago
Why show performance numbers on a machine that will not be going to production?
modzu · 6 years ago
maybe it's about profits not performance?
oliveshell · 6 years ago
Why not both?
tpmx · 6 years ago
Ding-ding-ding. We have a winner.
jfkebwjsbx · 6 years ago
The major reason for the transition is higher margins, plain and simple.

For customers, both average users and developers it will be a pain with little to be gained.

objclxt · 6 years ago
Unlikely - being able to have full control of your roadmap is a huge strategic advantage. Profits and revenue are nice, but if Apple was interested in that they could dual source x86 from AMD and drive cost down.

You don’t think companies like Oculus are envious of Apple’s flexibility from not having to rely on Qualcomm for their mobile SoCs? It’s not just about profit margin.

JohnBooty · 6 years ago
No doubt profit margins are a big factor in their thinking.

However, performance/watt matters too.

lowmemcpu · 6 years ago
> I'm surprised we didn't get any performance numbers.

The fact that all the demoes were on their top-of-line most-expensive machines fell very weird to me. "Look at this amazing performance" would be great if the demo was on a Macbook Air.

hn_check · 6 years ago
It's was a Pro monitor, but they explicitly said that the demos actually ran on the developer transition kit powered by an A12Z...
QuixoticQuibit · 6 years ago
I really wonder what this says about the x86 platform going forward.

Mobile completely passed it by.

I’ve been seeing more hype about ARM servers for a while with AWS Graviton instances, the new #1 supercomputer in the TOP500, etc.

And of course today we see that Apple plans to transition their Macs to their own ARM chips. Even Microsoft made an ARM-based Windows/Surface product but it didn’t seem to amount to much. I wonder if they’ll want to make another stab at it seeing Apple’s direction with strong vertical integration.

While I don’t think x86 as a platform is going away anytime soon, I feel like its market share and by extension its relevance will slowly dwindle over the next decade or two. Interesting times.

klelatti · 6 years ago
I too don't see x86 disappearing soon but it feels like the world has changed and that change is not positive for Intel.

We've been used to x86 dominance on the desktop and servers for so long that I think a future where there are two architectures with critical mass and one of the architectures can be licensed by a number of firms is hard to imagine.

There will be some short term effort and pain but it must surely be a better competitive environment than we have now.

The historic / current Intel and AMD duopoly has surely not been healthy.

ianai · 6 years ago
Intel has either squandered the tech and we will see drastic improvements soon or possibly they’ll fall closer toward the dustbin of history. I suspect the later since they’ve been stuck at various points of specifications for a decade plus. Or they do something novel for once. I wouldn’t count them out of a breakthrough into new architectures/fab processes entirely.
abvdasker · 6 years ago
It's astonishing to me how far Intel has fallen in so short a time. I feel like it was only a couple years ago that Intel was understood universally to be the "heavyweight champion of the world" so to speak.

They completely missed the boat on mobile and AMD has leapfrogged them very recently on desktop. On top of that there were the Spectre vulnerabilities which shook confidence even further. This announcement is another huge blow given the extent to which the entire consumer electronics industry tends to follow Apple. I would be interested to hear an insider's perspective on such a rapid decline.

gwd · 6 years ago
Well they have spent the last 4 years trying to fix wave after wave of side-channel vulnerabilities. A handful of those affected ARM and AMD as well, but a very large number of them were Intel-only, and were a direct side effect of Intel cutting corners on safety to get advances in their performance numbers.
QuixoticQuibit · 6 years ago
My comment was meant to be more general than Intel. I’m even including AMD here. x86 has no presence in mobile (that ship has sailed) and again I’m seeing quite a lot of innovation on ARM side wrt server-class processors. Not sure how traditional PCs will play out but Macs are going to split off from x86 within years.
kevin_thibedeau · 6 years ago
They couldn't even execute on Edison. Intel will be gone in 20 years. They couldn't manage their way out of a bag.
muro · 6 years ago
They still make more money than ever. It's mostly DC driven and many companies are probably looking at alternatives.
radiator · 6 years ago
It was about time. The architecture has so much technical baggage. I don't even think it was intended to last that long.
jasonhansel · 6 years ago
x86 is only dead because CISC is dead. Now we just have to wait for Intel to go "full IBM" and try to ban people from writing emulators...
habitue · 6 years ago
I don't think they can ban emulators without just straight up banning competitors like AMD that make compatible chips. (which presumably they would have done if they could)
yyyk · 6 years ago
I'd suggest avoiding either hyping or dismissing the new processors. We don't have performance numbers for a real model, and until we do, we know very little.

The thing we can talk about is Apple's strategic direction. The good version has Apple releasing a notably superior general purpose computer, maybe even gaining more marketshare in the process. The bad version has the Mac turning into an iOS development station. The fact they showed a game of all things does give some encouragement.

Key questions:

1. How open the new OS/Models will be, and how much developer support we get. The more open, the more likely is will be a powerful general purpose computer.

2. Whether Apple can keep riding the tiger regarding processors. I'm sure they did their due diligence, and the new processors will be powerful enough. But x86 isn't dead yet, AMD is capable and even Intel isn't dead - they still have a hand to play.

If Apple can keep at this, developers will flock in and we'll see nice stuff. If x86 (re)gains its momentum, Apple will be left behind, but they will be unlikely to switch back (unifying processors with the iPhone has a lot of advantages for Apple), and we end up with the bad future.

Apple took a chance today, we'll see whether it pays.

copperx · 6 years ago
Wasn't Linus who said that the reason ARM hasn't taken over the server market is because there are no ARM chips in developer machines?

I'm curious if Apple's move is going to finally help ARM succeed on the server side.

diroussel · 6 years ago
Yes I'm interested in the impact of an ARM laptop, ARM VMs/Docker, and AWS Graviton2. Quite an interesting toolset.
valuearb · 6 years ago
Docker and Linux virtualization. Fat binaries to make it easy to update Mac apps. iPhone and iPad apps. Rosetta to run Intel apps that aren’t updated.

Apple is gonna knock this out if the park.

hinkley · 6 years ago
There's gonna be a weird moment soon where developing docker images for Armbian hardware (Pi and friends) is going to be more straightforward on a Mac than developing docker images for Intel servers.
gwd · 6 years ago
And that may be the moment that ARM servers start picking up steam.
originalvichy · 6 years ago
I wonder what this will do to Electron. If the iOS apps are really 1:1 on macOS, then the need to maintain an electron app will probably diminish. As long as they both support the same OS APIs I can see devs that can learn a new language (Swift) ditch Electron.
jedieaston · 6 years ago
Apple had a list at the State of the Union of open source technology projects they had built pull requests for to add ARM support. Electron was up there, as was Python 3, OpenJDK and Go, notably.
roneythomas6 · 6 years ago
Electron already has support for ARM64, but no official releases yet. But it needs to be build from x86 machine. No native compilation on ARM64 yet. I think with apple moving to ARM, google will add native ARM64 compilation for chromium. This in turn will be picked by electron. Chromium has been running on ARM for a long time with Android and Chrome OS so it has all the optimizations.
drawfloat · 6 years ago
But if you're developing with Electron, the purpose is typically cross platform desktop support. This won't change that?
tootie · 6 years ago
Mac and iOS aren't the only two platforms.
ghettoimp · 6 years ago
Hrmn. Wouldn't this be leaving out android and windows? It might be practical for some apps, but that's an awful lot of users.
paxys · 6 years ago
Is there anyone building on Electron and only targeting the Mac desktop environment? Windows is still the king there.
SahAssar · 6 years ago
The processor architecture is not that relevant if you are working at the abstraction level that electron or swift-ui or similar provides.
LeoPanthera · 6 years ago
Isn't this already the case?

iOS apps can be built, for Intel, using Catalyst, with very trivial changes in code, but we still see Electron hanging around today like a bad smell.

I can't imagine this will change anything.

tomaskafka · 6 years ago
I am soo much looking forward to using native Slack and Teams instead of their horrendous electron apps, that even don't use GPU acceleration on iGPU Macbooks!

Dead Comment

minxomat · 6 years ago
It already has. I'll Def try to get my hands on the A12Z Mac Mini. The GPU performance should be vastly better than existing Mac Mini options.
deeblering4 · 6 years ago
Hopefully MacOS support for x86 won't just be 3-5 years from whenever the new ARM models come out.

I have invested quite a bit into the Mac/Apple ecosystem. A big part of that reason was the longevity of the hardware, along with good resale values.

I hope they do right by their existing Mac customers. As of right now I don't have a strong reason to switch away.

I also hope that Apple does not blow this transition from a quality perspective. Their design choices and attention to detail have left a quite a few things to be desired in the past few generations of hardware.

chipotle_coyote · 6 years ago
My suspicion is that new macOS releases will continue to support Intel-based Macs for at least three years after the last Intel-based Mac ships. New OS X releases only supported PowerPC for about two years after the last PowerPC Mac shipped, which is shorter than I'd expected, but Apple under Tim Cook has been a little less aggressive about ending support for old hardware, despite what people often seem to think. MacOS Big Sur supports hardware going back to 2013, and iOS 14 supports hardware back to 2015's iPhone 6s. (And iOS versions keep getting updates for a year after their replacements ship, while macOS versions tend to be updated for two.)

As for the quality, reply hazy, ask again later? The whole butterfly keyboard of laptops turned out to be a fiasco, and Apple's long-held tendency to push their industrial design right to the edge of thermal and material tolerances got kind of crazy-making in the last few years. Yet so far, I'm really liking my MacBook Air 2020, and the only thing I'd absolutely change about it if I were given a magic wand would be to add a third USB-C port on the right-hand side. I appreciated much of Jony Ive's design work, but I'm hopeful that with him gone, the drive to prioritize minimalism over functionality will be at least toned down.

nicoburns · 6 years ago
> My suspicion is that new macOS releases will continue to support Intel-based Macs for at least three years after the last Intel-based Mac ships. New OS X releases only supported PowerPC for about two years after the last PowerPC Mac shipped, which is shorter than I'd expected

I agree with this. There's a big difference between Intel -> ARM and PPC -> Intel too: With PPC -> Intel, they were moving from their own special architecture towards the mainstream architecture that everybody else was using. In this case they're leaving an architecture that is likely to remain highly relevant outside of the Apple bubble for a long time to come.

e40 · 6 years ago
I can't imagine how people who dropped serious cash for the 7,1 pro machines feel. I've used my 6,1 for 7 years, and I will use it until it is no longer receiving updates. So hopefully 10 years.
unilynx · 6 years ago
> I have invested quite a bit into the Mac/Apple ecosystem. A big part of that reason was the longevity of the hardware, along with good resale values.

Well, given that demand for x86 probably won't go away in 3-5 years and that Apple generally makes it impossible to downgrade OSX on new machines, dropping x86 support might actually increase the resale value of existing hardware.

hollandheese · 6 years ago
Hopefully, but PowerPC Support was just 4 years.