I'm guessing they don't want to maintain and build and test x86_64 versions of all the macos libraries like Appkit and UIKit (including large changes like liquid glass) when they are no longer shipping x86_64 macOS versions. Which is not entirely unreasonable as I'm sure it takes a lot of effort to keep the whole ui library stack working properly on multiple archs.
Perhaps that's what they're hinting about with the note about a "subset of Rosetta". So maybe there is hope that the core x86_64 binary translator will stick around for things like VM and emulation of generic (linux? wine?) binaries, but they don't want to maintain a whole x86_64 macOS userspace going forward.
Space savings from not shipping fat binaries for everything will probably also be not insignificant. Or make room for a new fat binary for a future "arm64v2" :)
Apple always phases out these kinds of technologies after some time to keep the ecosystem tidy and give a last push to developers to abandon legacy code.
In this iteration, it might also allow some simplification of the silicon since Mx chips have some black magic to mimic x86 (mostly in memory access IIRC) to allow Rosetta to work that fast. IOW, Rosetta 2 is not a software only magic this time.
I remember using the first Rosetta to play Starcraft on my Intel Mac. It also got deprecated after a year or two.
So leaving things behind despite some pains is Apple's way to push people forward (e.g.: Optical media, ports, Rosetta 1, Adobe Flash, etc.).
> ...they don't want to maintain and build and test x86_64 versions...
This feels wrong. Apple sold Intel-based Macs until early June 2023. The last one was the 2019 Mac Pro model.
Ending support for Rosetta in macOS around 2028 also means ending support for any x86_64 versions of software. This means that those unfortunate users who bought an Intel Mac Pro in 2023 only got five years of active usability.
Just because the latest OS isn't able to be installed on older hardware does not mean the hardware in no longer usable. I know people to this day that still run the last 2012 cheese grater MacPros with Snow Leopard as daily work machines. They still use Final Cut 7 on them to capture content from tapes. At this point, they are very fancy dedicated video recorders, but they still run and are money making devices.
Rosetta is the technology that allows Apple Silicon hardware to execute Intel software. When they introduced Apple Silicon with the M1 processor, not many binaries existed for Apple Silicon, so Rosetta2 was a bridge for that problem.
They used the same technology (Rosetta 1) when they switched from PowerPC to Intel.
Pretty much every binary for macOS is distributed as a "Universal Binary", which contains binaries for both x86 and Apple Silicon, so x86 isn't being abandoned, only the ability to run applications on Apple Silicon that hasn't been redistributed / recompiled in 6-7 years.
It's reasonable to say this is wrong. But really, this seems like a tiny subset of users. Who bought a Mac Pro in 2023 after Apple Silicon had been out for 3 years already? Almost nobody, because it wasn't a real performance improvement by that time. For those extremely niche folks for which it was somehow still beneficial, they definitely won't want to still be using such a machine in 2028. They will have moved on to something like an M5 Ultra Mac Studio or whatever form the Mac Pro takes next.
The largest impact would be that the reversion would only affect native macOS apps, while catalyst apps, remote iPhone apps and locally installed iPad apps would still have Liquid Glass UX.
> So maybe there is hope that the core x86_64 binary translator will stick around for things like VM and emulation of generic (linux? wine?) binaries
It's mostly for their game-porting toolkit. They have an active interest in Windows-centric game developers porting their games to Mac, and that generally doesn't happen without the compatibility layer.
I'm sure there's lots of x86_64 specific code in the macOS userland that is much more than just a recompile - things like safari/javascriptcore JIT, various quartz composer core animation graphics stack and video encoder decoder stack libraries, as well as various objective-c low level pointer tagging and message passing ABI shenanigans and so on. This is probably why 32bit intel mac app support was dropped pretty hard pretty fast, as the entire runtime and userland probably required a lot of upkeep. As just one example, 32bit intel objective-c had "fragile instance variables" which was a can of worms.
It’s not like they were doing it to make me happy, they are doing it to sell Mac and lock people into the Apple ecosystem. Maybe there is a negligible % of people using it, possible m1 is 6 yrs old iirc
They barely just released Containerization Framework[0] and the new container[1] tool, and they are already scheduling a kneecapping of this two years down the line.
Realistically, people are still going to be deploying on x64 platforms for a long time, and given that Apple's whole shtick was to serve "professionals", it's really a shame that they're dropping the ball on developers like this. Their new containerization stuff was the best workflow improvement for me in quite a while.
Yeah, it kind of kills me that I am writing this on a Samsung Galaxy Book 3 Pro 360 running Windows 11 so that I can run Macromedia Freehand/MX (I was a beta-tester for that version) so that I can still access Altsys Virtuoso 2 files from my NeXT Cube (Virtuoso 2 ~= Macromedia Freehand 4) for a typeface design project I'm still working on (a digital revival of a hot metal typeface created by my favourite type designer/illustrator who passed in 1991, but whose widow was gracious enough to give me permission to revive).
I was _so_ hopeful when I asked the devs to revive the Nx-UI code so that FH/MX could have been a native "Cocoa" app....
That's why like 80%+(?) of corporate world runs Windows client side for their laptops/workstations. They don't want to have to rewrite their shit whenever the OS vendor pushes an update.
Granted, that's less of an issue now with most new SW being written in JS to run in any browser but old institutions like banks, insurances, industrial, automation, retail chains, etc still run some ancient Java/C#/C++ programs they don't want to, or can't update for reasons but it keeps the lights on.
Which is why I find it adorable when people in this bubble think all those industries will suddenly switch to Macs.
It seems to talk about Rosetta 2 as a whole, which is what the containerization framework depends on to support running amd64 binaries inside Linux VMs (even though the kernel still needs to be arm)
Is there a separate part of Rosetta that is implemented for the VM stuff? I was under the impression Rosetta was some kind of XPC service that would translate executable pages for Hypervisor Framework as they were faulted in, did I just misunderstand how the thing works under the hood? Are there two Rosettas?
> and given that Apple's whole shtick was to serve "professionals",
When was the last time this was true? I think I gave up on the platform around the new keyboards, who clearly weren't made for typing, and the non-stop "Upgrade" and "Upgrade" notifications that you couldn't disable, just push forward into the future. Everything they've done since them seems to have been to impress the Average Joe, not for serving professionals.
My reasons for leaving Apple had nothing to do with this decision. I was already no longer working on Rosetta 2 in a day-to-day capacity, although I would still frequently chat with the team and give input on future directions.
Just went through that thread, I can't believe this wasn't a team of like 20 people.
It's crazy to me that apple would put one guy on a project this important. At my company (another faang), I would have the ceo asking me for updates and roadmaps and everything. I know that stuff slows me down, but even without that, I don't think I could ever do something like this... I feel like I do when I watch guitar youtubers, just terrible
I hope you were at least compensated like a team of 20 engineers :P
It doesn’t say if that is going away. The message calls out another part as sticking around:
> Beyond this timeframe, we will keep a subset of Rosetta functionality aimed at supporting older unmaintained gaming titles, that rely on Intel-based frameworks.
Since the Linux version of Rosetta requires even less from the host OS, I would expect it to stay around even longer.
Yes that was my first thought as well, and as the images aren't designed to be run on a mac specifically, like a native app might be, there is no expectation for the developers to create a native apple silicon version. This is going to be a pretty major issue for a lot of developers
Case in point - Microsoft's SQL Server docker image, which is x86-only with no hint of ever being released as an aarch64 image.
I run that image (and a bunch of others) on my M3 dev machine in OrbStack, which I think provides the best docker and/or kubernetes container host experience on macOS.
I’ve worked in DevOps and companies I’ve worked for put the effort in when M1 came out, and now local images work fine. I honestly doubt it will have a huge impact. ARM instances on AWS, for example, are much cheaper, so there’s already lots of incentive to support ARM builds of images
It's not just images; any software the images pull down must also support ARM64 now as well. For example, the official Google Chrome binaries used by Puppeteer for headless browsing/scraping don't have a Linux ARM build.
How does this work currently? I was under the impression that Docker for Mac already ran containers in an x86 VM. Probably outdated info, but I’m curious when that changed.
Docker on Mac runs containers in a VM, but the VM is native the cpu architecture and takes advantage of hardware virtualization.
You can of course always use qemu inside that vm to run non-native code (eg x86 on Apple Silicon), however this is perceived as much slower than using Rosetta (instead of qemu).
Seems premature. My scanner software, SnapScan, still regularly updated, requires Rosetta. Abbyy FineReaser, the best Mac OCR, requires Rosetta. Although they may be related, as the SnaScan software does OCR with the FineReader engine.
The M1 chip and Rosetta 2 were introduced in 2020. macOS 28 will be released in 2027. 7 years seems like plenty of time for software vendors to make the necessary updates. If Apple never discontinues Rosetta support, vendors will never update their software to run natively on Apple chips.
This is also consistent with Apple’s previous behavior with backwards compatibility, where Apple would provide a few years of support for the previous platform but will strongly nudge developers and users to move on. The Classic environment in Mac OS X that enabled classic Mac OS apps to run didn’t survive the Intel switch and was unavailable in Leopard even for PowerPC Macs, and the original Rosetta for PowerPC Mac OS X applications was not included starting with Lion, the release after Snow Leopard.
The main problem is not native software, but virtualization, since ARM64 hardware is still quite uncommon for Windows/Linux, and we need Rosetta for decent performance when running AMD64 in virtual machines.
There is lots of existing software (audio plugins, games, etc.) that will never see an update. All of that software will be lost. Most new software has ARM or universal binaries. If some vendors refuse to update their software, it's their problem. Windows still supports 32-bit applications, yet almost all new software is 64-bit.
I think this is exactly what they're issuing this notice to address. Rosetta performs so well that vendors are pretty okay just using it as long as possible, but a two year warning gives a clear signal that it's time to migrate.
I usually agree with Apple but I don't agree with this. Rosetta 28 is basically magic, why would they take away one of their own strongest features? If they want big name apps to compile to Apple Silicon, why can't they exert pressure through their codesigning process instead?
The “big name apps” have already moved to Apple Silicon. Rosetta helped them with that process a few years ago. We’re down to the long tail apps now. At some point, Rosetta is only helping a couple people and it won’t make sense to support it. I just looked, and right now on my M1 Air, I have exactly one x86 app running, and I was honestly surprised to find that one (Safari plug-in). Everything else is running ARM. My workload is office, general productivity, and Java software development. I’m sure that if you allow your Mac to report back app usage to Apple, they know if you’re using Rosetta or not, and if so, which apps require it. I suspect that’s why they’re telegraphing that they are about ready to pull the plug.
How much die area does it use that could be used for performance? How much engineering time does it use? Does it make sense to keep it around, causing ~30% more power usage/less performance?
There are many acceptable opposing answers, depending on the perspective of backwards compatibility, cost, and performance.
My naive assumption is that, by the time 2027 comes around, they might have some sort of slow software emulation that is parity to, say, M1 Rosetta performance.
I spent what I would consider to be a lot of money for a unitasker Fujitsu scanner device and am just astounded by how unmaintained and primitive the software is. I only use it on a Windows machine though, so I'm not in the same boat.
They were pretty quick to sunset the PPC version of Rosetta as well. It forces developers to prioritize making the change, or making it clear that their software isn’t supported. It
The one I have my eye on is Minecraft. While not mission critical in anyway, they were fairly quick to update the game itself, but failed to update the launcher. Last time I looked at the bug report, it was close and someone had to re-open it. It’s almost like the devs installed Rosetta2 and don’t realize their launcher is using it.
Owning a Mac has always meant not relying on 3P software. Forget printer/scanner drivers. Even if they target macOS perfectly, there will come a day when you need to borrow a Windows PC or old Mac to print.
It happens to be ok for me as a SWE with basic home uses, so their exact target user. Given how many other people need their OS to do its primary job of running software, idk how they expect to gain customers this way. It's good that they don't junk up the OS with absolute legacy support, but at least provide some kind of emulation even if it's slow.
Phasing out Rosetta 2 seems like a reasonable move. Maintaining backward compatibility indefinitely adds complexity and technical debt. Apple has supported Intel-based systems for a long time, and this step aligns with their goal of keeping macOS streamlined for Apple Silicon.
I wouldn't call 6 years a long time for support. Imagine if Microsoft announced that any software older than 2020 will not longer work. Not be out of support or not get any more updates, just become not runnable.
The problem I have with it is Apple unilaterally deciding that support ends. I don't see the harm in no longer supporting it but leaving it as an option for legacy support. No garentee that anything will work with it and no support for it. They've done this with their hardware before but here it's just a cudgel to force devs to update their apps.
This seems to basically only apply to full-fledged GUI apps and excludes e.g. games, so potentially stuff like Rosetta for CLI isn't going anywhere either
But games are full fledged GUI apps. At a minimum they have a window.
It’s really unclear what it means to support old games but not old apps in general.
I would think the set of APIs used by the set of all existing Intel Mac games probably comes close to everything. Certainly nearly all of AppKit, OpenGL, and Metal 1 and 2, but also media stuff (audio, video), networking stuff, input stuff (IOHID etc).
So then why say only games when the minimum to support the games probably covers a lot of non games too?
I wonder if their plan is to artificially limit who can use the Intel slices of the system frameworks? Like hardcode a list of blessed and tested games? Or (horror) maybe their plan is to only support Rosetta for games that use Win32 — so they’re actually going to be closing the door on old native Mac games and only supporting Wine / Game Porting Toolkit?
Games use a very small portion of the native frameworks. Most would be covered by Foundation, which they have to keep working for Swift anyway (Foundation is being rewritten in Swift) and just enough to present a window + handle inputs. D3DMetal and the other translation layers remove the need to keep Metal around.
That’s a much smaller target of things to keep running on Intel than the whole shebang that they need to right now to support Rosetta.
If you'd like to see an interesting parallel, go look at how Microsoft announced supporting DirectX 12 on Windows 7 for a blessed apps list - basically because Blizzard whined hard enough and was a big enough gorilla to demand it.
Perhaps that's what they're hinting about with the note about a "subset of Rosetta". So maybe there is hope that the core x86_64 binary translator will stick around for things like VM and emulation of generic (linux? wine?) binaries, but they don't want to maintain a whole x86_64 macOS userspace going forward.
Space savings from not shipping fat binaries for everything will probably also be not insignificant. Or make room for a new fat binary for a future "arm64v2" :)
In this iteration, it might also allow some simplification of the silicon since Mx chips have some black magic to mimic x86 (mostly in memory access IIRC) to allow Rosetta to work that fast. IOW, Rosetta 2 is not a software only magic this time.
I remember using the first Rosetta to play Starcraft on my Intel Mac. It also got deprecated after a year or two.
So leaving things behind despite some pains is Apple's way to push people forward (e.g.: Optical media, ports, Rosetta 1, Adobe Flash, etc.).
I don't know if this is the situation with Rosetta 2.
It was five years, from 2006 to 2011. Rosetta 2 will have been there for seven years (currently at five).
https://en.wikipedia.org/wiki/Rosetta_(software)
This feels wrong. Apple sold Intel-based Macs until early June 2023. The last one was the 2019 Mac Pro model.
Ending support for Rosetta in macOS around 2028 also means ending support for any x86_64 versions of software. This means that those unfortunate users who bought an Intel Mac Pro in 2023 only got five years of active usability.
Rosetta is the technology that allows Apple Silicon hardware to execute Intel software. When they introduced Apple Silicon with the M1 processor, not many binaries existed for Apple Silicon, so Rosetta2 was a bridge for that problem.
They used the same technology (Rosetta 1) when they switched from PowerPC to Intel.
Pretty much every binary for macOS is distributed as a "Universal Binary", which contains binaries for both x86 and Apple Silicon, so x86 isn't being abandoned, only the ability to run applications on Apple Silicon that hasn't been redistributed / recompiled in 6-7 years.
Deleted Comment
Deleted Comment
They could just revert all that large change with no loss to the users.
It's mostly for their game-porting toolkit. They have an active interest in Windows-centric game developers porting their games to Mac, and that generally doesn't happen without the compatibility layer.
Or, one can dream: RVA23
Realistically, people are still going to be deploying on x64 platforms for a long time, and given that Apple's whole shtick was to serve "professionals", it's really a shame that they're dropping the ball on developers like this. Their new containerization stuff was the best workflow improvement for me in quite a while.
[0] https://github.com/apple/containerization
[1] https://github.com/apple/container
I was _so_ hopeful when I asked the devs to revive the Nx-UI code so that FH/MX could have been a native "Cocoa" app....
Granted, that's less of an issue now with most new SW being written in JS to run in any browser but old institutions like banks, insurances, industrial, automation, retail chains, etc still run some ancient Java/C#/C++ programs they don't want to, or can't update for reasons but it keeps the lights on.
Which is why I find it adorable when people in this bubble think all those industries will suddenly switch to Macs.
Is there a separate part of Rosetta that is implemented for the VM stuff? I was under the impression Rosetta was some kind of XPC service that would translate executable pages for Hypervisor Framework as they were faulted in, did I just misunderstand how the thing works under the hood? Are there two Rosettas?
When was the last time this was true? I think I gave up on the platform around the new keyboards, who clearly weren't made for typing, and the non-stop "Upgrade" and "Upgrade" notifications that you couldn't disable, just push forward into the future. Everything they've done since them seems to have been to impress the Average Joe, not for serving professionals.
"CIOs say Apple is now mission critical for the enterprise" [1]
[1]: https://9to5mac.com/2025/10/25/cios-say-apple-is-now-mission...
https://news.ycombinator.com/item?id=42483895
It's crazy to me that apple would put one guy on a project this important. At my company (another faang), I would have the ceo asking me for updates and roadmaps and everything. I know that stuff slows me down, but even without that, I don't think I could ever do something like this... I feel like I do when I watch guitar youtubers, just terrible
I hope you were at least compensated like a team of 20 engineers :P
They released this a while ago which has hints of supporting amd64 beyond the Rosetta end date.
> Beyond this timeframe, we will keep a subset of Rosetta functionality aimed at supporting older unmaintained gaming titles, that rely on Intel-based frameworks.
Since the Linux version of Rosetta requires even less from the host OS, I would expect it to stay around even longer.
I run that image (and a bunch of others) on my M3 dev machine in OrbStack, which I think provides the best docker and/or kubernetes container host experience on macOS.
You can of course always use qemu inside that vm to run non-native code (eg x86 on Apple Silicon), however this is perceived as much slower than using Rosetta (instead of qemu).
Is it slow? Absolutely. But you'd be insane to run it in production anyway.
A test suite that becomes 10x slower is already a huge issue.
That said, it doesn't seem llike Rosetta for container use is going anywhere. Rosetta for legacy Mac applications (the macOS level layer) is.
There are many acceptable opposing answers, depending on the perspective of backwards compatibility, cost, and performance.
My naive assumption is that, by the time 2027 comes around, they might have some sort of slow software emulation that is parity to, say, M1 Rosetta performance.
[1] https://www.hamrick.com
The one I have my eye on is Minecraft. While not mission critical in anyway, they were fairly quick to update the game itself, but failed to update the launcher. Last time I looked at the bug report, it was close and someone had to re-open it. It’s almost like the devs installed Rosetta2 and don’t realize their launcher is using it.
It happens to be ok for me as a SWE with basic home uses, so their exact target user. Given how many other people need their OS to do its primary job of running software, idk how they expect to gain customers this way. It's good that they don't junk up the OS with absolute legacy support, but at least provide some kind of emulation even if it's slow.
If you're not willing to commit to supporting the latest and greatest, you shouldn't be developing for Apple.
The problem I have with it is Apple unilaterally deciding that support ends. I don't see the harm in no longer supporting it but leaving it as an option for legacy support. No garentee that anything will work with it and no support for it. They've done this with their hardware before but here it's just a cudgel to force devs to update their apps.
It’s really unclear what it means to support old games but not old apps in general.
I would think the set of APIs used by the set of all existing Intel Mac games probably comes close to everything. Certainly nearly all of AppKit, OpenGL, and Metal 1 and 2, but also media stuff (audio, video), networking stuff, input stuff (IOHID etc).
So then why say only games when the minimum to support the games probably covers a lot of non games too?
I wonder if their plan is to artificially limit who can use the Intel slices of the system frameworks? Like hardcode a list of blessed and tested games? Or (horror) maybe their plan is to only support Rosetta for games that use Win32 — so they’re actually going to be closing the door on old native Mac games and only supporting Wine / Game Porting Toolkit?
That’s a much smaller target of things to keep running on Intel than the whole shebang that they need to right now to support Rosetta.