What I really respect is the dedication to completing support for the M1/M2. Too many projects get dropped the moment the next shiny ray-traced toy comes along.
I know this type of work can be challenging to say the least. My own dabble with Zig and Pinephone hardware drivers reminded me of some of the pain of poorly documented hardware, but what a reward when it works.
My own M1 was only purchased because of this project and Alyssa's efforts with OpenGL+ES. It only ever boots Asahi Linux. Thank-you very much for your efforts.
One thing I noticed in the M4 macbook announcement comments was how many people were happy with their M1 laptop, and second, how many people kept their Macbooks for nearly a decade; these devices are built to last, and I applaud long-term support from Apple itself and the Linux community.
Second, since it's open source, Apple themselves are probably paying attention; I didn't read the whole thing because it's going over my head, but she discussed missing features in the chip that are being worked around.
I have the opposite experience. Apple is incredibly difficult and expensive to repair. But I have been pleasantly surprised by the longevity and repairability of ThinkPads. I like those Apple M processors, but I know where I'm spending my money.
> ...how many people kept their Macbooks for nearly a decade; these devices are built to last...
This is no longer true for me. I've been an Apple fan since the Apple ][ days, and reluctantly left the ecosystem last year. The hardware walled garden with soldered-on components and components tied down to specific units for ostensible privacy and security reasons (I don't buy those reasons), combined with the steadily degrading OS polish in fine attention to detail, for me personally, meant I could no longer justify the cognitive load to continue with a Mac laptop as my daily driver. While others might point to a cost or/or value differential, I'm in the highly privileged position to be insensitive to those factors.
Last straw was an board-soldered SSD that quit well before I was willing to upgrade, and even Louis Rossman's shop said it would cost way more to desolder and solder a new one on than the entire laptop is worth. Bought a Framework the same day, when it arrived I restored my data files to it and been running it as my daily driver ever since. The Mac laptop is still sitting here, as I keep hoping to figure out when to find time to develop my wave soldering skills to try my hand at saving it from the landfill, or break down and unsustainably pay for the repair (I do what I can to avoid perpetuating dark patterns, but it is a Sisyphean effort).
I found myself in a position of having to think increasingly more about working around the Mac ecosystem instead of working invisibly within it (like a fish in water not having to think about water), that it no longer made sense to stick with it. It has definitively lost the "It Just Works" polish that bound me so tightly to the ecosystem in the past. I see no functional difference in my daily work patterns using a Mac laptop versus a Framework running Fedora.
To be sure, there are a lot of areas I have to work around on the Framework-Fedora daily driver, but for my personal work patterns, in my personal needs, I evaluated them to be roughly the same amount of time and cognitive load I spent on the Mac. Maybe Framework-Fedora is slightly worse, but close enough that I'd rather throw my hat into the more open ring than the increasingly closed walled garden Apple's direction definitely is taking us, that does not align with my vision for our computing future. It does not hurt that experimenting with local LLM's and various DevOps tooling for my work's Linux-based infrastructure is way easier and frictionless on Fedora for me, though YMMV for certain. It has already been and will be an interesting journey, it has been fun so far and brought back some fond memories of my early Apple ][, Macintosh 128K, and Mac OS X days.
Underrated point, maybe it's aluminum unibody or more stable OS, but in my experience average MBP lifetime is meaningfully higher compared to a windows machine. My longest lasting windows machine was T400 Thinkpad which lasted 5 years before Core 2 Duo architecture stopped being able to keep up. It got replaced with HP Envy with great specs but made out of plastic which barely lasted 1.5 years before screen fell off (literally). Replaced with 17" 2014 MBP which is still alive after SSD replacement.
People talk about Apple prices being higher but the longevity of their devices really erases that price difference for me.
I still have an old iPhone 8 that I test with that runs well, I’ve had numerous Android devices die in that timeframe, slow to a crawl, or at best their performance is erratic.
> how many people kept their Macbooks for nearly a decade; these devices are built to last, and I applaud long-term support from Apple itself and the Linux community.
Anecdotal but I have a White Macbook from 2010. It's sitting on a shelf not because it doesn't work (minus the battery), but because it's too outdated to be of much use. And there's a small crack in the case.
I have a Macbook Pro from 2016. I still whip it out from time to time when I don't want to use my latest one for whatever reason (say, network troubleshooting). It works fine. Even the battery still holds charge. If those two had USB-C (and charging over that port) I'd probably use them more often. Their keyboards is also pleasant (since they are before the scissor key nonsense).
My company has thousands of Macbooks. It's rare that I see anyone at all with issues. They aren't perfect, but the build quality is far better than most PC laptops and so is the attention to detail. The price premium kinda blows, but the M line made them way better.
I have a personal M1 13" Air, and a work M3 16" Pro, and other than the silly 8GB limitation, I don't notice much of a difference in what I do when using the Air.
I'm always surprised when people speak highly of Apple devices here. While they do have certain advantages, there are some issues that should be dealbreakers for tech literate people. (in my own, possibly biased opinion at least)
In case of Macbooks, it's the fact that they refuse to provide an official GPU driver for Linux and general poor support for things outside the walled garden. The Asahi stuff is cool and all, but come on, is a 3.4 trillion dollar company really going to just stand there and watch some volunteers struggling to provide support for their undocumented hardware without doing anything substantial to help? That sounds straight up insulting to me, especially for such a premium product.
For iphones, it's the fact that you are not allowed to run your own code on YOUR OWN DEVICE without paying the Apple troll toll and passing the honestly ridiculous Apple Store requirements.
And of course, in both cases, they actively sabotage third party repairs of their devices.
as someone who's been coding for more than 20 years, the happiest and the most depressed moments in my career both came during a hardware project I participated only for 4 months.
Alyssa's solution to the 4KB vs. 16KB page size discrepancy by running everything in a virtual machine feels like both a clever hack and a potential performance bottleneck. It makes me wonder about the long-term implications of such workarounds. Are we reaching a point where the complexity of bridging these gaps outweighs the benefits, especially when dealing with proprietary hardware designed to be a closed ecosystem?
This also touches on a broader question about the future of open-source efforts on platforms that are inherently restrictive. While it's inspiring to see games like Control running at 45fps on an M1 MAX with open-source drivers, it begs the question: Should the community continue to invest significant resources into making closed systems more open, or should efforts be redirected toward promoting more open hardware standards?
Apple's approach to hardware design warrants criticism. By creating GPUs with limitations that hinder standard functionalities like tessellation shaders and using non-standard page sizes, Apple places unnecessary obstacles in the path of developers. This closed ecosystem not only complicates the work for open-source contributors but also stifles innovation that could benefit all users.
> Apple's approach to hardware design warrants criticism. By creating GPUs with limitations that hinder standard functionalities like tessellation shaders and using non-standard page sizes, Apple places unnecessary obstacles in the path of developers. This closed ecosystem not only complicates the work for open-source contributors but also stifles innovation that could benefit all users.
Apple designs its hardware to suit its own ends, and its own ends only. It's obvious to everyone here that this supports their closed business model, which actually works for them very well - they make excellent hardware and (some software flakiness more recently notwithstanding) the median user will generally have an excellent time with their hardware + software as a result.
So they're not placing "unnecessary obstacles in the path of developers" at all by designing their hardware as they do - they're just focused on designing hardware to suit their own needs.
(Also note that if their hardware wasn't excellent, there wouldn't be such interest in using it in other, non-Apple-intended ways.)
I am genuinely curious if those barriers have technical justifications. There's a pretty stark difference (to me, at least) between ignoring standards in order to reinvent better wheels and intentionally diverging from standards to prevent compatibility.
It's a question of whether they're _not_ investing resources to maintain standard behavior or they are actively investing resources to diverge from it. If it's the former, I don't find any fault in it, personally speaking.
> Alyssa's solution to the 4KB vs. 16KB page size discrepancy by running everything in a virtual machine feels like both a clever hack and a potential performance bottleneck.
In her initial announcement, she mentions VM memory overhead as the reason that 16 Gigs of RAM will be the minimum requirement to emulate most Windows games.
> This closed ecosystem not only complicates the work for open-source contributors but also stifles innovation that could benefit all users.
from their perspective, motivated devs are doing all the heavy-lifting for them. from this side of ecosystem, they would mainly care about native app compatibility and comparable AI (inference) experience.
both of the above seem to be taken care of, sometimes through combined efforts. other than this, they are happy to lock things down as much as they can get away with. the community unfortunately gravitates towards overall appeal rather than good open initiatives.
> Should the community continue to invest significant resources into making closed systems more open, or should efforts be redirected toward promoting more open hardware standards?
What do you mean by “should the community do X” ? The Community is not some form of organisation with a mission and objective, it is a loose collection of individuals free to put their talents and explore intersts into whatever they please. You imply that this creative and inspiring work somehow stifles innovation and hurts users which is frankly absurd.
Nice; that raises some interesting questions as to what Valve is planning here. Steam/Proton on macs would make a lot of sense for them hard as it may be. People booting linux on their macs to play games would really annoy Apple probably.
I have a bet going that their next Steam Deck is going to have an ARM processor.
Allegedly NVidia is working on a WoA SoC, now that Qualcomm's contract with MS has ended. If NVidia puts their recent capital gains into a performant ARM chip like Apple (tapeout alone would likely run in the billion-dollar range), we can hopefully expect AMD to respond with something soon after. Once the chipsets are available, it's a matter of getting Linux to work with mixed page-size processes.
I have no idea how possible that is, but if the emulation performance is anything like Rosetta and the M1 proved possible, most users won't even notice.
> People booting linux on their macs to play games would really annoy Apple probably.
You dont have to boot Linux to play PC games on Mac.
Apple already provides the tools you need to build a Mac native equivalent to Proton.
There are several options built using those tools, both paid {Crossover) and free/open source (Whisky).
Whisky combines the same ooen source Wine project that Proton leverages with Apple's Rosetta x86 emulation and Apple's DirectX emulation layer (part of the Game Porting Toolkit) into a single easy to use tool.
The work by Alyssa R and Asahi Lina is great stuff. I have to say that a lot of this is really inscrutable unless you’re used to driver code. I wish it were much easier to write this stuff but hardware stuff is so idiosyncratic.
Have to say I do enjoy all the old school style whimsy with the witch costume and whatnot.
I've just been watching her recent talk. I noticed she appears to change slides with a wave of her wand. Is there an known piece of hardware one can purchase to do this?
I tried googling, but trying to find the specific result I'm interested in amongst all the blog spam garbage related to powerpoint is beyond me. Even googles own AI couldn't help. Sad times!
She might be doing some OpenCV/MedaPipe gesture tracking? There's lots of tutorials out there and it's not super difficult to get some basic gesture tracking going on your laptop.
In this case there's probably a human doing the slides, but smart watches have apps that can control slideshows in a variety of ways. There's at least one WearOS app (WowMouse) that does some gesture based presentation control.
The graphics pipeline in modern GPUs are mostly a thin low-level Vulkan/Metal-like layer on top of a massively parallel CUDA-like compute architecture.
It's basically all emulated. One of the reasons GPU manufacturers are unwilling to open source their drivers is because a lot of their secret sauce actually happens in software in the drivers on top of the massively parallel CUDA-like compute architecture.
As a former insider, this is NOT remotely how I would describe reality.
I have signed NDAs and don't feel comfortable going into any detail, other than saying that there is a TON going on inside GPUs that is not "basically all emulated".
This statement isn't true at all. There are tons of fixed function hardware units for graphics in GPUs in addition to compute cores: triangle rasterizers, texture units, raytracing units, blitters, copy engines, video codecs, etc. They interact with the shader/compute cores and it's getting more common that the shader core is driving the rasterizer etc than vice versa (mesh shaders and ray tracing for example).
Calling it "all emulated" is very very far from the truth.
You can independently verify this by digging into open source graphics drivers.
I didn't read the article and don't know about Apple but that's definitely not true for everyone. Source: see amdgpu built on top of HSA.
EDIT: to be precise yes ofc every chip is a massively parallel array of compute units but CUDA has absolutely nothing to do with it and no not every company buries the functionality in the driver.
You know it’s not like Apple has a lot to lose in terms of market share if they opened even the specs of their hardware up to make folks like the Asahi team’s lives’ easier. What does Apple have in terms of the desktop market? 5%?
If anything working, fully baked desktop Linux working on ARM Mac hardware would drive sales and Apple would still get the profits. Besides their services are still mostly available too. Tux loving folks can subscribe to AppleTV, and music etc.
A lot of it is software, but not necessarily in the driver. Nouveau folks pretty much gave up and use NVidia's firmware blob going forward. While that's mostly due to NVidia not cooperating in making crucial capabilities of the GPU available to non-signed firmware blobs, the upside is that it will hopefully significantly reduce the effort connected with wiring up new hardware components on the GPU.
The things being emulated are mostly legacy features that are barely used in modern software, if at all, so the overhead of emulating them for backward compatibility isn't the end of the world. I can't blame Apple for not supporting geometry shaders in hardware, when they're widely considered to be a mistake that never should have been standardized in the first place, and Metal never supported them at all so they could only ever come up in old OpenGL code on macOS.
I wouldn't go so far as to say "mistake that should never have been standardized". Their intended use was always pretty limited, though. There's zero reason for anything built in recent memory to use them.
Yes. Apple have their own graphics API. They were able to decide that, say, geometry shaders aren't worth the chip area or engineering effort to support. Other IHVs don't get that choice; for geometry shaders, for instance, they're part of both Vulkan and OpenGLES, there are games and benchmarks that use them, and customers (end-users, gamedevs, review/benchmark sites, SoC vendors) will evaluate GPUs based, in some small part, on how good their geometry shader support is. Same story for tessellation, transform feedback, and whatever else Apple dropped.
Idk, TIL, had a career in mobile for 15 years running and I didn't know this was a distinctive quality of mobile GPUs. (makes sense! but all that to say, I'm very interested to hear more, and I'll trade you an answer to that question: "maybe not! sounds like you got some smart stuff to share :)")
Since bringing modern OpenGL and Vulkan onto Apple Silicon is impossible without an emulation layer anyway, could, theoretically, a native Metal API for Linux be created? Or is Metal too ingrained in macOS SDKs? MoltenVK is attempting to solve the same issues Alyssa was talking about in her talk [1, the last comment on the issue is hers]
Nothing is barring Apple from supporting Vulkan natively on MacOS. This is essentially the closing statement of Alyssa Rosenzweig´s talk.
With Apple knowledge of internal documents they are the best positioned to produce an even better low level implementation.
At this point the main blockroad is the opinionated point that Metal porting is the only official supported way to go.
If Valve pull up a witch-crafted way to run AAA games on Mac without Apple support that would be an interesting landscape. And maybe would force Apple to re-consider their approach if they don't want to be cornered on their own platform...
> If Valve pull up a witch-crafted way to run AAA games on Mac without Apple support that would be an interesting landscape. And maybe would force Apple to re-consider their approach if they don't want to be cornered on their own platform...
Right, except that Game Porting Toolkit and D3DMetal was an exact response to this scenario. Whether it's the right approach, time will tell, but Apple definitely already headed this one off at the pass.
I don't see why not. There are, after all, implementations of DirectX for Linux too, which is how Proton works. But I'm not sure if it would be better to build that API as a layer on top of Vulkan (completely "client side", like MoltenVK or dxvk do) or actually integrate it more deeply into Mesa. The first is certainly easier to start with, I guess.
I know this type of work can be challenging to say the least. My own dabble with Zig and Pinephone hardware drivers reminded me of some of the pain of poorly documented hardware, but what a reward when it works.
My own M1 was only purchased because of this project and Alyssa's efforts with OpenGL+ES. It only ever boots Asahi Linux. Thank-you very much for your efforts.
Second, since it's open source, Apple themselves are probably paying attention; I didn't read the whole thing because it's going over my head, but she discussed missing features in the chip that are being worked around.
This is no longer true for me. I've been an Apple fan since the Apple ][ days, and reluctantly left the ecosystem last year. The hardware walled garden with soldered-on components and components tied down to specific units for ostensible privacy and security reasons (I don't buy those reasons), combined with the steadily degrading OS polish in fine attention to detail, for me personally, meant I could no longer justify the cognitive load to continue with a Mac laptop as my daily driver. While others might point to a cost or/or value differential, I'm in the highly privileged position to be insensitive to those factors.
Last straw was an board-soldered SSD that quit well before I was willing to upgrade, and even Louis Rossman's shop said it would cost way more to desolder and solder a new one on than the entire laptop is worth. Bought a Framework the same day, when it arrived I restored my data files to it and been running it as my daily driver ever since. The Mac laptop is still sitting here, as I keep hoping to figure out when to find time to develop my wave soldering skills to try my hand at saving it from the landfill, or break down and unsustainably pay for the repair (I do what I can to avoid perpetuating dark patterns, but it is a Sisyphean effort).
I found myself in a position of having to think increasingly more about working around the Mac ecosystem instead of working invisibly within it (like a fish in water not having to think about water), that it no longer made sense to stick with it. It has definitively lost the "It Just Works" polish that bound me so tightly to the ecosystem in the past. I see no functional difference in my daily work patterns using a Mac laptop versus a Framework running Fedora.
To be sure, there are a lot of areas I have to work around on the Framework-Fedora daily driver, but for my personal work patterns, in my personal needs, I evaluated them to be roughly the same amount of time and cognitive load I spent on the Mac. Maybe Framework-Fedora is slightly worse, but close enough that I'd rather throw my hat into the more open ring than the increasingly closed walled garden Apple's direction definitely is taking us, that does not align with my vision for our computing future. It does not hurt that experimenting with local LLM's and various DevOps tooling for my work's Linux-based infrastructure is way easier and frictionless on Fedora for me, though YMMV for certain. It has already been and will be an interesting journey, it has been fun so far and brought back some fond memories of my early Apple ][, Macintosh 128K, and Mac OS X days.
I still have an old iPhone 8 that I test with that runs well, I’ve had numerous Android devices die in that timeframe, slow to a crawl, or at best their performance is erratic.
Anecdotal but I have a White Macbook from 2010. It's sitting on a shelf not because it doesn't work (minus the battery), but because it's too outdated to be of much use. And there's a small crack in the case.
I have a Macbook Pro from 2016. I still whip it out from time to time when I don't want to use my latest one for whatever reason (say, network troubleshooting). It works fine. Even the battery still holds charge. If those two had USB-C (and charging over that port) I'd probably use them more often. Their keyboards is also pleasant (since they are before the scissor key nonsense).
My company has thousands of Macbooks. It's rare that I see anyone at all with issues. They aren't perfect, but the build quality is far better than most PC laptops and so is the attention to detail. The price premium kinda blows, but the M line made them way better.
I considered upgrading but it’s hard to care to cause my M1 is just so good for what I need it for.
Are your laptops not lasting 10 years? (battery swaps are a must though)
The only reason I switched laptops was that I wanted to do AI Art and local LLMs.
I have so many old laptops and desktops that each of my 5 kids have their own. They are even playing half-modern games on them.
In case of Macbooks, it's the fact that they refuse to provide an official GPU driver for Linux and general poor support for things outside the walled garden. The Asahi stuff is cool and all, but come on, is a 3.4 trillion dollar company really going to just stand there and watch some volunteers struggling to provide support for their undocumented hardware without doing anything substantial to help? That sounds straight up insulting to me, especially for such a premium product.
For iphones, it's the fact that you are not allowed to run your own code on YOUR OWN DEVICE without paying the Apple troll toll and passing the honestly ridiculous Apple Store requirements.
And of course, in both cases, they actively sabotage third party repairs of their devices.
as someone who's been coding for more than 20 years, the happiest and the most depressed moments in my career both came during a hardware project I participated only for 4 months.
Well.... (from the article):
>"frankly, I think ray tracing is a bit of a gimmick feature"
I couldn't agree more, on both counts.
This also touches on a broader question about the future of open-source efforts on platforms that are inherently restrictive. While it's inspiring to see games like Control running at 45fps on an M1 MAX with open-source drivers, it begs the question: Should the community continue to invest significant resources into making closed systems more open, or should efforts be redirected toward promoting more open hardware standards?
Apple's approach to hardware design warrants criticism. By creating GPUs with limitations that hinder standard functionalities like tessellation shaders and using non-standard page sizes, Apple places unnecessary obstacles in the path of developers. This closed ecosystem not only complicates the work for open-source contributors but also stifles innovation that could benefit all users.
Apple designs its hardware to suit its own ends, and its own ends only. It's obvious to everyone here that this supports their closed business model, which actually works for them very well - they make excellent hardware and (some software flakiness more recently notwithstanding) the median user will generally have an excellent time with their hardware + software as a result.
So they're not placing "unnecessary obstacles in the path of developers" at all by designing their hardware as they do - they're just focused on designing hardware to suit their own needs.
(Also note that if their hardware wasn't excellent, there wouldn't be such interest in using it in other, non-Apple-intended ways.)
It's a question of whether they're _not_ investing resources to maintain standard behavior or they are actively investing resources to diverge from it. If it's the former, I don't find any fault in it, personally speaking.
In her initial announcement, she mentions VM memory overhead as the reason that 16 Gigs of RAM will be the minimum requirement to emulate most Windows games.
from their perspective, motivated devs are doing all the heavy-lifting for them. from this side of ecosystem, they would mainly care about native app compatibility and comparable AI (inference) experience.
both of the above seem to be taken care of, sometimes through combined efforts. other than this, they are happy to lock things down as much as they can get away with. the community unfortunately gravitates towards overall appeal rather than good open initiatives.
False dichotomy. Do both!
[1] https://en.wikipedia.org/wiki/Alyssa_Rosenzweig#Career
Allegedly NVidia is working on a WoA SoC, now that Qualcomm's contract with MS has ended. If NVidia puts their recent capital gains into a performant ARM chip like Apple (tapeout alone would likely run in the billion-dollar range), we can hopefully expect AMD to respond with something soon after. Once the chipsets are available, it's a matter of getting Linux to work with mixed page-size processes.
I have no idea how possible that is, but if the emulation performance is anything like Rosetta and the M1 proved possible, most users won't even notice.
You dont have to boot Linux to play PC games on Mac.
Apple already provides the tools you need to build a Mac native equivalent to Proton.
There are several options built using those tools, both paid {Crossover) and free/open source (Whisky).
Whisky combines the same ooen source Wine project that Proton leverages with Apple's Rosetta x86 emulation and Apple's DirectX emulation layer (part of the Game Porting Toolkit) into a single easy to use tool.
https://getwhisky.app/
[1] https://www.tomshardware.com/video-games/pc-gaming/steam-lik...
Sounds like a win-win situation.
Have to say I do enjoy all the old school style whimsy with the witch costume and whatnot.
I tried googling, but trying to find the specific result I'm interested in amongst all the blog spam garbage related to powerpoint is beyond me. Even googles own AI couldn't help. Sad times!
In fact, the company is so closed that Rosenzweig is the one we should consult when we encounter aliens.
Dead Comment
It's basically all emulated. One of the reasons GPU manufacturers are unwilling to open source their drivers is because a lot of their secret sauce actually happens in software in the drivers on top of the massively parallel CUDA-like compute architecture.
I have signed NDAs and don't feel comfortable going into any detail, other than saying that there is a TON going on inside GPUs that is not "basically all emulated".
Calling it "all emulated" is very very far from the truth.
You can independently verify this by digging into open source graphics drivers.
EDIT: to be precise yes ofc every chip is a massively parallel array of compute units but CUDA has absolutely nothing to do with it and no not every company buries the functionality in the driver.
If anything working, fully baked desktop Linux working on ARM Mac hardware would drive sales and Apple would still get the profits. Besides their services are still mostly available too. Tux loving folks can subscribe to AppleTV, and music etc.
https://x.com/pointinpolygon/status/1270695113967181827
It's truly stunning that anyone could do what she did, let alone a teenager (yes I know, she's not a teenager anymore, passage of time, etc :D)
Dead Comment
[1] https://github.com/KhronosGroup/MoltenVK/issues/1524
With Apple knowledge of internal documents they are the best positioned to produce an even better low level implementation.
At this point the main blockroad is the opinionated point that Metal porting is the only official supported way to go.
If Valve pull up a witch-crafted way to run AAA games on Mac without Apple support that would be an interesting landscape. And maybe would force Apple to re-consider their approach if they don't want to be cornered on their own platform...
Right, except that Game Porting Toolkit and D3DMetal was an exact response to this scenario. Whether it's the right approach, time will tell, but Apple definitely already headed this one off at the pass.
Apple already provides its own native implementation of a DirectX to Metal emulation layer.