So they are serving lower resolutions to ARM devices with no GPU acceleration. Doesn't sound like deliberately crippling Asahi Macs to me.. more like a reasonable default assuming those devices are something like a raspberry pi. I'm assuming it is just a default and you can select whatever resolution you like.
Detecting if the device has 20 cores or whatever like he says if possible would be more invasive of privacy. They would be criticized for that too. There's just no winning lol
Figuring out good defaults for this is really hard. In general, "Linux + aarch64 = lower performance = let's do lower resolution by default" seems like a reasonable thing, although not fool-proof obviously. Also note that the "is_arm()" is followed by "|| is_android()"-bit in his screenshot.
Asahi Linux is a comparatively small project with few real users. It's pretty arrogant to assume Google is intentionally targetting them specifically.
If anything, the bug here is that Chromium uses x86 on arm systems, although for Asahi specifically that's probably the right choice, it's a lot less clear to me that's the right choice for all systems.
> It's pretty arrogant to assume Google is intentionally targetting them specifically.
The Asahi Linux developers have a history of taking everything personally. If Apple adds a new way to start an ELF in macOS 12.3, it's because of them. If someone criticizes them publicly on Hacker News and isn't immediately shamed, it's because the actual leadership at Hacker News hates them (not an exaggeration considering their public stunt here where they blamed @dang for everything). If someone doesn't quite agree with their politics in perfect lockstep, they will publicly try to force you to resign (look at what happened with Dlang). And, more recently, threatening to keep changes downstream and out of the mainline Linux kernel as retaliation for what they call inappropriate conduct (which, well, if the COC of Linux hasn't been violated, I'm guessing it's probably them). And of course, if it's found out that the M2 has a technical bug with audio processing that literally nobody noticed until now, it's proof that Apple is full of incompetent idiots, unlike them.
It's also why, if I were running a corporation, I would almost require that anyone using Asahi on their Mac use a corporate fork for their own protection. I wouldn't rule out retaliation.
>Figuring out good defaults for this is really hard.
Just in general it's hard... for many things I find. There's perpetually a bunch of seemingly reasonable "yeah but" for almost every default once you hit a certain number of variables / use cases / differing users / customers and etc.
You are assuming incorrectly. The bug report linked in the thread the user filed with Mozilla indicates that 4K resolution isn’t available even though once he changes the user agent string it was and it worked without issue.
It sounds less like "let's help the poor Raspberry Pi users" or "let's make Asahi's experience worse" and more like "workaround for a buggy TV that nobody bothered to implement correctly".
If you want to know the media decoding capabilities of a Web Browser, you can use the MediaCapabilities API. In Firefox, when flipping the "resist fingerprinting" flag, it's spoofed appropriately based on a study of what the most common results are. This is available on all engines, desktop and mobile.
Depending on the implementation, what it returns can be based on the presence of optimized software decoders for a platform, presence of hardware, resolution and other characteristics of the video, it can also be based on a decoding benchmark that the web browser runs, etc.
Could it not tell if it's actually managing to play and keep up with time? Similarly to how (I believe) it'll degrade if the network is such that it's not loading 4k say quickly enough, it'll switch (if 'auto') to 1080p or whatever.
> Detecting if the device has 20 cores or whatever like he says if possible would be more invasive of privacy.
What he claims though is that they do already do that at least if it's not aarch64:
> Quality 1080 by default. If your machine has 2 or fewer cores, quality 480. If anything ARM, quality 240.
> Could it not tell if it's actually managing to play and keep up with time? Similarly to how (I believe) it'll degrade if the network is such that it's not loading 4k say quickly enough, it'll switch (if 'auto') to 1080p or whatever.
Probably, but it's pretty poor UX if videos start super-choppy by default and slowly downscale to a playable resolution over several seconds. Having a good default is still important.
Have to agree with this. Not saying the check should be there, but the majority of non-mobile/server ARM devices are generally underpowered, especially with per-core performance.
they keep serving me 380p by default that I have to change and I've got a gigabit fiber with a modern ryzen and a monster gpu running the latest windows. So whatever they are doing auto quality wise has some screws loose.
That way you can fall back for anything too old or weird to support that API, and if the vendor complains there’s a simple response: implement the standard.
That sounds reasonable but it's weird that Chrome lies by pretending to be x86_64. The same performance concerns that apply to Firefox should apply to it.
Virtually all arm devices do GPU decoding they kind of have to.
This was a well understood problem in the 80s if two sides need to establish a commonality one offers options and the other picks. Google controls both YouTubers and the most common browser making it uniquely positioned to handle this well.
True, but any competing operating system could just falsely report that it is running x86-64 and therefore trick YouTube into serving high-res content.
Disclosure: currently (as a personal project) slowly building an alternative to Android
Drives me crazy how regressive Apple is with computing. While it seemed to shake out that GPU AI was the only real solution, earlier this year people were(and still are) putting resources into CPU based.
I can't imagine someone @-ing me because when they were 18 years old and status insecure, someone sold them a corporate identity and feel entitled to maintain their edge case.
How many of these devices cater to large screens? IMHO most of such devices cater to small screens, and then 4K is already overkill. Even 720p is overkill on many of the smaller devices.
On my 1080p phone I set the resolution to 4k because 4k + downscaling makes for much better video quality than the butchered blur of pixels that Youtube's 1080p encoder produces.
I would've preferred a higher bitrate 1080p stream, but Youtube only offers that on a limited amount of videos (assuming you have Premium of course).
The Pinebook and Pinebook Pro have 1080 screens and can have hardware video decoding support. I wrote 'can' because the drivers are still in staging for the mainline Linux kernel.
Yeah this is likely done as a shortcut with assumptions made about ARM, but also an example of those that might come back and bite you. ARM is absolutely a moving performance target!
Another Firefox Linux web trouble I've had: Twitch refuses to login, claiming I need to use a supported browser. Then they link to a page saying Firefox is supported. Have this happening on two different computers
The last time I used Twitch it also insisted that my 16 character password was too long for their 40 character limit, so that kind of thing seems to be normal there.
The issue on Twitch only pops up when you enable the `resistFingerprinting` flag. If Twitch can't fingerprint you, they simply refuse your attempt to log in.
> Someone solved it by disabling fingerprinting resistance
Similar issue happens also in some Google services, e.g. Google Docs. It does not work at all (or everything is in wrong resolution) if you have fingerprint resistance enabled.
It doesn’t specifically target Asahi Linux and/or Firefox but reduces resolution on ARM devices. That behavior and assumptions made through a user agent string may be debated but that’s another topic.
Exactly, the post even mentions that Chromium pretends to be AMD64 to get around the limitation.
In most cases this detection is "good enough", but with Asahi a new edge case has emerged. Ideally the device would be able to tell the server about it's capabilities, but that could be seen as an invasion of privacy and could quickly turn into a new metric for profiling internet users across sites.
Yeah, the action is just ‘Google Chrome gets the good settings and the rest can eat dirt’. It’s not specifically Asahi Linux that Google/Youtube is acting antisocially to here.
Interesting that in Mac with Apple chips (M1 Pro) it seems both Chrome and Firefox report them as Intel:
Mozilla/5.0 (Macintosh; Intel Mac OS X 10.15; rv:120.0) Gecko/20100101 Firefox/120.0
Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/120.0.0.0 Safari/537.36
Edit: even Safari itself:
Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/605.1.15 (KHTML, like Gecko) Version/17.1.2 Safari/605.1.15
Detecting if the device has 20 cores or whatever like he says if possible would be more invasive of privacy. They would be criticized for that too. There's just no winning lol
Asahi Linux is a comparatively small project with few real users. It's pretty arrogant to assume Google is intentionally targetting them specifically.
If anything, the bug here is that Chromium uses x86 on arm systems, although for Asahi specifically that's probably the right choice, it's a lot less clear to me that's the right choice for all systems.
(I'm guessing those report aarch64 correctly.)
The Asahi Linux developers have a history of taking everything personally. If Apple adds a new way to start an ELF in macOS 12.3, it's because of them. If someone criticizes them publicly on Hacker News and isn't immediately shamed, it's because the actual leadership at Hacker News hates them (not an exaggeration considering their public stunt here where they blamed @dang for everything). If someone doesn't quite agree with their politics in perfect lockstep, they will publicly try to force you to resign (look at what happened with Dlang). And, more recently, threatening to keep changes downstream and out of the mainline Linux kernel as retaliation for what they call inappropriate conduct (which, well, if the COC of Linux hasn't been violated, I'm guessing it's probably them). And of course, if it's found out that the M2 has a technical bug with audio processing that literally nobody noticed until now, it's proof that Apple is full of incompetent idiots, unlike them.
It's also why, if I were running a corporation, I would almost require that anyone using Asahi on their Mac use a corporate fork for their own protection. I wouldn't rule out retaliation.
Just in general it's hard... for many things I find. There's perpetually a bunch of seemingly reasonable "yeah but" for almost every default once you hit a certain number of variables / use cases / differing users / customers and etc.
It sounds less like "let's help the poor Raspberry Pi users" or "let's make Asahi's experience worse" and more like "workaround for a buggy TV that nobody bothered to implement correctly".
Depending on the implementation, what it returns can be based on the presence of optimized software decoders for a platform, presence of hardware, resolution and other characteristics of the video, it can also be based on a decoding benchmark that the web browser runs, etc.
https://w3c.github.io/media-capabilities/
https://developer.mozilla.org/en-US/docs/Web/API/Media_Capab...
> Why does this not affect Chromium? Because chromium on aarch64 pretends to be x86_64
[1]: https://developer.mozilla.org/en-US/docs/Web/API/Navigator/h...
> Detecting if the device has 20 cores or whatever like he says if possible would be more invasive of privacy.
What he claims though is that they do already do that at least if it's not aarch64:
> Quality 1080 by default. If your machine has 2 or fewer cores, quality 480. If anything ARM, quality 240.
Probably, but it's pretty poor UX if videos start super-choppy by default and slowly downscale to a playable resolution over several seconds. Having a good default is still important.
https://developer.mozilla.org/en-US/docs/Web/API/Media_Capab...
That way you can fall back for anything too old or weird to support that API, and if the vendor complains there’s a simple response: implement the standard.
This was a well understood problem in the 80s if two sides need to establish a commonality one offers options and the other picks. Google controls both YouTubers and the most common browser making it uniquely positioned to handle this well.
For instance, the RPi5 cut out the H.264 decoders versus previous gens, and now only do H.265 decoding in hardware.
If spoofing the user agent fixes the issue despite being on ARM, then this does sound like a legitimate issue.
> Chrome is not affected even if it claims to be aarch64.
Deleted Comment
Disclosure: currently (as a personal project) slowly building an alternative to Android
Deleted Comment
Dead Comment
I can't imagine someone @-ing me because when they were 18 years old and status insecure, someone sold them a corporate identity and feel entitled to maintain their edge case.
I need to stop giving away my stuff for free.
To be fair, there are a lot of ARM Chromebooks and SBCs that can't really handle 4K video.
Raspberry Pi et. al. has been there many many years.
I would've preferred a higher bitrate 1080p stream, but Youtube only offers that on a limited amount of videos (assuming you have Premium of course).
https://bbs.archlinux.org/viewtopic.php?id=289645https://www.reddit.com/r/Twitch/comments/1118xgz/unsupported...
Someone solved it by disabling fingerprinting resistance https://www.reddit.com/r/archlinux/comments/100t2q3/comment/...
YouTube, however, did stop working on me this morning. It’s now demanding I sign in, which it didn’t before.
Similar issue happens also in some Google services, e.g. Google Docs. It does not work at all (or everything is in wrong resolution) if you have fingerprint resistance enabled.
In most cases this detection is "good enough", but with Asahi a new edge case has emerged. Ideally the device would be able to tell the server about it's capabilities, but that could be seen as an invasion of privacy and could quickly turn into a new metric for profiling internet users across sites.
It is not technically privacy issue if it "tells" about them, by having freedom to tell what it wants.
If the information is mandated with remote attestation and server can force it and otherwise not work properly, then it becomes privacy issue.
It's more like that guy plays the victim card to build some PR.
https://www.computerworld.com/article/3389882/former-mozilla...
> Why does this not affect Chromium? Because chromium on aarch64 pretends to be x86_64