Valve is practically singlehandedly dragging the Linux ecosystem forward in areas that nobody else wanted to touch.
They needed Windows games to run on Linux so we got massive Proton/Wine advancements. They needed better display output for the deck and we got HDR and VRR support in wayland. They also needed smoother frame pacing and we got a scheduler that Zuck is now using to run data centers.
Its funny to think that Meta's server efficiency is being improved because Valve paid Igalia to make Elden Ring stutter less on a portable Linux PC. This is the best kind of open source trickledown.
Game development is STILL a highly underrated field. Plenty of advancements/optimizations (both in software/hardware) can be directly traced back to game development. Hopefully, with RAM prices shooting up the way it is, we go back to keeping optimizations front and center and reduce all the bloat that has accumulated industry wide.
A number of my tricks are stolen from game devs and applied to boring software. Most notably, resource budgets for each task. You can’t make a whole system fast if you’re spending 20% of your reasonable execution time on one moderately useful aspect of the overall operation.
I think one could even say gaming as a sector single handedly move most of the personal computing platform forward since 80s and 90s. Before that it was probably Military and cooperate. From DOS era, overclocking CPU to push benchmarks, DOOM, 3D Graphics API from 3DFx Glide to Direct X. Faster HDD for faster Gaming Load times. And for 10 - 15 years it was gaming that carried CUDA forward.
Over time they're going to touch things that people were waiting for Microsoft to do for years. I don't have an example in mind at the moment, but it's a lot better to make the changes yourself than wait for OS or console manufacturer to take action.
I was at Microsoft during the Windows 8 cycle. I remember hearing about a kernel feature I found interesting. Then I found linux had it for a few years at the time.
I think the reality is that Linux is ahead on a lot of kernel stuff. More experimentation is happening.
Tbh i'm starting to think that I do not see microsoft being able to keep it's position in the OS market ; with steam doing all the hard work and having a great market to play with ; the vast distributions to choose from, and most importantly how easy it has become to create an operating system from scratch - they not only lost all possible appeal, they seem stuck on really weird fetichism with their taskbar and just didn't provide me any kind of reason to be excited about windows.
Their research department rocks however so it's not a full bash on Microsoft at all - i just feel like they are focusing on other way more interesting stuff
Kernel level anti-cheat with trusted execution / signed kernels is probably a reasonable new frontier for online games, but it requires a certain level of adoption from game makers.
I do, MIDI 2.0. It's not because they're not doing it, just that they're doing it at a glacial pace compared to everyone else. They have reasons for this (a complete rewrite of the windows media services APIs and internals) but it's taken years and delays to do something that shipped on Linux over two years ago and on Apple more like 5 (although there were some protocol changes over that time).
I’ve heard from several people who game on Windows that Gamescope side panel with OS-wide tweakables for overlays, performance, power, frame limiters and scaling is something that they miss after playing on Steam Deck. There are separate utilities for each, but not anything so simple and accessible as in Gamescope.
A good one is the shader pre caching with fossilize, microsoft is only now getting around it and it still pales in comparison to Valve's solution for Linux.
One would've expected one of the many desktop-oriented distros (some with considerable funding, even) to have tackled these things already, but somehow desktop Linux has been stuck in the awkward midway of "it technically works, just learn to live with the rough edges" until finally Valve took initiative. Go figure.
Please don't erase all the groundwork they've done over the years to make it possible for these later enhancements to happen. It wasn't like they were twiddling their thumbs this whole time!
It's not just Valve taking the initiative. It's mostly because Windows has become increasingly hostile and just plain horrible over the years. They'll be writing textbooks on how badly Microsoft screwed up their operating system.
That isn't it. Generally whatever the majority of users tend to use that where the majority of focus goes.
The vast majority of people that were using Linux on the desktop before 2015 were either hobbyists, developers or people that didn't want to run proprietary software for whatever reason.
These people generally didn't care about a lot of fancy tech mentioned. So this stuff didn't get fixed.
There's far more of that, starting with the lack of a stable ABI in gnu/linux distros. Eventually Valve or Google (with Android) are gonna swoop in with a user-friendly, targetable by devs OS that's actually a single platform
> Valve is practically singlehandedly dragging the Linux ecosystem forward in areas that nobody else wanted to touch.
I'm loving what valve has been doing, and their willingness to shove money into projects that have long been under invested in, BUT. Please don't forget all the volunteers that have developed these systems for years before valve decided to step up. All of this is only possible because a ton of different people spent decades slowly building a project, that for most of it's lifetime seemed like a dead end idea.
Wine as a software package is nothing short of miraculous. It has been monumentally expensive to build, but is provided to everyone to freely use as they wish.
Nobody, and I do mean NOBODY would have funded a project that spent 20 years struggling to run office and photoshop. Valve took it across the finish line into commercially useful project, but they could not have done that without the decade+ of work before that.
I do agree. It's also thanks to gaming that the GPU industry was in such a good state to be consumed by AI now. Game development used to always be the frontier of software optimisation techniques and ingenious approaches to the constraints.
I low key hope the current DDR5 prices push them to drag the Linux memory and swap management into the 21st century, too, because hard locking on low memory got old a while ago
It takes a solid 45 seconds for me to enable zram (compressed RAM as swap) on a fresh Arch install. I know that doesn't solve the issue for 99% of people who don't even know what zram is / have no idea how to do it / are trying to do it for the first time, but it would be pretty easy for someone to enable that in a distro. I wouldn't be shocked if it is already enabled by default in Ubuntu or Fedora.
I feel like all of the elements are there: zram, zswap, various packages that improve on default oom handling... maybe it's more about creating sane defaults that "just work" at this point?
Linux (and its ecosystem) sucks at having focus and direction.
They might get something right here and there, especially related to servers, but they are awful at not spinning wheels
See how wayland progress is slow. See how some distros moved to it only after a lot of kicking and screaming.
See how a lot of peripherals in "newer" (sometimes a model that's 2 or 3 yrs on the market) only barely works in a newer distro. Or has weird bugs
"but the manufacturers..." "but the hw producers..." "but open source..." whine
Because Linux lacks a good hierarchy at isolating responsibility, otherwise going for a "every kernel driver can do all it wants" together with "interfaces that keep flipping and flopping at every new kernel release" - notable (good) exception : USB userspace drivers. And don't even get me started on the whole mess that is xorg drivers
And then you have a Ruby Goldberg machine in form of udev dbus and what not, or whatever newer solution that solves half the problems and create another new collection of bugs.
Honestly I can't see it remaining tenable to keep things like drivers in the kernel for too much longer… both due to the sheer speed at the industry moves and due to the security implications involved.
I wish valve didn't abandon mac as a platform, honestly. As nice as these improvements are for linux and deck users they have effectively abandoned their mac ports as they never updated them to 64 bit like the linux and windows builds, so they can't run on new macs at all. You can coax them into running with wine on mac but it is a very tricky experience. My kegworks wine wrapper for tf2 is currently broken as of last month because the game update download from wine steam keeps corrupting and I'm at a bit of a loss at this point how to work around it. Even when it was working performance was not great and subject to regular lag spikes whenever too many explosions went off.
I totally get why they did, having had to support Mac for an in-house engine. Apple is by far the most painful platform to support out of the big 3 if you're not using turnkey tools, and they don't make up for it with sales outside of iOS. The extra labor is hard to justify already, and then we get to technical deficiencies like MoltenVK, plus social deficiencies like terrible support. It's just a really hard sell all around.
It was likely about control. Valve saw that Microsoft was becoming more controlling about the Windows platform and that's what pushed them towards developing SteamOS on Linux as that means that Valve can put resources into fixing anything that they want to. The Apple platform is also under control of a single entity, so it doesn't make too much sense for Valve to care about that (as well as Apple not being known as a gaming platform).
What you should do is just buy a SteamDeck for gaming.
The Elden Ring stutter work was unrelated to this effort, it was work in vkd3d-proton by Hans-Kristian Arntzen as part of our open-source graphics effort.
If I'm not mistaken this has been greatly facilitated by the recent bpf based extension mechanism that allows developers to go crazy on creating schedulers and other functionality through some protected virtual machine mechanism provided by the kernel.
I have a feeling this will also drag Linux mobile forwards.
Currently almost no one is using Linux for mobile because the lack or apps (banking for example) and bad hardware support.
When developing for Linux becomes more and more attractive this might change.
> When developing for Linux becomes more and more attractive this might change.
If one (or maybe two) OSes win, then sure. The problem is there is no "develop for Linux" unless you are writing for the kernel.
Each distro is a standalone OS. It can have any variety of userland. You don't develop "for Linux" so much as you develop "for Ubuntu" or "for Fedora" or "for Android" etc.
If anything it’s crazy that a company as large as meta is doing such a shitty job that it has to pull in solutions from entirely different industries … but that’s just my opinion
Man, if only meta would give back, oh and also stop letting scammers use their AI to scam our parents, but hey, that accounted for 10% of their revenue this last year, that's $16 BILLION.
Valve seemingly has no concerns with using the same tactics casinos perfected to hook people (and their demographics are young). They are not Meta level of societal harm, but they are happy to be a gateway for kids into gambling. Not that this is unusual in gaming unfortunately.
> This is the best kind of open source trickledown.
We shouldn't be depending on trickledown anything. It's nice to see Valve contributing back, but we all need to remember that they can totally evaporate/vanish behind proprietary licensing at any time.
They have to abide by the Wine license, which is basically GPL, so unless they’re going to make their own from scratch, they can’t make the bread and butter of their compat layer proprietary
> SCX-LAVD has been worked on by Linux consulting firm Igalia under contract for Valve
It seems like every time I read about this kind of stuff, it's being done by contractors. I think Proton is similar. Of course that makes it no less awesome, but it makes me wonder about the contractor to employee ratio at Valve. Do they pretty much stick to Steam/game development and contract out most of the rest?
Igalia is a bit unique as it serves as a single corporate entity for organizing a lot of sponsored work on the Linux kernel and open source projects. You'll notice in their blog posts they have collaborations with a number of other large companies seeking to sponsor very specific development work. For example, Google works with them a lot. I think it really just simplifies a lot of logistics for paying folks to do this kind of work, plus the Igalia employees can get shared efficiency's and savings for things like benefits etc.
This isn’t explicitly called out in any of the other comments in my opinion so I’ll state this. Valve as a company is incredibly focused internally on its business. Its business is games, game hardware, and game delivery. For anything outside of that purview instead of trying to build a huge internal team they contract out. I’m genuinely curious why other companies don’t do this style more often because it seems incredibly cost effective. They hire top level contractors to do top tier work on hyper specific areas and everyone benefits. I think this kind of work is why Valve gets a free pass to do some real heinous shit (all the gambling stuff) and maintain incredible good will. They’re a true “take the good with the bad” kind of company. I certainly don’t condone all the bad they’ve put out, and I also have to recognize all the good they’ve done at the same time.
Back to the root point. Small company focused on core business competencies, extremely effective at contracting non-core business functions. I wish more businesses functioned this way.
Yeah, I suppose this workflow is not for everyone. I can only imagine Valve has very specific issue or requirements in mind when they hire contractors like this. When you hire like this, i suspect what one really pay for is a well known name that will be able to push something important to you to upstream linux. Its the right way to do it if you want it resolved quickly. If you come in as a fresh contributor, landing features upstream could take years.
Valve is actually extremely small, I've heard estimates at around 350-400 people.
They're also a flat organization, with all the good and bad that brings, so scaling with contractors is easier than bringing on employees that might want to work on something else instead.
Proton is mainly a co-effort between in-house developers at Valve (with support on specific parts from contractors like Igalia), developers at CodeWeavers and the wider community.
For contextual, super specific, super specialized work (e.g. SCX-LAVD, the DirectX-to-Vulkan and OpenGL-to-Vulkan translation layers in Proton, and most of the graphics driver work required to make games run on the upcoming ARM based Steam Frame) they like to subcontract work to orgs like Igalia but that's about it.
Valve is known to keep their employee count as low as possible. I would guess anything that can reasonably be contracted out is.
That said, something like this which is a fixed project, highly technical and requires a lot of domain expertise would make sense for _anybody_ to contract out.
They seem to be doing it through Igalia, which is a company based on specialized consulting for the Linux ecosystem, as opposed to hiring individual contractors. Your point still stands, but from my perspective this arrangement makes a lot of sense while the Igalia employees have better job security than they would as individual contractors.
It would be a large effort to stand up a department that solely focuses on Linux development just like it would be to shift game developers to writing Linux code. Much easier to just pay a company to do the hard stuff for you. I'm sure the steam deck hardware was the same, Valve did the overall design and requirements but another company did the actual hardware development.
Speaking for myself, Valve has been great to work with - chill, and they bring real technical focus. It's still engineers running the show there, and they're good at what they do. A real breath of fresh air from much of the tech world.
I don't know what you're trying to suggest or question. If there is a question here, what is it exactly, and why is that question interesting? Do they employ contractors? Yes. Why was that a question?
Valve has a weird obsession with maximizing their profit-per-employee ratio. There are stories from ex-employees out on the web about how this creates a hostile environment, and perverse incentives to sabotage those below you to protect your own job.
I don't remember all the details, but it doesn't seem like a great place to work, at least based on the horror stories I've read.
Valve does a lot of awesome things, but they also do a lot of shitty things, and I think their productivity is abysmal based on what you'd expect from a company with their market share. They have very successful products, but it's obvious that basically all of their income comes from rent-seeking from developers who want to (well, need to) publish on Steam.
It's worth mentioning that sched_ext was developed at Meta. The schedulers are developed by several companies who collaborate to develop them, not just Meta or Valve or Italia and the development is done in a shared GitHub repo - https://github.com/sched-ext/scx.
I've been using Bazzite Desktop for 4 months now and it has been my everything. Windows is just abandonware now even with every update they push. It is clunky and hard to manage.
Bazzite is advertised for gamers, however from my understanding it's just Fedora Atomic wrapped up to work well on steamdeck adjacent hardware and gaming is a top priority. You'd still be receiving the same level of quality you would expect from Fedora/RHEL (I would think).
Gaming or not, stability is important. An OS that focuses on gaming will typically focus on stability, neither bleeding edge or lag behind in support. Has to update enough to work with certain games and behind enough to not have weird support isues.
So Bazzite in my opinion is probably one of the best user experience flavors of Fedora around.
I think you’ve forgotten or aren’t aware that before 3d graphics cards took over, people would buy new video cards to ostensibly make excel faster but then use them to play video games. It was an interesting time with interesting justifications for buying upgrades.
In this case yes, but on the other hand Red Hat won't publish the RHEL code unless you have the binaries. The GPLv2 license requires you to provide the source code only if you provide the compiled binaries. In theory Meta can apply its own proprietary patches on Linux and don't publish the source code if it runs that patched Linux on its servers only.
RHEL source code is easily available to the public - via CentOS Stream.
For any individual RHEL package, you can find the source code with barely any effort. If you have a list of the exact versions of every package used in RHEL, you could compose it without that much effort by finding those packages in Stream. It's just not served up to you on a silver platter unless you're a paying customer. You have M package versions for N packages - all open source - and you have to figure out the correct construction for yourself.
Can't anyone get a RHEL instance on their favorite cloud, dnf install whatever packages they want sources of, email Redhat to demand the sources, and shut down the instance?
Latency-aware scheduling is important in a lot of domains. Getting video frames or controller input delivered on a deadline is a similar problem to getting voice or video packets delivered on a deadline. Meanwhile housecleaning processes like log rotation can sort of happen whenever.
I mean.. many SteamOS flavors (and Linux distros in general have) have switched to Meta's Kyber IO scheduler to fix microstutter issues.. the knife cuts both ways :)
> Meta has found that the scheduler can actually adapt and work very well on the hyperscaler's large servers.
I'm not at all in the know about this, so it would not even occur to me to test it. Is it the case that if you're optimizing Linux performance you'd just try whatever is available?
almost certainly bottom-up: some eng somewhere read about it, ran a test, saw positive results, and it bubbles up from there. this is still how lots of cool things happen at big companies like Meta.
How well does Linux handle game streaming? I’m just now getting into it, and now that Windows10 is dead, I want to move my desktop PC over to linux, and end my relationship with Microsoft, formally.
It works will. I've tried used Sunshine as stream server and Moonlight as client to play games on my Steam Deck, my PC installed openSUSE Tumbleweed with KDE Plasma. There may be some key binding issues, but they can be solved with a little setup.
I keep being puzzled by the unwillingness of developers to deal with scheduling issues. Many developers avoid optimization, almost all avoid scheduling. There are some pretty interesting algorithms and data structures in that space, and doing it well almost always improves user experience. Often it even decreases total wall-clock time for a given set of tasks.
They needed Windows games to run on Linux so we got massive Proton/Wine advancements. They needed better display output for the deck and we got HDR and VRR support in wayland. They also needed smoother frame pacing and we got a scheduler that Zuck is now using to run data centers.
Its funny to think that Meta's server efficiency is being improved because Valve paid Igalia to make Elden Ring stutter less on a portable Linux PC. This is the best kind of open source trickledown.
I think the reality is that Linux is ahead on a lot of kernel stuff. More experimentation is happening.
Their research department rocks however so it's not a full bash on Microsoft at all - i just feel like they are focusing on other way more interesting stuff
"Slide left or right" CPU and GPU underclocking.
I do, MIDI 2.0. It's not because they're not doing it, just that they're doing it at a glacial pace compared to everyone else. They have reasons for this (a complete rewrite of the windows media services APIs and internals) but it's taken years and delays to do something that shipped on Linux over two years ago and on Apple more like 5 (although there were some protocol changes over that time).
Deleted Comment
The vast majority of people that were using Linux on the desktop before 2015 were either hobbyists, developers or people that didn't want to run proprietary software for whatever reason.
These people generally didn't care about a lot of fancy tech mentioned. So this stuff didn't get fixed.
I'm loving what valve has been doing, and their willingness to shove money into projects that have long been under invested in, BUT. Please don't forget all the volunteers that have developed these systems for years before valve decided to step up. All of this is only possible because a ton of different people spent decades slowly building a project, that for most of it's lifetime seemed like a dead end idea.
Wine as a software package is nothing short of miraculous. It has been monumentally expensive to build, but is provided to everyone to freely use as they wish.
Nobody, and I do mean NOBODY would have funded a project that spent 20 years struggling to run office and photoshop. Valve took it across the finish line into commercially useful project, but they could not have done that without the decade+ of work before that.
I'm sure there have been more commercial contributors to Wine other than Valve and CodeWeavers.
The guy is Philip Rebohler.
https://www.gamingonlinux.com/2018/09/an-interview-with-the-...
Linux (and its ecosystem) sucks at having focus and direction.
They might get something right here and there, especially related to servers, but they are awful at not spinning wheels
See how wayland progress is slow. See how some distros moved to it only after a lot of kicking and screaming.
See how a lot of peripherals in "newer" (sometimes a model that's 2 or 3 yrs on the market) only barely works in a newer distro. Or has weird bugs
"but the manufacturers..." "but the hw producers..." "but open source..." whine
Because Linux lacks a good hierarchy at isolating responsibility, otherwise going for a "every kernel driver can do all it wants" together with "interfaces that keep flipping and flopping at every new kernel release" - notable (good) exception : USB userspace drivers. And don't even get me started on the whole mess that is xorg drivers
And then you have a Ruby Goldberg machine in form of udev dbus and what not, or whatever newer solution that solves half the problems and create another new collection of bugs.
https://steamcommunity.com/games/221410/announcements/detail...
What you should do is just buy a SteamDeck for gaming.
Currently almost no one is using Linux for mobile because the lack or apps (banking for example) and bad hardware support. When developing for Linux becomes more and more attractive this might change.
If one (or maybe two) OSes win, then sure. The problem is there is no "develop for Linux" unless you are writing for the kernel.
Each distro is a standalone OS. It can have any variety of userland. You don't develop "for Linux" so much as you develop "for Ubuntu" or "for Fedora" or "for Android" etc.
Gaben does something: Wins Harder
We shouldn't be depending on trickledown anything. It's nice to see Valve contributing back, but we all need to remember that they can totally evaporate/vanish behind proprietary licensing at any time.
It seems like every time I read about this kind of stuff, it's being done by contractors. I think Proton is similar. Of course that makes it no less awesome, but it makes me wonder about the contractor to employee ratio at Valve. Do they pretty much stick to Steam/game development and contract out most of the rest?
Back to the root point. Small company focused on core business competencies, extremely effective at contracting non-core business functions. I wish more businesses functioned this way.
They're also a flat organization, with all the good and bad that brings, so scaling with contractors is easier than bringing on employees that might want to work on something else instead.
For contextual, super specific, super specialized work (e.g. SCX-LAVD, the DirectX-to-Vulkan and OpenGL-to-Vulkan translation layers in Proton, and most of the graphics driver work required to make games run on the upcoming ARM based Steam Frame) they like to subcontract work to orgs like Igalia but that's about it.
That said, something like this which is a fixed project, highly technical and requires a lot of domain expertise would make sense for _anybody_ to contract out.
There have been demands to do that more on HN lately. This is how it looks like when it happens - a company paying for OSS development.
Deleted Comment
I don't remember all the details, but it doesn't seem like a great place to work, at least based on the horror stories I've read.
Valve does a lot of awesome things, but they also do a lot of shitty things, and I think their productivity is abysmal based on what you'd expect from a company with their market share. They have very successful products, but it's obvious that basically all of their income comes from rent-seeking from developers who want to (well, need to) publish on Steam.
I wouldn't make excel spreadsheet on the steam deck for instance.
So Bazzite in my opinion is probably one of the best user experience flavors of Fedora around.
Yes you can do more than gaming on Bazzite.
For any individual RHEL package, you can find the source code with barely any effort. If you have a list of the exact versions of every package used in RHEL, you could compose it without that much effort by finding those packages in Stream. It's just not served up to you on a silver platter unless you're a paying customer. You have M package versions for N packages - all open source - and you have to figure out the correct construction for yourself.
> Meta has found that the scheduler can actually adapt and work very well on the hyperscaler's large servers.
I'm not at all in the know about this, so it would not even occur to me to test it. Is it the case that if you're optimizing Linux performance you'd just try whatever is available?