This is how IT worked 40+ years ago: a central server hosting everything (applications+data) surrounded by dumb terminals that would turn into doorstops if the network connection with the server failed for any reason. This approach made sense back then because of the enormous cost of hardware, but technology advancement and reduction in prices soon allowed more modern models like the personal computer which uses the network connection when available but it's not dependent on it, giving the user a lot more freedom.
Then corporate greed discovered they can turn everything into a pay per use model to maximize profits, which of course mean they must lock both apps and users data then ask for money to open them, until of course they go out of business, the server is hacked or they simply decide to raise prices.
The cloud based SaaS approach is how corporations are throwing down the toilet decades of advancement in the IT just because of greed. Just say no.
At some point, internet connections will be fast enough to make them indistinguishable form a local machine. Today, that is not true.
In my own job, I straddle cloud-based and local machine-based software. Overwhelmingly, the cloud-based software is slow to the extent that it removes at least 40 points of IQ and offsets a decade of experience. The intellect and mental speed I have spent a lifetime building are simply washed away by latency that shuts my mind down, effectively turning me into a ninth-grader. Sure the company pays me as if I continue to generate output becoming of my generous salary. But I maintain the illusion simply by spending many hours on weekends with copious amounts of coffee and, at times, gin, doing work on cloud-based software that, on the "outdated" local software, used to take minutes.
Cloud software is among the biggest frauds ever perpetrated on the American consumer.
Having the database on the same server as the application significantly improves responsiveness of applications.
Many of these cloud applications choose fragmentary architectures with data sprinkled across many discreet servers that are milliseconds or tens of milliseconds away from each other, resulting in significant delays compared to doing these blocking operations against a local database.
Cloud native isn't always the right choice, sometimes CRUD apps should have a local database just for the responsiveness benefits.
I play PC and VR games on ShadowPC. I can’t tell the difference with a controller, and the latency with a keyboard and mouse is like a cheap Bluetooth mouse. I think the key for these is relatively local data centers.
Once ISPs cohost, or get in on the game themselves, we’re set!
It is also how it worked 20-10 years ago: it was called Terminal Services.
And then it was neglected because next shiny thing: VDI.
The thing is, one big server for a hundred office users doesn't cost much, because they are sharing the RAM. But with VDI you need at least 100 x 4Gb memory and that means you need way more than one server. And some SAN, because only SAN with fast drive can survive the morning boot storm. Oh, by the way, here are two FC SAN switches for your HA storage fabric, because you really want paths to the SAN be redundant.
But anyway, local installs are not bringing revenue constantly. But renting the services do. This was perfectly clear with AzureAD and Exchange 365.
Not dead yet, my consulting business is almost entirely built around provisioning terminal servers as small private cloud managed systems for companies with niche business requirements. Not sure how much longer the model is going to work because eventually I suspect Microsoft will simply kill off on-prem terminal services all together or make it completely uncompetitive with their own azure / Office 365 combination.
But the surfeit of stupendously powerful white box server platforms and the remarkably efficient way that a terminal server conserves memory usage across sessions compared to containers or vdi or anything else, absolutely outperforms almost every other solution in the SMB space, in terms of how each support engineer can scale to 100 or more end users across a half dozen completely unrelated organizations. I'm surprised how underrepresented this business model is these days.
This cycle of alternating didn’t start 40 years ago, which would be 1983. We have been cycling between decentralization and centralization since at least the 1970’s.
Because they both suck, either side “winning” is worse than staying in the middle. But we only like pendulum swinging in this industry so nobody stops as the bottom of the arc.
That's true and one great advantage. But I don't get how to apply this advantage to personal computing. I have a laptop and I never ever wanted to randomly "log into my cloud workstation" from some other place. I can just open the lid of my laptop and continue working. What advantage does the requirement of a persistent network connection offer me? Seems like working on a train is not doable.
I could see a use case where people don't want / can spend to much money on hardware. But even then, hardware is still required and I can't see where a subscription would help here.
True, but aren't there far more economical ways of doing it if you're an IT department? Running a full Windows VM with 4G to 8G of reserved memory seems incredibly wasteful if all you want to do is centralise the provisioning and management software and data.
Why not Chromebooks + web apps if you need some client-side hardware anyway?
During my summer trainship, accounting was run on an AS/400 used by dumb terminals on the harbour offices.
At the university we had DG/UX, Solaris and eventually Red-Hat Linux, with a mix of classical phosphor and green terminals, PC and classical Macs (LC models).
At Nokia and CERN we used to develop on UNIX (HP-UX, Solaris, Scientific Linux) servers, PC and Macs were only used for Office and as thin clients for the R&D servers.
And then we moved into Citrix, RDP and Terminal Services for Windows development, so it has been with us all along.
> because Microsoft wants your Windows desktop to live in the cloud.
That's just never going to be acceptable for me. If the day ever comes there's no choice, I guess I'll keep using the last version that still works 100% locally. I already resent the extra time I have to spend keeping Win10 and Firefox usable against the constant creep of enshittification (<-- a great term by Cory Doctorow), so at least that'll stop when I jump off the upgrade train.
Ahh I wish I could keep doing that. Just yesterday windows decided to update itself in the middle of the night. Now my laptop is acting really slow even though no process is hogging CPU or ram. Tried removing the update but it's only slightly better. I'm in the middle of resetting it as I type this comment.
It got so bad that a couple of import statements in my jupyrr notebook cell took a minute to finish executing. It used to take barely a couple of seconds just yesterday. If resetting does not work I'll have to try a fresh install. If that doesn't work too it's Linux for me I guess.
The only reason I use windows is cause I play games and I am pretty used to it's UI. Guess it's almost the end of the road with this OS for me
Depending on the game, there's still that classic linux hassle (aka series of hacks you can impress your friends with) that are needed to get some games working, but the number of games supported and working without the hassle is honestly surprisingly good imo. The typical ones that you can expect to cause a headache are the always on live service games with heavy handed DRM.
Which games? Have a gander at ProtonDB or WineDB for your regulars. KDE is pretty close to Windows if you want to transfer your knowledge.
If the times comes, please avoid Ubuntu - Nobara (Fedora + gaming tweaks) might be a good choice for you. Debian would be another good choice as a general purpose desktop, and many Ubuntu tutorials/Q&A would work just fine with it. Garuda Arch-based if you want street cred, and uses KDE by default.
I've been using Linux for decades, and I'd regularly attempt to play games every few years.
The most recent time, I was shocked how good it was (with Steam & Proton). The only games I found that didn't work were due to DRM and anti-cheating software.
And I have a 2014 desktop PC and did a full reinstall when Windows 10 released in 2015. There have been issues but nothing a windows & driver update + reboot coulnt solve.
>Now my laptop is acting really slow even though no process is hogging CPU or ram
but that would fall into the realm of enshitification.
Linux has always been lackluster on the Desktop - fragmented, lacking features, lacking important applications, etc - and MacOS is attached to overpriced mediocrity.
I reckon desktops have been dying for a long time, and I love me a good desktop.
After many years of enlightenment16 on underpowered (scavenged) computers KDE Plasma became my desktop of choice. It still is.
Nowadays it's a choice between a browser and an editor for all but multimedia editing and gaming. So emacs and firefox work really well. I still use a desktop environnment for convenient interface with system settings such as wifi networks, window placement, power management and the like, but otherwise KDE disappears into the background as it should.
But, as I explore the wonders of NixOS these settings are more and more becoming declaratively controlled by a simple collection of assci files, and it's wonderful. As soon as emacs becomes multithreaded I'm switching to EXWM. And then my desktop enviroment will be fully subsumed.
I reckon this combination will eventually become ubiquitous as everyone will become some kind of a programmer, and Richard Stallman will be proven correct.
Apple, Google, and Microsoft, will suddenly realise there is no lunch and everyone else will eat computation more happily and healthily. Though in the long run Guix will be more nutritious than NixOS.
The desktop metaphor will become a kind of ancient poetry we explore for semiotic curiosities as we forge new pathways for collaborative thinking.
And we will always deploy the lambda calculus on the Von Neumann architecture, even when quantum computing is ubiquitous, because we need metaphors, because computing will always be a kind of poetry.
When I was a kid I had a vivid dream of having to watch a car commercial before my computer booted up. I realize now that we are very close to that reality.
Linus Tech Tips covered a Chinese smart TV [1] that had a preloaded car commercial when you boot it up. The TV was not even connected to the internet, the video file was preloaded from the factory.
I had the unfortunate pleasure of working on remote windows at { large financial institution }. It was a nightmare. Yes they got a huge immediate cash savings on hardware, but at the cost of worker productivity.
The response time from keypress to response was literally like I was in 1992 working with a dialup connection to a terminal server.
Oh, also the executives got their own laptops, so they didn't dogfood their own solution.
> Oh, also the executives got their own laptops, so they didn't dogfood their own solution.
Things like this happen when the person who signs the check is not the person who actually uses the software. Also, no matter the resulting clusterfuck, the project is always successful.
I could imagine it almost working as a two-tiered universe.
You'll have a lot of thin-client/remote environments used for places where the primary metric is manageability. Think of how schools love Chromebooks, and places like call centres that simply want users in a bolted down kiosk mode. They don't care if the experience is miserable, but they like the idea that nobody's reading Lemmy.
It's going to be a huge hairball for the more advanced user market. Your engineering and graphics and big-data people are going to need special tools with likely complex licensing and setup hassles, mitigating the manageability benefits, and you know there will be a point where hiring a big cloud machine gets pricier than buying an actual workstation.
I wonder if some vendors will instead buy into a reduced-version of the concept for distribution though-- instead of ever getting a copy of the bits for AutoCAD or Photoshop, you just get a RDP login to a big machine running it, almost reinventing the X11 paradigm.
Except the endgame is the beancounters win. Always.
First phase of a tech rollout gives overly generous hardware, because the end user needs to be won over. Then comes right-sizing because it makes economic sense, and we know things are too generous. Then end users get decoupled from bean counters and the provisioning gets done by a third party corporation, so feedback cycles are broken. At that point, the race to the bottom starts.
I've seen people declare: We provision to the satisfaction of the customer. Which means: If you don't do the effort to complain, all is well. The results is the best people leave, as they won't deal with that. The middlings complain, bothering their managers, so they get labelled troublemakers and are oushed out. The yes men stay and praise the situation. The 9to5ers stay and get paid no matter what, so who cares if msword's text appear multiple seconds after you typed it.
Yeah I don't know where this guy is coming from, but their set up must have been provisioned in a deliberately suboptimal way or criminally oversubscribed. Unless you're gaming or expecting scrolling at a rock solid 75 HZ+ refresh current gen RDP is almost indistinguishable from running a local desktop over an internal gigabit link.
Even even over a paltry 15mbit consumer grade cable internet uplink it is possible to play full screen YouTube videos in high def over plain vanilla RDP version 8, without even getting into any of the (often quite flaky and at this point highly redundant) Citrix enhancements.
I was apprehensive at first but actually it's pretty great. I can just turn off my computer at the office and turn it back on at home and get back the same session. I can move to any stations in the building without issues.
I don't need to carry around a laptop so I can go to the office with just my keycard and my wallet. I can buy whatever device I want for myself and use it for work too without actually mixing personal and work related computing at all.
I've heard it costs more than buying new laptops to everyone every year.
Also at a very big BigCo using Citrix Workspaces; experience is decent. Got an 8 core/64 GB RAM VM and it’s pretty snappy for most dev work. The portability is nice compared to lugging a work laptop, and connection doesn’t need a janky VPN client. Using Zoom for VDI with the VDI plugin on my home machine is even smoother than same thing on the thin client in the office.
>Citrix/RDP farm should NEVER be used for cost savings. It is for data control. But bean counters think it saves costs
Except ... it does "save costs"
Allowing BYOD (or eve just supplying a 'basic' device) and connecting only through Horizon (or whatever client of choice you prefer) is a huge savings over managing fleets of laptops, desktops, etc - all with different support cycles, specs, etc
All your management is focused on just one layer - the virtual desktop (and its supporting infrastructure)
I can't believe that Microsoft is missing the most obvious market here: DRM for business software.
Think about it. High-end CAM software (MasterCAM) can easily cost $10K or more per seat. If you want modules for, say, a 5-axis, it can be as much as $35K per seat. SolidWorks will happily charge $4K per seat, minimum. And of course, people complain when their SolidWorks license doesn't work well on their Integrated Graphics and Core i3-4100U; or whatever old systems they are trying to run it on. So much work to fix all the bugs and quirks with old hardware...
... or, just announce that SolidWorks or MasterCAM now run online. With good cloud-based hardware. The subscription makes it more universally affordable, provides service revenue the investors love, there's no more energy spent supporting old hardware and low-end systems, and piracy is dead which the investors also love.
The same technology was debuted in Windows 10, and in Windows 8. Same silly articles said that it was the future of Windows and you wouldn't be able to use a real desktop anymore, as though Microsoft had just up and forgot about the billion Windows machines that don't have good internet access and the billion more that don't have any at all. But if you write an article wailing that Microsoft is going to ruin your computer and steal your cat, you get clicks. Speaking of things that are dying.
Yes, I found personal pictures from years ago suddenly appear on a work OneDrive. The entire Microsoft accounts ecosystem is such a cluster fuck I don't think they could have designed an easier system to get phished in. It's like they never conceive of the idea that some people don't have the luxury of having a different laptop for all my personal and professional needs. This makes comingling data WAY to easy. IOS is also bad at this.
In my own job, I straddle cloud-based and local machine-based software. Overwhelmingly, the cloud-based software is slow to the extent that it removes at least 40 points of IQ and offsets a decade of experience. The intellect and mental speed I have spent a lifetime building are simply washed away by latency that shuts my mind down, effectively turning me into a ninth-grader. Sure the company pays me as if I continue to generate output becoming of my generous salary. But I maintain the illusion simply by spending many hours on weekends with copious amounts of coffee and, at times, gin, doing work on cloud-based software that, on the "outdated" local software, used to take minutes.
Cloud software is among the biggest frauds ever perpetrated on the American consumer.
Many of these cloud applications choose fragmentary architectures with data sprinkled across many discreet servers that are milliseconds or tens of milliseconds away from each other, resulting in significant delays compared to doing these blocking operations against a local database.
Cloud native isn't always the right choice, sometimes CRUD apps should have a local database just for the responsiveness benefits.
Once ISPs cohost, or get in on the game themselves, we’re set!
Has latency really improved that much lately? At least for gaming latency has been more or less constant for me over the last ~10 years or so.
And if the server is in the same city or quite close to you I think it’s already possible but possibly not that useful.
It is also how it worked 20-10 years ago: it was called Terminal Services.
And then it was neglected because next shiny thing: VDI.
The thing is, one big server for a hundred office users doesn't cost much, because they are sharing the RAM. But with VDI you need at least 100 x 4Gb memory and that means you need way more than one server. And some SAN, because only SAN with fast drive can survive the morning boot storm. Oh, by the way, here are two FC SAN switches for your HA storage fabric, because you really want paths to the SAN be redundant.
But anyway, local installs are not bringing revenue constantly. But renting the services do. This was perfectly clear with AzureAD and Exchange 365.
But the surfeit of stupendously powerful white box server platforms and the remarkably efficient way that a terminal server conserves memory usage across sessions compared to containers or vdi or anything else, absolutely outperforms almost every other solution in the SMB space, in terms of how each support engineer can scale to 100 or more end users across a half dozen completely unrelated organizations. I'm surprised how underrepresented this business model is these days.
Because they both suck, either side “winning” is worse than staying in the middle. But we only like pendulum swinging in this industry so nobody stops as the bottom of the arc.
OTOH, you could log off from one terminal and go to any other terminal (maybe in a different country) and continue to work right where you left.
If the environment is designed for that, it's not only doable, but practical.
I could see a use case where people don't want / can spend to much money on hardware. But even then, hardware is still required and I can't see where a subscription would help here.
Why not Chromebooks + web apps if you need some client-side hardware anyway?
At the university we had DG/UX, Solaris and eventually Red-Hat Linux, with a mix of classical phosphor and green terminals, PC and classical Macs (LC models).
At Nokia and CERN we used to develop on UNIX (HP-UX, Solaris, Scientific Linux) servers, PC and Macs were only used for Office and as thin clients for the R&D servers.
And then we moved into Citrix, RDP and Terminal Services for Windows development, so it has been with us all along.
That's just never going to be acceptable for me. If the day ever comes there's no choice, I guess I'll keep using the last version that still works 100% locally. I already resent the extra time I have to spend keeping Win10 and Firefox usable against the constant creep of enshittification (<-- a great term by Cory Doctorow), so at least that'll stop when I jump off the upgrade train.
It got so bad that a couple of import statements in my jupyrr notebook cell took a minute to finish executing. It used to take barely a couple of seconds just yesterday. If resetting does not work I'll have to try a fresh install. If that doesn't work too it's Linux for me I guess.
The only reason I use windows is cause I play games and I am pretty used to it's UI. Guess it's almost the end of the road with this OS for me
Depending on the game, there's still that classic linux hassle (aka series of hacks you can impress your friends with) that are needed to get some games working, but the number of games supported and working without the hassle is honestly surprisingly good imo. The typical ones that you can expect to cause a headache are the always on live service games with heavy handed DRM.
If the times comes, please avoid Ubuntu - Nobara (Fedora + gaming tweaks) might be a good choice for you. Debian would be another good choice as a general purpose desktop, and many Ubuntu tutorials/Q&A would work just fine with it. Garuda Arch-based if you want street cred, and uses KDE by default.
The most recent time, I was shocked how good it was (with Steam & Proton). The only games I found that didn't work were due to DRM and anti-cheating software.
>Now my laptop is acting really slow even though no process is hogging CPU or ram
Have you rebooted?
Is it thermal throttling?
Linux has always been lackluster on the Desktop - fragmented, lacking features, lacking important applications, etc - and MacOS is attached to overpriced mediocrity.
After many years of enlightenment16 on underpowered (scavenged) computers KDE Plasma became my desktop of choice. It still is.
Nowadays it's a choice between a browser and an editor for all but multimedia editing and gaming. So emacs and firefox work really well. I still use a desktop environnment for convenient interface with system settings such as wifi networks, window placement, power management and the like, but otherwise KDE disappears into the background as it should.
But, as I explore the wonders of NixOS these settings are more and more becoming declaratively controlled by a simple collection of assci files, and it's wonderful. As soon as emacs becomes multithreaded I'm switching to EXWM. And then my desktop enviroment will be fully subsumed.
I reckon this combination will eventually become ubiquitous as everyone will become some kind of a programmer, and Richard Stallman will be proven correct.
Apple, Google, and Microsoft, will suddenly realise there is no lunch and everyone else will eat computation more happily and healthily. Though in the long run Guix will be more nutritious than NixOS.
The desktop metaphor will become a kind of ancient poetry we explore for semiotic curiosities as we forge new pathways for collaborative thinking.
And we will always deploy the lambda calculus on the Von Neumann architecture, even when quantum computing is ubiquitous, because we need metaphors, because computing will always be a kind of poetry.
So long, and thanks for all the shells.
[1] https://youtu.be/4eSADWuZskk?t=213
http://edition.cnn.com/TECH/computing/9902/10/freepc.idg/
I had the unfortunate pleasure of working on remote windows at { large financial institution }. It was a nightmare. Yes they got a huge immediate cash savings on hardware, but at the cost of worker productivity.
The response time from keypress to response was literally like I was in 1992 working with a dialup connection to a terminal server.
Oh, also the executives got their own laptops, so they didn't dogfood their own solution.
Things like this happen when the person who signs the check is not the person who actually uses the software. Also, no matter the resulting clusterfuck, the project is always successful.
You'll have a lot of thin-client/remote environments used for places where the primary metric is manageability. Think of how schools love Chromebooks, and places like call centres that simply want users in a bolted down kiosk mode. They don't care if the experience is miserable, but they like the idea that nobody's reading Lemmy.
It's going to be a huge hairball for the more advanced user market. Your engineering and graphics and big-data people are going to need special tools with likely complex licensing and setup hassles, mitigating the manageability benefits, and you know there will be a point where hiring a big cloud machine gets pricier than buying an actual workstation.
I wonder if some vendors will instead buy into a reduced-version of the concept for distribution though-- instead of ever getting a copy of the bits for AutoCAD or Photoshop, you just get a RDP login to a big machine running it, almost reinventing the X11 paradigm.
Then COVID hit and all of a sudden we all worked from home exactly the same way. It was a wow moment of how prepared we were comparing to everyone.
Meanwhile the managers, (I only assume), got huge bonuses out of it.
Yes, it is never as good as native, but it's not that bad if done properly.
Citrix/RDP farm should NEVER be used for cost savings. It is for data control. But bean counters think it saves costs... and user experience suffer.
First phase of a tech rollout gives overly generous hardware, because the end user needs to be won over. Then comes right-sizing because it makes economic sense, and we know things are too generous. Then end users get decoupled from bean counters and the provisioning gets done by a third party corporation, so feedback cycles are broken. At that point, the race to the bottom starts.
I've seen people declare: We provision to the satisfaction of the customer. Which means: If you don't do the effort to complain, all is well. The results is the best people leave, as they won't deal with that. The middlings complain, bothering their managers, so they get labelled troublemakers and are oushed out. The yes men stay and praise the situation. The 9to5ers stay and get paid no matter what, so who cares if msword's text appear multiple seconds after you typed it.
Even even over a paltry 15mbit consumer grade cable internet uplink it is possible to play full screen YouTube videos in high def over plain vanilla RDP version 8, without even getting into any of the (often quite flaky and at this point highly redundant) Citrix enhancements.
I was apprehensive at first but actually it's pretty great. I can just turn off my computer at the office and turn it back on at home and get back the same session. I can move to any stations in the building without issues.
I don't need to carry around a laptop so I can go to the office with just my keycard and my wallet. I can buy whatever device I want for myself and use it for work too without actually mixing personal and work related computing at all.
I've heard it costs more than buying new laptops to everyone every year.
Except ... it does "save costs"
Allowing BYOD (or eve just supplying a 'basic' device) and connecting only through Horizon (or whatever client of choice you prefer) is a huge savings over managing fleets of laptops, desktops, etc - all with different support cycles, specs, etc
All your management is focused on just one layer - the virtual desktop (and its supporting infrastructure)
Everything else is noise
Think about it. High-end CAM software (MasterCAM) can easily cost $10K or more per seat. If you want modules for, say, a 5-axis, it can be as much as $35K per seat. SolidWorks will happily charge $4K per seat, minimum. And of course, people complain when their SolidWorks license doesn't work well on their Integrated Graphics and Core i3-4100U; or whatever old systems they are trying to run it on. So much work to fix all the bugs and quirks with old hardware...
... or, just announce that SolidWorks or MasterCAM now run online. With good cloud-based hardware. The subscription makes it more universally affordable, provides service revenue the investors love, there's no more energy spent supporting old hardware and low-end systems, and piracy is dead which the investors also love.