Something seems a bit off about the combination of hype and disclaimers here. First it's "seamless integration". Then it warns me that I should "expect to occasionally run into hiccups and bugs" and "should be comfortable with some level of troubleshooting". But then it's back to saying I can "enjoy the full range of Windows applications" "without any hassle". These just don't seem compatible to me. If it's seamless and hassle-free, that would mean there aren't hiccups and bugs. If there are hiccups and bugs, it's not seamless and hassle-free.
It may be a good project, but I always get kind of annoyed when projects try to overhype how "easy" and "smooth" the experience will be. I guess in one sense this is better than that because it does have disclaimers, but that just makes it harder to know what the truth actually is about its abilities.
Literally at the top of the docs it says it's in Beta. I don't think you have to think too hard to figure out that seamless integration is the goal but they aren't there yet.
That seems fair, but then it makes it all feel somewhat tautological: what sort of integration wouldn't aspire to be seamless, other than a beta integration.
A different selection of words wouldn't have lead to this debate, which I think is the point being made.
Then why say that it is that? It makes much more sense to me to say "The project is a beta that does X Y and Z and we hope it will eventually become this amazing seamless hassle-free thing." I don't get why projects don't distinguish future aspirations from present accomplishments.
"Seamless integration" in this case doesn't read as statement about how well it works to me. It means the applications from the windows appear on your Linux desktop without the "seam" of a full windows desktop around them.
I blame npm and the entire JavaScript ecosystem for promulgating the awful, sleazy-used-car-saleman practice of writing your readme in the form of advertising copy.
I suppose this is it. I just find it irritating when project descriptions don't clearly distinguish what they hope to eventually be from what they currently are.
"seamless integration" is an intention of the project - that doesn't mean the software is free of bugs and issues. What an absolute asinine and nonsensical argument.
Do you use an application launcher / configuration manager like Lutris to do this? Or do you mean directly through steam? There's a steam game that I play often that tends to work the most frequently with proton hotfix for reasons unknown to me.
No just directly in Steam. You can just add a non steam game to your library and select the .exe file you want, and steam will create a c drive environment for each non steam game you add.
In some cases you might have to change which exe file it runs, if you initially run a setup.exe which creates the real exe file you'd want to launch inside the c drive environment folder
Looks like it's just a fancy Docker container running the Windows RemoteApp implementation, wrapped around some VM management skins?
I normally set this up on Windows boxes directly with https://github.com/kimmknight/remoteapptool and https://github.com/kimmknight/raweb to build a basic "remote Windows apps" box on my network overall -- it's nicer to be able to have one central Windows VM running that I can put the apps wherever I need them across whatever device in my house.
I don't understand what's your suggested setup is, could you expand on that?
So you have a Linux desktop that runs a Windows VM that runs multiple things at once: the app you want to run; an RDP server that is configured via 1 of those 2 tools to stream just the app's window instead of the whole Windows desktop; and on top of it you run the other 1 of those 2 tools you linked to wrap the RDP server's stream into a web server, so that instead of running an RDP client on your Linux host to access the Windows-hosted app you could simply use the browser?
Is that your suggested setup?
If yes - then it won't fit resource-demanding gaming, as for such games you'd need to pass your GPU to the Windows VM and thus your host Linux loses a GPU, so you need a 2-GPU setup for that to work comfortably.
This entire setup (mine and OP) relies on a Windows RDP feature called “RemoteApp” that allows single-application remoting over a transparent RDP session.
RATool lets you configure RemoteApp “apps” on a serving Windows computer, which generates .rdp files which are RemoteApp “application” sessions that can be launched.
RAWeb on top serves those .rdp files in a way that’s compatible with the “remote resource feed” in Remote Desktop/Windows App, which requires hosting it on IIS.
So basically, one Windows VM is sitting on an ESXi server in my house running RAWeb with a bunch of RemoteApp .rdp files I generated with RATool, and all of my RD clients have just a normal list of apps I can launch, as if it was a proper VDI farm.
I’m not doing GPU intensive tasks so this works out fine for me.
I know nothing about this project, but this appears to be using a docker image from here "https://github.com/dockur/windows/pkgs/container/windows".
Which says "Any product keys found in the code are just generic placeholders provided by Microsoft for trial purposes."
This app appears to be using windows with trial keys which goes against their intended usage and would have questionable legality. I would think if you do use this you would need to have a licensed version of windows and alter the docker settings to use your purchased key instead of the included trial keys.
Ideally this project would have this detail in big bold letters in it's README.
So the app you want to run (suppose that's a game) runs in a docker container? Wouldn't this, just like with running the game on a Windows -in-a-VM setup) require an extra GPU (since you want your game to be GPU accelerated, but if you pass the GPU resource to the docker container (does that even work?) - your host machine thus loses the GPU?
I don't think I'd try running anything more than a puzzle game over RDP myself. There are better alternatives for that. As far as GPU sharing, there are patches for many consumer GPUs that will unlock enterprise features like gpu resource splitting/sharing.
One comment and the prerequisites hint at this tool spinning up a docker container which runs a windows VM and pulls the windows out using some remote desktop tool
True enough... that said, most of the security enhancements came over the Vista/Win7 timeframe. Just be careful with what you run and what apps have internet access... this setup seems to have full access to your home directory, which for most users can be every bit as bad as a root exploit.
There are a bunch of great forks / integrations of WinApps project (& similar) on GitHub now - I highly recommend LinOffice (https://github.com/eylenburg/linoffice/) for those who need native Office 365 (I needed to reluctantly edit Excel macros, and dual booting from Linux into Windows many times a day wasn't efficient).
It's great to support the team at CrossOver, but if you need a recent version of office or a Windows application that Wine/Proton doesn't support properly, then Docker/Podman running QEMU/KVM in the background is surprisingly performant (which Lin Office orchestrates all for you).
It may be a good project, but I always get kind of annoyed when projects try to overhype how "easy" and "smooth" the experience will be. I guess in one sense this is better than that because it does have disclaimers, but that just makes it harder to know what the truth actually is about its abilities.
A different selection of words wouldn't have lead to this debate, which I think is the point being made.
It costs a little bit of money, but it comes with support and directly finds the WINE project (CodeWeavers is a major contributor.)
In some cases you might have to change which exe file it runs, if you initially run a setup.exe which creates the real exe file you'd want to launch inside the c drive environment folder
I normally set this up on Windows boxes directly with https://github.com/kimmknight/remoteapptool and https://github.com/kimmknight/raweb to build a basic "remote Windows apps" box on my network overall -- it's nicer to be able to have one central Windows VM running that I can put the apps wherever I need them across whatever device in my house.
So you have a Linux desktop that runs a Windows VM that runs multiple things at once: the app you want to run; an RDP server that is configured via 1 of those 2 tools to stream just the app's window instead of the whole Windows desktop; and on top of it you run the other 1 of those 2 tools you linked to wrap the RDP server's stream into a web server, so that instead of running an RDP client on your Linux host to access the Windows-hosted app you could simply use the browser?
Is that your suggested setup?
If yes - then it won't fit resource-demanding gaming, as for such games you'd need to pass your GPU to the Windows VM and thus your host Linux loses a GPU, so you need a 2-GPU setup for that to work comfortably.
RATool lets you configure RemoteApp “apps” on a serving Windows computer, which generates .rdp files which are RemoteApp “application” sessions that can be launched.
RAWeb on top serves those .rdp files in a way that’s compatible with the “remote resource feed” in Remote Desktop/Windows App, which requires hosting it on IIS.
So basically, one Windows VM is sitting on an ESXi server in my house running RAWeb with a bunch of RemoteApp .rdp files I generated with RATool, and all of my RD clients have just a normal list of apps I can launch, as if it was a proper VDI farm.
I’m not doing GPU intensive tasks so this works out fine for me.
Otherwise looking at the code this feels like something that could be a short bash script wrapped in a electron app.
[0] https://github.com/dockur/windows?tab=readme-ov-file#is-this...
It's great to support the team at CrossOver, but if you need a recent version of office or a Windows application that Wine/Proton doesn't support properly, then Docker/Podman running QEMU/KVM in the background is surprisingly performant (which Lin Office orchestrates all for you).