> the evolution of iPadOS has stagnated over the years.. truly native, modern iPad apps by the most popular web services essentially don’t exist anymore.. If not even Apple is putting in the time, care, and resources to make exceptional iPad apps, why should a third party do it.. Web apps may be common everywhere these days, but the Mac has a vibrant ecosystem of indie apps going for it, and the Vision Pro has better multitasking and proper integration with macOS. The iPad has neither.
iPad is the new butterfly keyboard. Perchance rebirth awaits in the convergence of iOS, macOS and VisionOS.
For browsing the web and using web apps the iPad is perfect. It's cheap enough that I don't mind doing slightly dangerous things with it that might kill it once every three years.
Inevitably, for things like Bluesky and Mastodon, the experience of using the web app on the iPad is perfect, but the app experience is "phonish" and just doesn't fit the screen size but why should I care when the web experience is perfect?
We are already experiencing this for years on Windows, that is why its current development experience turned into a mess.
New generation of employees, without any Windows development culture, or even experience, as from the blank stares when asked about stuff that has been on Windows for decades during community calls, and Webview2 being pushed all over the place on the OS.
It continues to astound me that Apple seems not only reluctant – but actively disinterested – in bringing general purpose computing to touchscreen devices.
I want whole-ass macOS on the iPad, and not only are they not doing it, they're doing the reverse: massive adoption beyond early apple's technical/creative userbase is leading to the iOS-ification of macOS. Lower level controls are increasingly locked behind padded walls. I'm old enough to remember when the radio button for "allow apps from unidentified developers" was just... there. I didn't need to hunt stackoverflow for wacky CLI workarounds just to install indie software on my own computer.
It's uniquely unfortunate for apple too, because it's apple. Surely bringing desktop computing patterns to finger/pencil interfaces has a lot of hard problems in need of novel solutions, but there was a time when apple was an HCI powerhouse, and would have been as good a candidate as anyone to try and tackle such things. Could ~2013 apple have done windows 8 better than windows 8? In their sleep, IMO.
Anyway, do people have favorite takes on the actual motivations behind this? Does it truly come down to a desire to own the platform e2e and keep raking in that sweet sweet app store tax? Or is there some kind of more nuanced incentive structure at work?
Making an open ecosystem will never be in the best interest of their shareholders. They didn't make it to the top of Wall Street by accident.
Why would they make profit on their hardware alone if they can have an almost passive income in the form of an app store.
They'll put as many scary pop-ups as they can to prevent a regular user from installing third party software as they can.
They'll block third party developers from integrating with their ecosystem as much as they can get away with.
It'll be really interesting to see how much EU can influence them, but I wouldn't hold my breath to see them doing more than absolute bare legal minimum.
They're disinterested because they get to sell more hardware that way. Combine all functions in one device and you'll only buy that one device, split those functions arbitrarily over several devices while making sure that those devices only really play along well within their own family and the target market has shown to buy those devices even though they sputter and fume about it being astounding that they're not combining functions.
The solution is clear but hard to implement since the congregation will have to be convinced to stop going for the next sacrament.
This claim is frequently repeated, but does not reflect reality for those who buy Apple devices. When iPads actually meet user needs, the result is iPad proliferation, not replacement of MacBooks. An iPad with macOS/Linux VM is needed for adhoc time-sensitive tasks which require non-iOS software, like a quick doc edit or Unix toolchain. For scheduled work sessions, Macbook is better.
As the article noted, there is a $3500 iPad. If that iPad variant ran macOS/Linux VMs, it would sell more units. Anyone willing to spend that much money for portability and convenience would not blink at buying a Macbook for other use cases.
I keep thinking this will be a market opportunity for someone to make phones and tablets into computers that people own, that don't have to ask permission from someone to use, but it hasn't gone anywhere.
I have two internal web applications that I developed on the same code base. (1) YOShInOn, a smart RSS reader with a content-based recommender model and (2) Fraxinus, an image sorting and viewing application.
These work great on desktop, great on iPad, pretty good on Android and sorta OK on phones where the screen size is a little too small. I even found out they work great on the Meta Quest 3, where it isn't a real XR app but I can view and sort images on three great big monitors hanging in my room or in a Japanese temple [1]
For years I had a series of bottom-of-the-line Android phones that left me thinking that "app" is a contraction of "crap" as opposed to "application". When I heard Skype was ending I ported my number to an iPhone and I can see with a high end device and 5G you don't need to wait a minute for the app to get into your gym to load. On the iPhone a number of factors come together for the app to be worthwhile but I think it was never true on the iPad and web applications have long been undervalued, even though the experience of most web apps on the iPad, even if they weren't designed for it, is the definition of "just works".
It's an astonishing mistake that Apple never caught up with the Microsoft Surface and made the iPad Pro compatible with Mac applications, from a hardware reason there is no reason why not other than Apple thinks anyone who buys a Pro is made of money and can afford a macbook too -- it's like the way Digital struggled in the microcomputer age because they were afraid that cheap microcomputers compatible with the PDP-11 or VAX would cut off their minicomputer business.
It's hard to say what part of the Vision Pro failure was "too expensive" (the Meta Quest consumer thinks the MQ3 is too expensive) vs the moral judgement that using controllers is like putting your hand in a toilet; the MQ3 shows you can make great fully immersive games and experiences if you have controllers... At Vision Pro prices the device has to do it all and watching 3-d movies and using phone apps on a screen in midair just doesn't cut it.
[1] Curious how it works w/ the Vision Pro: web apps are easy to work with the controllers on the MQ3 but I found that it really helps to meet the WCAG AAA size requirements to make the targets easy to hit. How is it with the hand tracking on the Vision Pro? I found that I can do stuff w/ tracking and the clicker on the original Hololens but it isn't easy.
> Apple thinks anyone who buys a Pro is made of money and can afford a macbook too
Owner of iPhone, iPad Mini, iPad Pro, MBA & Mac Mini hardware: after waiting for VMs on iPad, I moved some workflows to Google Pixel Tablet + GrapheneOS, which has inferior UX and no hardware keyboard, but at least it can run Debian Linux VM for local development. Next step is to try Pixel 8+ with USB-c DisplayPort docking to monitor/kb/mouse.
Sandboxing giveth and sandboxing taketh away. The security value of sandboxing, particularly in a web browser, is obvious. But as a result web apps can't access a meaningful filesystem (not counting React Native, don't get me started on IndexDB or OPFS) and this has a significant impact on application development - pushing developers to build much more complicated client-server architectures which increases the cost of making a great app.
Swift and SwiftUI are seductive for building cross-platform iOS/Mac/IpadOS applications but despite Apple's marketing have significant frictions when building real-world apps. There's a reason large companies are still using Objective-C and UI-Kit - SwiftUI, and DEFINITELY SwiftData, are arguably not ready for production yet in direct contradiction to Apple's community messaging.
Look you can build a great app with any of these stacks, there's a lot of nuance in choosing between them, and the most cost effective and quality effective path will be decided by the developers strengths and weaknesses not the latest blog article or what happened to be "successful" for somebody else.
The values and viewpoints of the developer certainly matters.
One of the "eternal September" moments of web app development was in the late 1990s when Microsoft went "all-in" and Microsoft-oriented developers flooded forums with tearful laments about how "MY BOSS NEEDS TO ME TO BE ABLE TO ACCESS LOCAL FILES IN A WEB APPLICATION!"
From the viewpoint of a web-native developer though, you need local files about as much as Superman needs a Kryptonite sandwich. (You didn't lose local files, you gained the universe! Look at how multimedia software on CD-ROM was utterly destroyed by the world wide web!)
That image sorter, for instance, has a million images sampled from a practically infinite pool all available instantly to any device anywhere in the world that's attached to my TailNet -- though you'd better believe I keep RAWs from my Sony on local file systems. [1]
I had a boss who ran a web design company circa 2005 which had a great spiel about how with web applications small businesses could finally afford custom software, I had my own technical version of it which went something like. "Back in the Windows '95 age obsolete desktop applications kept their state as a disorganized graph of pointers that inevitably gets corrupted just like cheese goes bad and crash; modern web applications keep their state in a transaction-protected database (if you're smart enough to NOT GET SEDUCED BY SESSION VARIABLES PROVIDED BY YOUR RUNTIME) so your application state is refreshed with every page update"
[1] Adobe's relationship to the local filesystem drives me nuts even though I've talked w/ people there like Larry Masinter and deeply studied file formats enough to understand their point of view. My first instinct, having worked at a library, is that you want to associate metadata with an object, in fact I really want to "name" an object after a hash of the contents and have the object be immutable.
On the other hand, XMP which stuffs a metadata packet into a file, is a good fit for the way people use desktop apps. Still, my #1 complaint about Photoshop is thinks a file has a changed (some metadata has changed) when I do some operation that I don't think of as a mutation such as making a print or exporting a compressed and scaled JPG with the "Save to Web" Dialog. Since I do a lot of different things with my machine I am always forcing shutdowns which means Photoshop is always harassing me to recover files that I "didn't save" even though I didn't really change them. Sometimes I just save files that I don't really want to save but it feels icky because the source file might be a JPG which might not round trip perfectly, where I might feel compelled to save at a higher quality than the original file and all of that.
The amount of hoops you have to jump through to develop for Apple platforms stand in stark contrast to the increase in revenue you can expect from it. Mac used to be a platform for connoisseurs, but the iOS platforms are just mass market race-to-the-bottom shovelware dumps. Users do not appreciate its applications and developers in turn do not appreciate its users. Everyone gets what they deserve.
iPad is the new butterfly keyboard. Perchance rebirth awaits in the convergence of iOS, macOS and VisionOS.
Inevitably, for things like Bluesky and Mastodon, the experience of using the web app on the iPad is perfect, but the app experience is "phonish" and just doesn't fit the screen size but why should I care when the web experience is perfect?
New generation of employees, without any Windows development culture, or even experience, as from the blank stares when asked about stuff that has been on Windows for decades during community calls, and Webview2 being pushed all over the place on the OS.
I want whole-ass macOS on the iPad, and not only are they not doing it, they're doing the reverse: massive adoption beyond early apple's technical/creative userbase is leading to the iOS-ification of macOS. Lower level controls are increasingly locked behind padded walls. I'm old enough to remember when the radio button for "allow apps from unidentified developers" was just... there. I didn't need to hunt stackoverflow for wacky CLI workarounds just to install indie software on my own computer.
It's uniquely unfortunate for apple too, because it's apple. Surely bringing desktop computing patterns to finger/pencil interfaces has a lot of hard problems in need of novel solutions, but there was a time when apple was an HCI powerhouse, and would have been as good a candidate as anyone to try and tackle such things. Could ~2013 apple have done windows 8 better than windows 8? In their sleep, IMO.
Anyway, do people have favorite takes on the actual motivations behind this? Does it truly come down to a desire to own the platform e2e and keep raking in that sweet sweet app store tax? Or is there some kind of more nuanced incentive structure at work?
Why would they make profit on their hardware alone if they can have an almost passive income in the form of an app store.
They'll put as many scary pop-ups as they can to prevent a regular user from installing third party software as they can. They'll block third party developers from integrating with their ecosystem as much as they can get away with.
It'll be really interesting to see how much EU can influence them, but I wouldn't hold my breath to see them doing more than absolute bare legal minimum.
Why not both? Google is offering Android app store _and_ Debian Linux packages. Each distribution mechanism meets different needs.
They're disinterested because they get to sell more hardware that way. Combine all functions in one device and you'll only buy that one device, split those functions arbitrarily over several devices while making sure that those devices only really play along well within their own family and the target market has shown to buy those devices even though they sputter and fume about it being astounding that they're not combining functions.
The solution is clear but hard to implement since the congregation will have to be convinced to stop going for the next sacrament.
This claim is frequently repeated, but does not reflect reality for those who buy Apple devices. When iPads actually meet user needs, the result is iPad proliferation, not replacement of MacBooks. An iPad with macOS/Linux VM is needed for adhoc time-sensitive tasks which require non-iOS software, like a quick doc edit or Unix toolchain. For scheduled work sessions, Macbook is better.
As the article noted, there is a $3500 iPad. If that iPad variant ran macOS/Linux VMs, it would sell more units. Anyone willing to spend that much money for portability and convenience would not blink at buying a Macbook for other use cases.
Apple didn't even deign to give their lowly iPad users a file manager until 2017. Post-iOS Apple's aspirations are to be AOL.
2025 iOS, macOS and VisionOS are converging.
Android and ChromeOS are also being unified.
The muscle memory of 1 billion human workflows awaits Magic or Mayhem.
I keep thinking this will be a market opportunity for someone to make phones and tablets into computers that people own, that don't have to ask permission from someone to use, but it hasn't gone anywhere.
I have two internal web applications that I developed on the same code base. (1) YOShInOn, a smart RSS reader with a content-based recommender model and (2) Fraxinus, an image sorting and viewing application.
These work great on desktop, great on iPad, pretty good on Android and sorta OK on phones where the screen size is a little too small. I even found out they work great on the Meta Quest 3, where it isn't a real XR app but I can view and sort images on three great big monitors hanging in my room or in a Japanese temple [1]
For years I had a series of bottom-of-the-line Android phones that left me thinking that "app" is a contraction of "crap" as opposed to "application". When I heard Skype was ending I ported my number to an iPhone and I can see with a high end device and 5G you don't need to wait a minute for the app to get into your gym to load. On the iPhone a number of factors come together for the app to be worthwhile but I think it was never true on the iPad and web applications have long been undervalued, even though the experience of most web apps on the iPad, even if they weren't designed for it, is the definition of "just works".
It's an astonishing mistake that Apple never caught up with the Microsoft Surface and made the iPad Pro compatible with Mac applications, from a hardware reason there is no reason why not other than Apple thinks anyone who buys a Pro is made of money and can afford a macbook too -- it's like the way Digital struggled in the microcomputer age because they were afraid that cheap microcomputers compatible with the PDP-11 or VAX would cut off their minicomputer business.
It's hard to say what part of the Vision Pro failure was "too expensive" (the Meta Quest consumer thinks the MQ3 is too expensive) vs the moral judgement that using controllers is like putting your hand in a toilet; the MQ3 shows you can make great fully immersive games and experiences if you have controllers... At Vision Pro prices the device has to do it all and watching 3-d movies and using phone apps on a screen in midair just doesn't cut it.
[1] Curious how it works w/ the Vision Pro: web apps are easy to work with the controllers on the MQ3 but I found that it really helps to meet the WCAG AAA size requirements to make the targets easy to hit. How is it with the hand tracking on the Vision Pro? I found that I can do stuff w/ tracking and the clicker on the original Hololens but it isn't easy.
Owner of iPhone, iPad Mini, iPad Pro, MBA & Mac Mini hardware: after waiting for VMs on iPad, I moved some workflows to Google Pixel Tablet + GrapheneOS, which has inferior UX and no hardware keyboard, but at least it can run Debian Linux VM for local development. Next step is to try Pixel 8+ with USB-c DisplayPort docking to monitor/kb/mouse.
Swift and SwiftUI are seductive for building cross-platform iOS/Mac/IpadOS applications but despite Apple's marketing have significant frictions when building real-world apps. There's a reason large companies are still using Objective-C and UI-Kit - SwiftUI, and DEFINITELY SwiftData, are arguably not ready for production yet in direct contradiction to Apple's community messaging.
Look you can build a great app with any of these stacks, there's a lot of nuance in choosing between them, and the most cost effective and quality effective path will be decided by the developers strengths and weaknesses not the latest blog article or what happened to be "successful" for somebody else.
One of the "eternal September" moments of web app development was in the late 1990s when Microsoft went "all-in" and Microsoft-oriented developers flooded forums with tearful laments about how "MY BOSS NEEDS TO ME TO BE ABLE TO ACCESS LOCAL FILES IN A WEB APPLICATION!"
From the viewpoint of a web-native developer though, you need local files about as much as Superman needs a Kryptonite sandwich. (You didn't lose local files, you gained the universe! Look at how multimedia software on CD-ROM was utterly destroyed by the world wide web!)
That image sorter, for instance, has a million images sampled from a practically infinite pool all available instantly to any device anywhere in the world that's attached to my TailNet -- though you'd better believe I keep RAWs from my Sony on local file systems. [1]
I had a boss who ran a web design company circa 2005 which had a great spiel about how with web applications small businesses could finally afford custom software, I had my own technical version of it which went something like. "Back in the Windows '95 age obsolete desktop applications kept their state as a disorganized graph of pointers that inevitably gets corrupted just like cheese goes bad and crash; modern web applications keep their state in a transaction-protected database (if you're smart enough to NOT GET SEDUCED BY SESSION VARIABLES PROVIDED BY YOUR RUNTIME) so your application state is refreshed with every page update"
[1] Adobe's relationship to the local filesystem drives me nuts even though I've talked w/ people there like Larry Masinter and deeply studied file formats enough to understand their point of view. My first instinct, having worked at a library, is that you want to associate metadata with an object, in fact I really want to "name" an object after a hash of the contents and have the object be immutable.
On the other hand, XMP which stuffs a metadata packet into a file, is a good fit for the way people use desktop apps. Still, my #1 complaint about Photoshop is thinks a file has a changed (some metadata has changed) when I do some operation that I don't think of as a mutation such as making a print or exporting a compressed and scaled JPG with the "Save to Web" Dialog. Since I do a lot of different things with my machine I am always forcing shutdowns which means Photoshop is always harassing me to recover files that I "didn't save" even though I didn't really change them. Sometimes I just save files that I don't really want to save but it feels icky because the source file might be a JPG which might not round trip perfectly, where I might feel compelled to save at a higher quality than the original file and all of that.