For what it's worth, PCs are subject to the exact same slow decline, even if the author's romantic notions of repurposing an old machine ring true. While they may be suitable for old games on an unnetworked, phone-activated WinXP, or as a power-hungry Linux machine that will perform admirably as long as web browsers aren't involved, both fates are the same: frozen in time, locked out of the dizzying speed of evolution of online services and the portals through which they are viewed. Stallman's right yet again: the offline programs will run just the same, but the online programs are subject to change at a moment's notice, from Apple shedding support of 32-bit hardware, to application makers shedding support for old OS versions, to online backends forcing mandatory updates.
It feels worse with the iPad because its marketing made an impression of a sleek, "just-works" kind of device. Of course, its utility is tied exclusively to applications from a captive marketplace, and often, to remote services frontended by those apps or the browser, making its elegance a leaky abstraction. Today's Chromebooks, another "just-works" device, will be in the same situation in five years.
But with Chromebooks, or Android phones (if you're lucky), or x86 or x64 PCs, you can replace the stock OS and repurpose the device on the hardware's own remaining merits. An iPad is just stuck.
>For what it's worth, PCs are subject to the exact same slow decline
Nope, they aren't. My home PC hails from 2008, runs the latest Windows 10 update, and a range of software from early 2000s to 2018. It runs modern games, too (I like playing indie games that aren't too demanding).
Secret sauce? Modern SSD and a GPU from a few years back, plus an HDD to serve as a repository of all my data.
But even without the benefit or running Win 10 and hardware upgrades: I also spun up my old 2006 laptop the other day to rip a CD (my other machines don't have a drive), and guess what, FF Quantum runs well on it. Bring on your web apps, that XP machine will run 'em.
My point is that the PC's really don't suffer from the level of planned obsolescence that device like iPad do.
The PCs, ultimately, have one large window to see the online world with: the browser, and as long as that runs, the PCs don't lose any utility: whatever task they have enough computational power for today, they will be able to handle in a decade. The mobile devices, with their shift to the app-for-everything ideology pioneered by Apple, have thousands that are easily broken, prompting you to upgrade.
It's not about planned obsolescence. Mobile has seen the same type of dramatic performance increase you use to see in PCs back in the day. Can you imagine using a computer from 1996 in 2008?
The only real obstacle is the lack of NVME support. SATA3 is a bit of an upgrade bottleneck on these old systems, but even software development on this box was just fine with a fast ssd.
PC CPUs have not really got that much faster in the last ten years compared to the previous decade. Each new Intel generation of the i- series was maybe 10% faster than the preceding one.
A fast Core2 or 1st gen i7 is perhaps half the speed of a mid-range system today, and will probably beat some of the budget chips.
Upgrading an oldish system with an SSD can make a big difference. Obviously, GPUs have also got a lot faster too.
Hopefully, the recent competition from AMD will make us see better progress again.
"For what it's worth, PCs are subject to the exact same slow decline"
Maybe eventually, but not within 5 years. I'm currently typing on a 2011 Dell with an i3 proc and 6GB of RAM, running Ubuntu. I have 19 other tabs open in Chrome. It dual boots Win7.
I can use this 7ish year old PC for social media, email, HD videos and streaming music with few annoyances due to performance, in either OS. Not so with the 5 year iPad he describes. While I can do several of those things at once, it seems he has trouble with even just one.
My i7-2600k, which is 7 years old by now, still does everything I need it to do (among them being gaming newly released titles) and more. I did upgrade the GPU a couple of times, but the basic system is still the same and performing like it always did.
In that regard, I really don't understand what's happening with smartphones/tablets that suddenly makes them incapable of performing the same tasks, at the same performance, they always did. It's not like I'm suddenly trying to use some performance hungry app/game I never used before, it's the same apps I always use, but they simply perform worse.
If it wasn't for this simple fact I'd probably be still on my iPhone 4s, but after a while, that became even too slow for simple stuff like using WhatsApp. It can't be that the apps become that much more demanding, WhatsApp still did what it always does: Sending and receiving messages, so I really don't understand where that extra demand for performance comes from which suddenly turns whole generations of portable hardware into fancier paper weights.
Not anymore, but it's only recently that that's been spun as a good thing.
Tell someone around 2000 that a 7ish year old PC would be adequate and you'd get laughed at. 1993 to 2000, for instance: You'd be comparing a Pentium at 60Mhz (some quick googling says 60Mhz was first hit this year) on the high end to a 1Ghz Athlon. Probably >20x real performance improvement (the clock alone is 16.7x).
Mobile devices will catch up and stagnate too, sadly.
19 tabs opened in chrome with 6gb ram? Are you sure? Last time I used my friends pc with 4gb ram and it froze after I opened 12 tabs. Ram was fully used at 7 tabs.
People like to shit on MS, but Microsoft's commitment to backwards compatibility allows you to keep the same OS and use your PC to use all the software you paid for, for FAR longer than any Apple or Linux PC.
In Linux world, the problem is mitigated by availability of specialty distributions and desktop environments. Even older but open software can usually be salvaged with a few patches too.
And once you have the hardware new enough to run hardware virtualization well, there are truly no limits other than raw RAM and CPU.
The decline is similar but not nearly the exact same. PCs can generally do tasks they once did years later without compromise, iPads get slower and hardware is forced to update to a newer OS just to run existing applications, making tasks like simple note taking slower and more of a process than before. With iPad you can't keep your past experience because your product's ecosystem isn't your own vacuum you can't keep it away from forced updates.
I fail to see how that is the case? Apple neither forces OS updates, nor forces app updates. An iPad will perform exactly the same, given the same software, after 5 years or 10 years, and the user is in control of that software.
If you are speaking of services, well, that's not unique to iPads, or Apple, but is a consequence of the Internet.
> But with [...] x86 or x64 PCs, you can replace the stock OS and repurpose the device on the hardware's own remaining merits.
That's exactly the point. You know all the bazillion things that people usually do with their Raspberry PIs? An old PC can do all of that, and often much more. My old PCs are used for:
- HTPC (Home Theatre PC)
- NAS (Network Attached Storage)
- Retro gaming (this is the same box as the HTPC)
- Network-wide ad-blocker (same box as the NAS)
- Print server (same box as NAS)
Now back to the iPad. I am thinking to repurpose my old iPad Mini as an external screen using Duet Display (I would use it as a GPS display in a flight sim cockpit) but in general finding purpose for an old iPad is much, much harder than a desktop PC.
Well, a top-end PC from 5+ years ago can still easily outperform a lower end PC made today. This is true even if you don't have an SSD in either.
A "top end" iPad will lose in performance to the "bottom end" of the very next generation of iPads. Granted, that isn't the same thing, but the point stands.
How long has it been since we saw large performance increases year after year in the PC space?
>Intel's Kaby Lake Core i7-7700K is what happens when a chip company stops trying. The top-end Kaby Lake part is the first desktop chip in a brave new post-"tick-tock" world—which means that instead of major improvements to architecture, process, and instructions per clock (IPC), we get slightly higher clock speeds and a way to decode DRM-laden 4K streaming video. Huzzah.
I'm still using the PC I built in 2008. I've had to replace a few things, but not much. I can still browse the web. I run the latest packages in my linux distro. I can still work with all my old files. The apps I rely on are mature, well supported, and won't be going away any time soon. And in 10 years I'm betting it will still run. I might have to replace a component or two. I might have to give up web browsing, if sites keep getting fatter and fatter. But it'll still run, and I could find a use for it.
"For what it's worth, PCs are subject to the exact same slow decline"
PCs have a much broader choice of operating systems and software to install whilst maintaining compatibility. Take Voyage Linux as an example (http://linux.voyage.hk); it's a stripped down Debian system for embedded devices which can fit down to 256MB of storage, but nothing prevents you from apt-getting more software and make it grow to a full featured desktop system depending on the underlying hardware capabilities. Mobile devices as of today cannot even dream a fraction of this modularity, but we already know why: they're black boxes intended to be consumed then thrown away, and their OSes reflect that nature. Mobile manufacturers should be forced to publish all hardware source code after a reasonable time (say after 3 years) so that "old" devices can be repurposed instead of being littered into landfills.
Yes it could happen. But today February 8th, 2018. Every iPad that ever existed can view videos on Netflix, Hulu, Crackle, CW, etc. They can all connect to a standard mail server or Exchange server, sync Calendars using standard protocols, read ebooks from Apple, play music with Spotify, Airplay to the latest devices, or use Bluetooth, print to any AirPrint compatible printer (most consumer wireless printers), etc.
I know because I own a first generation iPad that does all of the above - after a reset last year and downloading older versions of apps.
The only thing I can't do is view the modern web with a tablet with a measly 256MB of RAM.
The capabilities of the first generation iPad in 2010 compared to a modern iPad is like night and day.
I have a 2008 Core 2 Duo 2.66Ghz laptop with 4GB of RAM, Gigabit ethernet, that runs the latest version of Windows. The average consumer laptop only comes with 8GB of RAM and gigabit ethernet is still the fastest consumer networking protocol. It serves as my Plex Server.
A modern iPad has 8 times the amount of RAM, a processor that's orders of magnitude faster, etc. You can't compare the improvements in mobile to the lack of improvements in desktop computers.
Back in 2008, could you imagine using a 10 year old computer and still expecting support from a new OS like you can with a 10 year old computer today?
Yes old PCs can be unusable too, but I can still use desktop PCs released in the same era as the first smartphones, and with upgraded graphics, memory and SSD they last even longer.
As a counter point I recently got a 12 year old MacPro to fix up and it is almost impossible to make usable even though the specs are better than some old PCs I've given away. So it's an design issue and Apple is a real crook in this regard.
As a counter point to your counter point I've used a 2010 Macbook Pro as my primary laptop for years. I upgraded the RAM and added an SSD and it still run great at most tasks.
A 2006 Mac Pro though? What issue is it giving you? Admittedly - the processor will no-doubt show some age. And it's not qualified for the latest releases of Mac OS in quite a while. But it had user upgradable RAM and disk drives. Throw your preferred distro of Linux on there and I imagine it would be a sufficiently capable machine.
My Athlon(tm) II X2 250 processor was released on Jun 2, 2009. It goes at 3 GHz. The 3 GB of memory comes from the same era. It is pretty much the only thing I ever do web stuff on. It is fast and I have no intention of replacing it any time soon.
What has happened is that maximum hardware performance has levelled off. Modern web design is slowly converging to a place where everyone needs maximum hardware performance everywhere.
When I had an iPhone it stopped being usable after 2 years and the last 2 non-Apple phones I had lasted 3 and 4 years and both got replaced on my terms, while still usable. You can't just handwave and declare that Apple doesn't do a worse job than others when the evidence is mounting that it does.
The same would have been true if your first phones were Android and your most recent ones were iPhone.
In other words, your experience has nothing to do with Android vs iPhone, rather that earlier devices simply didn't have the same amount of resource headroom as more recent ones.
Ever since the iPhone 4S, they've been usable for at least 3 years before expectations catch up to them. And now I think since the 6S and SE—which have the same hardware generation—iPhones can claim to have 4+ year lifetimes.
(At 3.5 years old, the iPhone 6 is still a great device today, particularly with a cheap battery replacement.)
It's worse than that. My gaming PC was top-of-the-line six years ago, and it still works great for browsing and work tasks, and runs many modern games effectively on lower settings.
My Samsung Galaxy Nexus phone was top-of-the-line six years ago, and is essentially useless now. Just switching from an app back to the homescreen takes multiple seconds, and launching or switching between apps often takes upwards of 15 seconds. It takes more than half a minute to start Kindle Reader and open a book. These are all things that happened almost instantly when it was new. I've wiped it and reinstalled Android multiple times, and it only helps a little.
Very dodgy. I'm still using my S3 Mini and it's fine. I have however seen some apps that just won't work 100% and will drain your battery because of bugs etc.
However, if the vendors on your alternate OS marketplace chose to go the same route as App Store providers, there's not much to gain. It's not so much about the OS, but about the ecosystem and the culture of this ecosystem, including online services. (As far as I do know, there's no reason why you couldn't support legacy iOS versions and hardware via the App Store. It's just not part of the predominant business model. The same is true for most commercial websites.)
I have a 15 year old Windows 2000 machine running a PCI based piece of test equipment. No problems or performance issues. It has not been on the network for at least 10 years, if not 15. On the rare occasion a data transfer is needed it's either a floppy (yes!), CD or thumb drive. Works fine.
Stallman may have been right about the dangers, but he was also a key impetus behind moving everything behind online APIs — the harder it is to keep stuff closed source (legally and socially) the more services will switch to an online-only model, which then can optimize for frequent updates.
Wait. Not sure if I understand you correctly here. As far as I can tell, keeping stuff closed source is one of the biggest reasons people like SaaS model (you can't even in principle have Stallman-style free software in SaaS model), another big reason being full control over access and user data. I'm not sure how he was a key impetus behind that - in fact, all of his teachings were strongly against this, but people ignored it and now here we are.
"a late-2012 iPad Mini, model number A1432, black, with 16 gigabytes of storage. It retailed, at release, for $329."
So the cost of owning this iPad has been less than 20 cents/day over its 5 year usable life. That doesn't even factor in the slight energy savings vs using a full laptop if you want to be nitpickey. Nor does it factor in the current resale value. I just sold an original iPad 1 for $80 on Craigslist. I'm sure this would be worth at least as much, if not more.
"I still use my old iPad for passive consumption: reading, watching videos, checking feeds ...formerly easy tasks have become strained. Social apps have become slow, videos take longer to load and Safari can’t seem to handle the most important and fundamental services of the modern web."
This is all also true of my maxed out 2012 MacBook Air. Final Cut Pro X runs more smoothly editing HD video than browsing Facebook. And, while I agree the uses for an old iPad are limited (photo browser, music player / media controller, eBook reader, kitchen recipe browser), they are more elegant than keeping an old computer around.
The mixed blessing and curse of something like the iPad is that it still looks beautiful and functional long after it isn't. I'm sure it would be easier to part with the device if it was a 5-year-old $300 Celeron netbook.
My Panasonic 'Smart TV' is from 2014 and its app marketplace doesn't have Netflix. It's just not there. My laptop is from 2011, and since it runs the latest Chrome browser, it can play Netflix just fine. Made me think: the fact that laptops & desktops are consumer devices sold with hardware and operating systems that do general-purpose computing is underrated.
In general, accept that the TV is never going to be really Smart or keep up-to-date with the latest and greatest in content and services. Given the fragmentation it's just not going to happen. (May be a day would come when there's a common platform for all TVs but I don't see that). On the other hand, any of the major "TV" platforms - Chromecast, Apple TV, Roku, ...etc would receive regular enhances and no major content provider can choose to ignore them. So, depending on your preference, pick one and make your TV just a dumb display (which is exactly what it should be to be honest - it allows you to upgrade the hardware while keeping the display)
I was in the market for a TV late last year and was astounded to discover that “dumb” TVs are no longer available at your typical retail outlet. Every one at Walmart, Sams, Target, Costco, etc. has some sort of OS running on it.
I just want an HDMI cable port and a decent display, but it’s quite obvious that isn’t what makes the TV manufacturers money these days.
My Panasonic 'Smart TV' is from 2013, and its app marketplace doesn't have Netflix. It did have Hulu and some others that were discontinued a few months ago. So I had a smart TV capable of watching Hulu, and now I don't. The pre-installed app was even forcefully removed. And so have a few others. So the app wall now has holes in it. No notifications, by the way, I had to google what happened with my disappeared icons. I wouldn't even be surprised if people are still being charged for the Hulu service they can't use anymore. (I wasn't subscribed anyways, but found out when I finally decided I wanted to check it out for a trial period)
Why would you ever use your TV to be connected to the internet? Just a matter of time before your TV gets hacked. I use either a chromecast or apple tv, and see the tv as a monitor with speakers, nothing more.
It's all but impossible to buy a high quality dumb TV these days. I bought a 4k TV this past year with the intention of never connecting it to the internet but I guess I didn't do enough research because I can't even access all the functions of the TV without pairing it with an Android "remote" over a wireless network. Even better it's from a manufacturer that sells users' viewing habits.
Even now they make it opt out. To disable it I had to go into some obscure submenu and disable an option called "smart interactivity," something I never would have found on my own. I had to look up a guide on how to stop my TV from selling data about everything I view on the TV to advertisers!
The way "smart" TVs are going makes me wary of what other "smart" appliances will do in the future.
I made tbe smart TV mistake. Not only are they bad for the reasons you state they are also utter junk. All I really want is a good panel with plenty of ports and good speakers.
I've dreamed for years to have a single cloud OS, and then multiple "dumb" monitors and speakers controlled by that OS, Chrome seemed to be aiming for that, but privacy concerns could kill that dream.
Beyond that, that nothing on the hardware side stops you from running any software you want to.
Expect the PC to morph into something more akin to a tablet or "smart" TV if the MAFIAA get what they want (and Intel and crew seems all too willing to give in).
And that point your GPC will be relegated to a developer workstation that may well require a verified employer and regular visits from an auditor to own.
You’re getting downvoted, but while what you describe probably isn’t going to happen, you’re not too far off. We’re already “there” with walled garden app stores, and outside of that app signing where you need to pay to be in a “developer program” to be able to sign your apps [1], DRM baked into the browser (EME) and hardware (HDCP), mandatory logins to cloud services for appliances, etc.
[1] (Thinking of macOS here. I’m sure a non-centralized way of signing would also work.)
Yup. Cory Doctorow warned us about it for years (Google keywords: "war on general-purpose computing").
And the sad thing, I'm not sure we can escape that. MAFIAA wants it. Large businesses want it. And to top it off, computer security specialists want it too. Ideas like sandboxing, or trusted computing, or hardware crypto modules - all provide security while simultaneously taking control away from the user.
I seriously fear that soon, having a general-purpose computer connected to the Internet will be considered a public safety issue ("because botnets!"), and eventually you'll need a professional license to be allowed to work with a Turing-complete language ("because langsec!"). I very much don't like it.
Sadly this is a direction Microsoft is taking with the rumored upcoming "S Mode". If this works for them (and I hope it don’t) even PC will be locked down and no more useful than a tablet. Some decision makers are forgetting hardware is for running software, not blocking everything under the sun.
My mother-in-law is in the same boat. She is not a demanding user. She basically uses her iPad for 3 things: buying things on Amazon, looking at pictures of her grandson on Facebook, and checking her stocks with the Fidelity app. Well, the Fidelity app recently informed her that she could no longer login unless she installed a required upgrade. It turns out the upgrade is not compatible with the version of iOS that runs on her hardware. So just like that her iPad became obsolete.
One of my biggest fear nowadays is I will someday mistakenly press "OK" to the annoying "Update to iOS 11" popup without thinking, and my iPad will be practically bricked.
My iPad is a relatively recent iPad mini which I bought a couple of years ago, but I know from my iPhone 6, I know upgrading to iOS 11 will slow down everything, for example each app taking about 10 seconds to launch--including Apple's own apps like Mail.app and Messages app.
Is there a way to completely block that upgrade to iOS 11 popup so I don't make the mistake?
Also, I marvel at how watered down the term "bricked" has become. From 'my thing shows absolutely no signs of life' to 'my thing is a little bit laggy'.
Thanks, but I wouldn't say "a little big laggy". I am not exaggerating when I say every app takes ten seconds to load.
I actually measured them and that's the average time it took for all apps to load after I updated to iOS 11 (Before iOS11 it loaded immediately).
If you can sympathize with me because you have more recent phones, please try to close your eyes and count to ten and imagine, hopefully that may give you an idea of how frustrating it is for peasants like us.
Maybe it is different in other circles but as a software engineer, bricked still means you unplugged it while flashing the ROM, literally unrecoverable.
It should not take apps 10 seconds to launch on an iPhone 6 running iOS 11. Something probably went wrong with your upgrade, which does happen. Have you tried taking a local backup, doing a factory reset of the phone, and then restoring the backup?
As someone who tested this extensively, it really needs to be treated like a major OS upgrade on a computer. Back up important info(pictures, etc) without using icloud, do a full format(DFU restore), and reinstall from scratch.
I have an iphone 6 and 6s here that are still flying along great on ios 11. And i had previously had so many problems with the 6S that i thought this upgrade really was awful. It took until 11.1 or so to truly be ok, but the performance increase from doing a true "clean install" like i would with windows or OSX was really night and day.
Yes, you can block it by installing the TVOS 11 public beta profile on your phone or ipad. I had to do this with a broken iphone 6 which would literally be bricked by an update due to a baseband fault. Works perfectly, the kids can play games on it without the risk of updating it by mistake.
It is ridiculous that there isn't a setting to disable update nagging though!
I have a 3rd gen iPad that I feel like there should be a 3rd-party logic board / OS update for.
There are approximately a bajillion 3rd gen iPads out there. Imagine walking into the local screen repair place, having them stick in a new Android board that reuses the digitizer, screen, and case. Maybe they pop a new battery in while it's open. Bam! Renewed iPad.
Given how cheep various SBCs have gotten, I have to imagine there is an economic business model for someone in retrofitting all these old, homogeneous, plentiful devices.
The main issue with this is that Apple doesn't allow downgrading of iOS. Thus, one might be able to maintain a device on an older version of iOS that is most suited to it, but if anything happens that requires a restore, then that device is effectively dead in the water. It seems to me that at the very least, Apple should backport relevant security fixes at least one major version, if not two (for example, a bug found in iOS 11 should also be patched in iOS 10, should it be present), and they should sign the last revision of one or two major versions back, so that these things aren't thrown out en masse.
The new version of IOS broke some of my apps so I went to submit feedback on those apps that Apple broke their app with their new OS and it turns out that providing feedback in the app store is also now broken!
I literally cannot do anything now because I cannot downgrade my OS nor can I tell the developers that Apple broke their app with the new OS version.
I have 2 original ipads that are in excellent shape. I spent a weekend looking for some way to repurpose them into photo albums that sit there are cycle through photos.
I couldn't find a working solution.
All the apps won't work properly and the browser doesn't work with things like Flikr or Google Photos.
Other things it still does (to varying degrees of success) are playing games (e.g. where’s my water, fruit ninja), taking notes, web dashboard (though sadly not general browsing), music player and alarm clock.
Some sort of jailbreaking/software modification (haven't done this in years, not really sure if it's possible?) could allow you to do what you were trying to do in the first place?
I think most versions of iOS have had a built-in slideshow function in the Photos application. Try loading up some photos onto the iPad, add to an album, and see if there's a slideshow option somewhere.
It feels worse with the iPad because its marketing made an impression of a sleek, "just-works" kind of device. Of course, its utility is tied exclusively to applications from a captive marketplace, and often, to remote services frontended by those apps or the browser, making its elegance a leaky abstraction. Today's Chromebooks, another "just-works" device, will be in the same situation in five years.
But with Chromebooks, or Android phones (if you're lucky), or x86 or x64 PCs, you can replace the stock OS and repurpose the device on the hardware's own remaining merits. An iPad is just stuck.
Nope, they aren't. My home PC hails from 2008, runs the latest Windows 10 update, and a range of software from early 2000s to 2018. It runs modern games, too (I like playing indie games that aren't too demanding).
Secret sauce? Modern SSD and a GPU from a few years back, plus an HDD to serve as a repository of all my data.
But even without the benefit or running Win 10 and hardware upgrades: I also spun up my old 2006 laptop the other day to rip a CD (my other machines don't have a drive), and guess what, FF Quantum runs well on it. Bring on your web apps, that XP machine will run 'em.
My point is that the PC's really don't suffer from the level of planned obsolescence that device like iPad do.
The PCs, ultimately, have one large window to see the online world with: the browser, and as long as that runs, the PCs don't lose any utility: whatever task they have enough computational power for today, they will be able to handle in a decade. The mobile devices, with their shift to the app-for-everything ideology pioneered by Apple, have thousands that are easily broken, prompting you to upgrade.
The only real obstacle is the lack of NVME support. SATA3 is a bit of an upgrade bottleneck on these old systems, but even software development on this box was just fine with a fast ssd.
A fast Core2 or 1st gen i7 is perhaps half the speed of a mid-range system today, and will probably beat some of the budget chips.
Upgrading an oldish system with an SSD can make a big difference. Obviously, GPUs have also got a lot faster too.
Hopefully, the recent competition from AMD will make us see better progress again.
Maybe eventually, but not within 5 years. I'm currently typing on a 2011 Dell with an i3 proc and 6GB of RAM, running Ubuntu. I have 19 other tabs open in Chrome. It dual boots Win7.
I can use this 7ish year old PC for social media, email, HD videos and streaming music with few annoyances due to performance, in either OS. Not so with the 5 year iPad he describes. While I can do several of those things at once, it seems he has trouble with even just one.
In that regard, I really don't understand what's happening with smartphones/tablets that suddenly makes them incapable of performing the same tasks, at the same performance, they always did. It's not like I'm suddenly trying to use some performance hungry app/game I never used before, it's the same apps I always use, but they simply perform worse.
If it wasn't for this simple fact I'd probably be still on my iPhone 4s, but after a while, that became even too slow for simple stuff like using WhatsApp. It can't be that the apps become that much more demanding, WhatsApp still did what it always does: Sending and receiving messages, so I really don't understand where that extra demand for performance comes from which suddenly turns whole generations of portable hardware into fancier paper weights.
Tell someone around 2000 that a 7ish year old PC would be adequate and you'd get laughed at. 1993 to 2000, for instance: You'd be comparing a Pentium at 60Mhz (some quick googling says 60Mhz was first hit this year) on the high end to a 1Ghz Athlon. Probably >20x real performance improvement (the clock alone is 16.7x).
Mobile devices will catch up and stagnate too, sadly.
The death of older mobile hardware is at least somewhat exaggerated.
- Automatically upgrading devices to Win10
- Announce that they will keep adding upgrades to win10 instead of creating new releases
- Office 365
- Require win10 to develop for windows phone 10
and important software companies like adobe/autodesk moving to subscription model isn't gonna help.
And once you have the hardware new enough to run hardware virtualization well, there are truly no limits other than raw RAM and CPU.
I fail to see how that is the case? Apple neither forces OS updates, nor forces app updates. An iPad will perform exactly the same, given the same software, after 5 years or 10 years, and the user is in control of that software.
If you are speaking of services, well, that's not unique to iPads, or Apple, but is a consequence of the Internet.
That's exactly the point. You know all the bazillion things that people usually do with their Raspberry PIs? An old PC can do all of that, and often much more. My old PCs are used for:
- HTPC (Home Theatre PC) - NAS (Network Attached Storage) - Retro gaming (this is the same box as the HTPC) - Network-wide ad-blocker (same box as the NAS) - Print server (same box as NAS)
Now back to the iPad. I am thinking to repurpose my old iPad Mini as an external screen using Duet Display (I would use it as a GPS display in a flight sim cockpit) but in general finding purpose for an old iPad is much, much harder than a desktop PC.
A "top end" iPad will lose in performance to the "bottom end" of the very next generation of iPads. Granted, that isn't the same thing, but the point stands.
With 16gb of memory, a Samsung 850 Pro and a NVIDIA 980, this PC which is now my spare / 7 year old son's is still going very strong indeed.
>Intel's Kaby Lake Core i7-7700K is what happens when a chip company stops trying. The top-end Kaby Lake part is the first desktop chip in a brave new post-"tick-tock" world—which means that instead of major improvements to architecture, process, and instructions per clock (IPC), we get slightly higher clock speeds and a way to decode DRM-laden 4K streaming video. Huzzah.
https://arstechnica.com/gadgets/2017/01/intel-core-i7-7700k-...
PCs have a much broader choice of operating systems and software to install whilst maintaining compatibility. Take Voyage Linux as an example (http://linux.voyage.hk); it's a stripped down Debian system for embedded devices which can fit down to 256MB of storage, but nothing prevents you from apt-getting more software and make it grow to a full featured desktop system depending on the underlying hardware capabilities. Mobile devices as of today cannot even dream a fraction of this modularity, but we already know why: they're black boxes intended to be consumed then thrown away, and their OSes reflect that nature. Mobile manufacturers should be forced to publish all hardware source code after a reasonable time (say after 3 years) so that "old" devices can be repurposed instead of being littered into landfills.
I know because I own a first generation iPad that does all of the above - after a reset last year and downloading older versions of apps.
The only thing I can't do is view the modern web with a tablet with a measly 256MB of RAM.
The capabilities of the first generation iPad in 2010 compared to a modern iPad is like night and day.
I have a 2008 Core 2 Duo 2.66Ghz laptop with 4GB of RAM, Gigabit ethernet, that runs the latest version of Windows. The average consumer laptop only comes with 8GB of RAM and gigabit ethernet is still the fastest consumer networking protocol. It serves as my Plex Server.
A modern iPad has 8 times the amount of RAM, a processor that's orders of magnitude faster, etc. You can't compare the improvements in mobile to the lack of improvements in desktop computers.
Back in 2008, could you imagine using a 10 year old computer and still expecting support from a new OS like you can with a 10 year old computer today?
As a counter point I recently got a 12 year old MacPro to fix up and it is almost impossible to make usable even though the specs are better than some old PCs I've given away. So it's an design issue and Apple is a real crook in this regard.
A 2006 Mac Pro though? What issue is it giving you? Admittedly - the processor will no-doubt show some age. And it's not qualified for the latest releases of Mac OS in quite a while. But it had user upgradable RAM and disk drives. Throw your preferred distro of Linux on there and I imagine it would be a sufficiently capable machine.
What has happened is that maximum hardware performance has levelled off. Modern web design is slowly converging to a place where everyone needs maximum hardware performance everywhere.
In other words, your experience has nothing to do with Android vs iPhone, rather that earlier devices simply didn't have the same amount of resource headroom as more recent ones.
Ever since the iPhone 4S, they've been usable for at least 3 years before expectations catch up to them. And now I think since the 6S and SE—which have the same hardware generation—iPhones can claim to have 4+ year lifetimes.
(At 3.5 years old, the iPhone 6 is still a great device today, particularly with a cheap battery replacement.)
My Samsung Galaxy Nexus phone was top-of-the-line six years ago, and is essentially useless now. Just switching from an app back to the homescreen takes multiple seconds, and launching or switching between apps often takes upwards of 15 seconds. It takes more than half a minute to start Kindle Reader and open a book. These are all things that happened almost instantly when it was new. I've wiped it and reinstalled Android multiple times, and it only helps a little.
Haven't met a Chromebook that can replace the production "experience"
So the cost of owning this iPad has been less than 20 cents/day over its 5 year usable life. That doesn't even factor in the slight energy savings vs using a full laptop if you want to be nitpickey. Nor does it factor in the current resale value. I just sold an original iPad 1 for $80 on Craigslist. I'm sure this would be worth at least as much, if not more.
"I still use my old iPad for passive consumption: reading, watching videos, checking feeds ...formerly easy tasks have become strained. Social apps have become slow, videos take longer to load and Safari can’t seem to handle the most important and fundamental services of the modern web."
This is all also true of my maxed out 2012 MacBook Air. Final Cut Pro X runs more smoothly editing HD video than browsing Facebook. And, while I agree the uses for an old iPad are limited (photo browser, music player / media controller, eBook reader, kitchen recipe browser), they are more elegant than keeping an old computer around.
The mixed blessing and curse of something like the iPad is that it still looks beautiful and functional long after it isn't. I'm sure it would be easier to part with the device if it was a 5-year-old $300 Celeron netbook.
Arguing for the low cost of ownership per day is a strawman.
There was a brief blissful window when computing was cheap enough to be a mass market product, but too expensive for manufacturers to add lockdowns.
I just want an HDMI cable port and a decent display, but it’s quite obvious that isn’t what makes the TV manufacturers money these days.
All home computers were consumer devices, with a big part of their OS burned into ROM.
For most part upgrading them meant buying a new device.
Now with the commoditization and low sales, it seems OEMs are eager to go back to that model.
EDIT: OS -> device.
Anything without slots or other expansion and interface ports isn't really a computer, it's an appliance, or at best a terminal.
https://www.engadget.com/2017/02/06/vizio-smart-tv-viewing-h...
Even now they make it opt out. To disable it I had to go into some obscure submenu and disable an option called "smart interactivity," something I never would have found on my own. I had to look up a guide on how to stop my TV from selling data about everything I view on the TV to advertisers!
The way "smart" TVs are going makes me wary of what other "smart" appliances will do in the future.
Expect the PC to morph into something more akin to a tablet or "smart" TV if the MAFIAA get what they want (and Intel and crew seems all too willing to give in).
And that point your GPC will be relegated to a developer workstation that may well require a verified employer and regular visits from an auditor to own.
[1] (Thinking of macOS here. I’m sure a non-centralized way of signing would also work.)
And the sad thing, I'm not sure we can escape that. MAFIAA wants it. Large businesses want it. And to top it off, computer security specialists want it too. Ideas like sandboxing, or trusted computing, or hardware crypto modules - all provide security while simultaneously taking control away from the user.
I seriously fear that soon, having a general-purpose computer connected to the Internet will be considered a public safety issue ("because botnets!"), and eventually you'll need a professional license to be allowed to work with a Turing-complete language ("because langsec!"). I very much don't like it.
My iPad is a relatively recent iPad mini which I bought a couple of years ago, but I know from my iPhone 6, I know upgrading to iOS 11 will slow down everything, for example each app taking about 10 seconds to launch--including Apple's own apps like Mail.app and Messages app.
Is there a way to completely block that upgrade to iOS 11 popup so I don't make the mistake?
Also, I marvel at how watered down the term "bricked" has become. From 'my thing shows absolutely no signs of life' to 'my thing is a little bit laggy'.
I actually measured them and that's the average time it took for all apps to load after I updated to iOS 11 (Before iOS11 it loaded immediately).
If you can sympathize with me because you have more recent phones, please try to close your eyes and count to ten and imagine, hopefully that may give you an idea of how frustrating it is for peasants like us.
See for example: http://mglenn.com/blog/2017/12/22/apples-bungled-battery-fea...
I have an iphone 6 and 6s here that are still flying along great on ios 11. And i had previously had so many problems with the 6S that i thought this upgrade really was awful. It took until 11.1 or so to truly be ok, but the performance increase from doing a true "clean install" like i would with windows or OSX was really night and day.
That's a good news then, maybe I should reset everything and reinstall to see if that works.
It is ridiculous that there isn't a setting to disable update nagging though!
There are approximately a bajillion 3rd gen iPads out there. Imagine walking into the local screen repair place, having them stick in a new Android board that reuses the digitizer, screen, and case. Maybe they pop a new battery in while it's open. Bam! Renewed iPad.
Given how cheep various SBCs have gotten, I have to imagine there is an economic business model for someone in retrofitting all these old, homogeneous, plentiful devices.
Somebody kickstarter this!
The new version of IOS broke some of my apps so I went to submit feedback on those apps that Apple broke their app with their new OS and it turns out that providing feedback in the app store is also now broken!
I literally cannot do anything now because I cannot downgrade my OS nor can I tell the developers that Apple broke their app with the new OS version.
I couldn't find a working solution.
All the apps won't work properly and the browser doesn't work with things like Flikr or Google Photos.
It is a shame since the display is still awesome.
Any ideas on what to do with them?
Other things it still does (to varying degrees of success) are playing games (e.g. where’s my water, fruit ninja), taking notes, web dashboard (though sadly not general browsing), music player and alarm clock.
Best hope is jailbreaking and running custom firmware, which may open up some possibilities.