The article is mostly about the fact that the first backup is slow. That was my experience too, but I wouldn't call that a "serious bug" in the context of backups. I'd call silently corrupted backups a serious bug. There's this paragraph:
"Several users have reported to me that they too have experienced serious problems with Time Machine in 10.15.3, both in making first full backups and in trying to restore from existing backups. At least one of these has been reported to Apple as a bug in Time Machine, and has apparently joined several previous reports of the same problem." (My emphasis)
Clearly there are people who have what I'd call serious bugs, including people in the comments here. But it doesn't seem to me like anyone has proved that there are replicable serious bugs with Time Machine - although of course just because it's unproved doesn't mean it's not the case, and two backup methods are clearly better than one.
I've never had an issue with Time Machine (touch wood) and have found the ability to easily revert to previous versions a godsend sometimes.
I've been using Time Machine backups for years (company policy) via an external USB drive, and now that it's full and I only tend to remember it once every couple of months, a backup takes a full workday or more to run.
At the same time I've got Arq Backup running to back up my code folder (not everything in there is on accessible git remotes for me), but it's very heavy as well given the number of small files (code + git files). But at least it doesn't end up months out of date I guess.
Does anyone have a good backup solution for one's code folder? Large amount of small files (probably tens or hundreds of thousands. It's got a load of node_module folders as well I'm sure)
I have an hourly cron job that rsyncs my code folder to another one with venv, node_modules, Rust target directory etc. excluded. I only back up that folder. That cuts out most pointless small files and saves a lot of space too. I haven’t had much problem with .git folders because (1) I can only work so fast so the delta is usually fairly small;
(2) periodic rsync cuts out a lot of intermediate changes. But if you still need to optimize, maybe you could repack .git too.
My code folder is a sym link to a folder in my iCloud drive. The beauty is all my code will be sync'ed to cloud automatically, and the bonus is I can see this folder in every machine that syncs with the same Apple account. I believe this approach works with any cloud drive like DropBox, Google Drive, One Drive, etc.
> Does anyone have a good backup solution for one's code folder? Large amount of small files (probably tens or hundreds of thousands. It's got a load of node_module folders as well I'm sure)
I've been using Duplicacy (along with Arq and Time Machine) which is amazing at speedy backups and deduplication. However, I've found that restores require quite a bit of CLI-fu [1].
Considering a move to Restic, because they have a killer feature which allows one to mount a snapshot as a FUSE filesystem [2].
If the backup disks is full, I'm not surprised doing a backup takes ages. It's got to go through purging stuff on the fly alongside backing up new stuff and constantly rebuilding indexes, which will grow and trigger even more purging requiring more re-indexing. Horrible. Your backup disk should always have some free working space for it to run efficiently.
Can't you reducing the retention period so you always have some working space free, or just get an appropriately sized backup disk? Time Machine backups should run in just a few minutes automatically every hour on a well configured system.
Use git and gitignore the node_modules folder. You shouldn't need to back up that directory, the package.json file should have all the required package info
Probably best to ignore both git and node_modules folders with arq. Arq goes back and validates old backups sometimes (this makes my fans spin up much more than actually backing up weirdly) and both of those folders are going to be scanned and revalidated after the initial backup.
Fun fact, rumor on the webs is that a new Arq version is in the works! (Look at their twitter feed for screenshots they sent somebody recently)
It’s definitely good to add more to the exclusion list for Time Machine (using System Preferences), since it might pull in very large things that you just don’t care about. For example, take a look at some things in ~/Library.
If you use git, even without remotes, then perhaps use rsync or rclone to sync those repos to one or more storage areas? Could be sent to an SSH server, Google Drive, Backblaze B2 or whatever you like.
When it comes to backup, the time it takes for the backup to complete is not a property that matters the most in my opinion. The time it takes to restore the backup matters, and being sure that the backup is able to restore and is not corrupted matters even more.
A backup which takes so long that the user aborts it before it completes (say if it's from a laptop to an external USB drive and user wants to take the laptop out with them) IS a problem though.
That's not news. I stopped using Time Machine, when it couldn't transfer my backup to a new Macbook. Rsync worked better. If you look at the actual backup, they use hard links to make snapshots. Not great for many small files or small changes to large files.
Since then I wrote a simple frontend for Borg Backup for macOS, called Vorta[1]. Use it for local and remote backups[2]. Fast and works on any file system.
Having huge numbers of tiny files in the filesystem definitely has its toll. As I routinely handle multiple JavaScript codebases with complicated node_modules trees, to reduce the overhead I started wrapping relevant directory trees into sparse bundle images—that helped quite a bit.
I've been looking for something that would let me do what Time Machine does, but to the cloud, for a while. I've been using BackBlaze, but it has poor metadata support and mandatory exclusions for a lot of stuff I want backed up including system configuration. I tried CloudBerry to BackBlaze B2 but the initial backup never completed, even after several weeks on a >250Mbps connection.
It strikes me as odd that APFS doesn't support hard links any more despite being the "new" filesystem, yet the entire macOS backup system relies on the old deprecated HFS+ filesystem for Time Machine disk format, since it uses hard links....
Seems a backwards, muddled, confused step (as does Catalina itself!).
The issue at question is hard links for directories, not just hard links. Hard links aren’t going anywhere, whereas hard links for directories was a dangerous feature to begin with (I won’t expand on that but you can easily find explanations) and basically only there for Time Machine.
APFS has native support for COW and snapshots which is way better than the directory hard links hack. They’re just slow to port Time Machine to APFS targets.
The author keeps saying serious bugs and a thing as being infeasible when they actually mean they don't like the first back-up taking eight hours when they paid so much for their storage devices.
I have never waited for the first backup in Time Machine because it always happens when I am asleep. They probably take several hours. Nothing to see here, esp. since Time Machine is a totally end-user low-touch, simplistic service that, as far as I'm concerned, is one of the last truly well-engineered bits of user experience to come out of Apple. Who ever said it was fast?
Time machine is wonderful in my opinion - I've been using it a bit lately to restore files here and there, there's nothing like it on any other platform afaik.
Pick a file, hit enter time machine, wait a bit (I still have a time capsule, that I'll miss dearly if it goes), scroll through the history, and restore your file, its so easy.
It is the most advanced backup system I've ever used.
Edit: I think the really great thing about time machine/capsule is I don't know how it works, it's like a toaster, I plugged it in, clicked a couple of things on my Mac years ago and it still works, even after 3 Macs. I remember the days of typing tar cvf > /dev/rmt0 or some such and its a miracle in comparison.
Sadly, there are hundreds of common bugs and annoyances that exist in all of Apple platforms that bite people everyday but will never get fixed. Reporting them is pointless because Apple rarely appears to stop to fix things. In the past, reporting them might've been helpful, but these days, it's as useful as shrieking into a hurricane.
Here's a list I started of just what I've found and could think of readily:
> Dropping support for 32-bit apps was a terrible idea.
What? Never?
First of all I think old CPU instruction sets should be deprecated in finite time. Secondly I think a decade is a reasonable time frame to do it in. It too fast, not too slow.
I go my first 64-bit mac in 2006. Fourteen years later why should the Mac have to support the older slower way that doesn’t release the potential of the hardware?
I welcome a 64-bit only OS and haven’t suffered any loss from it.
I still use a lot of software, like Aperture, for which there is no good replacement and no possibility of me recompiling it for 64-bit. I will probably never upgrade my personal machine to Catalina as a consequence. When it becomes infeasible to continue using Mojave I will probably abandon MacOS entirely.
Even in an operating system that supports 32bit binaries, there shouldn't be any runtime overhead if you stick to 64bit apps only, except maybe a few megabytes of wasted disk space. (which could even have been made into an on-demand component install, if that's too much to spare)
For those of us with legacy 32bit apps and games, having them run ("slower" is in the eye of the beholder, as they perform as well as they always have) is better than not running at all.
When I run a legacy 32bit app, it's not like I'm bothered by the fact that it also loads a bunch of 32bit OS libraries. It's not like I feel like my value for money for owning the machine is restored if clicking the .app icon just shows a "Not supported" dialog box.
> I go my first 64-bit mac in 2006. Fourteen years later why should the Mac have to support the older slower way that doesn’t release the potential of the hardware?
Aren't 32bit binaries (when not needing 64bits) slightly faster than 64bit versions due to the smaller pointer size?
One bug that's very annoying and doesn't seem to be on Apple's radar: finder has a hard time understanding non US date formats and will show files as having "no date" sometimes
> It should be possible to buy a license for macOS for non-Apple PC's that includes a number of drivers to make it work.
This will literally NEVER happen and it makes me so sad. Apple would rather drop macOS completely then let this happen. In my opinion this is also the only reason for the T1/T2 security chips on the recent Macs. Yes I understand the other features of the T1/T2 chip (secure boot, touch id, etc) but to me these have got to be bullshit reasons. There is no reason why Apple's special little security chip should be controlling the exposure of your webcam[0].
I've been using a "Hackintosh" desktop for the last five years, and once it was set up, it has worked flawlessly. That includes all the hardware except the Intel Bluetooth, which was replaced by a tiny dongle that came with something or other. Even the yearly major update has never been a problem. It's far less hassle than my last Linux-on-Desktop attempts, even though Apple isn't even trying.
So that would seem to be evidence against your theory that Apple is concerned about people running MacOS on generic hardware. While some cryptographic hardware would indeed be needed to reliably prevent such shenanigans, I'm somewhat certain they could sabotage such systems with minimal effort and raise the pain to levels where it's just not worth it.
They don't even bother to, say, check the CPU and refuse to run on AMD. That could probably be done in a single line of sourcecode. Not doing anything like that and instead designing custom silicone just isn't rational behaviour.
>There is no reason why Apple's special little security chip should be controlling the exposure of your webcam...
Er, yes there is. The custom image signal processor on the T2 enables the face detection feature that drives tone and white balance mapping _on_ _faces_. That requires custom silicon to do in real time. The T2 is a custom ARM chip, so I expect the image signal processor is a subsystem taken from one of the recent iPhone chips, which also have hardware enabled dynamic face detection and mapping.
Apple could have trivially blocked hackintoshes but couldn’t be bothered doing so as they constitute a trivial number of devices.
The Tx chips may indeed spell the end of hackintoshes (I hope not, but who knows) but I’m sure hackintoshes played no role in any decision to use them.
That's because you don't pay and the Cascade of attention deficient teenagers doesn't care about fixing bugs. If you paid you'd get service and you wouldn't be the product.
Oh wait it's 2020, sorry, that's not playing anymore. What's this year's RDF spin?
But you do pay! Apple hardware is famous for that!
(I expect the problem is that you mainly pay for the hardware, not so much the software. At some Apple found out that if the hardware is nice enough, the software can be a bit shit, and people won't mind.)
I'm actually really happy that this needled the incessant apple boosters around here. You love a company that literally refuses to fix bugs. You pay for that. Sometimes trolling has a point - Pin for the reality distortion field bubble.
Tell me what doesn't have serious bags in modern Mac Os? Is it only me who's constantly dealing with various lags and hangings? Is it only me who had to force quit music app at least every 24 hours?
Honestly not sure why this is downvoted. I run a multi-user setup and switching between users is a giant pain. Here is one scenario that I find really weird, at the very least since High Sierra.
Assume user MAIN and user WORK:
1) Open Macbook
2) Login prompt for MAIN shows, "Switch User" button below
3) Click "Switch User", now prompt with logins for MAIN and WORK shows
4) Click and successfully login to WORK, the desktop for WORK now shows
--- Getting weird now
5) Get flashing image of desktop of user MAIN (?)
6) Get login prompt for user MAIN (??)
7) Click cancel, bounce back to login screen with MAIN and WORK (???)
8) Login to WORK again, good to go from here.
Switching between users on macOS is not just weird sometimes but at least it feels downright insecure when I am able to see flashing images of MAIN when I am logged into WORK.
Active Directory stuff is broken whenever the Mac is hibernated. Trying to use an AD Admin password to approve something almost always requires a reboot before the password window stops shaking.
I switch users quite frequently and have never seen this or anything like it. (I see some others have mentioned that Firefox might be the culprit, though, and I don't use it.)
I ditched Microsoft about 10 years ago and have been a fervent proponent of Apple ever since, but since WWDC 2019, I cannot honestly recommend Apple to a new user anymore.
There are just too damn many bugs.
In everything. Operating systems, software, services, built-in apps, even their developer tools and even in their frameworks and the Swift language itself.
I run into at least one bug literally every day. Someone could write a daily blog about this. Core features like keyboard input, text selection, AirDrop, photo picker, iCloud Drive etc. are erratic and unreliable. It's a death by thousand cuts. Apple is no longer the clear best, just the least worst.
I love the 16" MBP though, and restoring it to the exact state as my previous 15" MBP from a Time Machine backup was smooth and effortless. But when I tried to backup the new system, it seemed to ignore the existing backup I had just restored from, proceeding to write 300+ GB all over again and not showing the older snapshots in the UI.
Sadly Catalina is the first incarnation of OS X/MacOS that I haven't upgraded to within the first couple of weeks of it coming out. I really feel like I'm missing few benefits by not upgrading and quite a few problems.
I have found that upgrading to macOS current - 1 with latest update applied works well. For example I decided to update to Mojave last month, and enjoyed my High Sierra until 2019.
Having said that, I am probably not going to upgrade to catalina anytime soon (at least not until Catalina+1 macOS gets first major patch release).
Why? Well my reasoning is that lots of stuff magically isn’t going to recompile itself from 32bit to 64bit. I can probably help with that while being on Mojave.
Same. You lose every cool program that happened to go unmaintained, and iTunes evaporates. If I do upgrade, I'm thinking about a Mojave VM for maintaining 32-bit support. The problem is I feel like Apple is throwing buckets of features into macOS without ever fixing anything and occasionally randomly breaking stuff that worked, so that the net result is always worse than before.
I can understand. Way back when I had an internship at a Apple reseller I always found it crazy people where strongly against upgrading to the latest release. A lot of it was because 3rd party software lagging behind on support. But nowadays I want my machine to just work and I completely understand.
I have been using time-machine for more than 10 years I think now on my third laptop.
Every time at some point it starts clearing out a corrupted backup history due to some issues its found with its own backup. Frankly, I don't trust it anymore, I just set it up for convenience and every other week I start a restic backup (https://restic.net/).
I'd say Time Machine seems like one of these programs neglected by a vendor.
"Several users have reported to me that they too have experienced serious problems with Time Machine in 10.15.3, both in making first full backups and in trying to restore from existing backups. At least one of these has been reported to Apple as a bug in Time Machine, and has apparently joined several previous reports of the same problem." (My emphasis)
Clearly there are people who have what I'd call serious bugs, including people in the comments here. But it doesn't seem to me like anyone has proved that there are replicable serious bugs with Time Machine - although of course just because it's unproved doesn't mean it's not the case, and two backup methods are clearly better than one.
I've never had an issue with Time Machine (touch wood) and have found the ability to easily revert to previous versions a godsend sometimes.
At the same time I've got Arq Backup running to back up my code folder (not everything in there is on accessible git remotes for me), but it's very heavy as well given the number of small files (code + git files). But at least it doesn't end up months out of date I guess.
Does anyone have a good backup solution for one's code folder? Large amount of small files (probably tens or hundreds of thousands. It's got a load of node_module folders as well I'm sure)
I've been using Duplicacy (along with Arq and Time Machine) which is amazing at speedy backups and deduplication. However, I've found that restores require quite a bit of CLI-fu [1].
Considering a move to Restic, because they have a killer feature which allows one to mount a snapshot as a FUSE filesystem [2].
[1] https://forum.duplicacy.com/t/restore-command-details/1102
[2] https://restic.readthedocs.io/en/v0.3.2/Manual/#mount-a-repo...
Can't you reducing the retention period so you always have some working space free, or just get an appropriately sized backup disk? Time Machine backups should run in just a few minutes automatically every hour on a well configured system.
Fun fact, rumor on the webs is that a new Arq version is in the works! (Look at their twitter feed for screenshots they sent somebody recently)
Desktop computers also back them up to another disk via backintime (an advanced backup tool built on rdiff).
This gives me realtime replication and two co-located backups. It's fire & forget kind of setup.
I have a local external drive on a usb hub. I also have Backblaze for offsite backup. Neither one requires active management
Relying on a cloud-based system always seems dodgy to me, since they sometimes get "confused" over which is the master.
I also put my code into Fossil for an easy-to-copy and move .db file.
Since then I wrote a simple frontend for Borg Backup for macOS, called Vorta[1]. Use it for local and remote backups[2]. Fast and works on any file system.
1: https://vorta.borgbase.com
I've been looking for something that would let me do what Time Machine does, but to the cloud, for a while. I've been using BackBlaze, but it has poor metadata support and mandatory exclusions for a lot of stuff I want backed up including system configuration. I tried CloudBerry to BackBlaze B2 but the initial backup never completed, even after several weeks on a >250Mbps connection.
Seems a backwards, muddled, confused step (as does Catalina itself!).
APFS has native support for COW and snapshots which is way better than the directory hard links hack. They’re just slow to port Time Machine to APFS targets.
I have never waited for the first backup in Time Machine because it always happens when I am asleep. They probably take several hours. Nothing to see here, esp. since Time Machine is a totally end-user low-touch, simplistic service that, as far as I'm concerned, is one of the last truly well-engineered bits of user experience to come out of Apple. Who ever said it was fast?
Pick a file, hit enter time machine, wait a bit (I still have a time capsule, that I'll miss dearly if it goes), scroll through the history, and restore your file, its so easy.
It is the most advanced backup system I've ever used.
Edit: I think the really great thing about time machine/capsule is I don't know how it works, it's like a toaster, I plugged it in, clicked a couple of things on my Mac years ago and it still works, even after 3 Macs. I remember the days of typing tar cvf > /dev/rmt0 or some such and its a miracle in comparison.
Many consumer NAS devices can act as time capsules so you’ll be able to continue.
Here's a list I started of just what I've found and could think of readily:
https://git.io/JvCNb
What? Never?
First of all I think old CPU instruction sets should be deprecated in finite time. Secondly I think a decade is a reasonable time frame to do it in. It too fast, not too slow.
I go my first 64-bit mac in 2006. Fourteen years later why should the Mac have to support the older slower way that doesn’t release the potential of the hardware?
I welcome a 64-bit only OS and haven’t suffered any loss from it.
For those of us with legacy 32bit apps and games, having them run ("slower" is in the eye of the beholder, as they perform as well as they always have) is better than not running at all.
When I run a legacy 32bit app, it's not like I'm bothered by the fact that it also loads a bunch of 32bit OS libraries. It's not like I feel like my value for money for owning the machine is restored if clicking the .app icon just shows a "Not supported" dialog box.
Aren't 32bit binaries (when not needing 64bits) slightly faster than 64bit versions due to the smaller pointer size?
But hey, I also think switch to Python 3 gave enough time to everyone to switch too and there's no shortage of people who disagree :)
This will literally NEVER happen and it makes me so sad. Apple would rather drop macOS completely then let this happen. In my opinion this is also the only reason for the T1/T2 security chips on the recent Macs. Yes I understand the other features of the T1/T2 chip (secure boot, touch id, etc) but to me these have got to be bullshit reasons. There is no reason why Apple's special little security chip should be controlling the exposure of your webcam[0].
[0] https://support.apple.com/en-us/HT208862
So that would seem to be evidence against your theory that Apple is concerned about people running MacOS on generic hardware. While some cryptographic hardware would indeed be needed to reliably prevent such shenanigans, I'm somewhat certain they could sabotage such systems with minimal effort and raise the pain to levels where it's just not worth it.
They don't even bother to, say, check the CPU and refuse to run on AMD. That could probably be done in a single line of sourcecode. Not doing anything like that and instead designing custom silicone just isn't rational behaviour.
Er, yes there is. The custom image signal processor on the T2 enables the face detection feature that drives tone and white balance mapping _on_ _faces_. That requires custom silicon to do in real time. The T2 is a custom ARM chip, so I expect the image signal processor is a subsystem taken from one of the recent iPhone chips, which also have hardware enabled dynamic face detection and mapping.
The Tx chips may indeed spell the end of hackintoshes (I hope not, but who knows) but I’m sure hackintoshes played no role in any decision to use them.
Oh wait it's 2020, sorry, that's not playing anymore. What's this year's RDF spin?
(I expect the problem is that you mainly pay for the hardware, not so much the software. At some Apple found out that if the hardware is nice enough, the software can be a bit shit, and people won't mind.)
Assume user MAIN and user WORK:
1) Open Macbook
2) Login prompt for MAIN shows, "Switch User" button below
3) Click "Switch User", now prompt with logins for MAIN and WORK shows
4) Click and successfully login to WORK, the desktop for WORK now shows
--- Getting weird now
5) Get flashing image of desktop of user MAIN (?)
6) Get login prompt for user MAIN (??)
7) Click cancel, bounce back to login screen with MAIN and WORK (???)
8) Login to WORK again, good to go from here.
Switching between users on macOS is not just weird sometimes but at least it feels downright insecure when I am able to see flashing images of MAIN when I am logged into WORK.
There are just too damn many bugs.
In everything. Operating systems, software, services, built-in apps, even their developer tools and even in their frameworks and the Swift language itself.
I run into at least one bug literally every day. Someone could write a daily blog about this. Core features like keyboard input, text selection, AirDrop, photo picker, iCloud Drive etc. are erratic and unreliable. It's a death by thousand cuts. Apple is no longer the clear best, just the least worst.
I love the 16" MBP though, and restoring it to the exact state as my previous 15" MBP from a Time Machine backup was smooth and effortless. But when I tried to backup the new system, it seemed to ignore the existing backup I had just restored from, proceeding to write 300+ GB all over again and not showing the older snapshots in the UI.
Having said that, I am probably not going to upgrade to catalina anytime soon (at least not until Catalina+1 macOS gets first major patch release).
Why? Well my reasoning is that lots of stuff magically isn’t going to recompile itself from 32bit to 64bit. I can probably help with that while being on Mojave.
I thought I was doing a general software update, half an hour of waiting later and I’ve got Catalina now.
Every time at some point it starts clearing out a corrupted backup history due to some issues its found with its own backup. Frankly, I don't trust it anymore, I just set it up for convenience and every other week I start a restic backup (https://restic.net/).
I'd say Time Machine seems like one of these programs neglected by a vendor.