Readit News logoReadit News
wbkang · 3 years ago
I have spent a lot of time trying out backup solutions and I feel strongly enough to write this to stop others from using this. As other commenters mentioned, Duplicati is pretty unstable. I was never even able to finish the initial backups (less than 2 TB) on my PC over many years. If you pause an ongoing backup it never actually works again.

I'd use restic or duplicacy if you need something that works well both on Linux and Windows.

Duplicati's advantage is that it has a nice web UI but if the core features don't work.. that's not very useful.

mosselman · 3 years ago
Also can't recommend duplicati. I never got it to work despite sinking many hours into it using different storage options. Not even local disk worked.

Instead, I'd recommend Arq backup.

jsmith99 · 3 years ago
It seems hard to find a universal recommendation. I've heard good things about Arq although it didn't work well for me personally whereas ironically Duplicati did, although I'm currently using Restic.
syntheticnature · 3 years ago
I have had similar experiences. I could not get a non-corrupt backup from one machine; it would repeatedly ask me to regenerate the local database from remote, which never succeeded. Oddly, another machine never seemed to have an issue, but that's not an argument in favor of using the software. It is possible there are "safe" versions, but without a way to identify them (all the releases I used were linked from the homepage).
phcreery · 3 years ago
I had a similar experience with Duplicati. I attempted a 2TB backup of my NAS to a cloud storage and it went up to ~500GB and would just hang there.

I switched to restic and recomned it over Duplicati.

quaffapint · 3 years ago
Just another stat point... Been using it against 1TB storing encrypted to Backblaze B2 for about a year and a half. I've tested restoring and so far it's been very stable.
malfist · 3 years ago
Just to balance this. I use duplicati for both my web server where I host client websites, and my personal home nas.

I've had to use it to restore multiple times, and have never had an issue with it. It's saved my ass multiple times. It's always been a set it and forget it until I remember I need it.

michaelcampbell · 3 years ago
Never tried Duplicati, but restic + B2 has been great as "a different choice", and for my use case of backing up a variety of OS's (Windows, Mac, and different Linux distros, anyway), it's worked great.
PenguinCoder · 3 years ago
Restic and B2 "just work". Works how I expect it to, and restores what I expect it to. Not amazingly fast in backups or restorations, but it works reliable for me. I have restic running on everything from workstations and laptops, (~200G each), to servers (500G-2TB) to a mini 'data hoard' (25TB+) level of backups, and its been doing great on each.

I did not like and could not trust duplicati to finish backups or restore from them.

Shank · 3 years ago
I'll throw a +1 in for Duplicacy too. I think I'm backing up something like 8TB to Wasabi using it and it's excellent in terms of de-duplication.
alyandon · 3 years ago
I had a very similar experience with Duplicati on a small (disk space wise) backup set but a very large number of files bloating the sqlite data store.

I use Urbackup to back up Windows and Linux hosts to a server on my home network and then use Borg to back that up for DR. I'm currently in the process of testing Restic now that it has compression and may switch Borg out for that.

magnetic · 3 years ago
What does restic offer that borg doesn't?

I've been using borg for a while (successfully, with Vorta as UI on mac) and curious to learn if there is something I've been missing that restic provides.

rekabis · 3 years ago
How strange. I have been backing up my own computers (4) and those of my family (another 3) using Duplicati for over three years now, and aside from the very rare full-body derp that required a complete dump of the backup profile (once) and a rebuild of the remote backup (twice), it’s been working flawlessly. I do test restores of randomly chosen files at least once a year, and have never had an issue.

Granted, the backup itself errors out on in-use files (and just proceeds to the next file), but show me a backup program that doesn’t. Open file handles make backing up rather hard for anything that needs to obey the underlying operating system.

Ayesh · 3 years ago
I started to use Duplicati 2 for about a month now to try it out, and it was working flawlessly for me, except for occasional time-out of the web UI. I only backup local directories, and the destinations I tried out include an external drive over USB, Google Drive, and an SSH connection.

I'm using it to backup a Firefox profile while I'm using Firefox. It backed up active files as they are being written too! I'm also using it to backup a Veracrypt container file (single 24GB file), and incremental backups worked quite well too.

Thanks for the words of advice, I will keep testing longer before I make the switch.

mekster · 3 years ago
Agree duplicati is quite immature.

I've looked around quite a bit too but did you actually use restic and duplicacy?

They've eaten my RAM quite heavily, it caused the machine to freeze up by exhausting the RAM on not that huge data sets and I've stopped using them a year or so ago.

I've come to the conclusion to use Borg and zfs as backup solutions (better to run multiple reliable independent implementations), latter being quite fast by knowing what got changed on each incremental backups as a file system itself unlike any other utilities that need to scan the entire datasets to figure out what got changed since last run.

You can run a 1GB memory instance and plug HDD (far cheaper) based block storage (such as on Vultr or AWS) for cheap zfs remote target. Ubuntu gets zfs running easily by simply installing zfsutils-linux package.

If you need large space, rsync.net gives you zfs target with $0.015/GB but with 4TB minimum commitment. Also good target for Borg at same price but with 100GB minimum yearly commitment. Hetzner storage box and BorgBase seem good for that too.

Saris · 3 years ago
If you use restic/kopia, how are you managing scheduling and failure/success reporting together?

That's one thing I can't seem to quite figure out with those solutions. I know there are scripts out there (or I could try my own), but that seems error-prone and could result in failed backups.

dividuum · 3 years ago
You could use one of those services that expect a regular http heartbeat. I'm personally using uptimerobot for that. Within a .bat or .sh file, add a

  restic [...] && curl <heartbeat-url>
and you'll get eventually notified if backup jobs fails too often.

wbkang · 3 years ago
Yeah I had to invent my own.

On Linux I used cron + email. You can setup postfix such that you use your personal gmail or whatever, then you will be able to do "echo message" | mail -s youremail.com to send an email. They (big email providers) always allow you to send an email as yourself to yourself.

On Windows, I used the native task scheduler (with various triggers like time, lock workstation, idle and so on) and send an email using powershell, which can also send emails using SMTP.

antx · 3 years ago
yeah, I scripted my backup jobs, and use good old email notifications to report.

I expect an email every day. If I don't receive one, I know there's a problem with email delivery.

npteljes · 3 years ago
I read that Duplicati is also in beta (for years now), and that really seems discouraging. Restic looks great, but it's also 0.14 as of the moment. Would you consider restic a stable product, despite the version number?
proactivesvcs · 3 years ago
Restic's versioning doesn't denote that it's not production-ready: it absolutely is. Stable, reliable and developed thoughtfully, with data integrity and security in mind. I highly recommend it.
mschulkind · 3 years ago
I've used restic for years now without issue. I'd definitely consider it stable.

I started with duplicacy and moved to restic.

ajsnigrutin · 3 years ago
To me, it shows "beta" and "not supported" options.. so it's hard to choose :)
m3nu · 3 years ago
Yes, it's stable. They even added compression this year. We just added support for Restic on BorgBase.com. Will have more user feedback in a few months, but first tests and benchmarks are pretty encouraging.
aborsy · 3 years ago
Restic is rock solid. I have backed TBs servers with it. It never failed.

Encryption is properly implemented.

donio · 3 years ago
I've been using it since 2018, no issues so far.
PYTHONDJANGO · 3 years ago
Even late this warning has to be issued: restic still has serious problems with writing to samba shares - to the honor of the auhors we can see that the manual clearly tells you about that:

On Linux, storing the backup repository on a CIFS (SMB) share is not recommended due to compatibility issues.

There seems to be some deeper system level problem with go concurrency:

https://github.com/restic/restic/issues/2659

RockRobotRock · 3 years ago
I agree. I really liked the interface and gave it a go at least 3 or 4 times, and got burned every single time with errors or random issues.
remram · 3 years ago
Duplicacy seems to upload every chunk as a separate object/file, which is great for deduplication but bad for your cloud bill (S3 providers usually charge for PUT requests). There's a reason everybody else packs up chunks.
actuallyalys · 3 years ago
I had a mixed experience. I've been able to successfully restore backups (the most important thing), but I frequently had to fix database issues, which makes the backup less seamless (perhaps the second most important thing).
emptysands · 3 years ago
Duplicacy has worked well for several years on both my wife's and mother's laptops. Doesn't require much work and just keeps operating.
Gazoche · 3 years ago
Adding to the choir. I like the web UI of Duplicati but found it buggy and unstable, which are definitely not things you want in a backup system.
fomine3 · 3 years ago
In my experience, Duplicacy is most stable backup software compared to Dupli* family. I don't say it's rock solid but mostly it works.
alexktz · 3 years ago
Agree totally with this. It's a hot mess tbh and very unreliable. As suggested restic (with autorestic as a wrapper) is a great replacement.
ciupicri · 3 years ago
It's hard to see restic as a Duplicati replacement when there's no official documentation about backing data via SFTP on Windows.
aborsy · 3 years ago
What do you mean? It’s just “sftp” in front of the repository name!

And SFTP is SFTP, regardless of the OS.

ajvs · 3 years ago
I too had huge problems with Duplicati restoring. Switched to Borg, using Vorta as the GUI and am much happier.
stuckkeys · 3 years ago
I did use it. It worked 90% of the time. I backed up to one-drive. I just ended up getting veeam.
KennyBlanken · 3 years ago
I strongly advise people to not rely on Duplicati. Throughout its history, it's had a lot of weird, fatal problems that the dev team has shown little interest in tracking down while there is endless interest in chasing yet another storage provider or other shiny things.

Duplicati has been in desperate need of an extended feature freeze and someone to comb through the forums and github issues looking for critical archive-destroying or corrupting bugs.

"If you interrupt the initial backup, your archive is corrupted, but silently, so you'll do months of backups, maybe even rely upon having those backups" was what made me throw up my hands in disgust. I don't know if it's still a thing; I don't care. Any backup software that allows such a glaring bug to persist for months if not years has completely lost my trust.

In general there seemed to be a lot of local database issues where it could become corrupted, you'd have no idea, and worse, a lot of situations seemed to be unrecoverable - even doing a rebuild based off the 'remote' archive would error out or otherwise not work.

The duplicati team has exactly zero appreciation for the fact that backup software should be like filesystems: the most stable, reliable, predictable piece of software your computer runs.

Also, SSD users should be aware that Duplicati assembles each archive object on the local filesystem. On spinning rust, it significantly impacts performance.

Oh, and the default archive object size is comically small for modern day usage and will cause significant issues if you're not using object storage (say, a remote directory.) After just a few backups of a system with several hundred GB, you could end up with a "cripples standard linux filesystem tools" numbers of files in a single directory.

And of course, there's no way to switch or migrate object sizes...

t_sawyer · 3 years ago
I had a terrible experience too. The UI is incredibly slow and personally, I had issues where the "local db" had to be constantly repaired. The tool is just buggy and doesn't work well IMO.

FWIW: I ran it on 3 separate Windows PCs for around 6 months without any real luck getting it to work consistently.

Tepix · 3 years ago
This looks interesting, thanks for all those warnings, i will stay away from it for now.

However the next question would always be which cloud provider to use.

Is OVH cloud archive the cheapest cloud storage for backups in europe? It lets me use scp or rsync, among others.

They are charging(§) $0.011/GB for traffic and $0.0024/month/GB for storage.

So if my total backup is size 100gb and i upload 5gb per day of incremental backups i pay around $2 per month.

--

§ https://www.ovhcloud.com/en/public-cloud/prices/#473

rsync · 3 years ago
"Is OVH cloud archive the cheapest cloud storage for backups in europe? It lets me use scp or rsync, among others."

OVH may, indeed, be the cheapest.

If you email[1] and ask for the long-standing "HN Reader Discount" you can get $0.01/GB storage and free usage/bandwidth/transfer.

Zurich Equinix ZH4 on init7 pipes.

Depending on your preference either [2] or [3] may be the most compelling aspect of our service.

[1] info@rsync.net

[2] https://news.ycombinator.com/item?id=26960204

[3] https://www.rsync.net/products/universal.html

marceldegraaf · 3 years ago
Does this discount also apply to the raw ZFS plans at rsync.net? Looking for a reliable and cost efficient place to push my ZFS snapshots via “zfs send”.
Tepix · 3 years ago
Great offer, thanks. Is there an open source backup software you recommend to your clients for encrypted backups?
asmor · 3 years ago
It's pretty hard to beat Hetzner Storage Boxes, if you can live with the fixed provisioning (beyond being able to switch between the tiers).

https://www.hetzner.com/storage/storage-box

ur-whale · 3 years ago
> It's pretty hard to beat Hetzner Storage Boxes

They had a recent change in pricing ... did you take that into account?

jacooper · 3 years ago
The data resiliency is pretty weak. Its only a single Raid cluster away from losing data.
aborsy · 3 years ago
Does Hetzner have a service that nigh work for ZFS receive (beyond dedicated server)?
jacooper · 3 years ago
Far from it really.

Backblaze is much cheaper, and can have free egress when using Cloudflare with it.

There is also Storj, a decentralized storage coin and it gives 150 GB for free + $4/TB with free egress matching what you stored.

another one is IDrive E2, it $4/tb, with the first year costing the same as a single month, with egress for free up to about three times the size of what's stored.

Hetzners storage boxes are pretty cheap, but that is for a reason.

The upload speed is pretty slow outside Hetzners network (from my experience) and more importantly is that data is only protected by a single RAID cluster.

They also offer free Unlimited egress.

But I would personally go with Backblaze or maybe IDrive.

Tepix · 3 years ago
Sorry but i asked for a european offering, Backblaze is a US company, as is IDrive. I should have been less ambiguous when i wrote "in europe".
pmontra · 3 years ago
Or a small computer with a disk at a friend's home and backup to that. It's cheaper than cloud after one or two years, always less reliable, network speed is probably OK, you can have physical access. If the friend is a techy it could be one among many other little computers in that home. You can reciprocate by hosting his/her backup at your home.
GekkePrutser · 3 years ago
Yeah this is what I do.. One at a friend's house in his rack, the other one elsewhere with an external drive on a raspberry zero 2 :P

The good thing is you can add more storage. The bad thing is no enterprise class guarantees of course. But having multiple mitigates that.

Tepix · 3 years ago
That's a charming idea, the question is how far away does your friend live? If it's too far, the upstream bandwidth of residential internet can be a problem during a restore.

Deleted Comment

thesimon · 3 years ago
Cheap and dirty: Office 365 family plan with 6 account a 1TB each for around $60/year.
shellfishgene · 3 years ago
Seems to be 100$ a year now.
Phelinofist · 3 years ago
With me being an IT person my landlord asked for recommendations for doing backups. Some googling revealed duplicati and we gave it a go. Installation + configuration was easy and the features were sane. That was like 6-7 years ago and it is still running without issue (AFAIK ^^)
patentatt · 3 years ago
Have you tested restores? The problem I had with duplicati was that eventually restoring from a backup would take exponentially longer, to the point of never finishing. Maybe it would have eventually, but I can't wait multiple days to restore one file. There's a possibility it was an error or problem on my end, and this was a couple of years ago, so ymmv.
bkuhns · 3 years ago
I'm a new user of Duplicati and so far so good, but what you describe sounds like their biggest issue with the original storage mechanism (full+huge chain of incremental backups). The new mechanism would likely completely fix your concern. Here's a brief description of how it now works on their website: https://www.duplicati.com/articles/Storage-Engine/
rekabis · 3 years ago
The one full-backup restore I did on my wife’s system - after her MacBook Air decided to fry its storage (it was obsolete anyhow) - went perfectly. 23Gb of personal files (she’s not the data pack rat I am) came streaming back down inside of 20 hrs. And we were on a much slower connection at the time, certainly not the symmetrical gigabit that we have now.
Phelinofist · 3 years ago
Yes, we did test that and it worked reasonable fast (backup to external USB SSD)
michaelcampbell · 3 years ago
> running without issue (AFAIK ^^)

If you don't know, then it's not working. At least that should be your stance on backups.

macropin · 3 years ago
Not to be confused with duplicity, or duplicacy backup programs which have similar features.
ahnick · 3 years ago
Duplicacy has been incredibly stable for me over the years and I still prefer it's lock-free deduplication design. Looks like 28 days ago there was a major release as well. Time to upgrade. :)

https://github.com/gilbertchen/duplicacy

patentatt · 3 years ago
Agreed, duplicacy seems to be more resilient to the inevitable errors or hiccups along the way. The only downside is that it seems to be inefficient with storage with small metadata updates which happen frequently with my use case.
Normille · 3 years ago
Another happy long-term Duplicacy user here. My only problem with it is; on the rare occasions I need to restore something from backup, I can never remember the correct syntax and always have to look it up again.
jacooper · 3 years ago
Note, its not open source. Duplicati is.
lepapillon · 3 years ago
They have some major differences. Enough so that I first tried Duplicati and ran into corruption issues so frequently that I sought out an alternative and luckily found Duplicacy.

Duplicacy has been stable for years now and I gladly pay the commercial license. It seemed like Duplicacy constructs a giant DB of all the files and manages everything that way, whereas Duplicacy's approach is much simpler and is less prone to corruption. The large DB approach seems to fail when the backup set contains a large number of files that many users manage.

rovr138 · 3 years ago
> It seemed like Duplicacy

Duplicati?

----

These names are always a mess. I half the time quit comparing these tools due to not being able to keep the names straight.

bobek · 3 years ago
Just use restic and reclone and be done with it.

https://bobek.cz/blog/2020/restic-rclone/

bakugo · 3 years ago
I occasionally use restic but one thing I don't like about it is the sheer number of data files it creates (45k for ~800GB in my case) which makes it a pain to use with certain cloud storage providers that don't always handle tens of thousands of files very well (gdrive being a good example).

Is there some way to get it to not make as many files?

dimatura · 3 years ago
dimatura · 3 years ago
I've used restic with the backblaze and S3 backends - works pretty well for me. The newest version also has compression on top of deduplication, like borg, which is nice. (Of course, it will only make a difference for compressible data - most images or videos won't compress, but say, JSONs will).
antx · 3 years ago
I dropped duplicati after its database got corrupted irreversibly. Also, recoveries were always very long.

I now use restic and I'm very happy. I find it to be very resilient. No more database, only indexes and data packs, which can be repaired.

8bitbuddhist · 3 years ago
Same. Database corruption hit me after ~1.5 years and I could never figure out what the cause was or how to fix it. Which is a shame, because Duplicati looks like a great open source project with a lot of dev time and effort invested into it. But when it comes to backup software, your core functionality better work reliably, and Duplicati just isn't there. I since switched to Duplicacy and couldn't be happier.
willriches · 3 years ago
If you plan to use Duplicati please pay attention to the docs around block size. We used this to back up a couple 100GB of data to S3. Recovery was going to take over 3 days to reassemble the blocks based on the default 100KB block size. For most applications you will want at least 1MB if not more.

Otherwise a good product and has been reliable enough for us.

* https://duplicati.readthedocs.io/en/latest/appendix-c-choosi...

jacooper · 3 years ago
Thanks for the note!