this post was submitted on 10 May 2025
175 points (98.9% liked)

Selfhosted

46680 readers
538 users here now

A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.

Rules:

  1. Be civil: we're here to support and learn from one another. Insults won't be tolerated. Flame wars are frowned upon.

  2. No spam posting.

  3. Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it's not obvious why your post topic revolves around selfhosting, please include details to make it clear.

  4. Don't duplicate the full text of your blog or github here. Just post the link for folks to click.

  5. Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).

  6. No trolling.

Resources:

Any issues on the community? Report it using the report flag.

Questions? DM the mods!

founded 2 years ago
MODERATORS
 

I'm planning on setting up a nas/home server (primarily storage with some jellyfin and nextcloud and such mixed in) and since it is primarily for data storage I'd like to follow the data preservation rules of 3-2-1 backups. 3 copies on 2 mediums with 1 offsite - well actually I'm more trying to go for a 2-1 with 2 copies and one offsite, but that's besides the point. Now I'm wondering how to do the offsite backup properly.

My main goal would be to have an automatic system that does full system backups at a reasonable rate (I assume daily would be a bit much considering it's gonna be a few TB worth of HDDs which aren't exactly fast, but maybe weekly?) and then have 2-3 of those backups offsite at once as a sort of version control, if possible.

This has two components, the local upload system and the offsite storage provider. First the local system:

What is good software to encrypt the data before/while it's uploaded?

While I'd preferably upload the data to a provider I trust, accidents happen, and since they don't need to access the data, I'd prefer them not being able to, maliciously or not, so what is a good way to encrypt the data before it leaves my system?

What is a good way to upload the data?

After it has been encrypted, it needs to be sent. Is there any good software that can upload backups automatically on regular intervals? Maybe something that also handles the encryption part on the way?

Then there's the offsite storage provider. Personally I'd appreciate as many suggestions as possible, as there is of course no one size fits all, so if you've got good experiences with any, please do send their names. I'm basically just looking for network attached drives. I send my data to them, I leave it there and trust it stays there, and in case too many drives in my system fail for RAID-Z to handle, so 2, I'd like to be able to get the data off there after I've replaced my drives. That's all I really need from them.

For reference, this is gonna be my first NAS/Server/Anything of this sort. I realize it's mostly a regular computer and am familiar enough with Linux, so I can handle that basic stuff, but for the things you wouldn't do with a normal computer I am quite unfamiliar, so if any questions here seem dumb, I apologize. Thank you in advance for any information!

(page 3) 24 comments
sorted by: hot top controversial new old
[–] [email protected] 1 points 1 week ago

My automated workflow is to package up backup sources into tars (uncompressed), and encrypt with gpg, then ship the tar.gpg off to backblaze b2 and S3 with rclone. I don't trust cloud providers so I use two just in case. I've not really been in the need for full system backups going off site, rather just the things I'd be severely hurting for if my home exploded.

But to your main questions, I like gpg because you have good options for encrypting things safely within bash/ash/sh scripting, and the encryption itself is considered strong.

And, I really like rclone because it covers the main cloud providers and wrangles everything down to an rsync-like experience which also pretty tidy for shell scripting.

[–] [email protected] 1 points 1 week ago

I spend my days working on a MacBook, and have several old external USB drives duplicating my important files, live, off my server (Unraid) via Resilio to my MacBook (yes I know syncthing exists, but Resilio is easier). My off-site backups are to a Hetzner Storage Box using Duplicacy which is amazing and supports encrypted snapshots (a cheap GUI alternative to Borgbackup).

So for me, Resilio and Duplicacy.

[–] [email protected] 1 points 1 week ago

I built a near identical server for my parents and just sync my nextcloud folder to theirs using syncthing

[–] [email protected] 1 points 1 week ago

My friend has 1G/1G Internet. I have a rsync cron job backing up there 2 times a week.

It has a 8TB NVMe drive that I use bulk data backup and a 2TB os drive for VM stuff.

[–] [email protected] 1 points 1 week ago

I use LUKS and backup to a usb-drive that I have at home. I rsync those backups to my work once a week. Not everyone can backup to their office, but as others have said, backing up to a friend/family member's house is doable. The nice thing about rsync is that you can limit the bandwidth, so that even though it takes longer, it doesn't saturate their internet connection.

[–] [email protected] 1 points 1 week ago

I bring 1 of my backup disks to my inlaws. I go there regularly so it's a matter of swapping them when I'm there.

[–] [email protected] 1 points 1 week ago

LTO8 in box elsewhere

The price per terabyte became viable when a drive was on sale for half off at a local retailer.

Works well and it was a fun learning experience.

[–] [email protected] 1 points 1 week ago

Most of my work is with Macs, and even one server is running macOS, so for those who don't know how it works 'over there', one runs Time Machine which is a versioning system keeping hourlies for a day, dailies for a week, then just weeklies after that. It accommodates using multiple disks, so I have a networked drive that services all the mac computers, and each computer also has a USB drive it connects to. Each drive usually services a couple of computers.

Backups happen automatically without interruption or drama.

I just rotate the USB drives out of the building into a storage unit once a month or so and bring the offsite drives back in to circulation. The timemachine system nags you for missing backup drives if it's been too long, which is great.

It’s not perfect but very reliable and I wish everyone had access to a similar system, it's very easy, apple got this one thing right.

[–] [email protected] 1 points 1 week ago (2 children)

I tend to just store all my backups off-site in multiple geographically distant locations, seems to work well

load more comments (2 replies)
[–] [email protected] -3 points 1 week ago* (last edited 1 week ago)

Amazon AWS Glacier

Edit: I was downvoted for this, but it’s genuinely a more affordable alternative to Backblaze whose finances are questionable.

load more comments
view more: ‹ prev next ›