I’m planning on setting up a nas/home server (primarily storage with some jellyfin and nextcloud and such mixed in) and since it is primarily for data storage I’d like to follow the data preservation rules of 3-2-1 backups. 3 copies on 2 mediums with 1 offsite - well actually I’m more trying to go for a 2-1 with 2 copies and one offsite, but that’s besides the point. Now I’m wondering how to do the offsite backup properly.

My main goal would be to have an automatic system that does full system backups at a reasonable rate (I assume daily would be a bit much considering it’s gonna be a few TB worth of HDDs which aren’t exactly fast, but maybe weekly?) and then have 2-3 of those backups offsite at once as a sort of version control, if possible.

This has two components, the local upload system and the offsite storage provider. First the local system:

What is good software to encrypt the data before/while it’s uploaded?

While I’d preferably upload the data to a provider I trust, accidents happen, and since they don’t need to access the data, I’d prefer them not being able to, maliciously or not, so what is a good way to encrypt the data before it leaves my system?

What is a good way to upload the data?

After it has been encrypted, it needs to be sent. Is there any good software that can upload backups automatically on regular intervals? Maybe something that also handles the encryption part on the way?

Then there’s the offsite storage provider. Personally I’d appreciate as many suggestions as possible, as there is of course no one size fits all, so if you’ve got good experiences with any, please do send their names. I’m basically just looking for network attached drives. I send my data to them, I leave it there and trust it stays there, and in case too many drives in my system fail for RAID-Z to handle, so 2, I’d like to be able to get the data off there after I’ve replaced my drives. That’s all I really need from them.

For reference, this is gonna be my first NAS/Server/Anything of this sort. I realize it’s mostly a regular computer and am familiar enough with Linux, so I can handle that basic stuff, but for the things you wouldn’t do with a normal computer I am quite unfamiliar, so if any questions here seem dumb, I apologize. Thank you in advance for any information!

  • Matt The Horwood@lemmy.horwood.cloud
    link
    fedilink
    English
    arrow-up
    19
    ·
    1 day ago

    There’s some really good options in this thread, just remember that whatever you pick. Unless you test your backups, they are as good as not existing.

    • Showroom7561@lemmy.ca
      link
      fedilink
      English
      arrow-up
      2
      ·
      16 hours ago

      How does one realistically test their backups, if they are doing the 3-2-1 backup plan?

      I validate (or whatever the term used is) my backups, once a month, and trust that it means something 😰

      • Matt The Horwood@lemmy.horwood.cloud
        link
        fedilink
        English
        arrow-up
        3
        ·
        14 hours ago

        Untill you test a backup it’s not complete, how you test it is up to you.

        If you upload to a remote location, pull it down and unpack it. Check that you can open import files, if you can’t open it then the backup is not worth the dick space

      • Appoxo@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        3
        ·
        15 hours ago

        Deploy the backup (or some part of it) to a test system. If it can boot or you can get the files back, they work.

        • Showroom7561@lemmy.ca
          link
          fedilink
          English
          arrow-up
          1
          ·
          7 hours ago

          For context, I have a single Synology NAS, so recovering and testing the entire backup set would not be practical in my case.

          I have been able to test single files or entire folders and they work fine, but obviously I’d have no way of testing the entire backup set due to the above consideration. It is my understanding that the verify feature that Synology uses is to ensure that there’s no bit rot and that the file integrity is intact. My hope is that because of how many isolated backups I do keep, the chance of not being able to recover is slim to none.

    • dave@hal9000@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      21 hours ago

      Is there some good automated way of doing that? What would it look like, something that compares hashes?

      • sugar_in_your_tea@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        2
        ·
        edit-2
        7 hours ago

        I don’t trust automation for restoring from backup, so I keep the restoration process extremely simple:

        1. automate recreating services - have my podman files in a repository
        2. manually download and extract data to a standard location
        3. restart everything and verify that each service works properly

        Do that once/year in a VM or something and you should be good. If things are simple enough, it shouldn’t take long (well under an hour).

      • Matt The Horwood@lemmy.horwood.cloud
        link
        fedilink
        English
        arrow-up
        3
        ·
        14 hours ago

        That very much depends on your backup of choice, that’s also the point. How do you recover your backup?

        Start with a manual recover a backup and unpack it, check import files open. Write down all the steps you did, how do you automate them.