I see many posts asking about what other lemmings are hosting, but I’m curious about your backups.

I’m using duplicity myself, but I’m considering switching to borgbackup when 2.0 is stable. I’ve had some problems with duplicity. Mainly the initial sync took incredibly long and once a few directories got corrupted (could not get decrypted by gpg anymore).

I run a daily incremental backup and send the encrypted diffs to a cloud storage box. I also use SyncThing to share some files between my phone and other devices, so those get picked up by duplicity on those devices.

  • davad@lemmy.world
    link
    fedilink
    English
    arrow-up
    6
    ·
    1 year ago

    Restic using resticprofile for scheduling and configuring it. I do frequent backups to my NAS and have a second schedule that pushes to Backblaze B2.

  • Bdking158@kbin.social
    link
    fedilink
    arrow-up
    5
    ·
    1 year ago

    Can anyone ELI5 or link a decent reference? I’m pretty new to self hosting and now that I’ve finally got most of my services running the way I want, I live in constant fear of my system crashing

  • linearchaos@lemmy.world
    link
    fedilink
    English
    arrow-up
    5
    ·
    1 year ago

    Irreplaceable media: NAS->Back blaze NAS->JBOD via duplicacy for versioning

    Large ISOs that can be downloaded again, NAS -> JBOD and or NAS -> offline disks.

    Stuff that’s critical leaves the house, stuff that would just cost me a hell of a lot of personal time to rebuild just gets a copy or two.

  • Elbullazul@lem.elbullazul.com
    link
    fedilink
    English
    arrow-up
    5
    ·
    edit-2
    1 year ago

    I run a restic backup to a local backup server that syncs most of the data (except the movie collection because it’s too big). I also keep compressed config/db backups on the live server.

    I eventually want to add a cloud platform to the mix, but for now this setup works fine

  • ipkpjersi@lemmy.one
    link
    fedilink
    English
    arrow-up
    5
    ·
    edit-2
    1 year ago

    I usually write my own scripts with rsync for backups since I already have my OS installs pretty much automated also with scripts.

  • thatsnothowyoudoit@lemmy.ca
    link
    fedilink
    English
    arrow-up
    4
    ·
    edit-2
    1 year ago

    Large/important volumes on SAN-> B2.

    Desktop Macs -> Time Machine on SAN & Backblaze (for a few)

    Borgbackup is great and what we used for all our servers when they were pets. It’s a great tool, very easy to script and use.

  • Showroom7561@lemmy.ca
    link
    fedilink
    English
    arrow-up
    4
    ·
    1 year ago

    All devices backup to my NAS either in realtime or at short intervals throughout the day. I use recycling bins for easy restores for accidentally deleted files.

    My NAS is set up on a RAID for drive redundancy (Synology RAID) and does regular backups to the cloud for active files.

    Once a day I do a hyperbackup to an external HDD.

    Once a month I backup to an external drive that lives offsite.

    Backups to these external HDDs have versioning, so I can restore files from multiple months ago, if needed.

    The biggest challenge is that as my NAS grows, it costs significantly more to expand my backups space. Cloud storage and new external drives aren’t cheap. If I had an easy way to keep a separate NAS offsite, that would considerably reduce ongoing costs.

    • homelabber@lemmy.one
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      1 year ago

      Depending on how much storage do you need (>30 TB?), it may be cheaper to use a colocation service for a server as an offsite backup instead of cloud storage. It’s not as safe, but it can be quite cheaper, especially if for some reason you’re forced to rapidly download a lot of your data from the cloud backup. (Backblaze b2 costs $0.01/gb downloaded).

      • Showroom7561@lemmy.ca
        link
        fedilink
        English
        arrow-up
        3
        ·
        1 year ago

        Do you have an example or website I could look at for this ‘colocation service’?

        Currently using idrive as the cloud provider, which is free until the end of the year, but I’m not locked into their service. Cloud backups really only see more active files (<7TB), and the unchanging stuff like my movie or music catalogue seems reasonably safe on offsite HDD backups, so I don’t have to pay just to keep those somewhere else.

        • homelabber@lemmy.one
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          First I’d like to apologize because I originally wrote less than 30TB instead of more than 30TB, I’ve changed that in the post.

          A colocation is a data center where you pay a monthly price and they’ll house your server (electricity and internet bandwidth is usually included unless with certain limits and if you need more you can always pay extra).

          Here’s an example. It’s usually around $99/99€ per 1U server. If you live in/near a big city there’s probably at least a data center that offers colocation services.

          But as I said, it’s only worth it if you need a lot of storage or if you move files around a lot, because bandwidth charges when using object storage tend to be quite high.

          For <7 TB it isn’t worth it, but maybe in the future.

  • KitchenNo2246@lemmy.world
    link
    fedilink
    English
    arrow-up
    4
    ·
    1 year ago

    I use borgbackup + zabbix for monitoring.

    At home, I have all my files get backed up to rsync.net since the price is lower for borg repos.

    At work, I have a dedicated backup server running borgbackup that pulls backups from my servers and stores it locally as well as uploading to rsync.net. The local backup means restoring is faster, unless of course that dies.

  • rambos@lemmy.ml
    link
    fedilink
    English
    arrow-up
    4
    ·
    1 year ago

    Am I the only one using kopia :)?

    Im quite new in selfohsting and backups. I went for duplicaty and it is perfect, but heared bad stories and now I use kopia daily backups to another drive and also to B2. Duplicaty is still doing daily backups, but only few important folders to google drive.

    Ive heared only good stories about kopia and no one mentioned it

  • Gerowen@lemmy.world
    link
    fedilink
    English
    arrow-up
    4
    ·
    1 year ago

    I have an external hard drive that I keep in the car. I bring it in once a month and sync it with the server. The data partition is encrypted so that even if it were to get stolen, the data itself is safe.

    • bernard@lemmy.film
      link
      fedilink
      English
      arrow-up
      4
      ·
      1 year ago

      I have a similar 321 strategy without using someone else’s server and needing to traverse the internet. I keep my drive in the pool shed, since if my house was to blow up or get robbed, the shed would probably be fine.

      • Gerowen@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        I have an a shed I built a year or two ago, but it’s about 100 feet from the house with no electricity to it. I’ve considered running power and ethernet to it and connecting those drives to a raspberry pi. That way I could rsync my backups over SSH to an “off-site”, aka, not in the same building, backup on a more regular basis, and also not have to worry about the potential damage that might occur from hauling them around in a car all the time.

  • DawnOfRiku@lemmy.world
    link
    fedilink
    English
    arrow-up
    4
    ·
    1 year ago

    Personal files: Syncthing between all devices and a TrueNAS Scale NAS. TrueNAS does snapshots 4 times a day, with a retention policy of 30 days. From there, a nightly sync to Backblaze B2 happens, also with a 30 day retention policy. Occasional manual backups to external drives too.

    Homelab/Servers: Proxmox VM and LXC container exports nightly to TrueNAS, with a retention policy of 7 days. A separate weekly export happens to a separate TrueNAS share, that gets synced to B2 weekly, with a retention policy of 30 says. Also has occasional external drive backups.

  • dan@upvote.au
    link
    fedilink
    English
    arrow-up
    4
    ·
    1 year ago

    I use Borgbackup 1.2.x. It works really well. Significantly faster than Duplicity. Borg uses block-level deduplication instead of doing incremental backups, meaning the backup won’t grow indefinitely like with duplicity (this is why you have to periodically do a full backup with Duplicity). The Borg server has an “append-only” mode meaning the client can only add data to the backup and not remove it - this is useful because if an attacker were to gain access to the client, they can’t delete all your backups. This is a common issue with other backup systems - the client has full access to the backup, so there’s nothing stopping an attacker from erasing the client system plus all its backups.

    For storing the backups, I have two storage VPSes - One with HostHatch in Los Angeles ($10/month for 10TB space) and one with Servarica in Montreal Canada (3.5GB space for $84/year).

    Each system being backed up performs the backup twice - Once to each VPS. Borgbackup recommends this approach over only performing one backup then rsyncing it to a different server. The idea is that if one backup gets corrupted (or deleted by an attacker, etc), the other one should still be OK as it’s entirely separate.