YonatanAvhar,

At the moment I’m doing primarily hopes and prayers

Llamajockey,

I had to upgrade to Hopes&Prayers+ after I ran out of hope and my prayers kept getting return to sender.

webjukebox,

I was in the same boat, until my prayers weren’t listened and my hopes are now dead.

I lost some important data from my phone a few days ago. My plan was to backup at night but chaos was that same day in the morning.

PlutoniumAcid,
@PlutoniumAcid@lemmy.world avatar

Ah yes, the ostrich algorithm.

0110010001100010,

Local backup to my Synology NAS every night which is then replicated to another NAS at my folks house through a secure VPN tunnel. Pretty simple and easy to deploy.

Greidlbeere,

Sounds good. What do you use for replication?

neardeaf,

Most likely Hyper Backup & Hyper Vault, two applications built into Synology’s DSM software that runs on their NAS devices.

0110010001100010,

Just simple old rsync. The nas at the far-end is an old QNAP I had lying around.

Wenny,

I usually use HanBrake, MakeMKV, or DVDFab to copy bluray disc.

sv1sjp,
@sv1sjp@lemmy.world avatar

Rsync custom script. I am connecting two different hard disks (1 natively + 1 remotely via ssh) to backup the disk.

1 tine per month, U unplug ny microsd fro my Raspberry Pi 4 Server and I am making a full backup of the sd in case it fails, to restore it to a new sd card.

dtc,

I’ve been using Restic for a while, and it’s backing up to a Hetzner storage box (1TB).

Restic supports encryption, compression, deduplication, and can forget old backups in a spread out timeline (configurable; e.g. save one yearly, three monthly and 7 daily).

On top of this I also use healthchecks.io to make sure all backups are working.

Walker,

All systems backup to Synology then to AWS Glacier. Ill check out Backblaze for pricing.

freeman,

I have an old synology DS1513+

it runs Active Backup for business and Active backup for google workspace, as well as an AFP share for Apple machines. This is about 95% of all backups. Those backup archive files are then ALSO backed up to one of two large 14TB hdds. I swap them out monthly (or thereabout) and keep the spare at my office or in my firesafe when at home.

I have a couple other things out there too. A small SSh box to handle some scripting of config file backups etc. My main synology 1815+ also has a cloud sync up to backblaze that happens in realtime, but only keeps 1 copy of stuff as well as a hyperbackup job for super important stuff up to Backblaze, in addition to the nightly backups to the 1513+. This way if my house burns down I still have something (and likely a full copy with the 14TB HDD)

boblemmy,

my solution is syncthing

angrox,

At home I have a Synology NAS for backup of the local desktops. Offsite Backups are done with restic to Blackblaze B2 and to another location.

bier,

my 20 TB storage is currently hosted by Hetzner on a SMB Share with a acompanying server The storage is accessable via NFS/SMB i have a Windows 10 VPS running Backblaze Personal Backup for 7$/Month with unlimited storage while mounting the SMB share as a “Physical Drive” using Dokan because Backblaze B1 doesn’t allow backing up Network shares If your Storage is local you can use the win Backup Agent in a Docker container

SeeJayEmm,
@SeeJayEmm@lemmy.procrastinati.org avatar

Desktop: I was using Duplicati for years but I’ve recently switched to Restic directly to B2. I’m using this powershell script to run it.

Server: I’m also using restic to b2.

I also have a Qnap NAS. I’m synchronizing my replaceable data to crappy old seagate NAS locally. For the irreplaceable data that’s using the Qnap backup client to B2.

Chifilly,

I use kup to back up my important PC files (the basic pre-installed backup software on KDE neon), which backs up to a separate drive on my PC, and that gets synced to my Nextcloud instance on my local server, and that - along with all the other data for my containers running on it - gets backed up by Kopia to DigitalOcean spaces.

I couldn’t recommend Kopia strongly enough, because you have such fine control of what gets backed up, when it gets backed up, how many to keep etc. and it is versioned so doesn’t grow exponentially, and it compresses and encrypts the backup. I also have a setup where it executes a script before and after the backup starts that stops and starts the containers to maintain file integrity since nothing will be writing to the files. And it’s also a Docker container so it can just fit into your current compose setup.

LanyrdSkynrd,

Rsync everything besides media to a Storj free account. I also rsync my most important data(docker compose files,config files, home assistant, a few small databases) to Google drive.

ComptitiveSubset,

For app data, Borg as backup/restore software. Backup data is then stored on Hetzner as an offsite backup - super easy and cheap to setup. Also add healthchecks.io to get notified if a backup failed.

Edit: Backup docker compose files and other scripts (without API keys!!!) with git to GitHub.

Carol2852,
@Carol2852@discuss.tchncs.de avatar

I’m running www.arqbackup.com to Storj and Synology on my desktops and plain NFS copy on my server.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • selfhosted@lemmy.world
  • DreamBathrooms
  • magazineikmin
  • tester
  • Leos
  • InstantRegret
  • rosin
  • Youngstown
  • mdbf
  • slotface
  • everett
  • osvaldo12
  • kavyap
  • thenastyranch
  • ethstaker
  • megavids
  • khanakhh
  • anitta
  • GTA5RPClips
  • tacticalgear
  • Durango
  • cubers
  • ngwrru68w68
  • cisconetworking
  • provamag3
  • normalnudes
  • modclub
  • JUstTest
  • lostlight
  • All magazines