only need dedup if your data is duplicated
(Justin)
Tech nerd from Sweden
only need dedup if your data is duplicated
Nope, you don’t need any VPS to use it, it comes with an SFTP interface.
https://www.hetzner.com/storage/storage-box/
offsite backup for $2/TB and no download fees, 1/3rd the price of B2.
Hetzner storage box is super cheap and works with rclone. They have a web interface for configuring regular zfs snapshots too so you don’t have to worry about accidental deletions/ransomware.
Hardware-wise:
Software wise, too many projects to count lol
Renovate is a very useful tool for automatically updating containers. It just watches a git repo and automatically updates stuff.
I have it configured to automatically deploy minor updates, and for bigger updates, it opens a pull request and sends me an email.
Yeah full VMs are pretty old school, there are a lot more management options and automation available with containers. Not to mention the compute overhead.
Red Hat doesn’t even recommend businesses to use VMs anymore, and they offer a virtualization tool that runs the VMs inside a container for legacy apps. Its called Openshift Virtualization.
Yeah unraid is the same, it just adds a Gui to make it easier to learn. The downside is that unraid is very non-standard and is basically impossible to back up or manage in source control like vanilla docker or kubernetes
You should keep your docker/kubernetes configuration saved in git, and then have something like rclone take daily backups of all your data to something like a hetzner storage box. That is the setup I have.
My entire kubernetes configuration: https://codeberg.org/jlh/h5b/src/branch/main/argo/custom_applications
My backup cronjob: https://codeberg.org/jlh/h5b/src/branch/main/argo/custom_applications/backups/rclone-velero.yaml
With something like this, your entire setup could crash and burn, and you would still have everything you need to restore safely stored offsite.
Ah, ok I see.