

Make your own dockerfile, and the first line will be FROM <upstream>. Then make your changes.
Make your own dockerfile, and the first line will be FROM <upstream>. Then make your changes.
I used supermaven (copilot competitor) for awhile and it was sorta ok sometimes, but I turned it off when I realized I’d forgotten how to write a switch case. Autocomplete doesn’t know your intent, so it introduces a lot of noise that I prefer to do without.
I’ve been trying out Claude code for a couple months and I think I like it ok for some tasks. If you use it to do your typing rather than your thinking, then it’s pretty decent. Give it small tasks with detailed instructions and you generally get good results. The problem is that it’s most tempting to use when you don’t have the problem figured out and you’re hoping it will, but thats when it gives you overconvoluted garbage. About half the time this garbage is more useful than starting from scratch.
It’s good at sorting out boilerplate and following explicit patterns that you’ve already created. It’s not good at inventing and implementing those patterns in the first place.
Yeah, syncthing can do all of that except public share links. Run an instance on your NAS so there is always a sync target online.
I strongly recommend ZFS as a filesystem for this as it can handle your sync, backup, and quota needs very well. It also has data integrity guarantees that should frankly be table stakes in this application. Truenas is an easy way to accomplish this, and it can run docker containers and VMs if you like.
Tailscale is a great way to connect them all, and connect to your nas when you aren’t home. You can share devices between tailnets, so you don’t all have to be on the same Tailscale account.
I’ll caution against nextcloud, it has a zillion features but in my experience it isn’t actually that good at syncing files. It’s complicated to set up, complicated to maintain, and there are frequent bugs. Consider just using SMB file sharing (built into truenas), or an application that only syncs files without trying to be an entire office suite as well.
For your drive layouts, I’d go with big drives in a mirror. This keeps your power and physical space requirements low. If you want, ZFS can also transparently put metadata and small files on SSDs for better latency and less drive thrashing. (These should also be mirrored.) Do not add an L2ARC drive, it is rarely helpful.
The boxes are kinda up to you. Avoid USB enclosures if at all possible. Truenas can be installed on most prebuilt NAS boxes other than synology, presuming it meets the requirements. You can also build your own. Hot swap is nice, and a must-have if you need normies to work on it. Label the drive serial number on the outside so you can tell them apart. Don’t go for less than 4 bays, and more is better even if you don’t need them yet. You want as much RAM as feasibly possible; ZFS uses it for caching, and it gives you room to run containers and VMs.
I mean if it’s worked without modification for 6 years….
Does what I want and gets out of my way.
Yeah, that’s my experience. The backend is an environment you control completely and has well-defined inputs and outputs specifically designed to be handled by machines. Front end code changes on a whim, runs who the hell knows where, and has to look good doing it.
It’s pretty easy to avoid all of these, mostly by using ===. Null being an object is annoying and is one of the reasons ‘typeof’ is useless, but there are other ways to accomplish the same thing.
JavaScript has a lot of foot guns, but it’s also used by literally everyone so there is a lot of tooling and practice to help you avoid them.
Probably getting hammered by ai scrapers
NAS at the parents’ house. Restic nightly job, with some plumbing scripts to automate it sensibly.
Have you considered karakeep (formerly hoarder)? It does all of this really well - drop it a URL and it saves a copy. Has lists & tagging (can be done by AI if you want), IOS & android apps as well as browser extensions that make saving stuff super easy.
Broadly similar from a quick glance: https://www.amazon.pl/s?k=m-disc+blu+ray
My options look like this:
https://allegro.pl/kategoria/nosniki-blu-ray-257291?m-disc=tak
Exchange rate is 3.76 PLN to 1 USD, which is actually the best I’ve seen in years
I only looked how zfs tracks checksums because of your suggestion! Hashing 2TB will take a minute, would be nice to avoid.
Nushell is neat, I’m using it as my login shell. Good for this kind of data-wrangling but also a pre-1.0 moving target.
Tailscale deserves it, bitcoin absolutely does not
Where I live (not the US) I’m seeing closer to $240 per TB for M-disc. My whole archive is just a bit over 2TB, though I’m also including exported jpgs in case I can’t get a working copy of darktable that can render my edits. It’s set to save xmp sidecars on edit so I don’t bother with backing up the database.
I mostly wanted a tool to divide up the images into disk-sized chunks, and to automatically track changes to existing files, such as sidecar edits or new photos. I’m now seeing I can do both of those and still get files directly on the disk, so that’s what I’ll be doing.
I’d be careful with using SSDs for long term, offline storage. I hear they lose data if not powered for a long time. IMO metadata is small enough to just save a new copy when it changes
I’ve been thinking through how I’d write this. With so many files it’s probably worth using sqlite, and then I can match them up by joining on the hash. Deletions and new files can be found with different join conditions. I found a tool called ‘hashdeep’ that can checksum everything, though for incremental runs I’ll probably skip hashing if the size, times, and filename haven’t changed. I’m thinking nushell for the plumbing? It runs everywhere, though they have breaking changes frequently. Maybe rust?
ZFS checksums are done at the block level, and after compression and encryption. I don’t think they’re meant for this purpose.
Aww, man, I’m conflicted here. On one hand, I’ve enjoyed their work for years and they seem like good dudes who deserve to eat. On the other, they’re AI enthusiast crypto-bros and that’s just fucking exhausting. I deal with enough of that bullshit at work
Edit: rephrase for clarity
Takes forever to encode though