Production is my testing lab, but only in my homelab ! I guess I don’t care to perfectly secure my services (really dumb and easy passwords, no 2fa, not hiding plain sight passwords…) because I’m not directly exposing them to the web and accessing them externally via Wireguard ! That’s really bad practice though, but any time soon will probably clean up that mess, but right now I can’t, I have to cook some eggs…
There are 2 things though I actually do have some more complex workflow:
Rather complex incremental automated backup script for my docker container volumes, databases, config files, compose files.
Self-hosted mini-CA to access all my services via a nice .lab domain and get rid of that pesky warning on my devices.
I always do some tests if my backups are working on a VM on my personal desktop computer, because no backup means that all those years of tinkering for nothing… This will bring up some nasty depression…
Edit: If have a rather small homelab, everything on an old laptop, still quite happy with the result and works as expected.
If you change your mind someday, just send me a PM !
Just create a wildcard domain certificate !
I access all my services in my lan through https://servicename.home.lab/
I just had to add the rootCA certificat (actually the intermediate certificate) into my trust store on every device. That’s what they actually do, just in automated way !
Never had an issue to access my services with my self-signed certs, neither on Android, iOS, windows, linux ! Everything served from my server via my reverse proxy of choice (Treafik).
However I do remember that there was something of importance to make my Android device accept the certificate (something in certificate itself and the extension).
If you’re interested I can send you the snipped of a book to fully host your own CA :). It’s a great read and easy to follow !
Ohhh thanks for the clarification ! As you guessed I’m not into dev/programming so I wasn’t aware of this kind of detail !
Thank you :)
Edit: Now semver makes sense !
I mean, where else should they show that warning? It’s also posted in the forum. They also edited the documentation page.
Maybe you’re more into mailing list or the like? I’m genuine curious on what/ how/ where you expected getting this kind of information.
Really cool stuff !! Something I need to try out for sure !
Just to bad they didn’t add a multiuser setup example :( !
If you are doing any kind of multiuser mail node, you should have a separate SMTP system in front of this one that performs any necessary validation.
Maybe something worth a shot is a direct Wireguard server/client connection. While I don’t know how it works with double NAT (wireguard client with double nat) making your home server act as a direct tunnel would solve all your issues.
IIR, tailscale uses wireguard under the hood and you’re already hosting things on your home server, so maybe this could be worth a try :) !
You’re right ! And because OP want to archive Reddit pages I propose an alternative to reduce that bloated site to a minimum :).
From my tests, it can go from 20MB to 700Bytes. IMO still big for a chat conversation but the readability from the alternative front-end is a + !
For reddit, SingleFile HTML pages can be 20MB per file ! Which is huge for a simple discussion…
To reduce that bloated but still relevant site, redirect to any still working alternative like https://github.com/redlib-org/redlib or old reddit and decrease your file to less than 1MB/file.
Back in the day, that’s what I did ALOT on Windows. Specially because of piracy and my younger me having no idea what he was doing XD !
Still happens on Linux with EndeavourOS but not for the same reasons ! There are millions times more ways to break stuff on Linux but I always learn Something new during the process.
Story time:
Learned the other day that some config files are loaded in a specific order and depending what display manager is installed. That was kinda eye opening to understand cause my system didn’t load .profile
when .bash_profile
was present and I didn’t understood why ! Thanks Archwiki !
For those not aware, nepenthes is an example for the above mentioned approach !
I was quite impressed by how it looks and the free option ! However, seeing Google tag manager and tiktok analytics domains and I’m already out !
XD okay ! Maybe I put to much fought into it 😅
For my docker containers I use what’s up docker which not only alerts me when there is an update but also give a link to the changes, so I can have a look what’s happening !
For my system itself… Just doing sudo pacman -Syu
. Though that’s not great, cause some updates can potentially break my EndeavourOS system… I keep sometimes an eye on the forum when I see some critical changes like the kernel itself or nvidia updates though.
The only thing I don’t install via that way is Firefox addons.
Any specific reason why? Yesterday I installed LibreWolf and saw at the same time a few addons in the AUR.
Do you know what’s the difference from an AUR addon or the official Firefox addon repo?
I guess It would be for security reasons because you never know if someone has tempered with the addon.
Damn… DNS issue early in the morning… What a nightmare 😂! Hope you got enough caffeine.
It really depends on your metadata/ directory structure. Even though navidrome doesn’t care of your directory structure it’s better to have everything neatly separated !
You can spin a docker compose (if you’re a bit acquainted with it) and simply point to your external drive containing your media, just to give it a try and see how it performs with your media files.
Just give it a try and see how it works, however I would wait for the new scanner update before upgrading fully to navidrome which would give some new long awaited functionalities like VA list of all artist.
But I had so many services running it was a pain to maintain.
Are you talking about docker containers? You should take a look at what’s up docker to maintain and keep track of your containers. I have approximately 20 containers and It was easier to keep track this way. If you’re more in the 50/100 range… Yeah this sounds a lot ! :o
Now I just want to host a web page and expose it with nepenthes…
First, because I’m a big fan of carnivorous plants.
Second, because it let’s you poison LLMs, AI and fuck with their data.
Lastly, because I can do my part and say F#CK Y0U to those privacy data hungry a$$holes !
I don’t even expose anything directly to the web (always accessible through a tunnel like wireguard) or have any important data to protect from AI or LLMs. But just giving the opportunity to fuck with them while they continuously harvest data from everyone is something I was already thinking off but didn’t knew how.
Thanks for the link !
Anime and .ass subtitles are a bit funky but that’s not due to jellyfin but the player used while streaming in direct play. (In my case)
I had the issue where on my mobile/laptop some subtitles just disappeared or where strangely formated. After some digging arround I found out that VLC was the culprit and changing the default player to MPV or alternatives like Findroid (which uses MPV as default player) everything went butter smooth in direct play !
No idea about transcoding though :/
I though that the recommended swap partition was to double until 16 GB? So at 32GB of ram use 32GB of swap?