Yeah, the container I used requires your Steam ID as an environment variable.
Yeah, the container I used requires your Steam ID as an environment variable.
That’s a really open-ended question. Depends purely upon your interests and appetite for risk, etc.
Might be worth looking at, from a Docker perspective:
I have in the past run a Valheim server and a VRising server, too. FWIW.
Searched “tdr” before replying, and was inexplicably happy. :)
Yeah, it make for a nice workflow, doesn’t it. It doesn’t give you the “fully automated” achievement, but it’s not much of a chore. :)
Have you considered something like borgbackup? It does good deduplication, so you won’t have umpteen copies of unchanged files.
I use it mostly for my daily driver laptop to backup to my NAS, and the Gitlab CE container running on the NAS acts as the equivalent for its local Git repos, which are then straightforward to copy elsewhere. Though haven’t got it scripting anything like bouncing containers or DB dumps.
Do you have a NAS? It can be a good way to get decent functionality without extra hardware, especially if you’re doing proof of concept or temporary stuff.
My self-hosting Docker setup is split between 12 permanent stacks on a Synology DS920+ NAS (with upgraded RAM) and 4 on a Raspberry Pi 4B, using Portainer and its agent on the Pi to manage them. The NAS is also using Synology’s Drive (like Dropbox or GDrive) and Photos (like Google Photos).
I’ve had the NAS running servers for Valheim and VRising in the past, but they require that fewer containers be running, as game servers running on Linux usually have no optimisation and/or are emulating Windows.
If I decide to host a game server again, I’ll probably look at a NUC. I’ve done the DIY mini-ITX route in the past (for an XBMC-based media centre with HDMI output) and it was great, so that’s another option.
This is what I do. I find keeping 20-odd docker-compose files (almost always static content) backed up to be straightforward.
Each is configured to bring up/down the whole stack in the right order, so any Watchtower-triggered update is seamless. My Gotify container sends me an update every time one changes. I use Portainer to manage them across two devices, but that’s just about convenience.
I disable Watchtower for twitchy containers, and handle them manually. For the rest, the only issue I’ve seen is if there’s a major change in how the container/stack is built (a change in database, etc), but that’s happened twice and I’ve been able to recover.
I used Linuxserver’s Docker container of Dokuwiki when I migrated my notes from Evernote a few years ago. It was easy to setup and configure, has a number of plugins that further improve it, and it did the job really well.
I ended up migrating it all to Obsidian this year, as it serves my needs better, but otherwise I’d still be using Dokuwiki.
I’ve been using Linux - off and mostly on - since a year after Linus released his kernel, and so have tried a bunch of flavours. I agree with aperson: you’ll receive lots of recommendations, but only you know what you like.
My daily driver is Ubuntu on an i5-7200U (Lenovo ThinkPad), and before that it was Kubuntu. My main PC is an i7-7900K, so similarly long in the tooth as yours, and both CPUs run the Ubuntu flavours just fine.
My personal preference is currently Kubuntu (faster, lighter, and fewer “this is how it is, and you’ll be glad for it” decisions). But there are so many others to try. Find a bunch that support Proton and gaming, grab their “live CD” versions, and see which ones work for you.
It’s a good question. A vault is only as strong as the credentials required to access it.
Bitwarden does have MFA support, though. If you’re using it without that enabled, you’re asking for trouble.
FWIW, I have an LG LED smart TV (2xHDMI, 1xDVB-S2, WiFi, NIC, etc) and it’s only been connected to my network once, for a post-purchase firmware update through my AdGuard Home. WiFi and Ethernet is disabled, and I use it with my Nvidia ShieldTV (Plex*, Netflix, ChromeCast, etc).
I won’t let it go online as I expect it already phones home if you let it, and don’t imagine LG will be able to resist ad injection into content, like Samsung and others do. So it’s an excellent quality dumb TV, which meets my needs perfectly.
*Plex Media Server runs on my NAS. The Shield and my mobile devices are Plex clients.
Exposed is the right term. Other than my Wireguard VPN port, everything I have exposed is HTTPS behind Authelia MFA and SWAG.
I’m tempted to switch Wireguard for Tailscale, as the level of logging with WG has always bothered me. Maybe one day.
When my old NetGear ReadyNAS Duo (2 bays, SPARC, 100Mb NIC) was reaching its EOL I looked into a purpose built server, a mini of some kind (NUC, etc), or a standard QNAP or Synology NAS. Eventually settled on a Synology DS 920+ (4 bays, x86_64, 1Gb NIC).
It’s been rock solid and amazing value for the 2.5 years I’ve had it. It’s running the majority of my Docker containers, Plex Media Server, a Linux VM, and a few other things. It also has its own shell/CLI, which is useful. I don’t use Synology’s “phone home”/remote access stuff, but Synology Drive and Synology Photos are great - they provide the equivalents of Dropbox and Google Photos respectively, and it works across Windows, Linux, Mac, iOS, and Android (via VPN when outside the house). No regrets at all.
I’ve had gitlab/gitlab-ce running on my NAS for 6+ months and it’s been reliable, mostly as a central repository and off-device backup. It has CI/CD and other capabilities (gitlab/gitlab-runner, etc), but I’ve not implemented them.
TT-RSS is fantastic, providing you hold your nose and wear as asbestos suit if you ever dare ask a question or raise a valid issue. The dev is… well, I’m not a fan. I won’t use it out of principle.
FreshRSS is a good-looking and skinnable alternative with a good Docker image, but I had issues with the inability to flush old items. Has a decent web UI.
These days I’m using Sismics and the web UI.
This is what I did, too. Used Pi-Hole for a year or so, and it required regular tinkering and repairing. Planned to test AGH for a short time in Docker container on a Pi4B, and it’s been running that way for 2 years without any issues.
Easier to administer, more functionality and rock solid. I’ve never looked back.
DNS-O-Matic (recommended by CloudFlare, among others) combined with SWAG and Authelia will handle dynamic DNS, reverse proxying, SSL certificates, and MFA. SWAG (nginx, Let’s Encrypt and Certbot) and Authelia (MFA) run nicely in a 2 container Docker stack.
Mine have been running for ~18 months on my NAS, though I have a fixed IP so no longer use a DDNS provider.
The Honeynet Project, related to the SANS Institute when I last checked, has a lot of resources on honeypots that are worth a look, if you haven’t already.