Back in the day it was nice, apt get update && apt get upgrade and you were done.
But today every tool/service has it’s own way to being installed and updated:
- docker:latest
- docker:v1.2.3
- custom script
- git checkout v1.2.3
- same but with custom migration commands afterwards
- custom commands change from release to release
- expect to do update as a specific user
- update nginx config
- update own default config and service has dependencies on the config changes
- expect new versions of tools
- etc.
I selfhost around 20 services like PieFed, Mastodon, PeerTube, Paperless-ngx, Immich, open-webui, Grafana, etc. And all of them have some dependencies which need to be updated too.
And nowadays you can’t really keep running on an older version especially when it’s internet facing.
So anyway, what are your strategies how to keep sanity while keeping all your self hosted services up to date?


A dedicated Forgejo instance
f.example.com.For a small set of trusted “base” images (e.g.
docker.io/alpineanddocker.io/debian): A Forgejo Action on separate small runner, scheduled on cron to sync images tof.example.com/dockerio/usingskopeo copy.Then all other runners have their docker/podman configuration changed to use that internal forgejo container registry instead of
docker.io.Other images are built from source in the Forgejo Actions CI. Not everything needs to be (or even should) be fully automated right off. You can keep some workflows manual while starting out and then increase automation as you tighten up your setup and get more confident in it. Follow the usual best practices around security and keep permissions scoped, giving them out only as needed.
Git repos are mirrored as Forgejo repo mirrors, forked if relevant, then built with Forgejo Actions and published to
f.example.com/whatever/. Rarely but sometimes is it worth spending time on reusing existing Github Workflows from upstreams. More often I find it easier to just reuse my own workflows.This way, runners can be kept fully offline and built by only accessing internal resources:
Same idea for npm or pypi packages etc.
Set up renovate1 and iterate on its configuration to reduce insanity. Look in forgejo and codeberg infra repos for examples of how to automate rebasing of forked repo onto mirrors.
I would previously achieve the same thing by wiring together more targeted services and that’s still viable but Forgejo makes it easy if you want it all in one box. Just add TLS.
1: Or anyone have anything better that’s straightforward to integrate? I’m not a huge fan of all the npm modules it pulls in or its github-centric perspective. Giving the same treatment to renovate itself here was a little bit more effort and digging than I think should really be necessary.