I've been running self-hosted services for a while now. What started as "hey, I wonder if I can host my own stuff" has turned into a setup that works well enough for my needs.
The real motivation here isn't saving money (though that's a bonus). It's more about making things work with whatever hardware I can get my hands on. I like understanding how the pieces fit together and seeing what I can squeeze out of limited resources.
The Hardware Foundation
My self-hosted setup is spread across four machines, and each one has its own role:
Raspberry Pi 5
Specs: 8GB RAM, 128GB MicroSD
Role: Running way more stuff than it probably should
This little guy is my workhorse for all the lightweight services I actually use daily:
- Mealie - Cooking recipe management
- Linkding - Bookmark management with tagging
- Miniflux - RSS feed aggregation
- Wakapi - Coding time tracking (WakaTime alternative)
- Gogs - Lightweight Git hosting
- Ghost - Blog publishing platform
- Ghostfolio - Portfolio tracking
- Szurubooru - Imageboard
The Pi handles all of this smoothly. The power consumption is minimal, and it's useful for testing new services before I decide if they're worth keeping around.
Bmax B2 - The Media Server
Specs: 4GB RAM, 512GB SSD (purchased second hand) Role: All the media stuff
This budget mini PC handles everything media-related quite well for its specs.
- Servarr Stack (Radarr, Sonarr, Prowlarr) - Automated media acquisition
- Jellyfin - Media streaming server
- Navidrome - Music streaming server
- Transmission - BitTorrent client
The SSD makes such a huge difference here. No more waiting forever for file operations or dealing with SD card corruption. Plus, when Jellyfin needs to transcode something, this box actually has the muscle for it.
STB TV for Network Services
Specs: 2GB RAM, 8GB SD card, running Armbian (came pre-flashed) Role: Network monitoring (for now)
This second-hand TV set top box cost me about $20 and came already flashed with Armbian by the seller. Currently it handles:
- AdGuard Home - Network-wide DNS filtering and ad blocking
- Prometheus Exporter - Collecting metrics from my router
I'm planning to either retire this device or move it to my office for remote network access, since the small storage and outdated OS are becoming a maintenance headache.
DigitalOcean Droplet for Public Services
Specs: 2GB RAM, SGP1 region
Role: The stuff that needs to be accessible from the scary internet
The cloud piece handles anything that needs to be reachable from outside my network:
- Traefik - Load balancer and reverse proxy
- Beszel - System monitoring (way simpler than Prometheus/Grafana)
- Prometheus + Grafana - Still around for experiments and custom metrics
- Certbot + Let's Encrypt - SSL certificates because security
- VaultWarden - Password manager (Bitwarden but self-hosted)
How It All Talks To Each Other
Tailscale makes this whole thing work. I'm using their free tier to create a mesh network where all my machines can talk to each other securely, without me having to mess around with port forwarding or VPN configs.
Here's how the traffic flows:
- Local stuff runs on my home hardware (Pi, Bmax, and STB TV)
- Traefik on the droplet handles all the public internet requests
- Tailscale creates secure tunnels so everything can talk
- Cloudflare does DNS and extra TLS stuff
This setup gives me control over my services while letting me learn how these pieces actually connect and work together.
Infrastructure as Code
I don't have everything fully automated, but I do use some IaC tools to keep things manageable:
Terraform
One repo that handles the cloud infrastructure:
- DigitalOcean droplet provisioning
- Cloudflare DNS records
- DigitalOcean Spaces storage
Ansible
Separate repo for service deployment:
- Installing Docker on machines
- Deploying services via docker-compose files
I used to have Atlantis set up for automated Terraform runs, but nowadays I just experiment and apply changes locally. I make sure to commit and push code frequently to keep everything in version control, even if the deployment process is more manual.
Storage Strategy
I use a hybrid storage approach:
- Local storage for frequently accessed data and databases
- DigitalOcean Spaces for backups and large files
- Kopia for automated backups, using DigitalOcean Spaces as the storage backend via its S3-compatible API
This ensures I have both performance and durability while keeping costs reasonable. Kopia handles deduplication and encryption, while DigitalOcean Spaces provides reliable cloud storage at a reasonable cost.
Lessons Learned
Here's what I've learned along the way:
Start small and grow: I started with just the Pi running one service. Good thing I didn't try to build everything at once.
Tailscale works well: Before this I was doing the whole port forwarding dance and it was painful. Tailscale makes things simpler and more secure.
Monitor everything, but keep it simple: I used to run the whole Prometheus/Grafana stack for monitoring, but it was overkill for my setup. Switched to Beszel and it's simpler. I still keep Prometheus around for custom metrics and experiments, but for basic system monitoring, Beszel does the job.
Git helps, even if you're not disciplined: All my configs live in git repos. I'm not great about committing and pushing regularly, but having the history there means I won't completely forget how things were configured.
Why I Actually Do This
This isn't about saving money. Sure, the whole setup costs me about $17/month versus what would probably be $100+ for equivalent managed services, but that's not the point.
The value is in the tinkering and learning. When you're working with limited hardware, you figure out how things actually work. You learn about networking, storage, deployment, and system administration by doing it yourself.
It's not production-grade or mission-critical. It's a playground. If something breaks, I fix it when I get around to it. The point is experimenting with what's possible on minimal resources.
Wrapping Up
Building this self-hosted setup has been enjoyable. It's about having control over my digital services and understanding how they work. My bookmarks, my RSS feeds, even my password manager run on hardware I control.
That said, privacy and de-googling aren't my main concerns here, especially for data I can't afford to lose. I still use cloud services when it makes sense. Google Drive for important documents, Gmail for email, GitHub for code repositories. The point isn't to replace everything, but to experiment and learn with the stuff that's fun to tinker with.
The hybrid setup (local + cloud) works well. I get control over my services and can still access them when I'm not home. Plus, I've learned things that make me better at my day job.
If you're thinking about self-hosting, just start. Grab a Raspberry Pi, pick one service, and try it. You'll make mistakes, but that's how you learn.
What's Coming Next
I'm always tinkering with new stuff. Here's what I'm thinking about:
Hardware Upgrades:
- Add SSD to the Raspberry Pi - The MicroSD card is the bottleneck now, and I want to see how much more I can squeeze out of this little computer
- More ARM devices - Maybe another Pi or some other SBC to justify running a proper k3s cluster
- Dedicated storage solution - Thinking about a small NAS setup, maybe with RAID for the media server
- UPS for power backup - Living in Indonesia means power outages happen, and I'm tired of everything going down randomly
Software Experiments:
- Kubernetes cluster (k3s) - I've run production Kubernetes clusters at work, but I'm curious about running it on a budget with ARM devices
- Running my own LLM - If I get a device with a decent GPU, experimenting with local language models could be interesting
None of this needs to work perfectly. It's about figuring out how to make it work and learning from what doesn't.