Hello, fellow Linux users!
My question is in the titel: What is a good approach to deploy docker images on a Raspberry Pi and run them?
To give you more context: The Raspberry Pi runs already an Apache server for letsencrypt and as a reverse proxy, and my home grown server should be deployed in a docker image.
To my understanding, one way to achieve this would be to push all sources over to the Raspberry Pi, build the docker image on the Raspberry Pi, give the docker image a ‘latest’ tag and use Systemd with Docker or Podman to execute the image.
My questions:
- Has anyone here had a similar problem but used a different approach to achieve this?
- Has anyone here automated this whole pipeline that in a perfect world, I just push updated sources to the Raspberry Pi, the new docker image gets build and Docker/Podman automatically pick up the new image?
- I would also be happy to be pointed at any available resources (websites/books) which explain how to do this.
At the moment I am using Raspbian 12 with a Raspberry Pi Zero 2 W and the whole setup works with home grown servers which are simply deployed as binaries and executed via systemd. My Docker knowledge is mostly from a developer perspective, so I know nearly nothing about deploying Docker on a production machine. (Which means, if there is a super obvious way to do this I might not even be aware this way exists.)
Thanks for the idea! I try to keep as little ‘moving’ parts as possible, so hosting gitlab is something I would want to avoid if possible. The Raspberry Pi is supposed to be sole hardware for the whole deployment of the project.
Its definitely not a lightweight solution. Is the pi dedicated to the application? If so, is it even worth involving docker?
You are asking exactly the right questions!
I have an Ansible playbook to provision the Pi (or any other Debian/Ubuntu machine) with everything need to run a web application, as long as the web application is a binary or uses one of the interpreters of the machine. (Well, I have also playbooks to compile Python/Ruby from source or get an Adoptium JDK repository etc.)
Right now I am flirting with the idea of using Elixir for my next web application, and it just seems unsustainable for me to now add Erlang/OTP and Elixir to my list of playbooks to compile from source.
The Debian repositories have quite old versions of Erlang/OTP/Elixir and I doubt there are enough users to keep security fixes/patches up to date.
Combined with the list of technologies I already use, it seems to reduce complexity if I use Docker containers as deployment units and should be future proof for at least the next decade.
Writing about it, another solution might simply be to have something like Distrobox on the PI and use something like the latest Alpine.
Up-to-date runtimes definitely makes sense, that is where docker shines.
Gitlab is obviously a bit overkill, but maybe you could just create some systemd timers and some scripts to auto-pull, build and deploy?
The script would boil down to:
Your welcome to steal whatever you can from the repo I linked before.
Thanks a lot!
Yeah, if I go down that road, I’ll probably just add a git commit hook on the repo for the Raspberry Pi, so that I’ll have a ‘push to deploy’ workflow!