Using Gitlab CI/CD and Ansible to deploy Docker stacks in your Homelab
Context
I have been enjoying my homelab for 2 years. I went from deploying apps manually, to using docker compose for orchestration. I now have a way to deploy my stacks using CI/CD through Gitlab. Everything is backed by the Proxmox hypervisor.
All my docker compose stacks were deployed by hand on one single Debian 12 Virtual Machine. The folder hierarchy was looking like this :
/opt/stacks:
- nextcloud:
- docker-compose.yaml
- paperless:
- docker-compose.yaml
- nginx:
- docker-compose.yaml
And so on…
It was working well but was not always easy to manage, with a lot of manual steps when adding a new stack.
Shutting down all the stacks required to do multiple docker compose down
in multiple directories or combining the docker-compose.yaml
files with the -f
option. In addition, all my docker-compose.yaml
files were scattered in multiple Git repositories. Not really easy to maintain.
The Solution
My goal was to manage multiple docker compose stacks from one single repository. Basically putting my /opt/stacks
hierarchy mentionned before in one single Git repository. I also wanted to add Continuous Deployment to automate the boring manual tasks when appending a new stack.
Gitlab was a great solution for Continuous Deployment. The .gitlab-ci.yml
format is intuitive and it’s quite easy to write.
You can install a Gitlab runner locally on your homelab without the need of setting up a whole Gitlab instance.
This works as follows:
- On each push to the stacks git repository, Gitlab triggers a new pipeline with two jobs. The first one is triggered automatically and is used to deploy the stacks. The other one can be triggered manually after the deployment to destroy all the stacks.
- The local Gitlab runner (VM 2) retrieves the new stacks codebase with the
docker-compose.yaml
and other related files. It then runs Ansible to push the new stacks to the Docker-stacks virtual machine (VM 1) and start the stacks with the Ansible docker-compose module.
A diagram of the underlying architecture:
It basically looks like this on Gitlab:
On each new push a new pipeline is created with a deploy
job.
In the deploy
job logs we can see which stacks have been modified by Ansible.
It’s also possible to destroy the stacks by triggering the destroy
job.
How-To
Prerequisites
- Two virtual machines, one for the Gitlab runner and one for the Docker stacks.
- An SSH key pair to allow the Gitlab runner to connect to the Docker stacks VM.
- Docker needs to be installed on the stacks VM and the Gitlab runner
Step 1 : registering a new Gitlab runner
- Create a Gitlab group
- Create your Git project in this group
- In your Gitlab group page, click on “Build” and “Runner”
- You can then register a new group runner. Don’t forget to assign a tag such as
lab
. This tag will be used later in thegitlab-ci.yml
file to specify that the job needs to be run inside your homelab and not on Gitlab’s public runners. - Gitlab then give you a token that can be used to register your runner.
- When you install the Gitlab runner use the docker executor
More information on this process can be found on Gitlab’s documentation:
Step 2 : registering the SSH private key on Gitlab
There is multiple ways to add an SSH key to your job in order to connect to the Docker stacks VM.
- Passing the SSH private key in Gitlab’s CI/CD variables
- Using Gitlab’s secure files
- Using an external secret manager such as Bitwarden Secrets
For me, the end game is to use Bitwarden Secrets for all of the stacks secrets including the SSH private key. The easiest way though is to use a CI/CD variable with the base64 encoded private key file content.
Step 3 : creating the pipeline
The pipeline declaration will be done inside of your stacks Gitlab repository. We’ve created it before with our Gitlab group.
You can find an example with all the necessary files for Ansible on my Gitlab: https://gitlab.com/magkeep/gitlab-cicd-deploy-docker-with-ansible-in-your-homelab
Improvements
- Integration with Bitwarden Secrets will be for another post