I am using Forgejo for hosting my Git Repos and since recently, this setup is extended with a Forgejo Runner.
Now I have a quite simple task for my runner that I would like to discuss.
The task
My idea is: As soon as I push a tag to a repo, the following should happen:
- Build a container, based on the Dockerfile provided
- Push this container to my local Container Registry. This newly created image should have the same version tag as the original one.
- Keep secrets secure!
To get more into detail, I wanted to extend my Apache Airflow image by adding some Python virtual environments. By their documentation, this should be done by extending their image, so this is a quite nice example for learning.
Note that this article will not go into detail about extending the Airflow Image - I may do that at a later time. Here, I will focus on the usage of Forgejo Actions.
The repo
The repo hosts more than just the Airflow image, so I have got a subfolder ./airflow-image
. It contains two important files: The requirements.txt
and the Dockerfile
. The requirements file is not too interesting, so I won’t go into detail here. The Dockerfile gets a bit of preparation. This is due to my idea of keeping the same version number between the original image and my adapted one. My change is as follows:
FROM apache/airflow:AIRFLOW_VERSION_PLACEHOLDER
# ... Do something
Forgejo Actions
To use Forgejo Actions, we setup a directory /.forgejo/workflows
in the repository. Also, it may be required to enable Actions for the repo. For that, go to the repo in the Forgejo UI. In Settings - Units, you can enable Actions
.
Now in the workflows
folder, create a new file called build-airflow-image.yaml
. Mine looks as follows:
name: build-airflow-image
on:
push:
tags:
- airflow-*
jobs:
build:
runs-on: ubuntu-22.04
steps:
- name: Checkout the repo
uses: actions/checkout@v4
- name: Setup Docker Buildx
uses: docker/setup-buildx-action@v1
- name: Extract the correct tag
id: extract_tag
run: echo "::set-output name=tag::$(echo ${{ env.GITHUB_REF_NAME }} | grep -oP '(?<=airflow-v)[^/]+')"
- name: Set the correct tag in Dockerfile
run: sed -i 's/AIRFLOW_VERSION_PLACEHOLDER/${{ steps.extract_tag.outputs.tag }}/' airflow-image/Dockerfile
- name: Login to Container Registry
uses: docker/login-action@v3
with:
registry: harbor.tech-tales.blog
username: robot$chris+airflow-pusher
password: ${{ secrets.HARBOR_PASSWORD }}
- name: Build and Push
uses: docker/build-push-action@v6
with:
push: true
tags: harbor.tech-tales.blog/chris/airflow:${{ steps.extract_tag.outputs.tag }}
context: airflow-image
file: airflow-image/Dockerfile
What happens here:
name
: We just define a name here. This has mainly cosmetic aspects, I think. I did not yet really find out where this is used.on
: Here, we define when this action should run. In my case, it should run every time I push a tag that starts withairflow-
, so for example,airflow-v2.10.4
.jobs.build.steps
: Here, the actual work is done.Checkout the repo
andSetup Docker Buildx
: These are predefined actions that will setup our working environment.Extract the correct tag
: This is a fancy way to extract the correct tag. First, consider the end of the line:echo ${{ env.GITHUB_REF_NAME }} | grep -oP '(?<=airflow-v)[^/]+'
. This one receives the Github Ref Name as input, which will be for exampleairflow-v2.10.4
. It does a grep on that, which essentially will drop theairflow-v
part and only keep the rest.
Then, outside of that, we do anecho
again. This outer one sets an output (::set-output
) with the nametag
and the value2.10.4
.
This is mainly just Shell and RegEx Magic.
Note that I am using this repo for a bit more stuff - for example, all my DAG files also reside in this repo. That’s the reason I need a complicated tag that only partly contains the actual Airflow version.- Now we have to define the correct version in the Dockerfile. Remember, we had
FROM apache/airflow:AIRFLOW_VERSION_PLACEHOLDER
in there, and now we want to change that toFROM apache/airflow:2.10.4
.
Now the previous step comes in: We can use the variablesteps.extract_tag.outputs.tag
, which has the value2.10.4
. The rest is just a typicalsed
command - again, a bit of command line magic, but nothing too fancy. - The last two steps are about building and pushing the image. I already have got a Harbor instance running, which I would like to use. The step is not too complicated: First, login to the registry. Define a username (which I did in Harbor) and pass on it’s password (which I will discuss below). Then, run the
build-and-push
action. Define a working directory (thecontext
) and a Dockerfile (thefile
), and the build is done automagically by the runner.
Note that again, I am using the tag I extracted earlier for tagging and pushing the image.
Secrets
We need a username for the Container Registry - as I don’t allow anonymous pushes to my registry. This is done in two steps:
- In the Harbor User Interface, go to the library you are going to push to (in my case, this is
harbor.tech-tales.blog/chris
, so go to Librarychris
). There, find theRobot Accounts
Tab and create a new robot. Pass on some name you like. I defined no expiration time, do that to your liking.
In the second part of the setup, it asks you what the account can do. I added the Repository Pull and Repository Push permissions - Push is required obviously and Harbor does not allow to push without being able to pull.
Now Confirm, then you will receive a name and a password for the robot. In my case, the username isrobot$chris+airflow-pusher
, as you can see in the action file. The secret is just a password. - Now go to your Forgejo Interface. Open the repo we are working on, then go to Settings - Actions - Secrets. Add a new secret here, call it
HARBOR_PASSWORD
and enter the password string as a value. Note that you will not be able to see this secret again - Forgejo stores it in an encrypted way.
Done!
Now how I am using this setup: First, I track the Airflow releases. Every time there is a new release, I create a new tag on the repo called airflow-vx.y.z
. I push this tag to Forgejo, and the action automatically sets up my new image, as expected. Finally, in a second step (which is manual at the moment), I update my Airflow installation.