How to run Airflow in Docker (with a persistent database)

In this blog post, I am going to show you how to prepare the minimalist setup of puckel/docker-airflow Docker image that will run a single DAG and store logs persistently (so we will not lose it during restarts of Docker container).

Firstly, we have to pull the Docker image and prepare the minimalist DAG configuration. The simplest DAG consists of a single DummyOperator. The content of the file is listed below:

from datetime import datetime
from airflow import DAG
from airflow.operators.dummy_operator import DummyOperator

dag = DAG('minimalist_dag', description='The simplest DAG',
          schedule_interval='0 12 * * *',
          start_date=datetime(2019, 7, 10))

dummy_operator = DummyOperator(task_id='dummy_task', retries=1, dag=dag)

I am going to save the code in minimalist.py file in the /home/user/airflow/dags directory (you will need the full path to the directory where you saved the file).

To pass the DAG configuration to the Airflow instance we need to map the local directory to a directory in a Docker container using the volume configuration, so we have to add those parameters to docker run parameters:

-v /home/user/airflow/dags:/usr/local/airflow/dags

By default, the log database is in the /usr/local/airflow/airflow.db file. I cannot map the /usr/local/airflow directory to a local directory, because that would break the configuration (the whole configuration is there, and I don’t want to override it).

To deal with that problem, I have to change the location of the database file. That can be done using an environment variable: AIRFLOW__CORE__SQL_ALCHEMY_CONN:

I am going to change it to sqlite:////usr/local/airflow/db/airflow.db (note that it has to be the full SQL connection string), using this parameter:

-e AIRFLOW__CORE__SQL_ALCHEMY_CONN=sqlite:////usr/local/airflow/db/airflow.db

After that change, I have to map the directory I choose to another local directory.

-v /home/user/airflow/db:/usr/local/airflow/db

The full command that runs Airflow in Docker with custom dags and persisted log is here:

docker run -d -p 8080:8080 -e AIRFLOW__CORE__SQL_ALCHEMY_CONN=sqlite:////usr/local/airflow/db/airflow.db -v /home/user/airflow/dags:/usr/local/airflow/dags -v /home/user/airflow/db:/usr/local/airflow/db puckel/docker-airflow webserver

Remember to change this part: /home/user/airflow/ to the full path of your local directory (in both parameters).

Older post

Using machine learning for software testing

How to sample production data to get representative testing dataset?

Newer post

Dependencies between DAGs: How to wait until another DAG finishes in Airflow?

How to trigger Airflow DAG when another DAG is completed

Are you looking for an experienced AI consultant? Do you need assistance with your RAG or Agentic Workflow?
Schedule a call, send me a message on LinkedIn. Schedule a call or send me a message on LinkedIn

>