Adding a custom function to Airflow is quite simple. First, we have do define a function in Python, for example, this one:

Table of Contents

  1. Get Weekly AI Implementation Insights

def do_something_with_execution_date(execution_date):
    # Imagine that there is some useful code ;)
    ...

When the function is ready, we use the user_defined_macros parameter of the DAG object to pass a dictionary of custom functions:

dag = DAG(
    ...,
    user_defined_macros={
        'custom_function': do_something_with_execution_date,
    }
)

Now, we can use the custom function in any place that supports Airflow templates. Of course, only in the DAGs that have access to the functions.

{{ custom_function(execution_date) }};

Note that, I can pass parameters to the function and rename it by using a different name as the dictionary key.

Get Weekly AI Implementation Insights

Join engineering leaders who receive my analysis of common AI production failures and how to prevent them. No fluff, just actionable techniques.

Get Weekly AI Implementation Insights

Join engineering leaders who receive my analysis of common AI production failures and how to prevent them. No fluff, just actionable techniques.

Older post

Speed up counting the distinct elements in a Spark DataFrame

Use HyperLogLog to calculate the approximate number of distinct elements in Apache Spark

Newer post

Select Serverless configuration variables using the stage parameter

How to pass environment parameters to Serverless that depend on the deployment stage

Engineering leaders: Is your AI failing in production? Take the 10-minute assessment
>