Adding a custom function to Airflow is quite simple. First, we have do define a function in Python, for example, this one:

def do_something_with_execution_date(execution_date):
    # Imagine that there is some useful code ;)
    ...

When the function is ready, we use the user_defined_macros parameter of the DAG object to pass a dictionary of custom functions:

dag = DAG(
    ...,
    user_defined_macros={
        'custom_function': do_something_with_execution_date,
    }
)

Now, we can use the custom function in any place that supports Airflow templates. Of course, only in the DAGs that have access to the functions.

{{ custom_function(execution_date) }};

Note that, I can pass parameters to the function and rename it by using a different name as the dictionary key.

Stop AI Hallucinations Before They Cost You.
Join engineering leaders getting weekly tactics to prevent failure in customer-facing AI systems. Straight from real production deployments.
Stop AI Hallucinations Before They Cost You.
Join engineering leaders getting weekly tactics to prevent failure in customer-facing AI systems. Straight from real production deployments.
Older post

Speed up counting the distinct elements in a Spark DataFrame

Use HyperLogLog to calculate the approximate number of distinct elements in Apache Spark

Newer post

Select Serverless configuration variables using the stage parameter

How to pass environment parameters to Serverless that depend on the deployment stage

Engineering leaders: Is your AI failing in production? Take the 10-minute assessment
>
×
Stop AI Hallucinations Before They Cost You.
Join engineering leaders getting weekly tactics to prevent failure in customer-facing AI systems. Straight from real production deployments.