---
title: "Pass parameters to SQL query when using PostgresOperator in Airflow"
description: "How to pass parameters to SQL template when using PostgresOperator in Airflow"
author: "Bartosz Mikulski"
author_bio: "Principal AI Engineer & MLOps Architect. I bridge the gap between \"it works in a notebook\" and \"it works for 200 million users.\""
author_url: https://mikulskibartosz.name
author_linkedin: https://www.linkedin.com/in/mikulskibartosz/
author_github: https://github.com/mikulskibartosz
canonical_url: https://mikulskibartosz.name/pass-parameters-to-sql-query-when-using-postgresoperator-in-airflow
---

PostgresOperator allows us to use a SQL file as the query. However, when we do that, the standard way of passing template parameters no longer works.

For example, if I have the following SQL query:

```sql
{% raw %} SELECT column_a, column_b FROM table_name WHERE column_a = {{ some_value }} {% endraw %}
```

Airflow will not automatically pass the `some_value` variable as the parameter.

In this situation, I have to use the `parameters` attribute of the PostgresOperator:

```python
PostgresOperator(
    task_id='postgres_task',
    postgres_conn_id='postgress_sophi',
    params={"some_value":"'the value I want'"},
    sql='query.sql',
    dag=dag)
```

Note that, I had to wrap the value into **single quotes**! Because in Postgres double quotes are used to reference a columns, tables, or functions. String literals, on the other hand, are inside single quotes.

In the `query.sql` file, I have to change the template variable to `params.some.value`:

```sql
{% raw %} SELECT column_a, column_b FROM table_name WHERE column_a = {{ params.some_value }} {% endraw %}
```

