---
title: "How to perform a batch write to DynamoDB using boto3"
description: "How to write multiple DynamoDB objects at once using boto3"
author: "Bartosz Mikulski"
author_bio: "Principal AI Engineer & MLOps Architect. I bridge the gap between \"it works in a notebook\" and \"it works for 200 million users.\""
author_url: https://mikulskibartosz.name
author_linkedin: https://www.linkedin.com/in/mikulskibartosz/
author_github: https://github.com/mikulskibartosz
canonical_url: https://mikulskibartosz.name/batch-write-dynamodb-boto3
---

This article will show you how to store rows of a Pandas DataFrame in DynamoDB using the batch write operations.

First, we have to create a DynamoDB client:

```python
import boto3

dynamodb = boto3.resource('dynamodb', aws_access_key_id='', aws_secret_access_key='')
table = dynamodb.Table('table_name')
```

When the connection handler is ready, we must create a batch writer using the `with` statement:

```python
with table.batch_writer() as batch:
   pass # we will change that
```

Now, we can create an iterator over the Pandas DataFrame inside the `with` block:

```python
with table.batch_writer() as batch:
    for index, row in df.iterrows():
        pass # to be changed
```

We will extract the fields we want to store in DynamoDB and put them in a dictionary in the loop:

```python
with table.batch_writer() as batch:
    for index, row in df.iterrows():
        content = {
            'field_A', row['A'],
            'field_B', row['B']
        }
        # there is still something missing
```

In the end, we use the `put_item` function to add the item to the batch:

```python
with table.batch_writer() as batch:
    for index, row in df.iterrows():
        content = {
            'field_A', row['A'],
            'field_B', row['B']
        }
        batch.put_item(Item=content)
```

When our code exits the `with` block, the batch writer will send the data to DynamoDB.