Skip to content

Export DynamoDB table to S3 with PITR #28830

@hyangminj

Description

@hyangminj

Description

Airflow provides the Amazon DynamoDB to Amazon S3 below.
https://airflow.apache.org/docs/apache-airflow-providers-amazon/stable/operators/transfer/dynamodb_to_s3.html

Most of Data Engineer build their "export DDB data to s3" pipeline using "within the point in time recovery window".
https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/dynamodb.html#DynamoDB.Client.export_table_to_point_in_time

I appreciate if airflow has this function as a native function.

Use case/motivation

My daily batch job exports its data with pitr option. All of tasks is written by apache-airflow-providers-amazon except "export_table_to_point_in_time" task.
"export_table_to_point_in_time" task only used the python operator. I expect I can unify the task as apache-airflow-providers-amazon library.

Related issues

No response

Are you willing to submit a PR?

  • Yes I am willing to submit a PR!

Code of Conduct

Metadata

Metadata

Assignees

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions