-
Notifications
You must be signed in to change notification settings - Fork 16.4k
Closed
Labels
area:providersgood first issuekind:featureFeature RequestsFeature Requestsprovider:amazonAWS/Amazon - related issuesAWS/Amazon - related issues
Description
Description
I am using Airflow to move data periodically to our datalake and noticed that the MySQLToS3Operator has tempalted fields and the DynamoDBToS3Operator doesn't. I found a semi awkward workaround but thought templated fields would be nice.
I supposed an implementation could be as simple as adding
template_fields = (
's3_bucket',
's3_key',
)
to the Operator similar to the MySQLToS3Operator.
Potentially one could also add a log call in execute()
as well as using with in the NamedTemporaryFile
Use case/motivation
At the moment MySQLToS3Operator and DynamoDBToS3Operator behave differently in terms of templating and some of the code in MySQLtoS3Operator seems more refined.
Related issues
No response
Are you willing to submit a PR?
- Yes I am willing to submit a PR!
Code of Conduct
- I agree to follow this project's Code of Conduct
Metadata
Metadata
Assignees
Labels
area:providersgood first issuekind:featureFeature RequestsFeature Requestsprovider:amazonAWS/Amazon - related issuesAWS/Amazon - related issues