Reputation: 416
I would like to unload data from the Redshift db to an S3 bucket, which would later be used to copy into another database. I have written my DAG as below:
from airflow.operators import RedshiftToS3Transfer
from datetime import datetime, timedelta
from airflow import DAG
default_args = {
'owner': 'me',
'start_date': datetime.today(),
'max_active_runs': 1,
}
dag = DAG(dag_id='redshift_S3',
default_args=default_args,
schedule_interval="@once",
catchup=False
)
unload_to_S3 = RedshiftToS3Transfer(
task_id='unload_to_S3',
schema='schema_name',
table='table_name',
s3_bucket='bucket_name',
s3_key='s3_key',
redshift_conn_id='redshift',
aws_conn_id='my_s3_conn',
dag=dag
)
But I get an error "Broken DAG: cannot import name 'RedshiftToS3Transfer' from 'airflow.operators' (unknown location)". Any idea on how to import the RedshiftToS3Transfer would be of help.
Upvotes: 2
Views: 2216
Reputation: 416
The right way to import this is
from airflow.operators.redshift_to_s3_operator import RedshiftToS3Transfer
Upvotes: 7