Reputation: 237
What are the steps to be taken to migrate historical data load from Teradata to Snowflake? Imagine there is 200TB+ of historical data combined from all tables.
I am thinking of two approaches. But I don't have enough expertise and experience on how to execute them. So looking for someone to fill in the gaps and throw some suggestions
Approach 1- Using TPT/FEXP scripts
Approach 2
Which of these two is the best suggested approach ? In general, what are the challenges faced in each of these approaches.
Upvotes: 1
Views: 988
Reputation: 2069
Using ADF will be a much better solution. This also allows you to design DataLake as part of your solution. You can design a generic solution that will import all the tables provided in the configuration. For this you can choose the recommended file format (parquet) and the size of these files and parallel loading.
The challenges you will encounter are probably a poorly working ADF connector to Snowflake, here you will find my recommendations on how to bypass the connector problem and how to use DataLake Gen2: Trouble loading data into Snowflake using Azure Data Factory
More about the recommendation on how to build Azure Data Lake Storage Gen2 structures can be found here: Best practices for using Azure Data Lake Storage Gen2
Upvotes: 0