Ganesh Ik
Ganesh Ik

Reputation: 1

How to load more than 5K records from a table to Datalake using in ADF

I have table that is having more than 5000 records. I am migrating these directly to cosmos db using dataflow, this is Initial/full load we are calling ....As a next step i have setup a pipeline that will migrate the incremental data from source table to cosmos db using API's Problem that i am facing is the pipeline which i defined has limitation of 5000 recrods. i.e. For the incremental data that is having more than 5000 records it will fail because of limition of lookup activity in ADF. Could you please help me on this ?How to load records more than 5000

Upvotes: 0

Views: 247

Answers (1)

Naveen Sharma
Naveen Sharma

Reputation: 1298

For lookup there is limitation of 5000 rows instead of lookup you can directly use copy activity to insert incremental records as it provides upsert option which will insert record if not exist in destination as per this document

enter image description here

Upvotes: 0

Related Questions