Reputation: 905
1.Is using azure cosmosdb changefeed right approach, to migrate data from old containers to new partitioned containers?
2.Will azure cosmosdb changefeed have documents which are not at all modified even once till now?
3.If above are true, what are the steps involved in migrating data using azure cosmosdb change feed.
Upvotes: 0
Views: 727
Reputation: 15603
You can certainly use Change Feed to achieve all this. Cosmos DB has several options to consume the Change Feed, one of them is using Azure Functions.
If you go with the Functions option, you can quickly do a simple migration mixing the Cosmos DB Trigger (with StartFromBeginning
to true
) and the Cosmos DB Output binding like so:
[FunctionName("Migration")]
public static async Task Run(
[CosmosDBTrigger(
databaseName: "your-source-database",
collectionName: "your-source-collection",
ConnectionStringSetting = "Connection-String-Setting-Name-For-Source-Account",
StartFromBeginning = true,
CreateLeaseCollectionIfNotExists = true,
)] IReadOnlyList<Document> source,
[CosmosDB(
databaseName: "your-destination-database",
collectionName: "your-destination-collection",
ConnectionStringSetting = "Connection-String-Setting-Name-For-Destination-Account" // In case your destination is on a different account, otherwise, it could be the same value as the Trigger
)] IAsyncCollector<Document> destination,
ILogger log)
{
foreach(var doc in source){
await destination.AddAsync(doc);
}
}
There are some optimizations that can be applied for Multi master or geo-distributed applications (like using PreferredLocations
if your Function is running on a different region).
Keep in mind that the Function acts as a live migration. It will start migrating documents that exist before it was deployed, and then it will continue to migrate new documents while it keeps running if your source collection keeps getting insert/updates.
If what you are needing is a cold migration, you could also use the Data Migration Tool
Upvotes: 3