Reputation: 11
I'm currently trying to send data to a azure document db collection on python (using pydocumentdb lib). Actually i have to send about 100 000 document on this collection and this takes a very long time (about 2 hours).
I send each document one by one using :
for document in documents :
client.CreateDocument(collection_link, document)
Am i doing wrong, is there another faster way to do it or it's just normal that it takes so long.
Thanks !
Upvotes: 0
Views: 2034
Reputation: 24138
On Azure, there are many ways to help importing data to CosmosDB faster than using PyDocumentDB API which be wrappered the related REST APIs via HTTP.
First, to be ready a json file includes your 10,000 documents for importing, then you can follow the documents below to import data.
How to import data into Azure Cosmos DB for the DocumentDB API?
to import json data file via DocumentDB Data Migration Tool.Azure Cosmos DB: How to import MongoDB data?
to import json data file via the mongoimport
tool of MongoDB.Example: Copy data from Azure Blob to Azure Cosmos DB
to know more details.If you just want to import data in programming, you can try to use Python MongoDB driver to connect Azure CosmosDB to import data via MongoDB wire protocol, please refer to the document Introduction to Azure Cosmos DB: API for MongoDB
.
Hope it helps.
Upvotes: 1