MAK
MAK

Reputation: 1298

Serializing Array inside cosmos DB when using ADF

I've the following format within my COSMOS DB document.

"Tag": "SPEEDSTER",
    "data": [
        {
            "timestamp": "2018-09-05T13:55:09.297Z",
            "jsonVersion": 1,
            "speed": 404
        }
    ]

While importing the schema within ADF copy pipeline, the array is not supported. Is there any way I can achieve this?

Upvotes: 1

Views: 533

Answers (2)

Jay Gong
Jay Gong

Reputation: 23792

You need to use SQL query in the copy activity to design your schema of your cosmos db source data.

enter image description here

I tested above SQL based on your sample data, and it was copied to a txt file in the blob storage successfully.

select c.id,c.Tag,data.timestamp,data.jsonVersion,data.speed from c

Output:

enter image description here

Upvotes: 1

Fang Liu
Fang Liu

Reputation: 2363

what is your sink data store?

If you want to export the document as is, for example, to a json format file, you could use the export json as is feature. To achieve that, you could remove structure in your data set and translator in your copy activity. https://learn.microsoft.com/en-us/azure/data-factory/connector-azure-cosmos-db#importexport-json-documents

If you want to extract data out from the array, you could write your own query to do some transformation.

Upvotes: 0

Related Questions