Reputation: 322
I'm new to Data Factory ecosystem. I'm trying to copy data from source MySQL database to sink CosmosDB for MongoDB. An example source schema that I have is something like this:
*inventory_warehouse_table*
--------------------------------------------------------
| id | warehouse_name | warehouse_address | lat | long |
--------------------------------------------------------
The sink schema is of the form like:
*inventory_warehouse_collection*
{
id: <int>
warehouse_name: <string>
warehouse_address: <string>
geo_coordinates: {
type: "Point",
coordinates: [lat, long]
}
}
I don't see any schema mapping in copy data activity for achieving the same. How can I do this in Azure Data Factory? Is there any other pipeline I need to create for doing so?
Upvotes: 0
Views: 268
Reputation: 7136
Img:2 Sink Dataset
Except the last column, all other columns are mapped. Since last column has concat of lat, long columns, Dataflow is used to combine the columns. Below is the approach.
concat("[",toString(lat),",",toString(long),"]")
Upvotes: 1