Reputation: 575
I'm trying to extract data using Azure Data Factory from Oracle to Parquet in ADLS. The problem I have is: whatever I try, the datatype decimal(p,s) in Oracle will always change to decimal(38,18) in my parquet file.
I've tried dynamically mapping the column in the mapping menu in Azure Data Factory:
However, my datatype in the parquet file will still be 38,18:
I've tried with an MSSQL Server as source and that works fine.
This is what my Mapping looks like in ADF (but i've tried different options/values):
"translator": {
"type": "TabularTranslator",
"mappings": [
{
"source": {
"name": "COLUMN",
"type": "Decimal",
"physicalType": "decimal",
"scale": 1,
"precision": 4
},
"sink": {
"name": "COLUMN",
"type": "Decimal",
"physicalType": "DECIMAL",
"scale": 1,
"precision": 4
}
}
],
"typeConversion": true,
"typeConversionSettings": {
"allowDataTruncation": false,
"treatBooleanAsNumber": false
}
}
Does anyone have a fix for this?
Upvotes: 1
Views: 1363
Reputation: 5074
You can convert the decimal scale and precision in the Derived column
transformation in Azure Data Factory dataflow
activity.
In my example, I am adding an expression to get precision 7 and scale 1 of column salary.
Sink preview:
Upvotes: 0