Reputation: 31
I'm looking for a way to update field in SalesForce from Azure DataFactory pipeline. I have standard copy pipeline from SalesForce to SQL server, then I'm executing stored procedure on downloaded data and after that I need to update fields in SalesForce that changed after procedure. Right now I have totally no idea how to even try.
Upvotes: 0
Views: 2655
Reputation: 99
Azure is supporting salesforce as a Source as well as Sink. To copy data to Salesforce, you have to use Salesforce as a sink type.
I don't have much knowledge on Azure, but this link will help you to setup salesforce as sink.
You might get difficulty in setting externalIdFieldName
. If you already have a field in salesforce containing unique values (except if field)
, edit that field and mark External ID
as checked. If there is no unique field available then below change needs to do in salesforce (below step is not needed if we map id field of salesforce with database table containing salesforce id value. You can try without adding externalIdFieldName, if not working then use below
) :
Sync part will be configured as below in your case:
"sink": {
"type": "SalesforceSink",
"writeBehavior": "Update",
"externalIdFieldName": "pipelinecol__c",
"writeBatchSize": 10000,
"ignoreNullValues": true
}
Upvotes: 1
Reputation: 2363
You could chain your copy activity with a stored procedure activity, and then add another copy activity. You could use ADF V2 UI to help you create the pipeline.
Upvotes: 0