Getting error while Copy data from Salesforce using Azure Data Factory while using OAuth 2.0 Client Credential authentication
We are implementing new ADF and Salesforce connectivity changes that is using OAuth 2.0 Client Credential authentication. Here is the current linked service connection
{
"name": "LinkedServiceSalesforce",
"type": "Microsoft.DataFactory/factories/linkedservices",
"properties": {
"annotations": [],
"type": "SalesforceV2",
"typeProperties": {
"environmentUrl": "https://mydomain.sandbox.my.salesforce.com",
"clientId": "3MJadb",
"apiVersion": "58.0",
"encryptedCredential": "xxxxxxxxxxxx"
}
}
}
For normal salesforce object like project account it works fine, but when we are trying to fetch object which contain blobs like content version or ProjectAttachments object, we are getting following error
Failure happened on 'Source' side. ErrorCode=SalesforceHttpResponseNotSuccessCodeException,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=The API request to Salesforce failed. Request Url: https://mydomain.my.salesforce.com/services/data/v58.0/jobs/query, Status Code: BadRequest, Error message: [{"errorCode":"API_ERROR","message":"Blob field not supported in Bulk V2 Query with CSV content type"}],Source=Microsoft.Connectors.Salesforce,'
We are trying to copy content version object data to csv file, pipeline is as follow
{
"name": "pipe_17_ContentVersion",
"properties": {
"activities": [
{
"name": "Create Content version Intermadiate Copy",
"type": "Copy",
"dependsOn": [],
"policy": {
"timeout": "0.12:00:00",
"retry": 0,
"retryIntervalInSeconds": 30,
"secureOutput": false,
"secureInput": false
},
"userProperties": [],
"typeProperties": {
"source": {
"type": "SalesforceV2Source",
"SOQLQuery": "SELECT Id\n,ContentDocumentId\n,IsLatest\n,ContentUrl\n,ContentBodyId\n,VersionNumber\n,Title\n,Description\n,VersionData\n,ContentSize\n,FileExtension\nFROM\nContentVersion\nWHERE\nFile_Tag_Name__c IN ('Inspection QAQC','Inspection Safety')"
},
"sink": {
"type": "DelimitedTextSink",
"storeSettings": {
"type": "AzureBlobStorageWriteSettings",
"copyBehavior": "FlattenHierarchy"
},
"formatSettings": {
"type": "DelimitedTextWriteSettings",
"quoteAllText": true,
"fileExtension": ".txt"
}
},
"enableStaging": false,
"enableSkipIncompatibleRow": true,
"logSettings": {
"enableCopyActivityLog": true,
"copyActivityLogSettings": {
"logLevel": "Warning",
"enableReliableLogging": false
},
"logLocationSettings": {
"linkedServiceName": {
"referenceName": "BlobStorageLink",
"type": "LinkedServiceReference"
},
"path": "log-activity/attachment/"
}
},
"translator": {
"type": "TabularTranslator",
"typeConversion": true,
"typeConversionSettings": {
"allowDataTruncation": true,
"treatBooleanAsNumber": false
}
}
},
"inputs": [
{
"referenceName": "ds_SF_ContentVersion",
"type": "DatasetReference"
}
],
"outputs": [
{
"referenceName": "ds_ContentVersion",
"type": "DatasetReference"
}
]
}
],
"folder": {
"name": "Attachments"
},
"annotations": [],
"lastPublishTime": "2024-02-08T13:06:47Z"
},
"type": "Microsoft.DataFactory/factories/pipelines"
}
We already given all possible permission to Salesforce app but it seems like with jobs/query it does not support blob csv datatype response