We have a Data Factory Pipeline which reads a CSV from a location, manipulates the data through three Data Flows and writes the data to an Azure SQL db.
The process was initially drafted with a static CSV in Blob Storage since its long term source was not available.
We now have access to the same file but live via SFTP. We’ve created a new Dataset specifying all the SFTP information including passphrase and key file. The file can be previewed through the Preview Data option
SFTP access is via a whitelist and all Data Factory - AustraliaEast IP’s have been added
We’ve switched the Data flow source from the old Blob dataset to the new SFTP dataset and run the pipeline
We get the following error message
Operation on target AddNewAssets failed: {"StatusCode":"DF-Executor-InvalidPath","Message":"Job failed due to reason: at Source 'TIALRMExport': Path /saas-out/CES/Assets/ALRM/TI_ALRM_Export.csv does not resolve to any file(s). Please make sure the file/folder exists and is not hidden. At the same time, please ensure special character is not included in file/folder name, for example, name starting with _","Details":""}
Drilling in further the pipeline fails at the very first step, reading the source Dataset in the first Data flow
The Dataset can be previewed so we don’t understand why it doesn’t exist when it comes to running the pipeline
Is it down to the special character in the folder name? Would this allow the data to be previewed but not run?
Thanks for any guidance given
Paul