Hi,
Im trying to move a ~3GB .bak file from an ftp_server to azure blob storage using sftp. I want the transfer to happen automatically once a day, as a new .bak file with the same name is placed at the FTP_server every night.
I have looked into logical apps, azure functions and Azure Data Factory right now.
I have already managed to do the transfer with Azure Data Factory!! BUT, the file is stored as a Block Blob in the container, and it HAS TO BE a Page Blob in order for the virtual machine SQL server should be able to restore a database from it.
THEREFORE, how can i tell Azure DF to store the file as page blob?? here is my current pipeline script:
{
"name": "pipeline1copy",
"properties": {
"activities": [
{
"name": "Copy data1",
"type": "Copy",
"dependsOn": [],
"policy": {
"timeout": "0.12:00:00",
"retry": 0,
"retryIntervalInSeconds": 30,
"secureOutput": false,
"secureInput": false
},
"userProperties": [],
"typeProperties": {
"source": {
"type": "BinarySource",
"storeSettings": {
"type": "SftpReadSettings",
"recursive": true,
"disableChunking": false
},
"formatSettings": {
"type": "BinaryReadSettings"
}
},
"sink": {
"type": "BinarySink",
"storeSettings": {
"type": "AzureBlobStorageWriteSettings",
"pageBlob": true
}
},
"enableStaging": false
},
"inputs": [
{
"referenceName": "Binary1",
"type": "DatasetReference"
}
],
"outputs": [
{
"referenceName": "Binary2Sink",
"type": "DatasetReference"
}
]
}
],
"annotations": []
}
}
Also, the json for the blob sink is like this:
{
"name": "Binary2Sink",
"properties": {
"linkedServiceName": {
"referenceName": "AzureBlobStorage1backupexcelinegeneralp",
"type": "LinkedServiceReference"
},
"annotations": [],
"type": "Binary",
"typeProperties": {
"location": {
"type": "AzureBlobStorageLocation",
"fileName": "XXXX.bak",
"container": "backup"
},
"blobType": "PageBlob"
}
}
}
none of the blobType parameters work.
Thanks in advance!