I've been trying to load data from an Azure Blob (blob type append blob) using SAS into SQL (dev edition 14 - on-prem, using Azure Data Studio).
I've followed the example given here: https://channel9.msdn.com/Shows/Data-Exposed/Load-data-from-Azure-Blob-storage-into-Azure-SQL
setting up the schema, master key, a database scoped credential ("AccessBlobTlsLog"), and the external data source ("dataset").
and finally attempting to import the data with the script below:
SET NOCOUNT ON -- reduces network traffic by stopping the rows affected messages
BULK INSERT TlsViolations.csp_reports -- the destination data table
FROM 'csp.report.json' -- set the path to the blob file
WITH (
DATA_SOURCE = 'dataset' -- using the data source from step 4
, BATCHSIZE = 2048 -- this is the default size of a block of data in the blob
, TABLOCK -- Minimize number of log records for the insert operation
)
after a little over 4 minutes I get the following errror:
Msg 4861, Level 16, State 1, Line 2
Cannot bulk load because the file "csp.report.json" could not be opened. Operating system error code 12150(failed to retrieve text for this error. Reason: 15105).
Total execution time: 00:04:21.561
I also tried this:\
SELECT VALUE
FROM OPENROWSET (BULK 'csp.report.json', DATA_SOURCE = 'dataset', SINGLE_CLOB) as json
CROSS APPLY OPENJSON(BulkColumn)
and get the same error.
Anyone got some idea what I'm doing wrong?
I can copy the file from Azure using the SAS url for the blob. So I know the SAS is valid.