I am using the Azure SQL Edge (mcr.microsoft.com/azure-sql-edge:latest
) together with Azurerite (mcr.microsoft.com/azure-storage/azurite:latest
) for local development purposes using Docker Compose.
# docker-compose.yml
---
services:
# Azure SQL Edge
sqlserver:
image: mcr.microsoft.com/azure-sql-edge:latest
hostname: sqlserver
cap_add:
- SYS_PTRACE
init: true
environment:
ACCEPT_EULA: "Y"
MSSQL_SA_PASSWORD: "${MSSQL_SA_PASSWORD}"
MSSQL_PID: Developer
ports:
- 1433:1433 # for Azure Edge SQL Service
- 10000:10000 # for Azurerite Blob Service
volumes:
- db_data:/var/opt/mssql
# Azurerite
blobstore:
image: mcr.microsoft.com/azure-storage/azurite:latest
command: >
azurite-blob \
--blobHost 0.0.0.0 \
--blobPort 10000 \
--disableProductStyleUrl \
--debug /tmp/debug.log
# NOTE: joins same network as sqlserver container for access via localhost
network_mode: service:sqlserver
volumes:
- blob_data:/azurite:/data
volumes:
db_data: {}
blob_data: {}
Note: The Azurerite container is configured to use the same network as Azure SQL Edge, so it's accessible using localhost
from the SQL server, and I've explicitly set the Azure Edge container hostname to sqlserver
(to resolve connection issues when connecting from the host machine).
I have a created a public blob container called foo
in Azurite Blob Storage and defined external data sources in Azure SQL Edge as follows.
create external data source test_with_ip_address
with (
type = blob_storage,
location = 'http://127.0.0.1:10000/devstoreaccount1/foo'
);
go
create external data source test_with_localhost
with (
type = blob_storage,
location = 'http://localhost:10000/devstoreaccount1/foo'
);
go
create external data source test_with_container_hostname
with (
type = blob_storage,
location = 'http://sqlserver:10000/devstoreaccount1/foo'
);
go
create external data source test_with_invalid_location
with (
type = blob_storage,
location = 'this is not a valid location!'
);
go
When trying to issue a BULK INSERT ...
using any of the external data sources, I receive a Bad or inaccessible location specified in external data source <external_data_source_name>
error.
create table test ( id int );
go
bulk insert test
from 'blob-of-id-values'
with ( data_source = 'test_with_ip_address' );
go
bulk insert test
from 'blob-of-id-values'
with ( data_source = 'test_with_localhost' );
go
bulk insert test
from 'blob-of-id-values'
with ( data_source = 'test_with_container_hostname' );
go
bulk insert test
from 'blob-of-id-values'
with ( data_source = 'test_with_invalid_location' );
go
I've also tried using a private blob container together with SAS credentials, but get the same error.
Any help is appreciated!