Machine Learning datastores do not create the underlying storage account resources. Instead, they link an existing storage account for Machine Learning use. Machine Learning datastores aren't required. If you have access to the underlying data, you can use storage URIs directly.
from azure.ai.ml.entities import AzureBlobDatastore
from azure.ai.ml import MLClient
ml_client = MLClient.from_config()
store = AzureBlobDatastore(
name="",
description="",
account_name="",
container_name=""
)
ml_client.create_or_update(store)
from azure.ai.ml.entities import AzureBlobDatastore
from azure.ai.ml.entities import AccountKeyConfiguration
from azure.ai.ml import MLClient
ml_client = MLClient.from_config()
store = AzureBlobDatastore(
name="blob_protocol_example",
description="Datastore pointing to a blob container using https protocol.",
account_name="mytestblobstore",
container_name="data-container",
protocol="https",
credentials=AccountKeyConfiguration(
account_key="XXXxxxXXXxXXXXxxXXXXXxXXXXXxXxxXxXXXxXXXxXXxxxXXxxXXXxXxXXXxxXxxXXXXxxxxxXXxxxxxxXXXxXXX"
),
)
ml_client.create_or_update(store)
from azure.ai.ml.entities import AzureBlobDatastore
from azure.ai.ml.entities import SasTokenConfiguration
from azure.ai.ml import MLClient
ml_client = MLClient.from_config()
store = AzureBlobDatastore(
name="blob_sas_example",
description="Datastore pointing to a blob container using SAS token.",
account_name="mytestblobstore",
container_name="data-container",
credentials=SasTokenConfiguration(
sas_token= "?xx=XXXX-XX-XX&xx=xxxx&xxx=xxx&xx=xxxxxxxxxxx&xx=XXXX-XX-XXXXX:XX:XXX&xx=XXXX-XX-XXXXX:XX:XXX&xxx=xxxxx&xxx=XXxXXXxxxxxXXXXXXXxXxxxXXXXXxxXXXXXxXXXXxXXXxXXxXX"
),
)
ml_client.create_or_update(store)
Create the following YAML file (update the appropriate values):
# my_blob_datastore.yml
$schema: https://azuremlschemas.azureedge.net/latest/azureBlob.schema.json
name: my_blob_ds # add your datastore name here
type: azure_blob
description: here is a description # add a datastore description here
account_name: my_account_name # add the storage account name here
container_name: my_container_name # add the storage container name here
Create the Machine Learning datastore in the Azure CLI:
az ml datastore create --file my_blob_datastore.yml
Create this YAML file (update the appropriate values):
from azure.ai.ml.entities import AzureDataLakeGen2Datastore
from azure.ai.ml import MLClient
ml_client = MLClient.from_config()
store = AzureDataLakeGen2Datastore(
name="",
description="",
account_name="",
filesystem=""
)
ml_client.create_or_update(store)
from azure.ai.ml.entities import AzureDataLakeGen2Datastore
from azure.ai.ml.entities._datastore.credentials import ServicePrincipalCredentials
from azure.ai.ml import MLClient
ml_client = MLClient.from_config()
store = AzureDataLakeGen2Datastore(
name="adls_gen2_example",
description="Datastore pointing to an Azure Data Lake Storage Gen2.",
account_name="mytestdatalakegen2",
filesystem="my-gen2-container",
credentials=ServicePrincipalCredentials(
tenant_id= "XXXXXXXX-XXXX-XXXX-XXXX-XXXXXXXXXXXX",
client_id= "XXXXXXXX-XXXX-XXXX-XXXX-XXXXXXXXXXXX",
client_secret= "XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX",
),
)
ml_client.create_or_update(store)
Create this YAML file (update the values):
# my_adls_datastore.yml
$schema: https://azuremlschemas.azureedge.net/latest/azureDataLakeGen2.schema.json
name: adls_gen2_credless_example
type: azure_data_lake_gen2
description: Credential-less datastore pointing to an Azure Data Lake Storage Gen2 instance.
account_name: mytestdatalakegen2
filesystem: my-gen2-container
Create the Machine Learning datastore in the CLI:
az ml datastore create --file my_adls_datastore.yml
Create this YAML file (update the values):
# my_adls_datastore.yml
$schema: https://azuremlschemas.azureedge.net/latest/azureDataLakeGen2.schema.json
name: adls_gen2_example
type: azure_data_lake_gen2
description: Datastore pointing to an Azure Data Lake Storage Gen2 instance.
account_name: mytestdatalakegen2
filesystem: my-gen2-container
credentials:
tenant_id: XXXXXXXX-XXXX-XXXX-XXXX-XXXXXXXXXXXX
client_id: XXXXXXXX-XXXX-XXXX-XXXX-XXXXXXXXXXXX
client_secret: XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
Create the Machine Learning datastore in the CLI:
az ml datastore create --file my_adls_datastore.yml
from azure.ai.ml.entities import AzureFileDatastore
from azure.ai.ml.entities import AccountKeyConfiguration
from azure.ai.ml import MLClient
ml_client = MLClient.from_config()
store = AzureFileDatastore(
name="file_example",
description="Datastore pointing to an Azure File Share.",
account_name="mytestfilestore",
file_share_name="my-share",
credentials=AccountKeyConfiguration(
account_key= "XXXxxxXXXxXXXXxxXXXXXxXXXXXxXxxXxXXXxXXXxXXxxxXXxxXXXxXxXXXxxXxxXXXXxxxxxXXxxxxxxXXXxXXX"
),
)
ml_client.create_or_update(store)
from azure.ai.ml.entities import AzureFileDatastore
from azure.ai.ml.entities import SasTokenConfiguration
from azure.ai.ml import MLClient
ml_client = MLClient.from_config()
store = AzureFileDatastore(
name="file_sas_example",
description="Datastore pointing to an Azure File Share using SAS token.",
account_name="mytestfilestore",
file_share_name="my-share",
credentials=SasTokenConfiguration(
sas_token="?xx=XXXX-XX-XX&xx=xxxx&xxx=xxx&xx=xxxxxxxxxxx&xx=XXXX-XX-XXXXX:XX:XXX&xx=XXXX-XX-XXXXX:XX:XXX&xxx=xxxxx&xxx=XXxXXXxxxxxXXXXXXXxXxxxXXXXXxxXXXXXxXXXXxXXXxXXxXX"
),
)
ml_client.create_or_update(store)
from azure.ai.ml.entities import AzureDataLakeGen1Datastore
from azure.ai.ml import MLClient
ml_client = MLClient.from_config()
store = AzureDataLakeGen1Datastore(
name="",
store_name="",
description="",
)
ml_client.create_or_update(store)
from azure.ai.ml.entities import AzureDataLakeGen1Datastore
from azure.ai.ml.entities._datastore.credentials import ServicePrincipalCredentials
from azure.ai.ml import MLClient
ml_client = MLClient.from_config()
store = AzureDataLakeGen1Datastore(
name="adls_gen1_example",
description="Datastore pointing to an Azure Data Lake Storage Gen1.",
store_name="mytestdatalakegen1",
credentials=ServicePrincipalCredentials(
tenant_id= "XXXXXXXX-XXXX-XXXX-XXXX-XXXXXXXXXXXX",
client_id= "XXXXXXXX-XXXX-XXXX-XXXX-XXXXXXXXXXXX",
client_secret= "XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX",
),
)
ml_client.create_or_update(store)
Create this YAML file (update the values):
# my_adls_datastore.yml
$schema: https://azuremlschemas.azureedge.net/latest/azureDataLakeGen1.schema.json
name: alds_gen1_credless_example
type: azure_data_lake_gen1
description: Credential-less datastore pointing to an Azure Data Lake Storage Gen1 instance.
store_name: mytestdatalakegen1
Create the Machine Learning datastore in the CLI:
az ml datastore create --file my_adls_datastore.yml
Create this YAML file (update the values):
# my_adls_datastore.yml
$schema: https://azuremlschemas.azureedge.net/latest/azureDataLakeGen1.schema.json
name: adls_gen1_example
type: azure_data_lake_gen1
description: Datastore pointing to an Azure Data Lake Storage Gen1 instance.
store_name: mytestdatalakegen1
credentials:
tenant_id: XXXXXXXX-XXXX-XXXX-XXXX-XXXXXXXXXXXX
client_id: XXXXXXXX-XXXX-XXXX-XXXX-XXXXXXXXXXXX
client_secret: XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
Create the Machine Learning datastore in the CLI:
az ml datastore create --file my_adls_datastore.yml
Create a OneLake (Microsoft Fabric) datastore (preview)
This section describes various options to create a OneLake datastore. The OneLake datastore is part of Microsoft Fabric. At this time, Machine Learning supports connection to Microsoft Fabric lakehouse artifacts in "Files" folder that include folders or files and Amazon S3 shortcuts. For more information about lakehouses, see What is a lakehouse in Microsoft Fabric?.
OneLake datastore creation requires the following information from your Microsoft Fabric instance:
Endpoint
Workspace GUID
Artifact GUID
The following screenshots describe the retrieval of these required information resources from your Microsoft Fabric instance.
You will then find "Endpoint", "Workspace GUID" and "Artifact GUID" in "URL" and "ABFS path" from the "Properties" page: