Hello @Benny Lau ,Shui Hong - Group Office ,
Thanks for the ask and welcome to Microsoft Q&A .
As I understand the ask here is to where the model is saved and how you can save to the blob .
As per the document here : [https://learn.microsoft.com/en-us/azure/databricks/mlflow/models#api-commands
You have three option and I assume that your model file is getting stored in the DBFS on the Azure databricks cluster .
Databricks can save a machine learning model to an Azure Storage Container using the dbutils.fs
module. This module provides a set of functions for interacting with the Databricks file system (DBFS) and Azure Blob Storage. Here is an example of how to save a model to an Azure Storage Container:
- First, you will need to mount the Azure Storage Container to DBFS, this can be done using the
dbutils.fs.mount
function.
dbutils.fs.mount(
source='wasbs://<your-container-name>@<your-storage-account-name>.blob.core.windows.net',
mount_point='/mnt/<your-mount-point>',
extra_configs={
"fs.azure.account.auth.type": "OAuth",
"fs.azure.account.oauth.provider.type": "org.apache.hadoop.fs.azurebfs.oauth2.ClientCredsTokenProvider",
"fs.azure.account.oauth2.client.id": "<your-client-id>",
"fs.azure.account.oauth2.client.secret": "<your-client-secret>",
"fs.azure.account.oauth2.client.endpoint": "https://login.microsoftonline.com/<your-tenant-id>/oauth2/token"
}
)
- Once the container is mounted, you can use the
dbutils.fs.cp
function to copy the model from the local file system to the mount point.
dbutils.fs.cp("path/to/local/model", "/mnt/<your-mount-point>/model")
- You can also use
model.save()
method to save the model in the mounted container path
model.save("/mnt/<your-mount-point>/model")
Note: Be sure to replace the placeholders in the above code with the appropriate values for your use case.