Export to Azure SQL Database

Important

Support for Machine Learning Studio (classic) will end on 31 August 2024. We recommend you transition to Azure Machine Learning by that date.

Beginning 1 December 2021, you will not be able to create new Machine Learning Studio (classic) resources. Through 31 August 2024, you can continue to use the existing Machine Learning Studio (classic) resources.

ML Studio (classic) documentation is being retired and may not be updated in the future.

This article describes how to use the Export to Azure SQL Database option in the Export Data module in Machine Learning Studio (classic). This option is useful when you want to export data from your machine learning experiment to an Azure SQL Database or Azure SQL Data Warehouse.

Note

Applies to: Machine Learning Studio (classic) only

Similar drag-and-drop modules are available in Azure Machine Learning designer.

Export to a SQL database is useful in many machine learning scenarios: for example, you might want to store intermediate results, save scores, or persist tables of engineered features. Although storing data in an Azure SQL Database or Azure SQL Data Warehouse can be more expensive than using tables or blobs in Azure, there are no transaction fees against SQL databases. Moreover, database storage is ideal for quickly writing smaller amounts of frequently used information, for sharing data between experiments, or for reporting results, predictions, and metrics.

On the other hand, there might be limits on the amount of data that you can store in a database, depending on your subscription type. You should also consider using a database and account that is in the same region as your machine learning workspace.

To export data, you provide the instance name and database name where the data is stored, and run the module using an account that has write permissions. You must also specify the table name, and map the columns from your experiment to columns in the table.

How to export data to an Azure SQL Database

  1. Add the Export Data module to your experiment in Studio (classic). You can find this module in the Data Input and Output category.

  2. Connect Export data to the module that produces the data that you want to export.

  3. For Data destination, select Azure SQL Database. This option supports Azure SQL Data Warehouse as well.

  4. Indicate the name of the server and database in Azure SQL Database or Azure SQL Data Warehouse.

    Database server name: Type the server name as generated by Azure. Typically it has the form <generated_identifier>.database.windows.net.

    Database name: Type the name of an existing database on the server you just specified. The Export Data module cannot create a database.

    Server user account name: Type the user name for an account that has access permissions for the database.

    Server user account password: Provide the password for the specified user account.

  5. Specify the columns to export, and if you want to rename the columns.

    Comma-separated list of columns to be saved: Type the names of the columns from the experiment that you want to write to the database.

    Data table name: Type the name of the table to store the data in.

    For Azure SQL Database, if the table does not exist, a new table is created.

    For Azure SQL Data Warehouse, the table must already exist and have the correct schema, so be sure to create it in advance.

    Comma-separated list of datatable columns: Type the names of the columns as you wish them to appear in the destination table.

    For Azure SQL Database, you can change the column names, but you must keep the columns in the same order that you listed the columns for export, in Comma-separated list of columns to be saved.

    For Azure SQL Data Warehouse, the columns names must match those already in the destination table schema.

  6. Number of rows written per SQL Azure operation: This option specifies how many rows should be written to the destination table in each batch.

    By default, the value is set to 50, which is the default batch size for Azure SQL Database. However, you should increase this value if you have a large number of rows to write.

    For Azure SQL Data Warehouse, we recommend that you set this value to 1. If you use a larger batch size, the size of the command string that is sent to Azure SQL Data Warehouse can exceed the allowed string length, causing an error.

  7. Use cached results: Select this option to avoid writing new results each time the experiment is run. If there are no other changes to module parameters, the experiment writes the data only the first time the module is run. However, a new write is always performed if any parameters have been changed in Export Data that would change the results.

  8. Run the experiment.

Examples

For examples of how to use the Export Data module, see the Azure AI Gallery:

Technical notes

This section contains implementation details, tips, and answers to frequently asked questions.

Common questions

Using a database in a different geographical region

If the Azure SQL Database or SQL Data Warehouse is in a different region from the machine learning account, writes might be slower.

Also, you are charged for data ingress and egress on the subscription if the compute node is in a different region than the storage account.

Why are some characters in the output data not displayed correctly

Machine Learning supports the UTF-8 encoding. If string columns in your database use a different encoding, the characters might not be saved correctly.

Also, Machine Learning cannot output data types such as money.

Module parameters

Name Range Type Default Description
Data source List Data Source Or Sink Azure Blob Storage Data source can be HTTP, FTP, anonymous HTTPS or FTPS, a file in Azure BLOB storage, an Azure table, an Azure SQL Database or Azure SQL Data Warehouse, a Hive table or an OData endpoint.
Database server name any String none
Database name any String none
Server user account name any String none
Server user account password none
Comma separated list of columns to be saved none
Data table name any String none
Comma separated list of datatable columns String String none String
Number of rows written per SQL Azure operation String Integer 50 String
Use cached results TRUE/FALSE Boolean FALSE Module only executes if valid cache does not exist; otherwise use cached data from prior execution.

Exceptions

Exception Description
Error 0027 An exception occurs when two objects have to be the same size, but they are not.
Error 0003 An exception occurs if one or more of inputs are null or empty.
Error 0029 An exception occurs when an invalid URI is passed.
Error 0030 an exception occurs in when it is not possible to download a file.
Error 0002 An exception occurs if one or more parameters could not be parsed or converted from the specified type to the type required by the target method.
Error 0009 An exception occurs if the Azure storage account name or the container name is specified incorrectly.
Error 0048 An exception occurs when it is not possible to open a file.
Error 0015 An exception occurs if the database connection has failed.
Error 0046 An exception occurs when it is not possible to create a directory on specified path.
Error 0049 An exception occurs when it is not possible to parse a file.

For a list of errors specific to Studio (classic) modules, see Machine Learning Error codes.

For a list of API exceptions, see Machine Learning REST API Error Codes.

See also

Import Data
Export Data
Export to Azure Blob Storage
Export to Hive Query
Export to Azure Table