Hi,
Are you familiar with Data Factory? You could create a for-each pipeline to copy data from the source to the destination and create the destination tables.
This browser is no longer supported.
Upgrade to Microsoft Edge to take advantage of the latest features, security updates, and technical support.
Hi folks!
The official doc is quite poor regarding the topic info.
Here is an instruction: Onboarding into serverless compute tier
https://learn.microsoft.com/en-us/azure/azure-sql/database/serverless-tier-overview#onboarding-into-serverless-compute-tier
with just one phrase:
A serverless database can be moved into a provisioned compute tier in the same way as moving a provisioned compute database into a serverless compute tier.
OK. Great. Now when you try to run a similar command: az sql db update ...
you catch an error:
Azure SQL Data Warehouse can be updated with the command az sql dw update
.
OK. Let's try the same command with using az sql dw update
(instead of az sql db update
)
In my case it's:
az sql dw update -g SomeResourceGroupName -s SomeServerName -n SomeDBName --edition GeneralPurpose --min-capacity 1 --capacity 4 --family Gen5 --compute-model Provisioned
And..... here you get another error:
az: error: unrecognized arguments: --edition GeneralPurpose --min-capacity 1 --capacity 4 --family Gen5 --compute-model Provisioned
OK. Nice. Let's go to the manual with description:
Azure CLI Commands - az sql dw update
https://learn.microsoft.com/en-us/cli/azure/sql/dw?view=azure-cli-latest#az_sql_dw_update
And..... voala! There are no such arguments at all!
In this way you can only set --service-objective
(e.g. DW100, DW1000c), but it's not the case, besides this scaling up/down can be done just in one click in the settings in Azure portal.
Please add remarks to above-mentioned paragraph that DW/SQL Pool doesn't support such migration.
Now let's get back to main topic. If there is no automatic/fast way to move from the serverless to the provisioned tier then what other options are available ?
I can manually create a provisioned DB and copy all the data there. But here another bad news are awaiting.
AFAIK Azure DWH doesn't support direct cross-DB queries. The only workaround (according to stackoverflow.com) is to set up Elastic Query Feature (just in order to run it once) OR to create an external table with "a few" steps:
There is already an object named 'SomeTableName' in the database.
And it's another trouble with migration that requires intermediate tables and re-naming. I didn't find an option to assign some separate schema to external tables in order to keep the same table names.But anyway, is this really the easiest/fastest way?
It looks like a duct tape. I'm disappointed :(
Best regards,
Vadim.
Hi,
Are you familiar with Data Factory? You could create a for-each pipeline to copy data from the source to the destination and create the destination tables.
Hello @Vadim K and welcome to Microsoft Q&A. I'm sorry you are having such a frusterating experience.
The link you shared, https://learn.microsoft.com/en-us/azure/azure-sql/database/serverless-tier-overview#onboarding-into-serverless-compute-tier ,
is referring to Azure SQL Database, quite different from Azure Synapse. Both Synapse and Azure SQL Database have dedicated and serverless modes.
While migrating tier of Azure SQL Database between serverless and decidated makes sense, it is very different for Synapse.
In Synapse you would provision a pool, then either ingest or create external tables on the files you were using in serverless.