Restoring an Azure SQL database with New-AzureRmSqlDatabaseImport results in error "40619: The edition 'Standard' does not support the database data max size '52428800'"

Urs Enzler 41 Reputation points
2020-10-15T12:58:02.777+00:00

In an Azure DevOps pipeline, we restore a database (it is a small one) with the following PowerShell Script:

$importRequest = New-AzureRmSqlDatabaseImport `
  -ResourceGroupName "Testing" `
  -ServerName "$(sqltestservername)" `
  -DatabaseName "$(sqltestdatabase)" `
  -DatabaseMaxSizeBytes 5000000 `
  -StorageKeyType "StorageAccessKey" `
  -StorageKey $(Get-AzureRmStorageAccountKey -ResourceGroupName "XYZ" -StorageAccountName "XYZ").Value[0] `
  -StorageUri "https://XYZ.bacpac" `
  -Edition Standard `
  -ServiceObjectiveName S0 `
  -AdministratorLogin "XYZ" `
  -AdministratorLoginPassword $(ConvertTo-SecureString -String "XYZ" -AsPlainText -Force)

This results in the following error:
[error]40619: The edition 'Standard' does not support the database data max size '5000000'.

As we understand it:

  • the number is in bytes, so 5000000 are close to 50MB, which is way below the max DB size of the Standard tier at 250GB.

The script ran successfully for many months until around 14th October 2020. Then the error appeared. The database and the script did not change between successful and failing runs.

So we assume that something on the service side has changed?!?!

Azure SQL Database
{count} votes

Accepted answer
  1. Simon Redman 76 Reputation points Microsoft Employee
    2020-10-16T20:42:38.597+00:00

    Hi @Urs Enzler ,

    50MB is not a supported value for a Standard S3 database. Azure SQL Databases cannot be created in any random size in the supported range, they must be a specific value. The available values for DTU-based databases are specified here: https://learn.microsoft.com/en-us/sql/t-sql/statements/create-database-transact-sql?view=azuresqldb-current&tabs=sqlpool#arguments-1

    If 50MB was working before, that was some kind of service anomaly. Rejecting 50MB is the correct behavior.

    However, I was testing earlier today and I noticed that 250MB is also not working even though it's supposed to be supported. I will check whether this is a documentation issue (and it is really not supposed to be supported) or whether it's a service code issue.


1 additional answer

Sort by: Most helpful
  1. Jakub Procházka 6 Reputation points
    2020-12-15T20:29:00.797+00:00

    I just run into same problem and find another doc page that is using unsupported value in example script https://learn.microsoft.com/en-us/azure/azure-sql/database/scripts/import-from-bacpac-powershell

    1 person found this answer helpful.
    0 comments No comments

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.