Woody Chan
Woody Chan

Reputation: 69

Imported Azure SQL Database always sizes at 250GB

I have the issue that setting up a new SQL DB bacpac import (S3 Standard) always has a max size of 250 GB - just as if the parameter 'DatabaseMaxSizeBytes' gets omitted. Here the code:

$sdvobjDbImport = New-AzureRmSqlDatabaseImport `
                    -ResourceGroupName $sdvstrSqlsResourceGroupName `
                    -ServerName $sdvstrSqlsName `
                    -AdministratorLogin $sdvstrAdminLI `
                    -AdministratorLoginPassword $sdvsstrAdminPW `
                    -DatabaseName $sdvstrDatabaseName `
                    -Edition $sdvstrDbEditionAtImport `
                    -ServiceObjectiveName $sdvstrServiceObjectiveAtImport `
                    -DatabaseMaxSizeBytes 262144000 `
                    -StorageKey $sdvstrStorageKey `
                    -StorageKeyType 'StorageAccessKey' `
                    -StorageUri $sdvstrStorageUri `
                    -EA Stop

It should be 250 MB, but not GB. I don't need such a monster; scaling down afterwards (from 250 GB to 250 MB) gives problems with long operations times on the DB. Any idea what is wrong in my code? Google does not give an answer, either.

Upvotes: 0

Views: 1111

Answers (2)

Alberto Morillo
Alberto Morillo

Reputation: 15618

Az is replacing AzureRM so this bug probably won't be fixed. The solution is using New-AzSqlDatabaseImport instead of New-AzureRmSqlDatabaseImport.

Here is an example of how to use it.

$importRequest = New-AzSqlDatabaseImport 
   -ResourceGroupName "<your_resource_group>" `
   -ServerName "<your_server>" `
   -DatabaseName "<your_database>" `
   -DatabaseMaxSizeBytes "<database_size_in_bytes>" `
   -StorageKeyType "StorageAccessKey" `
   -StorageKey $(Get-AzStorageAccountKey -ResourceGroupName "<your_resource_group>" -StorageAccountName "<your_storage_account").Value[0] `
   -StorageUri "https://myStorageAccount.blob.core.windows.net/importsample/sample.bacpac" `
   -Edition "Standard" `
   -ServiceObjectiveName "P6" `
   -AdministratorLogin "<your_server_admin_account_user_id>" `
   -AdministratorLoginPassword $(ConvertTo-SecureString -String "<your_server_admin_account_password>" -AsPlainText -Force)

As you can see here they are favoring the use of New-AzSqlDatabaseImport.

Upvotes: 0

Leon Yue
Leon Yue

Reputation: 16401

I tested and have the same problem: the parameter -DatabaseMaxSizeBytes doesn't works. No matter which value we use, it will all create the database with the max storage 250G (DTU Standard S2).

The Azure Document use -DatabaseMaxSizeBytes 5000000, I tested and it also doesn't works.

Solution:

After the import completed, We must set the database size manually, here's the sample command:

Set-AzSqlDatabase -DatabaseName "TestDB" -ServerName "sqlservername" -ResourceGroupName "resourcegroup" -MaxSizeBytes "104857600"

Note:

The value of -MaxSizeBytes must be: 100M, 500m, 1GB, 2GB, 5GB, 10GB, 20GB, 30GB, 40GB, 50GB,100GB, 150GB, 200GB, 250GB.

You also can test

You can get this value from Portal:

enter image description here

I use the new Azure PowerShell Az module.

Hope this helps.

Upvotes: 1

Related Questions