Golden configuration promotion
Database movement operations are a suite of self-service actions that can be used as part of data application lifecycle management (DataALM). "Golden configuration" refers to a common practice among customers and partners in the Microsoft Dynamics ecosystem, where a developer environment is used as a configuration store. In this way, implementation projects can store finalized global and company-specific settings in a database that can later become a baseline for Conference Room Pilots, mock go-lives, and go-lives. This tutorial shows how to prepare a golden configuration database and hydrate a target user acceptance testing (UAT) environment.
In this tutorial, you will learn how to:
- Prepare the golden configuration database for Microsoft Azure SQL Database.
- Run the import to the target UAT environment.
- Copy the UAT environment into a production environment.
As an example of this scenario, a customer who hasn't gone live is instead preparing for a Conference Room Pilot, mock go-live, or go-live. This scenario supports promoting a baseline golden database from a developer environment to a UAT environment and eventually to production.
To complete this tutorial, you must have a developer environment that is deployed with a database that is curated as a golden configuration database. You must also have at least one standard UAT environment deployed and, optionally, a production environment.
The developer environment must run the same application version as the target UAT environment. In addition, the platform version of the developer environment must be earlier than or the same as the platform version in the target UAT environment.
Before you begin
Supported SQL Server collation
The only supported collation databases in the cloud is SQL_Latin1_General_CP1_CI_AS. Make sure that your Microsoft SQL Server and database collations in development environments are set to this value. Also make sure that any configuration environments that are published to sandbox have this collation.
Document the values of encrypted fields
Encrypted and environment-specific values can't be imported into a new environment. After you've completed the import, you must reenter some data from your source environment in your target environment.
Because of a technical limitation that is related to the certificate that is used for data encryption, values that are stored in encrypted fields in a database will be unreadable after that database is imported into a new environment. Therefore, after an import, you must manually delete and reenter values that are stored in encrypted fields. New values that are entered in encrypted fields after an import will be readable. The following fields are affected. The field names are given in Table.Field format.
|Field name||Where to set the value|
|CreditCardAccountSetup.SecureMerchantProperties||Select Accounts receivable > Payments setup > Payment services.|
|ExchangeRateProviderConfigurationDetails.Value||Select General ledger > Currencies > Configure exchange rate providers.|
|FiscalEstablishment_BR.ConsumerEFDocCsc||Select Organization administration > Fiscal establishments > Fiscal establishments.|
|FiscalEstablishmentStaging.CSC||This field is used by the Data Import/Export Framework (DIXF).|
|HcmPersonIdentificationNumber.PersonIdentificationNumber||Select Human resources > Workers > Workers. On the Worker tab, in the Personal information group, select Identification numbers.|
|HcmWorkerActionHire.PersonIdentificationNumber||This field has been obsolete since Microsoft Dynamics AX 7.0 (February 2016). It previously appeared on the All worker actions page (Human resources > Workers > Actions > All worker actions).|
|SysEmailSMPTPassword.Password||Select System administration > Email > Email parameters.|
|SysOAuthUserTokens.EncryptedAccessToken||This field is used internally by Application Object Server (AOS). It can be ignored.|
|SysOAuthUserTokens.EncryptedRefreshToken||This field is used internally by AOS. It can be ignored.|
If you're running Commerce components, document encrypted and environment-specific values
The values on the following pages are either environment-specific or encrypted in the database. Therefore, all the imported values will be incorrect.
- Payments services (Accounts receivable > Payments setup > Payments services)
- Hardware profiles (Retail and commerce > Channel setup > POS setup > POS profiles > Hardware profiles)
Create a copy of the source database
Back up the source database using SSMS. Right-click the source database, and select Tasks > Backup option. After this is completed, right-click the Databases folder in the SSMS navigation window, and select Restore database. Choose the backup that you just made, but give the target database a new name such as AXDB_CopyForExport.
Prepare the database
Run the following script against the AxDB_CopyForExport database that you created in the previous section. This script makes the following changes:
- Set the SysGlobalConfiguration flag to inform the application that the database is Azure-based.
- Remove a reference to tempDB in the XU_DisableEnableNonClusteredIndexes procedure. References to tempDB aren't allowed in an Azure SQL database. The database synchronization process will re-create the reference later.
- Drop users, because Microsoft Windows users are forbidden in Azure SQL databases. Other users must be re-created later, so that they're correctly linked to the appropriate sign-in on the target server.
- Clear encrypted hardware profile merchant properties.
A successful export and import of the database requires all these changes.
update sysglobalconfiguration set value = 'SQLAZURE' where name = 'BACKENDDB' update sysglobalconfiguration set value = 1 where name = 'TEMPTABLEINAXDB' drop procedure if exists XU_DisableEnableNonClusteredIndexes drop procedure if exists SP_ConfigureTablesForChangeTracking drop procedure if exists SP_ConfigureTablesForChangeTracking_V2 drop schema [NT AUTHORITY\NETWORK SERVICE] drop user [NT AUTHORITY\NETWORK SERVICE] drop user axdbadmin drop user axdeployuser drop user axmrruntimeuser drop user axretaildatasyncuser drop user axretailruntimeuser drop user axdeployextuser --Tidy up the batch server config from the previous environment DELETE FROM SYSSERVERCONFIG --Tidy up server sessions from the previous environment DELETE FROM SYSSERVERSESSIONS --Tidy up printers from the previous environment DELETE FROM SYSCORPNETPRINTERS --Tidy up client sessions from the previous environment DELETE FROM SYSCLIENTSESSIONS --Tidy up batch sessions from the previous environment DELETE FROM BATCHSERVERCONFIG --Tidy up batch server to batch group relation table DELETE FROM BATCHSERVERGROUP -- Clear encrypted hardware profile merchant properties update dbo.RETAILHARDWAREPROFILE set SECUREMERCHANTPROPERTIES = null where SECUREMERCHANTPROPERTIES is not null
Export the database from SQL Server
Open a Command Prompt window, and run the following commands.
The 140 folder reflects the current version. You must use the version that is available in your sandbox environment. Therefore, you might have to install the latest version of Microsoft SQL Server Management Studio in your development environment.
cd C:\Program Files (x86)\Microsoft SQL Server\140\DAC\bin\ SqlPackage.exe /a:export /ssn:localhost /sdn:<database to export> /tf:D:\Exportedbacpac\my.bacpac /p:CommandTimeout=1200 /p:VerifyFullTextDocumentTypesSupported=false
Here is an explanation of the parameters:
- ssn (source server name) – The name of the SQL Server to export from. For the purposes of this article, the name should always be localhost.
- sdn (source database name) – The name of the database to export.
- tf (target file) – The path and name of the file to export to. The folder should already exist, but the export process will create the file.
Import the database
Upload the .bacpac file that was created in the previous step to the Database backup section in your LCS project's Asset Library. Then begin the import. The target UAT environment's databases will be overwritten by the golden configuration database.
Certain elements are not copied as part of the import database step. In the golden configuration scenario, this would impact things such as Email Addresses and Print Management setup. These settings ideally should be populated as part of the master data migration in the steps below, and should not be part of the golden configuration database.
To import a database that is prepared from a developer environment to a standard user acceptance test (UAT), or a database previously exported from a UAT environment, follow the steps outlined below:
- Go to your target sandbox Environment Details page, and select the Maintain > Move database menu option.
- Select Import database and choose your source database backup (.bacpac format) file from the Asset Library.
- Note the warnings. Review the list of data elements that are cleaned up from the backup file.
- The import operation will begin immediately.
All users except the Admin user and other internal service user accounts will be unavailable after import. Therefore, the Admin user can delete or obfuscate data before other users are allowed back into the system.
To import a database to a developer environment after you've downloaded a database backup (.bacpac) file, you can begin the manual import operation on your Tier 1 environment. When you import the database, we recommend that you follow these guidelines:
- Keep a copy of the existing AxDB database, so that you can revert to it later if needed.
- Import the new database under a new name, such as AxDB_fromProd.
To ensure the best performance, copy the *.bacpac file to the local computer that you're importing from. Download sqlpackage .NET Core for Windows from Get sqlpackage .NET Core for Windows. Open a Command Prompt window, and run the following commands from the sqlpackage .NET Core folder.
SqlPackage.exe /a:import /sf:D:\Exportedbacpac\my.bacpac /tsn:localhost /tdn:<target database name> /p:CommandTimeout=1200
Here is an explanation of the parameters:
- tsn (target server name) – The name of the Microsoft SQL Server instance to import into.
- tdn (target database name) – The name of the database to import into. The database should not already exist.
- sf (source file) – The path and name of the file to import from.
During import, the user name and password aren't required. By default, SQL Server uses Microsoft Windows authentication for the user who is currently signed in.
For information about how to complete the manual import operations into a Tier 1 environment, see Import the database.
Perform master data migration
Now that the UAT environment is hydrated with the golden configuration, you can begin to migrate master data. You can do this data migration by using data entities. We recommend that you complete your data migration activities before you copy the UAT environment to production, because you will have access to the database in the UAT environment for troubleshooting.
Files stored in Azure blob storage are not copied from UAT to Production in the next step. This includes document attachments and custom Microsoft Office templates. If your go-live requires attachments or custom templates, you will want to import those in the Production environment directly.
Copy the sandbox database to production
When you're ready to do a mock go-live or actual go-live, you can copy the UAT environment to production. This process is often referred to as cutover. We recommend that you do a cutover more than one time before your actual go-live. In this way, you can get detailed time estimates for each step of the process.
Determine the Environment type of your production environment and follow the relevant steps accordingly.
- In LCS, open the Full details for the production environment to load the Environment page.
- In the Maintain menu, select Move database.
- For the operations options, select Refresh database.
- In the Source environment, select the sandbox where your golden configuration is. Note the important instructions found on the Refresh database page for this operation.
- Select the check box to confirm that you understand this operation will overwrite the production database. The operation starts immediately after submitting the request.
In LCS, on the project home page, select Service requests.
On the Service requests page, select Add, and then select Sandbox to Production.
In the Sandbox to Production dialog box, follow these steps:
- In the Source environment name field, select the sandbox environment to copy the database from.
- Set the Preferred downtime start date and Preferred downtime end date fields. The end date must be at least four hours after the start date. To help ensure that resources are available to run the request, it's recommended that you submit your request at least 24 hours before your preferred downtime window.
- Select the check boxes at the bottom to agree to the terms.
Every database refresh will create a new database that will reset the Point-in-time-restore chain of restore points.
Reconfigure environment specific settings
After the refresh is completed, use the Sign off button in LCS to close out of the operation. You then can start to configure the environment-specific settings.
First, sign in to the environment by using the admin account that can be found on the Environment details page in LCS. Here are some typical areas of reconfiguration. You might require additional reconfiguration, based on your setup and the independent software vendor (ISV) solutions that are installed:
- System administration > Setup > Batch groups: Add the various AOS instances to the batch server groups that you require.
- System administration > Setup > Entity Store: Update the various entities that you require for Microsoft Power BI reporting.
- System administration > Setup > System parameters: Reconnect the environment to the LCS Help configuration for task guides.
- System administration > Setup > Email > Email parameters: Enter the Simple Mail Transfer Protocol (SMTP) settings if you use email in your UAT environment.
- System administration > Setup > Integration configuration > Azure storage account connection string: Enter the storage account string.
- System administration > Setup > System Parameters: On the Document connections tab enter the Azure Key and Application Secret.
- System administration > Inquiries > Batch jobs: Select the jobs that you want to run in your UAT environment, and update the status to Waiting.
As a best practice, all mission-critical batch jobs that will run with recurrence should be created and run by the admin account. The admin should be a generic user such as
email@example.com. It should not be linked to a specific employee's Azure Active Directory (Azure AD) account, because that account might be disabled later if the employee leaves the company.
Open the environment to users
When the system is configured as you require, you can enable selected users to access the environment. By default, all users except the admin and Microsoft service accounts are disabled.
Go to System administration > Users > Users, and enable the users that should have access to the Production environment. If many users must be enabled, you can complete this task more quickly by using the Microsoft Excel Add-In.
Are you looking for more tools to help you prepare backup files from your developer environments? Here are some other sources of information:
- D365fo.Tools provides many valuable tools that are created by the community.
- Community-provided open source projects on GitHub.
Help us understand
Submit and view feedback for