Home / News / DataALM in a Dynamics 365 Finance & SCM environment – Part 2 of 2
16 June 2022
By: Simon van Dijk

DataALM in a Dynamics 365 Finance & SCM environment – Part 2 of 2

Featured image for a blog about DataALM in a D365 F&SCM environment

Save time by automating

In the first part of this small blog series, we have already seen an example of DataALM, how to run an automated database restore, which preparations can be done, and which post-refresh cleanup and reconfiguration activities are needed. Some of the described post-refresh tasks really have to be performed manually, like restoring backed up newly created financial reports, security configurations, data management templates and/or specific setup data in the sandbox environment which are not yet available in the production database and need to be saved.

But most other post-refresh tasks can be automated by scripting, saving you a lot of time by completing the required activities much more quickly and efficiently. In this last blog part, we will discuss different options to accomplish this.

 

Automatic reconfiguration via SQL script

One scripting option is to use direct SQL to automatically create and update the needed environment-specific settings for the sandbox. You can create such a script in Microsoft SQL Server Management Studio running on your Tier-1 development sandbox for example, by logging on to the target sandbox database server, selecting the Azure database and then executing “Generate Scripts…” from the Tasks menu:

Next, you choose “Select specific database objects” and select all tables for which you want to save the data. You will probably include at least the PrintMgmtSettings and PrintMgmtDocInstance tables because it is always a pain in the ass that after a database refresh all print management settings are gone.

Then in the “Scripting Options” step, you press the Advanced button and set “Types of data to script” to value “Data only”:

In the final steps, you can select “Save Scripts” and the data creation script is automatically generated. Afterwards, you can of course extend and update this script to let it meet your additional requirements.

 

Run the SQL script in an Azure DevOps pipeline

The created SQL script can for example be executed via a Powershell task in an Azure DevOps release pipeline. The downside is that before running this pipeline, you first need to request database access in LCS, which is only valid for 8 hours. So, although you can achieve some automation of the post-refresh tasks, this option still requires a few manual activities and is only suited for ad hoc situations.

To get a Just-In-Time (JIT) username and password, you log on to LCS, go to the “Full details” of the environment for which you have refreshed the database and go the “Manage environment” section. There you select “Performance tuning for AX (write to AX)”, enter a short description in the “Details” field and press the “Request Access” button:

Then copy the provided JIT username and password.

Next, you also have to enable database access by adding an Inbound Allow rule for the public IP address of the Tier-1 (for example) development sandbox from which you will be running the SQL script in the target Azure database. Select “Enable access” from the Maintain menu and add an Inbound Allow rule by pressing the plus sign and entering the correct information:

After these extra steps, you can now run the Azure DevOps pipeline. Log on to the Azure DevOps project of your customer and select Pipelines > Releases > Set environment-specific values and press the “Create release” button:

 

In the dialog that has appeared, you then specify the JIT username and password retrieved from LCS and press Create:

Open the newly created release and select Deploy on the “Start VM” stage:

The “Run SQL script” will be automatically triggered as soon as the “Start VM” stage has finished. This stage could be defined as follows:

To execute the SQL statements in PowerShell, the “Invoke-Sqlcmd” command is being used. You can execute separate SQL statements with this command or execute a SQL script file as in our example:

Upon successful completion, this pipeline will have set all desired environment-specific values in your sandbox environment like print management settings, etcetera.

 

What about a Runnable class or a Custom service?

Another automatic reconfiguration scripting option is to create a runnable class (the good old job script from AX) to set the environment-specific values. This class can be run via the SysClassRunner functionality in Dynamics and is especially useful for restoring the values of the encrypted fields which were cleared during the database refresh, because this cannot be done in a SQL script.

Instead of a runnable class you can also build a custom web service endpoint that can be called on demand after the database refresh is completed.

In these code-based scripting options you might also want to include the automatic configuration as done by means of the previously described SQL script. Of course, when you want to change something in this, you will need a code change while a SQL script on the other hand can be changed at any time without having to go through the build and release processes that go along with that code change. However, if you want to do environment database refreshes with a frequent recurrence, for example every day, then these code-based scripting options really are the way to go!

 

Conclusion

This brings us to the end of this two-part blog. In this last part, we have seen some automatic post-refresh reconfiguration options to handle the required activities much more quickly and efficiently.

I hope you enjoyed reading this blog series and learnt another thing or two. Till next time!