Automating moving NAV to Azure SQL and multitenancy

In the near future I have to migrate 40-50 single-tenant databases running on an on-premise SQL Server to Azure SQL and multitenancy. I’m new to both multi tenancy and Azure SQL, and Powershell for that matter, but in order to save a lot of time in the future, I decided to create one Powershell script that could do everything for me and save me a lot of hours.

There are many many different ways to achieve this goal, the path I have chosen might not be right for you, but maybe you can use bits of this code and create your own script.

There are some prerequisites

Here’s what the script does:

  • Sets a lot of variables
  • Logs in with your  Azure account
  • Imports all the needed modules
  • Exports all companies from your single-tenant database into .navdata file
  • Exports NAV application data from single-tenant database into new database (in this case on an on-premise SQL 2014 server)
  • It get all company names from the .navdata file and put them into a variable
  • Deletes the [NT AUTHORITY\NETWORK SERVICE] user from the new application database (you can’t import local windows users og domain users to Azure SQL)
  • Connects to SQL server and creates .bacpac file from the new application database
  • Copies .bacpac file from \\yourSQLserver\c$\pshtemp to c:\temp or whatever path you have set in the $NAVBackupFilePath variable.
  • Copies the .bacpac file to Azure blob storage
  • Restores the .bacpac into new database on Azure SQL server
  • Connects to your NAV service tier server and configures the specified instance for multitenancy
  • Creates and imports NAV encryption key
  • Asks user to select which companies should be imported
  • For each company selected it will do this:
    • Create a database on Azure SQL
    • Import NAV data into database
    • Mount tenant
    • Create NAV user with permission set SUPER
    • Sets Azure SQL database pricing tier to whatever you prefer (set in variables)

This script is adapted specifically for our current server environment. Most people won’t be able to use this out of the box, but if you’re planning on creating something similar for your environment there might be something useful that might save you some work.

Here’s a simple drawing on the server environment used for testing this script.

NAV on Azure SQL environment

The script is run on the administration server. From there it connects to the SQL server, NAV server and Azure SQL server.

Since I’m new to Powershell the script is quite a mess. In this first version there is only one function. I tried to comment most things, I hope that will help a little bit. If I get the time I will create a lot more functions instead of just one very large script, that would make things much easier if you want to adapt this script to other scenarios.

You can download the script here: or

To run the script, you must start Powershell ISE and you have to run as administrator. Set all the needed variables. If you are uncomfortable putting in all your different credentials, you can change all those lines that require credentials, to prompt you for user name and password instead – for example like this

Change the line
$AzureAccount = Add-AzureAccount -Credential $AzureRmCredential
$AzureAccount = Add-AzureAccount -Credential (Get-Credential)

When running the script you will get some warnings when loading modules, those warnings are harmless, I just haven’t been able to suppress those, so please just ignore. If you know a solution, please tell me 🙂

You will get asked which companies you want to import. You can of course choose all the companies you like. Please not that the order in which you select the companies matter. The first company you select will get the tenant id ‘default’.

I have tested this with a Cronus database which contains two companies, in the Danish version it’s “CRONUS Danmark” and “CRONUS Danmark A/S”. The dash (/) caused me some trouble, so that will be removed in the database name and tenant id. I don’t know if other languages has other special characters, if that’s the case, you will have to replace those characters as well.

Running the script that will import both CRONUS companies takes just around one hour for me, on a 100/100 Mbit connection, so I expect it will take a lot longer if you have databases that contain a lot of data.

The script contains a few Start-Sleep commands, the longest is set to 300 seconds (5 minutes), after creating each Azure SQL database. This one can probably be lowered if necessary, before I added it I got a lot of errors like “Metadata for table 2000000151 not found”. I have tried lower settings, I can’t remember exactly but I think it was 60 and 120 seconds, that just wasn’t enough.

I have a ton of ideas on how to improve on this the possibilities are endless, but for now this is what I have to offer. Moving all the different parts into functions will probably be my next move, when I get the time.

I hope this will be useful to other people as well, I know it’s gonna save me hours of work in the future. Please let me know if you’re using it, that might motivate me to make more Improvements 😉




Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s