Import-NAVData into an empty database

I have been struggling with this a few times, and I was sure that I succeeded earlier when trying to import a .navdata file into an empty SQL database, so this time I’m going to write a short post about it.

Looks like it is really quite simple.

If you have a .navdata file that contains application, application data, global data and some company data as well, then you can import this with PowerShell directly into an empty SQL Server database.

First you need to create the empty database and you can do this from Microsoft SQL Server Management Studio or the Azure Portal if you are on Azure SQL.

Then open up PowerShell and import your NAV-modules. I use the Import-NAVModules cmdlet from Cloud.Ready.Software.NAV, but you can also use…

Import-module “C:\Program Files (x86)\Microsoft Dynamics NAV\11.0.20783\Service\Microsoft.Dynamics.Nav.Management.dll”

Then run Import-NAVData…

Import-NAVData -DatabaseServer <SQL server name> -DatabaseName <Database name> -ApplicationDatabaseServer <SQL server name> -ApplicationDatabaseName <Database name> -FilePath “C:\temp\backup.navdata” -IncludeApplication -IncludeApplicationData -IncludeGlobalData -AllCompanies -CommitPerTable -Force

Note this example is a single tenant database. If you are using SQL server instances you can type mysqlserver\myinstance under -DatabaseServer and -AppliationDatabaseServer.

Does it work on Azure SQL? Yes it does…

Import-NAVData -DatabaseServer $azuresql -DatabaseName $azuresqldb -DatabaseCredentials $azuresqlcred -ApplicationDatabaseServer $azuresql -ApplicationDatabaseName $azuresqldb -ApplicationDatabaseCredentials $azuresqlcred -FilePath “C:\temp\backup.navdata” -IncludeApplication -IncludeApplicationData -IncludeGlobalData -AllCompanies -CommitPerTable -Force

If you omit the -Force parameter you will get this…

Import-NAVData : System table $ndo$dbproperty is missing or database MyDatabase is not a Dynamics NAV application database
At line:1 char:1
+ Import-NAVData -DatabaseServer $azuresql -DatabaseName $azuresqldb -D …
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : NotSpecified: (0:Int32) [Import-NAVData], NavSqlException
+ FullyQualifiedErrorId :\nhtest2,Microsoft.Dynamics.Nav.Management.Cmdlets.Import

EDIT: I noticed that the actual company data was not imported with the above commands, not sure why. But if you experience the same you can always run the commands again without the -IncludeApplication, -IncludeApplicationData and -IncludeGlobalData parameters.

Also I got this message on Azure, when the Import-NAVData command finished…

Import-NAVData : Error while trying to set the collation for database nhtest2 containing tenant default. The error was:
The following SQL error was unexpected:
The database could not be exclusively locked to perform the operation.
ALTER DATABASE failed. The default collation of database ‘nhtest2’ cannot be set to Danish_Greenlandic_100_CS_AS.

I tried deleting the database and recreating it from the Azure Portal specifying ‘Danish_Greenlandic_100_CS_AS’ as collation. After that I was able to restore everything to the Azure SQL database.



Automating moving NAV to Azure SQL and multitenancy

In the near future I have to migrate 40-50 single-tenant databases running on an on-premise SQL Server to Azure SQL and multitenancy. I’m new to both multi tenancy and Azure SQL, and Powershell for that matter, but in order to save a lot of time in the future, I decided to create one Powershell script that could do everything for me and save me a lot of hours.

There are many many different ways to achieve this goal, the path I have chosen might not be right for you, but maybe you can use bits of this code and create your own script.

There are some prerequisites

Here’s what the script does:

  • Sets a lot of variables
  • Logs in with your  Azure account
  • Imports all the needed modules
  • Exports all companies from your single-tenant database into .navdata file
  • Exports NAV application data from single-tenant database into new database (in this case on an on-premise SQL 2014 server)
  • It get all company names from the .navdata file and put them into a variable
  • Deletes the [NT AUTHORITY\NETWORK SERVICE] user from the new application database (you can’t import local windows users og domain users to Azure SQL)
  • Connects to SQL server and creates .bacpac file from the new application database
  • Copies .bacpac file from \\yourSQLserver\c$\pshtemp to c:\temp or whatever path you have set in the $NAVBackupFilePath variable.
  • Copies the .bacpac file to Azure blob storage
  • Restores the .bacpac into new database on Azure SQL server
  • Connects to your NAV service tier server and configures the specified instance for multitenancy
  • Creates and imports NAV encryption key
  • Asks user to select which companies should be imported
  • For each company selected it will do this:
    • Create a database on Azure SQL
    • Import NAV data into database
    • Mount tenant
    • Create NAV user with permission set SUPER
    • Sets Azure SQL database pricing tier to whatever you prefer (set in variables)

This script is adapted specifically for our current server environment. Most people won’t be able to use this out of the box, but if you’re planning on creating something similar for your environment there might be something useful that might save you some work.

Here’s a simple drawing on the server environment used for testing this script.

NAV on Azure SQL environment

The script is run on the administration server. From there it connects to the SQL server, NAV server and Azure SQL server.

Since I’m new to Powershell the script is quite a mess. In this first version there is only one function. I tried to comment most things, I hope that will help a little bit. If I get the time I will create a lot more functions instead of just one very large script, that would make things much easier if you want to adapt this script to other scenarios.

You can download the script here: or

To run the script, you must start Powershell ISE and you have to run as administrator. Set all the needed variables. If you are uncomfortable putting in all your different credentials, you can change all those lines that require credentials, to prompt you for user name and password instead – for example like this

Change the line
$AzureAccount = Add-AzureAccount -Credential $AzureRmCredential
$AzureAccount = Add-AzureAccount -Credential (Get-Credential)

When running the script you will get some warnings when loading modules, those warnings are harmless, I just haven’t been able to suppress those, so please just ignore. If you know a solution, please tell me 🙂

You will get asked which companies you want to import. You can of course choose all the companies you like. Please not that the order in which you select the companies matter. The first company you select will get the tenant id ‘default’.

I have tested this with a Cronus database which contains two companies, in the Danish version it’s “CRONUS Danmark” and “CRONUS Danmark A/S”. The dash (/) caused me some trouble, so that will be removed in the database name and tenant id. I don’t know if other languages has other special characters, if that’s the case, you will have to replace those characters as well.

Running the script that will import both CRONUS companies takes just around one hour for me, on a 100/100 Mbit connection, so I expect it will take a lot longer if you have databases that contain a lot of data.

The script contains a few Start-Sleep commands, the longest is set to 300 seconds (5 minutes), after creating each Azure SQL database. This one can probably be lowered if necessary, before I added it I got a lot of errors like “Metadata for table 2000000151 not found”. I have tried lower settings, I can’t remember exactly but I think it was 60 and 120 seconds, that just wasn’t enough.

I have a ton of ideas on how to improve on this the possibilities are endless, but for now this is what I have to offer. Moving all the different parts into functions will probably be my next move, when I get the time.

I hope this will be useful to other people as well, I know it’s gonna save me hours of work in the future. Please let me know if you’re using it, that might motivate me to make more Improvements 😉