Import-NAVData into an empty database

I have been struggling with this a few times, and I was sure that I succeeded earlier when trying to import a .navdata file into an empty SQL database, so this time I’m going to write a short post about it.

Looks like it is really quite simple.

If you have a .navdata file that contains application, application data, global data and some company data as well, then you can import this with PowerShell directly into an empty SQL Server database.

First you need to create the empty database and you can do this from Microsoft SQL Server Management Studio or the Azure Portal if you are on Azure SQL.

Then open up PowerShell and import your NAV-modules. I use the Import-NAVModules cmdlet from Cloud.Ready.Software.NAV, but you can also use…

Import-module “C:\Program Files (x86)\Microsoft Dynamics NAV\11.0.20783\Service\Microsoft.Dynamics.Nav.Management.dll”

Then run Import-NAVData…

Import-NAVData -DatabaseServer <SQL server name> -DatabaseName <Database name> -ApplicationDatabaseServer <SQL server name> -ApplicationDatabaseName <Database name> -FilePath “C:\temp\backup.navdata” -IncludeApplication -IncludeApplicationData -IncludeGlobalData -AllCompanies -CommitPerTable -Force

Note this example is a single tenant database. If you are using SQL server instances you can type mysqlserver\myinstance under -DatabaseServer and -AppliationDatabaseServer.

Does it work on Azure SQL? Yes it does…

Import-NAVData -DatabaseServer $azuresql -DatabaseName $azuresqldb -DatabaseCredentials $azuresqlcred -ApplicationDatabaseServer $azuresql -ApplicationDatabaseName $azuresqldb -ApplicationDatabaseCredentials $azuresqlcred -FilePath “C:\temp\backup.navdata” -IncludeApplication -IncludeApplicationData -IncludeGlobalData -AllCompanies -CommitPerTable -Force

If you omit the -Force parameter you will get this…

Import-NAVData : System table $ndo$dbproperty is missing or database MyDatabase is not a Dynamics NAV application database
.
At line:1 char:1
+ Import-NAVData -DatabaseServer $azuresql -DatabaseName $azuresqldb -D …
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : NotSpecified: (0:Int32) [Import-NAVData], NavSqlException
+ FullyQualifiedErrorId : agidonsql2.database.windows.net\nhtest2,Microsoft.Dynamics.Nav.Management.Cmdlets.Import
NavData

EDIT: I noticed that the actual company data was not imported with the above commands, not sure why. But if you experience the same you can always run the commands again without the -IncludeApplication, -IncludeApplicationData and -IncludeGlobalData parameters.

Also I got this message on Azure, when the Import-NAVData command finished…

Import-NAVData : Error while trying to set the collation for database nhtest2 containing tenant default. The error was:
The following SQL error was unexpected:
The database could not be exclusively locked to perform the operation.
ALTER DATABASE failed. The default collation of database ‘nhtest2’ cannot be set to Danish_Greenlandic_100_CS_AS.

I tried deleting the database and recreating it from the Azure Portal specifying ‘Danish_Greenlandic_100_CS_AS’ as collation. After that I was able to restore everything to the Azure SQL database.

 

Advertisements

RDP – An internal error has occured

I’m using a Let’s Encrypt SSL certificate with Remote Desktop Services and have set up a script to automatically renew the certificate and also update the certificate for the Remote Desktop Services.

The script renews the certificate just fine but after the has been rebooted I can no longer connect via RDP and I get the error ‘An internal error has occured’. Not very informative. The server is a Azure VM and I tried every suggestion I could find including restarting, redeploying and resetting the password. The boot diagnostics wouldn’t even update the screenshot.

I bet there are more ways to solve this, but here’s what works for me.

On another server in the same domain, start the Server Manager. Click on ‘Add other servers to manage’ and select the troubled server and add it. Wait while the servers roles are being fetched. The ‘Remote Desktop Services’ role should appear on the list to the left. Click on it and on the ‘DEPLOYMENT OVERVIEW’ under ‘TASKS’ select ‘Edit Deployment Properties’.

Now select Certificates, select ‘RD Connection Broker – Enable Single Sign On’ and click the ‘Select existing certificate…’ button.

2018-05-15_11h18_44

Enter the password for the certificate (or choose a different certificate) and mark ‘Allow the certificate to be added to the Trusted Root‚Ķ’. Click OK.

That’s what I had to do to fix my problem, hope this will save someone else some time. Now I just have to figure out what I have to do to prevent it from happening every time the certificate is renewed.

How to restore Microsoft SQL database from Azure blob (with special characters in logical name)

I was struggling with this and now I found a solution I would like to share ūüôā

I have setup Managed Backup on a SQL server and now I have a lot of backupfiles located in a storage account on Azure. I needed to figure out how to use these files for database restore. First I tried simply using SQL Server Management Studio, but I didn’t get very far. I can select the blob storage, but for some reason it does not show any files?

Next option was restoring using a query.

I wanted to restore the database to a new server and I want the database files to be located in another folder. To do that I need to know the logical names of all files in the database backup, you can get that by using this query…

RESTORE FILELISTONLY

FROM URL = 'https://mybackupstorage.blob.core.windows.net/mybackupcontainer/MyDatabase 2015_726ef093fa6d4165b0492221922832bb_20170614141530+02.bak'

 

SQLrestore1The result of this query lists the logical name of alle database files, and the physical location of the files on the server where the backup was taken.

Notice that the above logical names contains parentheses. I could not find a way to escape these, so instead I used variables.

DECLARE @fn1 VARCHAR(50)

DECLARE @fn2 VARCHAR(50)

SET @fn1 = 'Demo Database NAV (7-0)_data'

SET @fn2 = 'Demo Database NAV (7-0)_log'

RESTORE DATABASE [My database 2015]

FROM URL = 'https://mybackupstorage.blob.core.windows.net/mybackupcontainer/Mydatabase 2015_726ef093fa6d4165b0492221922832bb_20170614141530+02.bak'

WITH

MOVE @fn1 TO 'C:\SQL data\Mydatabase2015_Data.mdf'

,MOVE @fn2 TO 'C:\SQL data\Mydatabase2015_Log.ldf'

 

The result should hopefully be something like this…

SQLrestore2

Azure SQL Long-term backup retention

Azure SQL Long-term retention just want into public preview. I was looking for a guide to set this up, but I couldn’t find anything, so here is my own little guide ūüôā

  • Log into your Azure portal (https://portal.azure.com)
  • Click on azure_sql_ltr_ss3¬†and search for ‘backup’. Click the result ‘Backup and Site Recovery (OMS)’ to create a new Recovery Services Vault.

azure_sql_ltr_ss5

  • Fill out the form. Select the subscription that also contains your SQL server
  • Select the Resource group that your SQL server is also a member of
  • Select your preferred Location
  • Click Create
  • Wait a few seconds while your Recovery Services vault is being created
  • Go into SQL Servers
  • Click on your SQL server
  • Click on the new setting called “Long-term backup retention”

azure_sql_ltr_ss1

  • If you have not yet accepted the preview terms you will see this message:

azure_sql_ltr_ss2

  • Click the message, click the checkbox that shows up, and click the OK button
  • Select the databases you would like to backup
  • Click the Configure button on top of the pane
  • Select your Recovery service vault
  • Create a new retention policy by filling out the name and choosing the retention period
  • Click OK

azure_sql_ltr_ss6

  • Click the Save button on top of the pane

azure_sql_ltr_ss7

  • That should be it ūüôā

Loading all NAV cmdlets in your Powershell profile

If you often need to run both the NAV Development Shell and the NAV Administration Shell and your tired of opening up two windows or tired of loading both modules manually here is a solution.

First of all, you must check to see if you already have a profile, if not you have to create one. Please check out this post to see how it’s done (http://www.howtogeek.com/50236/customizing-your-powershell-profile/)

I copied most of the examples from the above site, but had to change a few things¬†because I got some errors and warnings. Also I added the two Import-Module lines to load the NAV cmdlets. Here is what I ended up with…

Set-Location C:
$Shell = $Host.UI.RawUI
$size = $Shell.WindowSize
$size.width=200
$size.height=60
$Shell.WindowSize = $size
$size = $Shell.BufferSize
$size.width=200
$size.height=2000
$Shell.BufferSize = $size
$Shell.WindowTitle="NAV console"
Import-Module 'C:\Program Files\Microsoft Dynamics NAV\90\Service\NavAdminTool.ps1'
Import-Module 'C:\Program Files (x86)\Microsoft Dynamics NAV\90\RoleTailored Client\Microsoft.Dynamics.Nav.Model.Tools.psd1'
Clear-Host
Write-Host -ForegroundColor Green "All NAV 2016 cmdlets has been loaded."

It’s easier to copy the code from here.

NAV Multitenant sync

I haven’t worked with multitenant database before, so I had to figure out how to get the database objects updated. It’s not a big deal really, but here is how I did it, using Powershell of course ūüėÄ

First I tried to run this command

(You need to run this from the Dynamics NAV 20xx Delelopment Shell)

Import-NAVApplicationObject -Path "\\tsclient\U\Nicolai\300516.fob" -NavServerInstance MyInstance -DatabaseName "My 2016 APP"
 -DatabaseServer mydatabaseserver.database.windows.net -LogPath 'C:\Temp\logfile.txt' -ImportAction Overwrite -SynchronizeSchemaChanges Yes -Username <your_user_name> -Password <your_password>

And I ran into this error message:

Check that:
. The Microsoft Dynamics NAV Server is running
. There is only one tenant mounted on the server instance.
At line:1 char:1
+ Import-NAVApplicationObject -Path "\\tsclient\U\Nicolai\300516.fob" - ...
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
    + CategoryInfo          : NotSpecified: (:) [Write-Error], WriteErrorException
    + FullyQualifiedErrorId : Microsoft.PowerShell.Commands.WriteErrorException,Import-NAVApplicationObject

Then I tried change the SynchronizeSchemaChanges to No, like this:

Import-NAVApplicationObject -Path "\\tsclient\U\Nicolai\300516.fob" -NavServerInstance MyInstance -DatabaseName "My 2016 APP"
-DatabaseServer mydatabaseserver.database.windows.net -LogPath 'C:\Temp\logfile.txt' -ImportAction Overwrite -SynchronizeSchemaChanges No -Username <your_user_name> -Password <your_password>

And voila, it worked! But we have not synchronized the schema changes yet, take a look at the state of your tenants, either in the NAV Administration GUI or with the cmdlet Get-NAVTenant -ServerInstance myinstance (from NAV Administration Shell).

nav2016_tenant_sync_status

Notice the state of your tenants. The default instance says might say mounted but other tenants say OperationalWithSyncPending. If you are ready to sync one or more of your tenants you have to run the Sync-NAVTenant cmdlet from your NAV Administration Shell.

Sync-NAVTenant -ServerInstance myNAVInstance -Tenant myTenant -Mode Sync -Force

That should be it. You might need to set mode to ForceSync instead of Sync, this is required when the table changes cause loss of data. Please note that¬†you need to sync the default instance as well, it’s state should also be Operational.

If you run the Get-NAVTenant cmdlet again you will see that the state has changed to Operational.

nav2016_tenant_sync_status2

If you want to sync all your tenants at once you could do something like this:

Get-NAVTenant -ServerInstance myNAVInstance | Sync-NAVTenant -Mode ForceSync -Force

You can add the -Verbose parameter if you want more output, and you can leave out the -Force parameter if you want to confirm every tenant and maybe skip some of them.

Or take a look at Waldo’s blog –¬†http://www.waldo.be/2014/07/17/nav-2013-r2-multi-tenancy-force-full-sync-on-all-tenants/ if you want to control the output you get.

 

 

Automating moving NAV to Azure SQL and multitenancy

In the near future I have to migrate 40-50 single-tenant databases running on an on-premise SQL Server to Azure SQL and multitenancy. I’m new to both multi tenancy and Azure SQL, and Powershell for that matter, but in order to save a lot of time in the future, I decided to create one Powershell script that could do everything for me and save me a lot of hours.

There are many many different ways to achieve this goal, the path I have chosen might not be right for you, but maybe you can use bits of this code and create your own script.

There are some prerequisites

Here’s what the script does:

  • Sets a lot of variables
  • Logs in with your¬† Azure account
  • Imports all the needed modules
  • Exports all companies from your single-tenant database into .navdata file
  • Exports NAV application data from single-tenant database into new database (in this case on¬†an on-premise SQL 2014 server)
  • It get all company names from the .navdata file and put them into a variable
  • Deletes the [NT AUTHORITY\NETWORK SERVICE] user from the new application database (you can’t import local windows users og domain users to Azure SQL)
  • Connects to SQL server and creates .bacpac file from¬†the new application database
  • Copies .bacpac file from \\yourSQLserver\c$\pshtemp to c:\temp or whatever path you have set¬†in the¬†$NAVBackupFilePath variable.
  • Copies the .bacpac file to Azure blob storage
  • Restores the .bacpac into new database on Azure SQL server
  • Connects to your NAV service tier server and configures the specified instance for multitenancy
  • Creates and imports NAV encryption key
  • Asks user to select which companies should be imported
  • For each company selected it will do this:
    • Create a database on Azure SQL
    • Import NAV data into database
    • Mount tenant
    • Create NAV user with permission set SUPER
    • Sets Azure SQL database pricing tier to whatever you prefer (set in variables)

This script is adapted specifically for our current server environment. Most people won’t be able to use this out of the box, but if you’re planning on creating something similar for your environment there might be something useful that might save you some work.

Here’s a simple drawing on the server environment used for testing this script.

NAV on Azure SQL environment

The script is run on the administration server. From there it connects to the SQL server, NAV server and Azure SQL server.

Since I’m new to Powershell the script is quite a mess. In this first version there is only one function. I tried to comment most things, I hope that will help a little bit. If I get the time I will create a lot more functions instead of just one very large script, that would make things much easier if you want to adapt this script to other scenarios.

You can download the script here: https://gist.github.com/anonymous/24e6bbb159120ce7763d4f3ca522d5cc or http://www.filedropper.com/movenavtoazuresql

To run the script, you must start Powershell ISE and you have to run as administrator. Set all the needed variables. If you are uncomfortable putting in all your different credentials, you can change all those lines that require credentials, to prompt you for user name and password instead Рfor example like this

Change the line
$AzureAccount = Add-AzureAccount -Credential $AzureRmCredential
With
$AzureAccount = Add-AzureAccount -Credential (Get-Credential)

When running the script you will get some warnings when loading modules, those warnings are harmless, I just haven’t been able to suppress those, so please just ignore. If you know a solution, please tell me ūüôā

You will get asked which companies you want to import. You can of course choose all the companies you like. Please not that the order in which you select the companies matter. The first company you select will get the tenant id ‘default’.

I have tested this with a Cronus database which contains two companies, in the Danish version it’s “CRONUS Danmark” and “CRONUS Danmark A/S”. The dash (/) caused me some trouble, so that will be removed in the database name and tenant id. I don’t know if other languages has other special characters, if that’s the case, you will have to replace those characters as well.

Running the script that will import both CRONUS companies takes just around one hour for me, on a 100/100 Mbit connection, so I expect it will take a lot longer if you have databases that contain a lot of data.

The script contains a few Start-Sleep commands, the longest is set to 300 seconds (5 minutes), after creating each Azure SQL database. This one can probably be lowered if necessary, before I added it I got a lot of errors like “Metadata for table 2000000151 not found”. I have tried lower settings, I can’t remember exactly but I think it was 60 and 120 seconds, that just wasn’t enough.

I have a ton of ideas on how to improve on this the possibilities are endless, but for now this is what I have to offer. Moving all the different parts into functions will probably be my next move, when I get the time.

I hope this will be useful to other people as well, I know it’s gonna save me hours of work in the future. Please let me know if you’re using it, that might motivate me to make more Improvements ūüėČ

/Nicolai