Virtualisation | Cloud | Strategy

Deploy an AD Domain in IaaS to Azure with ARM and DSC

Quick Update

I’ve been working with Ravello to build out a vSphere environment, but have had to put that on hold just for the moment to take a closer look at Azure.  I’ve been working with Azure for a while in a hands off capacity that has been rewarding in the complexity, but lacking in hands on opportunities.   A chance presented itself to get back hands on with Azure and I’ve jumped at the chance to get back hands on.

I’ve registered for the 70-533 exam with a retake and practice test bundle.  To  provide my learning some impetus, as those familiar with my TOGAF posts will have read I like setting myself a challenge.

With a more than solid grasp of the basics, I wanted to develop something a bit more meaningful that might be of continuing practical use.  I set myself the challenge of scripting the build of an AD Domain controller to Azure using ARM and DSC.  The aim here is to have a re-usable building block for a lab environment.  Something that can be used as a base for more exiting templates and configurations.

This blog post will as ever help me solidify my learning.


The basic tools you need to follow this guide are as follows;

  1. An Azure Subscription, available as a trial account (with some limitations to core usage etc. or as part of an MSDN subscription).  I used a trial account for everything that follows.
  2. Visual Studio, free community versions available from Microsoft.
  3. PowerShell ISE, available alongside a host of other goodies from Microsoft.
  4. GitHub account and a repository to host extended DSC configuration.
  5. BitBucket account and a repository for software version control.  Github charges for private repositories, BitBucket doesn’t so was a better fit for me.

PowerShell is going to need some modules importing if they are not already available.  you can see what modules are available by running ‘get-module‘.  run that command looking for ‘Microsoft.Powershell.Management‘ and ‘xActiveDirectory‘ in the output.  if they are not listed, the modules need to be imported.  At this point you can either download them to a known location and add them via; ‘import-module -name Microsoft.PowerShell.Management‘ or ‘import-module  -name xActiveDirectory‘ accordingly.

Or you can get them from the PowerShell gallery.  in order to bring them in from that source you need to set the source to trusted.  This can be done via the following command ‘Set-PSRepository -Name PSGallery -InstallationPolicy Trusted‘. Working with a trusted repository, the modules can now be installed using ‘install-module -name xActiveDirectory‘ etc.

Getting started in Visual Studio

To get started we want to create a new Azure Resource Group project from within Visual Studio.  To do that;

  1. Open Visual Studio (run as administrator).
  2. File > New > Project.
  3. Underneath Installed > Templates > Visual c# > Cloud & Azure Resource Group
  4. Give the project a name and a solution name.

The new project wizard is then going to allow you to select the type of resource that you want to deploy into Azure.  Selecting a template will give you all of the files that need to be configured to script the creation of resources in Azure.

The base elements needed to deploy the server to host our new forest are contained within the ‘Windows Virtual Machine’ template.

This template creates the following files within the solution; ‘WindowsVirtualMachine.json‘, ‘WindowsVirtualMachine.parameters.json‘ and ‘Deploy-AzureResourceGroup.ps1

Deploy-AzureResurcegroup.ps1‘ – This PowerShell script references the json template and configuration data to drive the deployment of the solution to Azure.  When you deploy the solution, you are in effect running this script that calls information from the json templates and configuration data.

‘WindowsVirtualMachine.parameters.json’ – This contains the configuration data that is fed per deployment configuration data into the ARM template.

At the moment this file contains no information and will prompt the user for an administrative username and DNS name for the public interface.

WindowsVirtualMachine.json’ – This is the template.  Through manipulation of this file the VM that will be deployed can be configured and managed.   Through the addition and use of ‘copy‘ elements multiple resources can be deployed, through the extension of the template VMs can be added to existing directories.  Both of those topic deserve more in depth coverage so I will leave them for a later date.

The JSON outline contains the resources that the template is configured to deploy.  You can see with a sample template the deployment includes a Storage Account, Public IP Address, Virtual Network, Network Interface and a Virtual Machine this is configured with an extension for Azure Diagnostics.

In other words it provides all of the resources that are needed to progress with the deployment.

Desired State Configuration

Desired State Configuration (DSC) is an essential part of the configuration, management and maintenance of Windows-based servers. It allows a PowerShell script to specify the configuration of the machine using a declarative model in a simple standard way that is easy to maintain and understand.

DSC can be used to specify the configuration of the resources that are delivered in the template.  More specifically it can be used to specify the configuration of an AD forest on the ‘VirtualMachine‘ resource.

We add the DSC extension by right mouse clicking the ‘VirtualMachine‘ resource and selecting ‘add new resource‘.  This loads an ‘Add Resource‘ wizard that requests a name and the section of a Virtual Machine that it applies to.

After adding this the DSC extention will appear in the JSON outline, adding a section to the template and a PowerShell script to the solution.

The Default DSC PowerShell Script will contain the following, it’s worth having a quick look to see the sort of features that can be deployed this way.

All of the elements are now in place that will enable the creation of the new VM and forest from script, they just need to be configured.

Editing the DSC configuration

The first edits are to add parameters for the domain name and the domain administrator credentials, so that we can pass those from the ARM template;

Remember earlier in the piece we installed the ‘xActiveDirectory‘ module.  This is the script that’s going to make use of it.  So lets import the modules that the script needs;

Next is to list the windows features that need to be installed for this server to functions as a domain controller.

The ‘Node $AllNodes.Where{$_.Role -eq “DC”}.Nodename‘ is something I hope to expand upon in another blog post.  It is a filter that will enable DSC specification to contain specs for multiple node objects.

Lastly using the imported ‘xActiveDirectory‘ module, the script can create the AD forest.

If that is put all together it should look something like this;

DSC Configuration Data

Best practice when working with DSC is to separate out the configuration data from the configuration itself.

That might sound like an oxymoron for what we’re doing here.  However, if you scale the solution up to using DSC to configure hundreds or thousands of servers. Separating out the configuration data simply enables better solution scaling.

To add configuration data;

  1. Right mouse click the DSC folder from solution explorer
  2. Select add > new item.
  3. Select PowerShell & PowerShell Script Data File.
  4. Give the Script Data File a name

We need this data file to be included in the build so we need to configure that.

  1. Right mouse click the PowerShell Script Data File
  2. Select Properties.
  3. Select ‘All Configurations’ from the configuration drop down.
  4. Then change the value of ‘Copy to output directory’ to  ‘Copy Always’
  5. Apply changes!
  6. Then change the value of ‘Build Action’ to ‘Content’
  7. Apply changes and ‘OK’ to exit.

If the sequence above is not followed then that might throw the following error;

If that happens revisit the sequence above and reapply the changes.

Edit the PowerShell Script Data File (*.psd1) to include the following code.

ARM Template changes

It’s worth noting here that when we added the DSC script and script config elements to the solution. The following variables  ‘[YourDSCName]ArchiveFolder‘ and ‘[YourDSCName]ArchiveFileName‘ were added to the ‘WindowsVirtualMachine.json‘ variables section and appropriately filed in.

in addition manual editing is required to the variables section of the ‘WindowsVirtualMachine.json‘ file is required to point toward the DSC Config Script and DSC Config File.

To support the DSC configuration the following changes and additions need to be made to the ARM template.

As you will recall from above the ‘WindowsVirtualMachine.parameters.json‘ file is pretty empty.

Into this file we’re going to add parameters and edit the values for ‘adminUsername‘, adminPassword‘, ‘domainName‘, ‘dnsNameforPublicIP‘ and ‘windowsOSVersion‘.  These parameters should be self explanatory;

We already have defined parameters for ‘adminPassword‘ and ‘windowsOSVersion‘ contained within ‘WindowsVirtualMachine.json‘.  However, we don’t have ‘domainName‘ so we need to define that within the parameters section of the file.

The observant among you will have noticed the introduction of the following parameters ‘_artifactsLocation‘ and ‘_artifactsLocationSasToken‘ these have been added to support changes needed in the DSC extension part of the ‘WindowsvirtualMachine.json‘ file.

Lastly we need to change the DSC Extension part of ‘WindowsVirtualMachine.json‘.  We’re going to edit this section to point toward the location of the DSC files and *.psd1.

Now in theory, you can hold the *.psd1 on BLOB storage and call it from there with a secure SAS token.  I’ve not got this working. I think consensus in the community is that, certainly for test environments it is far simpler to pull this file from github.

This section of the file should now read;

When linking to the *.psd1 file on github, be sure to link to the raw file format and not the *.html file presented to you initially in the repository.


Before you deploy into live you can perform validation on the project from a right mouse click of the project within Visual Studio.

Further to that I would strongly recommend validating your json using the tools at https://jsonlint.com/. These should certainly protect you from too much pain caused by fixable code errors.

Combined these two tools should enable you to highlight and fix any errors that may have cropped up in the process.


To complete simply deploy the project.

Have Fun!

Next up…

Deploy a VM and add it to an existing AD Domain in IaaS with Azure ARM and DSC.




Leave a Reply