Azure Automation DSC Config example

There’s a couple of ways to do DSC on Azure, you can deploy a template and use the DSC extension resource to deploy DSC configuration to your VM (simple for quick simple deployments), or you can leverage Azure Automation as a DSC Pull server (subject of this blog), where you store all your DSC configuration scripts, MOF files and manage all your DSC nodes, to see drift, compliance etc.

This blog post discusses my github repo, which:

  • Deploys an Azure VM
  • Deploys a vNet into a separate Resource Group (Cross Resource Group Deployment), a resource group used for shared resources
  • Leverages the Custom Script extension which runs a script as the local computer account at the time of deployment. This script copies a script from the artifcats location to the local C:\ drive to be used as a user logon script. The DSC sets up a scheduled task to call the script at the time of any user logon.
  • This blog post:
    • leverages the DSC extension only to register the VM with the Azure Automation pull server in order for DSC to run the configuration on the VM
  • My other blog post
    • leverages the DSC extension to run the configuration on the VM. The JSON template also feeds parameter values into a DSC configuration script via the DSC extension

Note, this blog post focuses on my Github repo https://github.com/marckean/Azure-DSC-Automation where I have full repo of a working demo of deploying a VM to Azure using Infrastructure as Code along with further configuration with the Windows OS itself using Configuration as Code, my favourite is PowerShell DSC (Desired State Configuration).

Before moving on, you should be somewhat familiar with all the GIT, VS Code, Fork, Branch, Push, Commit, Clone terms as well as have all the tools – to get started setting up all the tooling to start using VS Code & GIT, my other blog post walks you through setting up all the tooling you need. Do this and have a play, it’s seriously addictive.

Back to this, as for my GitHub repo, you should:

  • Fork my repo to your own GitHub account from GitHub’s website.2018-06-28_2125
  • Using Github desktop, clone your newly forked repository to your local computer.
2018-06-28_2045
  • Then open the repository in VS Code Open Folder

For this demo, as this is focused Azure Automation DSC, you need to first add an Azure Automation account if you don’t already have one.

2018-06-28_1959

From the Azure Automation account, add a configuration, this is a .ps1 file which you need to upload – available at the very bottom of this page.

2018-06-28_2001

The .ps1 file you want to add is the script is further below, make sure you copy & paste the contents to a new file on your desktop and save this file specifically as main.ps1 before uploading in the Azure portal, otherwise the upload will fail.

2018-06-28_2006

Once this is uploaded, you need to compile this DSC configuration which creates a DSC node configuration (MOF file).

This is the same as running:

Main -nodeName $env:COMPUTERNAME -VNCKey 'XXXXX-XXXXX-XXXXX-XXXXX-XXXXX' -OutputPath "$env:USERPROFILE\Desktop"
2018-06-28_2007

It takes a few minutes to compile.

Things to keep in mind, most of this of this DSC Configuration file will require Source Files to be accessed from a blob storage account.

Source Files – artifacts

Source Files / Build Files / Artifacts used in the process of DSC configuration has always been a challenge. Where to place them centrally so they’re accessible for all deployments. Source files can be other scripts, files, or software packages to install on your machines.

The best place I have found to store source files is in a good old Azure Storage account container e.g.

https://msmarcsg.blob.core.windows.net/deployment

You give the storage account container anonymous read access.

2018-06-28_2024

You’re probably thinking no way! This is way too in-secure!! Not really, you encrypt all files other than the normal .exe & .msi files, you encrypt all your sensitive files using Rijndael encryption (pronounced rain-dahl), it is the algorithm that has been selected by the U.S. National Institute of Standards and Technology (NIST) as the candidate for the Advanced Encryption Standard (AES) – see.

Encryption is easy and done locally on your computer by following my other blog post Storing Files safely & securely in Publicly Accessible Storage.

As per this other blog, you normally would:

  1. Create a certificate used to encrypt the files, this generates a Private Key on your computer
  2. Export the Private Key as a .PFX file to your local computer – ready to add to Azure’s Key Vault
  3. Encrypt the files in a local (Source Files) designated folder on your computer as per my blog
  4. Upload these encrypted files to an Azure Blob storage container using Azure Storage Explorer.

However, for this blog post demonstration for DSC, I will provide you the certificate you need. As for the source files, I will provide demo ones… Below is what my folder structure looks like in order for this demo DSC configuration to work. As you can see, some of the files have a .encrypted extension, to indicate the files are encrypted.

2018-06-28_2035

In a normal world, you would want to add your own source files if you wanted to do this properly, so simply change this line in the below script to suit your storage account container.

$PublicStorageSourceContainer = 'https://msmarcsg.blob.core.windows.net/deployment'

Encryption Certificate Azure Key Vault

In a real world, you would create your own certificate to protect your source files – however for this demo, use this certificate. Password is Passw0rd.

You can run DSC configurations locally on test machines to make sure DSC runs correctly, you can see it apply in real-time. This is highly beneficial to speed up testing for errors instead of waiting each time for a full deployment. If you want to experiment and run it locally, you would need to add this same certificate to the local machine certificate store on your test computer…. Double click on the certificate and follow your nose.

2018-06-28_2139.png

For this demonstration, you upload this certificate (you just downloaded) to Azure’s Key Vault.

2018-06-28_2137

You need to then click on the newly imported certificate in Azure Key Vault, then copy the Secret Identifier to the clipboard of your computer. You will use this imediately below in the JSON template.

2018-06-28_2140

To allow Azure services to be able to access Azure Key Vault, you’ll need to open it up to allow access.

Logon to https://resources.azure.com same as your Azure logon. Navigate through the levels to where your Key Vault is located…. Subscriptions > {Your Subscription} > resourceGroups > {Your Resource Group} > providers > vaults……

Select your Key Vault, then on the right select both ReadWrite & Edit.

2018-06-29_0721

At the very bottom, change the 3 items to say ‘true‘, then press the PUT button at the top to apply the settings.

2018-06-29_0722

Changes to the JSON Template

Make sure you have the local copy of the GitHub repo folder open in VS Code….

2018-06-28_2143

Select the JSON template, in VS Code you need to run through all the parameters at the top and in the parameters file, change the settings as you see fit to suite your environment. For instance the Azure Automation parameters.

You also need to change specifically the Secret Identifier as per the step above.

2018-06-28_2039

Once you’re happy everything looks good | Save, commit the file locally, then sync to your GitHub repo.

Setup a Build definition in VSTS

You need to use VSTS to do the deployment of your GitHub repo JSON template to Azure. For the Build Definition in VSTS, use the GitHub repo as the source, this will be the same GitHub repo you forked from me, your own GitHub account.

2018-06-28_2110

You want to start with an empty pipeline:

2018-06-28_2118

Add both Azure Resource Group DeploymentAzure PowerShell tasks and configure the obvious stuff along with the not so obvious stuff as per further below.

2018-07-24_1652.png

Configure Azure Resource Group Deployment as per:

Template:

$(Build.SourcesDirectory)/BlankResourceGroup.json

Configure Azure PowerShell as per:

Script Path:

$(Build.SourcesDirectory)/Deploy-AzureResourceGroup.ps1

Script Arguments:

-ResourceGroupName 'RG-Name-ChangeThis' -ResourceGroupLocation 'australiaeast' -TemplateFile '$(Build.SourcesDirectory)\WindowsVirtualMachine.json' -TemplateParametersFile '$(Build.SourcesDirectory)\WindowsVirtualMachine.parameters.json' -UploadArtifacts -ArtifactStagingDirectory '$(Build.SourcesDirectory)'

That’s it, go and hit build.

Azure PowerShell

Leave a Reply

Please log in using one of these methods to post your comment:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: