Securing Credentials properly in DSC using Azure Automation - powershell

How would you go on to secure your DSC configuration that's using credentials properly in Azure Automation?
E.g.:
configuration MyServer {
param(
[Parameter(Mandatory=$true)][PSCredential]$MyCredential
);
# Some configuration using credentials
}
Normally I'd set up a public key and a proper certificate installed on each node and pass along CertificateFile and Thumbprint to ConfigurationData when compiling the configuration documents.
In Azure Automation I can't find any good solution.
The documentation says Azure automation encrypts the entire MOF by it's own : https://azure.microsoft.com/en-us/documentation/articles/automation-certificates/ and the article specifies to use PSAllowPlainTextCredentials.
When you then register a node to it's pull server and pull it's configuration, you can read out the password in plain text as long as you have local admin / or read access to temp, after being pulled down/updated. This is not good in a security perspective.
What I'd like, ideally would be to upload such a public key/certificate to the Azure Automation credentials and use it as a part of ConfigurationData when starting the compilation job.
However today, "CertificateFile" expects a path and not a AutomationCertificate so I cannot see a way to start the compilationjob with any public key present in Azure Automation. I can't see any ways of referring to my assets certificate when running the job.
Any ideas if this is possible in the current state of Azure automation and the way they work with DSC/pull to secure it properly using either an asset store din Azure Automation or Azure Key vault?

You should create a Azure Automation Credential and reference it in the configuration like so:
# Compile mof
$ConfigurationData = #{
AllNodes = #(
#{
NodeName = $nodeName
PSDscAllowPlainTextPassword = $true
}
)
}
$Parameters = #{
"nodeName" = $nodeName
"credential" = $credName ##### Notice this is only the name on Azure Automation Credential Asset, Azure Automation will securely pull those and use in the compilation
}
Start-AzureRmAutomationDscCompilationJob -ResourceGroupName $ResourceGroupName -AutomationAccountName $AutomationAccountName `
-ConfigurationName $configurationName -Parameters $Parameters -ConfigurationData $ConfigurationData
You should not worry about the PSDscAllowPlainTextPassword since Azure Automation does encrypt everything at rest for you, it's just that DSC doesn't know that (so you have to supply that to the DSC engine).
And in DSC you should have something like:
Configuration name
{
Param (
[Parameter(Mandatory)][ValidateNotNullOrEmpty()][String]$nodeName,
[Parameter(Mandatory)][ValidateNotNullOrEmpty()][pscredential]$credential
)
Import-DscResource -Module modules
Node $nodeName {DOSTUFF}
}

The correct way to pass a credential to a DSC file from Azure Automation is to use an Azure Automation Credential.
Then inside your DSC file you use the command Get-AutomationPSCredential
Example:
Configuration BaseDSC
{
Import-DscResource -ModuleName xActiveDirectory
Import-DscResource -ModuleName PSDesiredStateConfiguration
Import-DscResource -ModuleName XNetworking
$Credential = Get-AutomationPSCredential -Name "CredentialName"
Node $AllNodes.Nodename
{ ...
The credential is stored encrypted in Azure Automation, and is put into the encrypted MOF file in Azure Automation when you run the compilation job.
Additionally, the password can be updated in Azure Automation and then updated in the MOFs by just recompiling.
The password is not able to be retrieved in clear text from Azure.

Use secure credential and create user to Windows server and add to Administrator group using dsc:
**Solution ( PowerShell DSC) **
Firstly : Create credential in Automation account form azure portal or using any azure module
Home>Resource group> ...> automation account > credentials
Configuration user_windows_user
{
param
(
[Parameter()][string]$username,
[Parameter()]$azurePasswordCred **#name (string)of the credentials**
)
$passwordCred = Get-AutomationPSCredential -Name $azurePasswordCred
Node "localhost"
{
User UserAdd
{
Ensure = "Present" # To ensure the user account does not exist, set Ensure to "Absent"
UserName = $username
FullName = "$username-fullname"
PasswordChangeRequired = $false
PasswordNeverExpires = $false
Password = $passwordCred # This needs to be a credential object
}
Group AddtoAdministrators
{
GroupName = "Administrators"
Ensure = "Present"
MembersToInclude = #($username)
}
}
} # end of Configuration
#
$cd = #{
AllNodes = #(
#{
NodeName = 'localhost'
PSDscAllowPlainTextPassword = $true
}
)
}
Upload the Dsc file in azure automation>configuration
compile the configuration (provide input -username , credential name (string)
add configuration to node and wait for the configuration deployment

Related

Execute an App registration without AzureAD

For a professional project, a chunk of the pipeline must be able to create an application (the first App registration, so I only have a global Admin) automatically within Azure AD. So far I used AzureAD which works well with Powershell 5.6 on Windows.
I now must be able to run the code with Ubuntu 20.04 and its Powershell 7.2. Unfortunately for me, AzureAD module is only supported on non-core Windows PowerShell, therefore it does not work on core PS6 or PS7. A very simplified piece of code is the following:
# Connection infos
$tenantId = "abcdef12345-1234-1234-124-abcdef12346789"
$account = "my_admin#domain.com" # Is cloud Admin by default
$password = ConvertTo-SecureString "MyPassword" -AsPlainText -Force
$psCred = New-Object System.Management.Automation.PSCredential -ArgumentList ($account, $password)
Connect-AzureAD -Credential $psCred -Tenant $tenantId
# Create app
$appName = "MyApp"
New-App -appName $appName -tenant_id $tenantId
I am stuck and my question is the following: how could I run such an operation with Powershell 7.2 considering AzureAD is not usable? I did check Connect-MgGraph for the connection part only (https://github.com/microsoftgraph/msgraph-sdk-powershell) but the clientId is an infos that I don't have -and want to create-.
Thanks in advance
You can use DeviceLogin as explained in this article to obtain an oAuth access token for you Global Administrator account in PowerShell (independent of the version) but this first step needs a human interaction.
After obtaining the token, you can use it to make Graph API calls with your Global Administrator permissions to create an application.
Once you create your first application, you must attribute required permissions and use it to automate the process (obtain token programmatically using API calls) for application creation in PowerShell.
You could use Resource Owner Password Credentials (ROPC) to authenticate, however Microsoft actively discourages it in their documentation due to the security implications of sending a password over the wire.
If the security issues present with this method of authentication are still tolerated within your acceptance criteria, you would still need a ClientID. Luckily, AzureAD has a well-known ClientID that you can use to authenticate. This ID is 1950a258-227b-4e31-a9cf-717495945fc2
The below Powershell code should get you started. I've basically translated the HTTP request within Microsoft's documentation into a splatted Invoke-RestMethod command.
$LoginWithROPCParameters = #{
URI = "https://login.microsoftonline.com/contoso.onmicrosoft.com/oauth2/v2.0/token"
Method = "POST"
Body = #{
client_id = "1950a258-227b-4e31-a9cf-717495945fc2"
scope = "user.read openid profile offline_access"
username = "username#contoso.onmicrosoft.com"
password = "hunter2"
grant_type = "password"
}
}
Invoke-RestMethod #LoginWithROPCParameters

Missing cluster cert causes Add-AzServiceFabricClusterCertificate to fail: Object reference not set to an instance of an object

I'm fairly new to Service Fabric, so I'm not sure if this is an issue with the cmdlet or if this is a miss on my part. I am using Az.ServiceFabric module version 2.0.2 and the Az module version 3.8.0.
I am trying to use the Add-AzServiceFabricClusterCertificate cmdlet to add a secondary certificate that I've already created in my Azure KeyVault to my cluster. When I run the cmdlet, it fails with this error (running with Debug gave me more stack detail):
DEBUG: AzureQoSEvent: CommandName - Add-AzServiceFabricClusterCertificate; IsSuccess - False; Duration -
00:00:07.3059582;; Exception - System.NullReferenceException: Object reference not set to an instance of an object.
at Microsoft.Azure.Commands.ServiceFabric.Commands.ServiceFabricClusterCmdlet.GetClusterType(Cluster
clusterResource)
at Microsoft.Azure.Commands.ServiceFabric.Commands.AddAzureRmServiceFabricClusterCertificate.ExecuteCmdlet()
at Microsoft.WindowsAzure.Commands.Utilities.Common.AzurePSCmdlet.ProcessRecord();
Looking at the code for this cmdlet, I noticed that it's probably failing because the cluster resource that gets passed into GetClusterType does not have its Certificate member, so it fails when it tries to check the Certificate.Thumbprint and Certificate.ThumbprintSecondary:
internal ClusterType GetClusterType(Cluster clusterResource)
{
if (string.IsNullOrWhiteSpace(clusterResource.Certificate.Thumbprint) &&
string.IsNullOrWhiteSpace(clusterResource.Certificate.ThumbprintSecondary))
{
return ClusterType.Unsecure;
}
else
{
return ClusterType.Secure;
}
}
The cluster that gets passed into GetClusterType is retrieved in the same manner as in the Get-AzServiceFabricCluster cmdlet, so when I run that cmdlet for the cluster that I'm trying to add the certificate to, I noticed that my Certificate field is empty in the response. I'm guessing that's what's causing the NullRef exception. Here's that relevant snippet:
AzureActiveDirectory :
TenantId : xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx
ClusterApplication : xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx
ClientApplication : xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx
Certificate :
CertificateCommonNames : Microsoft.Azure.Management.ServiceFabric.Models.ServerCertificateCommonNames
ClientCertificateCommonNames :
ClientCertificateThumbprints :
I'm wondering if it's expected that the Certificate field would be empty when I'm using the Get-AzServiceFabricCluster cmdlet and if that is indeed the cause of my Add-AzServiceFabricClusterCertificate cmdlet failing. When I look at the cluster's Security blade in Azure Portal, I do see the Primary Cluster Certificate with which I originally created the cluster, and this is the cert that I use when deploying and doing other cluster operations. However, I did notice that the cert thumbprint field is empty when viewing the certificate from the portal. I would expect to see this certificate when using Get-AzServiceFabricCluster, but it comes up empty. Is this certificate missing from my Get-AzServiceFabricCluster cmdlet possible to fix through the portal or with another cmdlet?
It looks like your cluster is configured to find certificates by common name, rather than thumbprint. I'm guessing this based on the fact your portal doesn't show a thumbprint against the certificate, in addition to the snippet you have posted.
If this is the case, there's no need to update your cluster configuration with a new certificate when the old certificate has expired - Instead you need to install the certificate only into your VMSS vault. Once you add the new certificate to the VMSS, Service Fabric will automatically use the later expiring certificate.
You must always ensure you have at least one valid certificate installed on your VMSS with the common name configured in your cluster.
PS to upload certificate KV and install onto VMSS:
$subscriptionId = "sub-id"
$vmssResourceGroupName = "vmss-rg-name"
$vmssName = "vmss-name"
$vaultName = "kv-name"
$primaryCertName = "kv-cert-name"
$certFilePath = "...\.pfx"
$certPassword = ConvertTo-SecureString -String "password" -AsPlainText -Force
# Sign in to your Azure account and select your subscription
Login-AzAccount -SubscriptionId $subscriptionId
# Update primary certificate within the Key Vault
$primary = Import-AzKeyVaultCertificate `
-VaultName $vaultName `
-Name $primaryCertName `
-FilePath $certFilePath `
-Password $certPassword
$certConfig = New-AzVmssVaultCertificateConfig -CertificateUrl $primary.SecretId -CertificateStore "My"
# Get VM scale set
$vmss = Get-AzVmss -ResourceGroupName $vmssResourceGroupName -VMScaleSetName $vmssName
# Add new certificate version
$vmss.VirtualMachineProfile.OsProfile.Secrets[0].VaultCertificates.Add($certConfig)
# Update the VM scale set
Update-AzVmss -ResourceGroupName $vmssResourceGroupName -Verbose `
-Name $vmssName -VirtualMachineScaleSet $vmss
For more info, I wrote a blog post on switching from thumbprint to common name.
The official docs are also a good reference.

How do I authenticate Azure Powershell on Azure VM

I'm wanting to execute a Powershell script from an Azure VM to get its current public IP address (and to write this address to an evironment variable for an application to use).
My question is what the best way to authenticate the Azure Powershell environment is? On AWS credentials get 'baked' into an instance when it gets created. Does the equivalent happen with Azure Virtual Machines?
You can use a Management Certificate contained in your Publish Settings file and 'bake' it yourself
Import-AzurePublishSettingsFile –PublishSettingsFile C:\Store\my.publishsettings
If you already have a certificate for management, you can store it in your vm and use it in PS
# Get management certificate from personal store
$certificate = Get-Item cert:\\CurrentUser\My\$CertificateThumbprint
if ($certificate -eq $null) {
throw “Management certificate for $SubscriptionName was not found in the users personal certificate store. Check thumbprint or install certificate”
}
# Set subscription profile
Set-AzureSubscription -SubscriptionName $SubscriptionName -SubscriptionId $SubscriptionId -Certificate $certificate
# Select subscription as the current context
Select-AzureSubscription -SubscriptionName $SubscriptionName

Create an Azure Website with PowerShell and FTP

I need to write a PowerShell script that automatically creates an Azure Website and deploy the content of a local file system. Seems that Azure PowerShell SDK doesn't provide a way to copy files to a website so we want to use FTP as a deploy method.
To get the correct FTP endpoints and credentials the only way that I have found is to call an Azure Management Rest API: Get a Site’s Publish Profile.
But this API as other Azure Management API requires a certificate. I have found the following tutorial
Building Real-World Cloud Apps with Windows Azure that explain how to get the certificate:
$s = Get-AzureSubscription -Current
$thumbprint = $s.Certificate.Thumbprint
Unfortunately seems that with the current SDK $s.Certificate is always null, this property doesn't exists. If I manually set the certificate thumbprint everything works as expected.
Do you have an idea on how to get the correct subscription certificate? Or do you have an easy alternative to deploy local files to an Azure website?
It seems that now you can access the certificate thumbprint using
$thumbprint = $s.DefaultAccount
instead of
#$thumbprint = $s.Certificate.Thumbprint
Seems that the DefaultAccount has exactly the same value as the certificate thumbprint.
Just for reference here is my complete script to obtain a publishing profile for a given website:
Function get-AzureWebSitePublishXml
{
Param(
[Parameter(Mandatory = $true)]
[String]$WebsiteName
)
# Get the current subscription
$s = Get-AzureSubscription -Current
if (!$s) {throw "Cannot get Windows Azure subscription."}
#$thumbprint = $s.Certificate.Thumbprint #this code doesn't work anymore
$thumbprint = $s.DefaultAccount
if (!$thumbprint) { throw "Cannot get subscription cert thumbprint."}
# Get the certificate of the current subscription from your local cert store
$cert = Get-ChildItem Cert:\CurrentUser\My\$thumbprint
if (!$cert) {throw "Cannot find subscription cert in Cert: drive."}
$website = Get-AzureWebsite -Name $WebsiteName
if (!$website) {throw "Cannot get Windows Azure website: $WebsiteName."}
# Compose the REST API URI from which you will get the publish settings info
$uri = "https://management.core.windows.net:8443/{0}/services/WebSpaces/{1}/sites/{2}/publishxml" -f `
$s.SubscriptionId, $website.WebSpace, $Website.Name
# Get the publish settings info from the REST API
$publishSettings = Invoke-RestMethod -Uri $uri -Certificate $cert -Headers #{"x-ms-version" = "2013-06-01"}
if (!$publishSettings) {throw "Cannot get Windows Azure website publishSettings."}
return $publishSettings
}
NOTE: this only works when you have connected to azure using Import-AzurePublishSettingsFile
Can anyone confirm that is safe to use DefaultAccount property?
UPDATE
If you use Kudu API to upload your site, like this, you don't need any certificate or publishing profile. You should read the user name and password using Get-AzureWebsite and the hostname is just yourwebsitename.scm.azurewebsites.net (note the scm segment). I suggest to use Kudu because is far more reliable and fast.

Teamcity and msdeploy using powershell

I developed a powershell script that accepts a buncha parameters, creates an MSDeploy string, and executes it. I've tested this powershell command:
It works on my local box (installs the web app from my local box to
a remote IIS server)
It works on the TeamCity box (installs the web
app from team city's folder structure to remote iis server)
problem:
It doesn't work when I run the command from teamcity's browser version.
The error is: ERROR_USER_NOT_ADMIN
Please note, my Teamcity Build Agent is a local admin on both my teamcity server and IIS Remote server
Powreshell Source Code:
$msDeploy = 'C:\Program Files (x86)\IIS\Microsoft Web Deploy V3\msdeploy.exe'
$sourcePackage = $args[0]
$paramFile = $args[1]
$iisAppPath = $args[2]
$servername = $args[3]
$usreName = $args[4]
$password = $args[5]
$includeAcls = $args[6]
function UpdateParamFile($paramXMLFile, $applicationPath)
{
$doc = New-Object System.Xml.XmlDocument
$doc.Load($paramXMLFile)
#IIS Application Path (this is where the code will be deployed - it has to exist in target IIS):
$appPath = $doc.SelectSingleNode("//parameters//setParameter[#name = 'IIS Web Application Name']")
$appPath.value = $applicationPath
#Connection Strings:
#KDB Connection string:
#Save
$doc.Save($paramXMLFile)
#[xml] $xmlPars = Get-Content $paramXMLFile
#$xmlPars.parameters.setParameter | Where-Object { $_.name -eq 'IIS Web Application Name' } | select value
}
UpdateParamFile $paramFile $iisAppPath
$arguments = "-source:package=$sourcePackage", "-dest:auto,computerName=`"$servername`",userName=`"$usreName`",password=`"$password`",includeAcls=`"$includeAcls`"", "-setParamFile:$paramFile", '-verb:sync', '-disableLink:AppPoolExtension', '-disableLink:CertificateExtension', '-disableLink:ContentExtension'
&$msDeploy $arguments
Teamcity call to the above script file:
C:\Windows\System32\WindowsPowerShell\v1.0\powershell.exe -NonInteractive -ExecutionPolicy ByPass -File C:\TeamCity\buildAgent\work\253e6183c0596123\Debug\PMRSWebsite\DeployWeb.ps1 Debug\PMRSWebsite\Web.csproj.zip "Debug\PMRSWebsite\Web.csproj.SetParameters.xml" ^^^IIS_APP_NAME^^^ ^^^ServerName^^^ ^^^userName^^^ ^^^Password^^^ false
ERROR_USER_NOT_ADMIN
Diagnosis - This happens if you try to connect to the Remote Agent Service but
have not provided appropriate administrator credentials.
Resolution - The Remote Agent Service accepts either built-in Administrator or
Domain Administrator credentials. If you have a non-domain setup and want to
use account other that built-in administrator, please do following:
1. Create a separate user group MSDepSvcUsers on remote computer.
2. Create an local account A on both local & remote computer.
3. Add A to MSDepSvcUsers on remote computer.
4. Use account A to publish, this will allow you to publish without needing to
use built-in admin account.
via