Missing cluster cert causes Add-AzServiceFabricClusterCertificate to fail: Object reference not set to an instance of an object - azure-service-fabric

I'm fairly new to Service Fabric, so I'm not sure if this is an issue with the cmdlet or if this is a miss on my part. I am using Az.ServiceFabric module version 2.0.2 and the Az module version 3.8.0.
I am trying to use the Add-AzServiceFabricClusterCertificate cmdlet to add a secondary certificate that I've already created in my Azure KeyVault to my cluster. When I run the cmdlet, it fails with this error (running with Debug gave me more stack detail):
DEBUG: AzureQoSEvent: CommandName - Add-AzServiceFabricClusterCertificate; IsSuccess - False; Duration -
00:00:07.3059582;; Exception - System.NullReferenceException: Object reference not set to an instance of an object.
at Microsoft.Azure.Commands.ServiceFabric.Commands.ServiceFabricClusterCmdlet.GetClusterType(Cluster
clusterResource)
at Microsoft.Azure.Commands.ServiceFabric.Commands.AddAzureRmServiceFabricClusterCertificate.ExecuteCmdlet()
at Microsoft.WindowsAzure.Commands.Utilities.Common.AzurePSCmdlet.ProcessRecord();
Looking at the code for this cmdlet, I noticed that it's probably failing because the cluster resource that gets passed into GetClusterType does not have its Certificate member, so it fails when it tries to check the Certificate.Thumbprint and Certificate.ThumbprintSecondary:
internal ClusterType GetClusterType(Cluster clusterResource)
{
if (string.IsNullOrWhiteSpace(clusterResource.Certificate.Thumbprint) &&
string.IsNullOrWhiteSpace(clusterResource.Certificate.ThumbprintSecondary))
{
return ClusterType.Unsecure;
}
else
{
return ClusterType.Secure;
}
}
The cluster that gets passed into GetClusterType is retrieved in the same manner as in the Get-AzServiceFabricCluster cmdlet, so when I run that cmdlet for the cluster that I'm trying to add the certificate to, I noticed that my Certificate field is empty in the response. I'm guessing that's what's causing the NullRef exception. Here's that relevant snippet:
AzureActiveDirectory :
TenantId : xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx
ClusterApplication : xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx
ClientApplication : xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx
Certificate :
CertificateCommonNames : Microsoft.Azure.Management.ServiceFabric.Models.ServerCertificateCommonNames
ClientCertificateCommonNames :
ClientCertificateThumbprints :
I'm wondering if it's expected that the Certificate field would be empty when I'm using the Get-AzServiceFabricCluster cmdlet and if that is indeed the cause of my Add-AzServiceFabricClusterCertificate cmdlet failing. When I look at the cluster's Security blade in Azure Portal, I do see the Primary Cluster Certificate with which I originally created the cluster, and this is the cert that I use when deploying and doing other cluster operations. However, I did notice that the cert thumbprint field is empty when viewing the certificate from the portal. I would expect to see this certificate when using Get-AzServiceFabricCluster, but it comes up empty. Is this certificate missing from my Get-AzServiceFabricCluster cmdlet possible to fix through the portal or with another cmdlet?

It looks like your cluster is configured to find certificates by common name, rather than thumbprint. I'm guessing this based on the fact your portal doesn't show a thumbprint against the certificate, in addition to the snippet you have posted.
If this is the case, there's no need to update your cluster configuration with a new certificate when the old certificate has expired - Instead you need to install the certificate only into your VMSS vault. Once you add the new certificate to the VMSS, Service Fabric will automatically use the later expiring certificate.
You must always ensure you have at least one valid certificate installed on your VMSS with the common name configured in your cluster.
PS to upload certificate KV and install onto VMSS:
$subscriptionId = "sub-id"
$vmssResourceGroupName = "vmss-rg-name"
$vmssName = "vmss-name"
$vaultName = "kv-name"
$primaryCertName = "kv-cert-name"
$certFilePath = "...\.pfx"
$certPassword = ConvertTo-SecureString -String "password" -AsPlainText -Force
# Sign in to your Azure account and select your subscription
Login-AzAccount -SubscriptionId $subscriptionId
# Update primary certificate within the Key Vault
$primary = Import-AzKeyVaultCertificate `
-VaultName $vaultName `
-Name $primaryCertName `
-FilePath $certFilePath `
-Password $certPassword
$certConfig = New-AzVmssVaultCertificateConfig -CertificateUrl $primary.SecretId -CertificateStore "My"
# Get VM scale set
$vmss = Get-AzVmss -ResourceGroupName $vmssResourceGroupName -VMScaleSetName $vmssName
# Add new certificate version
$vmss.VirtualMachineProfile.OsProfile.Secrets[0].VaultCertificates.Add($certConfig)
# Update the VM scale set
Update-AzVmss -ResourceGroupName $vmssResourceGroupName -Verbose `
-Name $vmssName -VirtualMachineScaleSet $vmss
For more info, I wrote a blog post on switching from thumbprint to common name.
The official docs are also a good reference.

Related

backing up certificate from keyvault

after a short research and documentation I've created a script based on microsoft's documentation on backing up keys and secrets and certificates.
For testing purposes I've created a self signed certificate so that I can have a certificate, created 3-4 secrets and 10 keys.
All of these scattered into two keyvaults in azure
Going through all of these with a powershell script , I have encountered an issue as follows :
"Operation "backup" is not allowed on this Key, since it is associated with a certificate. Perform the operation on the corresponding certificate. For more information refer to https://learn.microsoft.com/en-us/azure/key-vault/certificates/about-certificates#composition-of-a-certificate"
Although going into that documentation and reading through the composition of a certificate, I fail to understand what exactly seems to be the issue, has anyone encountered something simillar or have experience in this field?
I will show off part of the script that is responsible for backing up the certificates, the variables are correct as all other items (secrets/keys) are being downloaded properly.
$certificates = az keyvault certificate list `
--vault-name $keyVaultName `
| ConvertFrom-Json
foreach ($certificate in $certificates) {
# backup each secret
$KeyVaultid = $certificate.id
$keyVaultOrder = $keyVaultid.Split("/")
$keyVaultType = $keyVaultOrder[3]
$certfilename = "$($certificate.name)-$keyVaultName-$keyVaultType.txt"
az keyvault key backup `
--vault-name $keyVaultName `
--name $certificate.name `
--file $certfilename
}

New-AzWebAppSSLBinding - The specified network password is not correct

I trying to Assign Already created SSL Lets Encrypt certificate to azure app service.
DNS zone in app service already configured to (1.test.webapp1)
$Site = Get-AzWebApp -ResourceGroupName "MyRG" -Name "webapp1"
New-AzWebAppSSLBinding `
-Name "1.test.webapp1" `
-WebApp $Site `
-SslState SniEnabled `
-CertificateFilePath "C:\Users\someuser\Downloads\*.test.webapp1.pfx" `
-CertificatePassword 'password'
Error below:
I don't understand how I can put the password correctly
UPD
Certificate generation output
Subject : CN=*.test.webapp1
NotBefore :
NotAfter :
KeyLength : 2048
Thumbprint :
AllSANs : {*.test.webapp1}
CertFile : C:website\cert.cer
KeyFile : C:website\cert.key
ChainFile : C:website\chain.cer
FullChainFile : C:website\fullchain.cer
PfxFile : C:website\cert.pfx
PfxFullChain : C:website\fullchain.pfx
PfxPass : System.Security.SecureString
One of the workaround you can follow to resolve the above issue;
The password and the file path put them into a variable and provide your FQDN(Custom domain) for which you want SSL binding then you can use the moderated script as below.
NOTE:- We can't able to test it due to some restriction in our subscription.
Make sure that your custom domain is verified as per the requirement and you must have the sufficient permissions to do this operation as suggested here in MS Q&A by #SnehaAgrawal-MSFT .
$fqdn="<Replace with your custom domain name>"
$pfxPath="<Replace with path to your .PFX file>"
$pfxPassword="<Replace with your .PFX password>"
New-AzWebAppSSLBinding -WebAppName $webappname -ResourceGroupName $webappname -Name $fqdn `
-CertificateFilePath $pfxPath -CertificatePassword $pfxPassword -SslState SniEnabled
For more information please refer the below links:-
MICROSOFT DOCUMENT|Bind a custom TLS/SSL certificate to a web app using PowerShell .
Blog|To create .pfx certificate with password and config.

Imported by PS script certificate has broken Private Key

I'm running CI integration tests in Azure DevOps, running happens on a dedicated Azure VM with installed build agent. Those tests require client SSL certificate to be installed on that VM. As a build step in CI I have a PS script that consumes the Azure KeyVault certificate and imports that into LocalMachine/My store of VM. While the cert is imported and I can see it in VM, tests from CI fail using the cert. Note that the cert, when trying to manually export in VM, has a Export with Private Key option grayed out.
When I run the same PS script manually withing VM and then run CI tests (with PS step disabled), tests successfully consumer certificate and pass.
What should I change in my PS script below, so it (being running remotely) would import a certificate with Export with Private Key option enabled?
$vaultName = "MyKeyVault-stest"
$secretName = "MyCertificate"
$kvSecret = Get-AzureKeyVaultSecret -VaultName $vaultName -Name $secretName
$kvSecretBytes = [System.Convert]::FromBase64String($kvSecret.SecretValueText)
$kvSecretPass = 'myPass'
#-----------------------------------------------------------------------------
$pfxCertObject=New-Object System.Security.Cryptography.X509Certificates.X509Certificate2 -ArgumentList #($kvSecretBytes, "", [System.Security.Cryptography.X509Certificates.X509KeyStorageFlags]::Exportable)
$newcertbytes = $pfxCertObject.Export([System.Security.Cryptography.X509Certificates.X509ContentType]::Pkcs12, $kvSecretPass)
$newCert=New-Object System.Security.Cryptography.X509Certificates.X509Certificate2
$newCert.Import($newcertbytes,$kvSecretPass,[System.Security.Cryptography.X509Certificates.X509KeyStorageFlags]::Exportable)
#-------------------------------------------------------------------------------
$certStore = Get-Item "Cert:\LocalMachine\My"
$openFlags = [System.Security.Cryptography.X509Certificates.OpenFlags]::ReadWrite
$certStore.Open($openFlags)
$certStore.Add($newCert)
Write-host $env:USERNAME
Write-host $(whoami)
If you are importing a PFX to add it to a persisted store you want to specify the X509KeyStorageFlags.PersistKeySet flag. If you don't, at some undetermined point later the garbage collector notices no one cares about the key and then asks Windows to delete it... and then the version added to the X509Store can no longer find its key.
Other reading:
What is the impact of the `PersistKeySet`-StorageFlag when importing a Certificate in C#
What is the rationale for all the different X509KeyStorageFlags?

Create an Azure Website with PowerShell and FTP

I need to write a PowerShell script that automatically creates an Azure Website and deploy the content of a local file system. Seems that Azure PowerShell SDK doesn't provide a way to copy files to a website so we want to use FTP as a deploy method.
To get the correct FTP endpoints and credentials the only way that I have found is to call an Azure Management Rest API: Get a Site’s Publish Profile.
But this API as other Azure Management API requires a certificate. I have found the following tutorial
Building Real-World Cloud Apps with Windows Azure that explain how to get the certificate:
$s = Get-AzureSubscription -Current
$thumbprint = $s.Certificate.Thumbprint
Unfortunately seems that with the current SDK $s.Certificate is always null, this property doesn't exists. If I manually set the certificate thumbprint everything works as expected.
Do you have an idea on how to get the correct subscription certificate? Or do you have an easy alternative to deploy local files to an Azure website?
It seems that now you can access the certificate thumbprint using
$thumbprint = $s.DefaultAccount
instead of
#$thumbprint = $s.Certificate.Thumbprint
Seems that the DefaultAccount has exactly the same value as the certificate thumbprint.
Just for reference here is my complete script to obtain a publishing profile for a given website:
Function get-AzureWebSitePublishXml
{
Param(
[Parameter(Mandatory = $true)]
[String]$WebsiteName
)
# Get the current subscription
$s = Get-AzureSubscription -Current
if (!$s) {throw "Cannot get Windows Azure subscription."}
#$thumbprint = $s.Certificate.Thumbprint #this code doesn't work anymore
$thumbprint = $s.DefaultAccount
if (!$thumbprint) { throw "Cannot get subscription cert thumbprint."}
# Get the certificate of the current subscription from your local cert store
$cert = Get-ChildItem Cert:\CurrentUser\My\$thumbprint
if (!$cert) {throw "Cannot find subscription cert in Cert: drive."}
$website = Get-AzureWebsite -Name $WebsiteName
if (!$website) {throw "Cannot get Windows Azure website: $WebsiteName."}
# Compose the REST API URI from which you will get the publish settings info
$uri = "https://management.core.windows.net:8443/{0}/services/WebSpaces/{1}/sites/{2}/publishxml" -f `
$s.SubscriptionId, $website.WebSpace, $Website.Name
# Get the publish settings info from the REST API
$publishSettings = Invoke-RestMethod -Uri $uri -Certificate $cert -Headers #{"x-ms-version" = "2013-06-01"}
if (!$publishSettings) {throw "Cannot get Windows Azure website publishSettings."}
return $publishSettings
}
NOTE: this only works when you have connected to azure using Import-AzurePublishSettingsFile
Can anyone confirm that is safe to use DefaultAccount property?
UPDATE
If you use Kudu API to upload your site, like this, you don't need any certificate or publishing profile. You should read the user name and password using Get-AzureWebsite and the hostname is just yourwebsitename.scm.azurewebsites.net (note the scm segment). I suggest to use Kudu because is far more reliable and fast.

Set-AzureSubscription : Cannot bind parameter 'Certificate'

While changing CurrentStorageAccountName in azure account using PowerShell I am getting following error:
Set-AzureSubscription : Cannot bind parameter 'Certificate'. Cannot convert value
"0EA9BE03CD1C2E5B93DB176F89C2CC2EF147B96C" to type "System.Security.Cryptography.X509Certificates.X509Certificate2".
Error: "The system cannot find the file specified.
Code is :
Set-AzureSubscription -SubscriptionName Enterprise -Certificate 0EA9BE03CD1C2E5B93DB176F89C2CC2EF147B96C -CurrentStorageAccountName btestps -SubscriptionId XXXXXXXXXXXXXXXXXXXXXXXXXXX
I recently needed to do this, you need to actually get or build an X509Certificate2 object that you then pass through to the -certificate parameter.
To set the certificate with a certificate from the local user certificate store by thumbprint you can:
# open the local user certificate store as read-only
$store = new-object System.Security.Cryptography.X509Certificates.X509Store([System.Security.Cryptography.X509Certificates.StoreLocation]::CurrentUser)
$store.Open([System.Security.Cryptography.X509Certificates.OpenFlags]::ReadOnly)
# get all the certificates in the store
$certs = $store.Certificates;
# restrict the selection to only currently valid certificates
$currentcerts = $certs.Find([System.Security.Cryptography.X509Certificates.X509FindType]::FindByTimeValid, [System.DateTime]::Now, 0)
# get the first certificate by thumbprint, you could also find by distinguished name, subject name, etc
$signingCert = $currentcerts.Find([System.Security.Cryptography.X509Certificates.X509FindType]::FindByThumbprint,"XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX",0)[0]
# set the certificate
set-azuresubscription -certificate $signingCert
# open the local user certificate store as read-only
$store = new-object System.Security.Cryptography.X509Certificates.X509Store([System.Security.Cryptography.X509Certificates.StoreLocation]::CurrentUser)
A some what simpler solution is to use this:
$Certificate = Get-Item cert:\\LocalMachine\My\$CertificateThumbprint
Set-AzureSubscription -SubscriptionName Enterprise -Certificate $Certificate
Assuming that the certificate is in LocalMachine\My.