No certificate error on Login-AzureRmAccount -ServicePrincipal - powershell

I have automated few steps utilizing ARM template for Java/Tomcat deployment but I am not getting success in automated login via certificate.
I have created a self-signed certificate using OpenSSL for a fictitious domain “project.company.com”. After following this article to setup an Application in AD and a service principal with contributor role,
https://azure.microsoft.com/en-us/documentation/articles/resource-group-authenticate-service-principal/#provide-certificate-through-automated-powershell-script
I am getting error
“Login-AzureRmAccount : No certificate was found in the certificate store with thumbprint xxxxxxxxxxxxxxxxxxx” for following.
Looks like I am missing something at Azure subscription level. These exact steps work fine for Azure CLI from a Linux box but they don’t work for Azure PowerShell from Windows box.
$cert = New-Object -TypeName System.Security.Cryptography.X509Certificates.X509Certificate2 -ArgumentList #("<my-path>/project.company.pfx", "<my-password>")
#$applicationId="xxxxxxxxxxxxxxxxxxxxx"
#$tenantId="dddddddddddddddddddddd"
#$subscriptionId="yyyyyyyyyyyyyyyyyyy"
#Login-AzureRmAccount -CertificateThumbprint $cert.Thumbprint -ApplicationId $applicationId -ServicePrincipal -TenantId $tenantId
$azureAdApplication = Get-AzureRmADApplication -IdentifierUri "https://project.company.com"
$subscription = Get-AzureRmSubscription
Login-AzureRmAccount -CertificateThumbprint $cert.Thumbprint -ApplicationId $azureAdApplication.ApplicationId -ServicePrincipal -TenantId $subscription.TenantId

In order for the Azure cmdlets to detect the correct certificate you need to install the public cert (the .cer file) into the Trusted Root Certificate Authorities store.

While it will work to install a cert into the trusted root cert authorities, this is not advisable as it impacts what other applications on your machine will trust. Instead, install the certficate in the LocalMachine\My store or the CurrentUser\My store.

Related

Connect-ExchangeOnline UnAuthorized

I'm working on updating our PowerShell scripts to use more secure connection methods. When I try, I get an error that says "UnAuthorized"
PS X:> Connect-ExchangeOnline -AppId $clientId -CertificateThumbprint $thumbPrint -Organization $organization UnAuthorized
At C:\Program
Files\WindowsPowerShell\Modules\ExchangeOnlineManagement\3.0.0\netFramework\ExchangeOnlineManagement.psm1:730 char:21
throw $_.Exception;
CategoryInfo : OperationStopped: (:) [], UnauthorizedAccessException
FullyQualifiedErrorId : UnAuthorized
Is what I highlighted in the following screenshot what I'm supposed to use for the organization parameter?
[snip]
How do I fix the UnAuthorized error?
Thanks
I agree with #scottwtang, you will get unauthorized error if your application don't have required roles and permissions.
I tried to reproduce the same in my environment and got below results:
I used below script from your previous question to generate certificate:
$CN = "GraphApp"
$cert=New-SelfSignedCertificate -Subject "CN=$CN" -CertStoreLocation "Cert:\CurrentUser\My" -KeyExportPolicy Exportable -KeySpec Signature -NotAfter (Get-Date).AddYears(5)
$Thumbprint = $Cert.Thumbprint
Get-ChildItem Cert:\CurrentUser\my\$Thumbprint | Export-Certificate -FilePath $env:USERPROFILE\Downloads\GraphApp.cer
Write-Output "$Thumbprint <- Copy/paste this (save it)"
Output:
Now I uploaded this certificate to Azure AD application like below:
For $organization parameter, you need to pass your domain name. You can find that here:
Go to Azure Portal -> Azure Active Directory -> Overview -> Primary domain
When I ran the below script to connect Exchange Online, I got Access denied error like this:
$clientId="47xxxd8-8x2x-4xxx-bxx7-30cxxxxx8"
$thumbPrint="E4A0F6C6B85EBFxxxxxCD91B5803F88E5"
$organization="xxxxxxxx.onmicrosoft.com"
Connect-ExchangeOnline -AppId $clientId -CertificateThumbprint $thumbPrint -Organization $organization
Output:
To resolve the error, you need to add API permission and Directory role to your application:
Make sure to grant admin consent for the added permission as below:
Now I added Exchange Administrator role to my application like below:
Go to Azure Portal -> Azure Active Directory -> Roles and administrators -> Exchange administrator -> Add assignment
It may take few minutes to assign role successfully as below:
Now I connected to Exchange Online by running script again and ran sample command Get-EXOMailbox -PropertySets Archive to verify it and got response successfully like below:
$clientId="47xxxd8-8x2x-4xxx-bxx7-30cxxxxx8"
$thumbPrint="E4A0F6C6B85EBFxxxxxCD91B5803F88E5"
$organization="xxxxxxxx.onmicrosoft.com"
Connect-ExchangeOnline -AppId $clientId -CertificateThumbprint $thumbPrint -Organization $organization
Output:
So, make sure to assign required roles and permissions for your application to fix the error.
Unfortunately Exchange.ManageAsApp no longer appears in the Graph API Permissions, so cannot be directly selected in the portal. But you can add the permission by adding the following to the JSON in the Manifest:
{
"resourceAppId": "00000002-0000-0ff1-ce00-000000000000",
"resourceAccess": [
{
"id": "dc50a0fb-09a3-484d-be87-e023b12c6440",
"type": "Role"
}
]
}

I am getting the "Sequence contains no elements error" while running a build pipeline in Azure DevOps services to deploy a cloud service (classic)

I have created an Azure Classic type service connection. Is there anything I am missing?
I am then using this Azure Classic Service connection to deploy the cloud service to Azure.
Azure Deployment: D:\a\1\a\*.cspkg
View raw log
Starting: Azure Deployment: D:\a\1\a\*.cspkg
----------------------------------------------------------------
Task : Azure Cloud Service deployment
Description : Deploy an Azure Cloud Service
Version : 1.175.2
Author : Microsoft Corporation
Help : https://learn.microsoft.com/azure/devops/pipelines/tasks/deploy/azure-cloud-powershell-deployment
----------------------------------------------------------------
Import-Module -Name C:\Modules\azure_2.1.0\Azure\2.1.0\Azure.psd1 -Global
Import-Module -Name C:\Modules\azurerm_2.1.0\AzureRM\2.1.0\AzureRM.psd1 -Global
##[warning]The names of some imported commands from the module 'AzureRM.Websites' include unapproved verbs that might make them less discoverable. To find the commands with unapproved verbs, run the
Import-Module command again with the Verbose parameter. For a list of approved verbs, type Get-Verb.
Import-Module -Name C:\Modules\azurerm_2.1.0\AzureRM.Profile\2.1.0\AzureRM.Profile.psm1 -Global
Add-AzureAccount -Credential System.Management.Automation.PSCredential
##[error]Sequence contains no elements
-
##[error]There was an error with the Azure credentials used for the deployment.
-
Finishing: Azure Deployment: D:\a\1\a\*.cspkg
PS: I am using the classic editor to create the pipelines and not YAML builds.
##[error]There was an error with the Azure credentials used for the deployment.
You could change to use the Certificate Based as the verification method.
To use the Certificate Based method, you could use the following script to create the .cer file.
$cert = New-SelfSignedCertificate -DnsName yourdomain.cloudapp.net -CertStoreLocation "cert:\LocalMachine\My" -KeyLength 2048 -KeySpec "KeyExchange"
$password = ConvertTo-SecureString -String "your-password" -Force -AsPlainText
Export-PfxCertificate -Cert $cert -FilePath ".\my-cert-file.pfx" -Password $password
Export-Certificate -Type CERT -Cert $cert -FilePath .\my-cert-file.cer
Then you could get the upload certificate with the management portal.
At the same time , you could get the Public key with .cer file.
This key is used in the service connection.
You can refer to this ticket to get the possible cause of the problem
Through the following error message, the problem is obvious. Like this pic.
##[error]Sequence contains no elements
##[error]There was an error with the Azure credentials used for the deployment.
You need get $cred by below script.(There are some restriction, for more details, pls read related posts.)
$username = "**.com"
$password = "***"
$secstr = New-Object -TypeName System.Security.SecureString
$password.ToCharArray() | ForEach-Object {$secstr.AppendChar($_)}
$cred = new-object -typename System.Management.Automation.PSCredential -argumentlist $username, $secstr
Add-AzureRmAccount -Credential $cred
Related Posts
1. Visual Studio Team Services: Sequence contains no elements
2. Add-AzureRmAccount : Sequence contains no elements

Use the AZ module in a non-interactive environment?

I'm hoping to be able to use the Az module to retrieve a secret from an Azure key vault, for use with a PowerShell script that has been deployed to a server and is run daily by Windows Task Scheduler.
Initially, I needed to follow the oauth (a guess) process:
Connect-AzAccount -Tenant '69a29f45-...'
Which redirects to https://login.microsoftonline.com/..., asking you to choose an account:
eventually, it indicates success:
Authentication complete. You can return to the application. Feel free to close this browser tab.
After this has been completed, the script that retrieves the secret works as expected:
...
$AccessToken = Get-AzKeyVaultSecret -VaultName 'MyVault' -Name 'MySecret' | Select-Object -ExpandProperty SecretValue | ConvertFrom-SecureString -AsPlainText
...
I'm concerned that the token will expire, causing my script to fail.
The SharePoint module (Pnp.PowerShell) can make use of a credential stored in Windows Credential Manager. Can the Az module do so as well?
If not, is there another way to handle this authentication process without interaction?
You can logon using a certificate tied to a Service Principal (SP) in your AD tenant. Then you just have to make sure that the SP has access to your key vault as at least a reader.
Looks we could not use Az module with the Windows Credential Manager, to use Az powershell in a non-interactive way, we always use a service principal, please follow the steps below.
1.Register an application with Azure AD and create a service principal.
2.Get values for signing in and create a new application secret.
3.Then use the commands below.
Note: Don't forget to add the service principal to the Access policies with the secret permission of the keyvault in the portal first.
$azureAplicationId ="<application-id>"
$azureTenantId= "<tenant-id>"
$azurePassword = ConvertTo-SecureString "<client-secret>" -AsPlainText -Force
$psCred = New-Object System.Management.Automation.PSCredential($azureAplicationId , $azurePassword)
Connect-AzAccount -Credential $psCred -TenantId $azureTenantId -ServicePrincipal
#get the secret
$AccessToken = Get-AzKeyVaultSecret -VaultName 'MyVault' -Name 'MySecret' | Select-Object -ExpandProperty SecretValue | ConvertFrom-SecureString -AsPlainText

Certificate Authentication to a Point to Site Vpn with Azure key vault

I Want to Create a Point-To-Site vpn from a Virtual netwerk in azure.
For the authentication I want to use certificates, the root certificate is generated in azure key vault. I don't want to authenticate with the rootCertificate.pfx but with the clientCertificate.pfx.
The requirements are
that I don't use an external certificate provider.
I use as much powershell as possible
We followed this Documentation to Create a self-signed certificate in key vault.
https://blogs.technet.microsoft.com/kv/2016/09/26/get-started-with-azure-key-vault-certificates/
After I Created a Root Certificate, it is placed within the key vault. (public part in secrets, private part in keys)
Next We copy/paste the public part(Base64) in the Virtual Network Gateway -> point-to-site Configuration -> Root Certificates
Then I used this powershell script to generate a clientCertificate from the rootCertificate that we made earlier. We export it to our Local Hard Drive.
Login-AzureRmAccount
$kvSecret = Get-AzureKeyVaultSecret -VaultName "akv-contoso-test" -Name "ContosoFirstCertificate"
$kvSecretBytes = [System.Convert]::FromBase64String($kvSecret.SecretValueText)
$rootCertificate = New-Object System.Security.Cryptography.X509Certificates.X509Certificate2 -ArgumentList #($kvSecretBytes, $null, [System.Security.Cryptography.X509Certificates.X509KeyStorageFlags]::Exportable)
$certStore = New-Object System.Security.Cryptography.X509Certificates.X509Store
$certStore.Open([System.Security.Cryptography.X509Certificates.OpenFlags]::ReadWrite)
$certStore.Add($rootCertificate)
$clientCertificate = New-SelfSignedCertificate -Type Custom -KeySpec Signature -Subject "CN=Point To Site VPN Client" -KeyExportPolicy Exportable -HashAlgorithm sha256 -KeyLength 2048 -Signer $rootCertificate -TextExtension #("2.5.29.37={text}1.3.6.1.5.5.7.3.2")
$securePassword = ConvertTo-SecureString -String "mysecret" -AsPlainText -Force
Export-PfxCertificate -Cert $clientCertificate -Password $securePassword -FilePath "C:\Users\User\Desktop\Certificates\Point To Site VPN Client.pfx" -ChainOption BuildChain
$certStore.Close()
After that we install the "Point To Site VPN Client.pfx"
The public root certificate is installed in the Trusted Root Certification Authorities. In the chain of trust of my client certificate, the Root Certificate says the following "This certificate is not valid for the selected purpose".
Now I want to connect to the vpn with the Client Certificate.
He tells that the message received was unexpected or badly formatted.
My guess is that there is a problem when we want to generate the Client Certificate out of the keyvault (script above).
Chain of trust and Connection Error

Azure functions & Powershell - unable to Login-AzureRmAccount using certificate

I'm building an Azure Function (hosted in app service plan) which will enumerate the assets in my subscription and do something with them.
I have the site set up in such a way that I expect it to work, but Login-AzureRmAccount errors out every time with the notice
Login-AzureRmAccount : No certificate was found in the certificate store with thumbprint xxxxxxxx
Here are some relevant pieces:
First I create the cert:
$cert = New-SelfSignedCertificate -CertStoreLocation cert:\currentuser\my -Subject "cn=$appCommonName" ...etc...
$keyValue = [System.Convert]::ToBase64String($cert.GetRawCertData())
$aNewApp = New-AzureRmADApplication -DisplayName $AzureADApplicationName -HomePage $AppHomePage -IdentifierUris $appIDUri -CertValue $keyValue -EndDate $cert.NotAfter -StartDate $cert.NotBefore
#export the cert for use future upload to Azure
$password = ConvertTo-SecureString -String "supersecrettpassword" -Force -AsPlainText
Export-PfxCertificate -Cert $cert -FilePath "Export-cert.pfx" -Password $password
Export-Certificate -Type CERT -Cert $cert -FilePath "Export-cert.cer"
Later I provision my Service Principal and give it read access
$theSvcPrincipal = New-AzureRmADServicePrincipal -ApplicationId $ApplicationId
$testRole = Get-AzureRmRoleAssignment -RoleDefinitionName Reader -ServicePrincipalName $ApplicationId
Through the magic of ARM deployments I end up with two app settings in the web site which hosts the azure function : WEBSITE_LOAD_CERTIFICATES with a value of * and CertThumbprint with a value of the thumbprint of the certificate which I've uploaded the the SSL certificates area.
Finally, after the ARM template is deployed, I upload the certificate following the instructions from this Stack Overflow post
Given all that prep work, I would expect this to work in my function:
Login-AzureRmAccount -ServicePrincipal -CertificateThumbprint $env:CertThumbprint -ApplicationId $env:ApplicationId -TenantId $env:TenantId
but when that line executes, despite my having a certificate in the web site with matching thumbprint, I get the no cert found wiht matching thumbprint error.
Ok, this was a bit embarrassing. It turns out that the ARM template I was using to provision the App Service was using the Dynamic sku:
"properties": {
"name": "[parameters('hostingPlanName')]",
"computeMode": "Dynamic",
"sku": "Dynamic"
}
rather than a standard sku:
"sku": {
"name": "B1",
"tier": "Basic",
"size": "B1",
"family": "B",
"capacity": 1
},
Once I switched the two, I was able to access the cert store.
I had seen this issue discussed on github, but dismissed it when a coworker was able to access a client certificate from a function app hosted on a dynamic plan after using the Azure web interface to provision it. I still can't explain how he was able to do it, but I've seen it with my own eyes and poured over the ARM templates from his deployment to verify that he's using the consumption-based model.