Visual studio team services deploymen/buildt certificate error - azure-devops

I am trying to build a click-once application using the Continuous integration and deployment feature in VSTS (Visual studio team services Online)We are trying to build this using the Hosted agent Visual studio 2015 We had difficulties signing the strong name key file with an error of
MSB3326: Cannot import the following key file: xxxx.snk. The key file may be password protected. To correct this, try to import the certificate again or import the certificate manually into the current user's personal certificate store.
And after that
MSB3321: Importing key file "xxxx.pfx" was canceled.
I have tried to both select from store and from file changed the location and made sure to commit but with no success.
Any ideas how i can overcome this errors or what am doing wrong.
Clerification on answer selected
Just wanted to make a clarification if anyone else has the same issue, in addition to the answer i had to place my certificate in my source control code and commit it. Then to select its location add a global variable on the VSTS Build
$cert.Import("$(CertPath)", $password, [System.Security.Cryptography.X509Certificates.X509KeyStorageFlags]"PersistKeySet")
Where $(CertPath) would be something like $(Build.SourcesDirectory)\SharedSolutionFiles\CertificateName.pfx

You can create a PowerShell script and add a PowerShell Script step in your build definition to import the certificate file before the VSBuild step.
Build failed without PowerShell Import Certificate Step:
Build passed with PowerShell Import Certificate Step:
The PowerShell Script I used:
$pfxpath = 'pathtoees.pfx'
$password = 'password'
Add-Type -AssemblyName System.Security
$cert = New-Object System.Security.Cryptography.X509Certificates.X509Certificate2
$cert.Import($pfxpath, $password, [System.Security.Cryptography.X509Certificates.X509KeyStorageFlags]"PersistKeySet")
$store = new-object system.security.cryptography.X509Certificates.X509Store -argumentlist "MY", CurrentUser
$store.Open([System.Security.Cryptography.X509Certificates.OpenFlags]"ReadWrite")
$store.Add($cert)
$store.Close()

The better way is that you can setup a on premise build agent and import the certificate to certificate store, then change build agent service account to the same account.

Instead of using either an on premise build or loading the certificates on to the certificate stores on the build agent (which could be considered insecure) it is possible to overwrite the build task FileSign and construct one that uses a certificate file and password.
I have outlined the steps here:
https://stackoverflow.com/a/55313239/2068626

After failing to use methods in other answers, I found another way to use PowerShell script to import pfx certificate. my script was written for GitHub Actions but you can easily change the syntax to VSTS or Azure pipeline (using 'task: PowerShell#2' for azure pipeline for example). You may also like to update file path from github to your devops path. Also password can be replaced with a secured variable.
- name: Import certificate from the command-line
shell: pwsh
run: |
$Secure_String_Pwd = ConvertTo-SecureString "<password>" -AsPlainText -Force
Import-PfxCertificate -FilePath '${{github.workspace}}\<path>\project1_TemporaryKey.pfx' -CertStoreLocation Cert:\CurrentUser\My -Password $Secure_String_Pwd
Then build your Visual Studio ClickOnce project. The error should be gone.
After the build, you may like to remove the certificate from the machine. Here is a PowerShell example provided by Microsoft:
https://learn.microsoft.com/en-us/answers/questions/360772/powershell-commands-to-delete-personal-certificate.html
$users = "user1","user2","user3","user4","user5"
Get-ChildItem Cert:\CurrentUser\My | ForEach-Object {
$ifkeep = $false
foreach($user in $users){
if($_.Subject -match $user){
$ifkeep = $true
break
}
}
if($ifkeep -eq $false){
Remove-Item $_
}
}

Related

Imported pfx certificate is not saved on disk

I wrote a script in PowerShell to import a certificate on Windows Server 2016/2019. The script is added to an Azure DevOps pipeline and the agent is an Environment Agent running as NT AUTHORITY\SYSTEM. It first imports the certificate into the LocalMachine\My store and sets read permissions on the certificate right after that. It comes with quite weird behavior while running as a pipeline or executed manually. I'll write out the different kinds of behavior:
Import certificate on a server (successful)
Check the existence of the file on the physical disk (successful -> found in C:\ProgramData\Microsoft\Crypto\RSA\MachineKeys)
Set read permissions for account x (successful)
Import certificate on a server (successful)
Check the existence of the file on the physical disk (failed. File is not stored programdata\machinekeys, but in C:\ProgramData\Microsoft\Crypto\Keys)
Set read permissions for account x (failed --> can't find the file on the proper location)
Import certificate on a server (successful)
Check the existence of the file on the physical disk (successful)
Set read permissions for account x (failed --> can't find the file on the proper location. does not exist on the server )
There is no way to predict which one of the scenario's will occur while running the script.
I monitored server and file behavior with ProcMon (SysInternal Suite) during the import and did see the file being created and saved in while in C:\ProgramData\Microsoft\Crypto\RSA\MachineKeys
during both successful and failed attempts.
Import PFX code:
$FilePath = 'e:\folder\certificate.pfx'
$password = 'iLoveChocolateyCookies'
$cert = New-Object system.Security.Cryptography.X509Certificates.X509Certificate2($FilePath, $password, "PersistKeySet,MachineKeySet")
$store = New-object System.Security.Cryptography.X509Certificates.X509Store -argumentlist "My", "LocalMachine"
$store.Open([System.Security.Cryptography.X509Certificates.OpenFlags]::"ReadWrite")
$store.Add($cert)
$store.Close()
Physical disk location check code:
$checkCert = Get-ChildItem "Cert:\LocalMachine\My" | Where-Object { $_.thumbprint -eq '<insertThumbhere>' }
$rsaCertCheck = [System.Security.Cryptography.X509Certificates.RSACertificateExtensions]::GetRSAPrivateKey($checkCert)
$checkCertDirectory = (Get-ChildItem -Path 'C:\Programdata\Microsoft\Crypto' -Include $rsaCertCheck.Key.UniqueName -File -Recurse).DirectoryName
if ($checkCertDirectory -eq "C:\ProgramData\Microsoft\Crypto\RSA\MachineKeys") {
Write-Host "Certificate found on physical disk."
} else {
throw 'nope'
}
Import-PfxCertificate showed the same behaviour. That is why I decided to use the .Net approach after consulting StackOverflow and Google, but ended up with the same issues.
A helpful hand will be very much appreciated :)

Add certificates the same way the Certificate Import Wizard does [Powershell]

I have configured a powershell script, which creates a vpn conection profile.
To make it work i need to add proper certificate.
Everything works fine when i add a certificate manually to local machine:
More detailed regarding importing certificate manualy:
Info
I'm trying to perform this task via powershell, but it doesn't work (script seems to work, but i am not sure to which stores should i copy certificate). In contrary to manual method - the certificate added by my powershell script is invisible for vpn connection.
#add certificate
$cert_name=$env:USERNAME+"#vpn.contoso.com.p12"
$cert_loc="\\ad\deploy\other\certs\"+$cert_name
$secure_pwd = ConvertTo-SecureString "contoso987%#" -AsPlainText -Force
Import-PfxCertificate -FilePath $cert_loc -CertStoreLocation Cert:\LocalMachine\My -Password $secure_pwd
# Add vpn connection
Add-VpnConnection -Name "Example VPNX" -ServerAddress "vpn.example.com" -AuthenticationMethod "MachineCertificate" -TunnelType "IKEv2" -EncryptionLevel "Maximum" -SplitTunneling $True
I would like to do it the same way the certificate import wizard does. Does anyone have experience in that ?
PS
I've changed addresses in codes etc.
Kind Regards,
Tamara
I've decided to post the solution. Although it is not developed in powershell it solves the problem completely. It is possible to import these kind of certificates from command prompt:
certutil -f -p Some_password -importpfx "\\ad\somepath\certificate.p12"

Set-Authenticode appears to incorrectly sign the specified assembly

I'm having trouble signing a .NET Standard 2.0 assembly using the Set-Authenticode powershell function as part of an Azure DevOps pipeline. I have written a little bit of powershell to go through assembles in a directory and apply signatures to each DLL in the folder:
$project = "${{ parameters.projects }}"; # todo: what about multi-line values here
$folderPath = [System.IO.Directory]::GetParent($project)
$files = Get-ChildItem -Path $folderPath -Filter "*.dll" -Recurse
$securePassword = ConvertTo-SecureString $(CertificatePassword) -AsPlainText -Force
$certificate = Get-PfxCertificate -FilePath $(CodeSignCertificate.secureFilePath) -NoPromptForPassword -Password $securePassword
foreach($file in $files) {
Write-Host "Setting Authenticode Signature for $file"
$result = Set-AuthenticodeSignature -FilePath $file -Certificate $certificate -Force -HashAlgorithm SHA256 -IncludeChain All -TimestampServer "http://tsa.starfieldtech.com"
if ($result.Status.ToString().Contains("Error")) { Write-Error $result.StatusMessage }
else {
Write-Host $result.Status.ToString()
Write-Host $result.StatusMessage.ToString()
}
}
The process appears to complete successfully, each DLL that is signed outputs the following message, as per the three Write-Host lines in my script:
Setting Authenticode Signature for D:\a\1\s\...\SomeAssembly.dll
Valid
Signature verified.
Now, the problem becomes apparent when inspecting the DLL that is produced at the end of the build using the Nuget Package Explorer. I see the following error: "The file is signed, however the signed hash does not match the computed hash". This is seen here in this screenshot (the error appears as a tooltip when hovering over the red error icon).
I have also tried:
Running this locally with a self-signed certificate, which appears to work fine.
Running the build on an older build agent- in this case the signing process fails during the build with the error "Get-PfxCertificate : The specified network password is not correct".
Using signtool.exe. This produces the same error.
I've certainly run out of ideas now. What could I be missing?
I've actually figured this out on my own.
My Code signing pipeline step was followed by a strong-naming step. The strong-naming step was changing the hash of the assembly so it no longer matched the hash specified in the signature.
The solution was to move the strong-naming step to happen before the code signing one. Now I can successfully strong-name the assembly, sign it, then after packaging, I can sign the nuget package and all the signatures are valid in the output.

Imported by PS script certificate has broken Private Key

I'm running CI integration tests in Azure DevOps, running happens on a dedicated Azure VM with installed build agent. Those tests require client SSL certificate to be installed on that VM. As a build step in CI I have a PS script that consumes the Azure KeyVault certificate and imports that into LocalMachine/My store of VM. While the cert is imported and I can see it in VM, tests from CI fail using the cert. Note that the cert, when trying to manually export in VM, has a Export with Private Key option grayed out.
When I run the same PS script manually withing VM and then run CI tests (with PS step disabled), tests successfully consumer certificate and pass.
What should I change in my PS script below, so it (being running remotely) would import a certificate with Export with Private Key option enabled?
$vaultName = "MyKeyVault-stest"
$secretName = "MyCertificate"
$kvSecret = Get-AzureKeyVaultSecret -VaultName $vaultName -Name $secretName
$kvSecretBytes = [System.Convert]::FromBase64String($kvSecret.SecretValueText)
$kvSecretPass = 'myPass'
#-----------------------------------------------------------------------------
$pfxCertObject=New-Object System.Security.Cryptography.X509Certificates.X509Certificate2 -ArgumentList #($kvSecretBytes, "", [System.Security.Cryptography.X509Certificates.X509KeyStorageFlags]::Exportable)
$newcertbytes = $pfxCertObject.Export([System.Security.Cryptography.X509Certificates.X509ContentType]::Pkcs12, $kvSecretPass)
$newCert=New-Object System.Security.Cryptography.X509Certificates.X509Certificate2
$newCert.Import($newcertbytes,$kvSecretPass,[System.Security.Cryptography.X509Certificates.X509KeyStorageFlags]::Exportable)
#-------------------------------------------------------------------------------
$certStore = Get-Item "Cert:\LocalMachine\My"
$openFlags = [System.Security.Cryptography.X509Certificates.OpenFlags]::ReadWrite
$certStore.Open($openFlags)
$certStore.Add($newCert)
Write-host $env:USERNAME
Write-host $(whoami)
If you are importing a PFX to add it to a persisted store you want to specify the X509KeyStorageFlags.PersistKeySet flag. If you don't, at some undetermined point later the garbage collector notices no one cares about the key and then asks Windows to delete it... and then the version added to the X509Store can no longer find its key.
Other reading:
What is the impact of the `PersistKeySet`-StorageFlag when importing a Certificate in C#
What is the rationale for all the different X509KeyStorageFlags?

Automatically sign powershell script using Get-PfxCertificate

I have to sign remote scripts with a certificate from the remote machine from which I have a .pfx file.
I would like to automate the scripting by supplying the password to the Get-PfxCertificate programmatically.
So the question is:
Is it possible to somehow supply programmatically the required password to
Get-PfxCertificate?
$CertPath = "my.pfx"
$CertPass = "mypw"
$Cert = New-Object System.Security.Cryptography.X509Certificates.X509Certificate2($CertPath, $CertPass)
Set-AuthenticodeSignature -Certificate $Cert -TimeStampServer http://timestamp.verisign.com/scripts/timstamp.dll -FilePath $OutputFilename
Make sure you have the proper permissions otherwise you won't be able to create an instance of the X509Certificate2 object.
I did a bit of checking around on this and couldn't find a clean way to provide the password programmatically. I suspect it is meant to be this way for security reasons. Either that or the PowerShell development team just blew it by not including a Credential parameter for this cmdlet. The only other option I can think of is to use someting like SendKeys to send the individual password character key presses to the PowerShell console at the right time via a background job (blech - just threw up in my mouth a little). :-)
Another way of doing this is by loading your certificate directly from your certificate store using PS Providers. Use Get-PSProviders to determine available PSProviders on your machine.
Once you have cert provider loaded, you can now get the certificate using Get-ChildItem
Launch certmgr.msc from run to launch the certificate store
Assuming that your certificate is stored under Personal folder in your cert store and has "Company Name" set in the subject property of the certificate, and there is only certificate in that folder with Company Name in the subject - you can get the certificate like so
$my_cert = Get-ChildItem cert:\CurrentUser\My | ? {$_.Subject -match "Company Name"}
$my_cert will be your certificate object that you can pass directly to Set-AuthenticodeSignature cmdlet
Set-AuthenticodeSignature -Certificate $my_cert -FilePath fqn_to_dll.dll -Timestampserver "http://timestampurl"
post signing, you can retrieve the sign status by querying on the Status property for "Valid" or not like
$result = Set-AuthenticodeSignature -Certificate $my_cert -FilePath fqn_to_dll.dll -Timestampserver "http://timestampurl" | Select Status
if(-Not ($result -eq "Valid")){
Write-Output "Error Signing file: Status: $($result.Status)"
}