Set-Authenticode appears to incorrectly sign the specified assembly - powershell

I'm having trouble signing a .NET Standard 2.0 assembly using the Set-Authenticode powershell function as part of an Azure DevOps pipeline. I have written a little bit of powershell to go through assembles in a directory and apply signatures to each DLL in the folder:
$project = "${{ parameters.projects }}"; # todo: what about multi-line values here
$folderPath = [System.IO.Directory]::GetParent($project)
$files = Get-ChildItem -Path $folderPath -Filter "*.dll" -Recurse
$securePassword = ConvertTo-SecureString $(CertificatePassword) -AsPlainText -Force
$certificate = Get-PfxCertificate -FilePath $(CodeSignCertificate.secureFilePath) -NoPromptForPassword -Password $securePassword
foreach($file in $files) {
Write-Host "Setting Authenticode Signature for $file"
$result = Set-AuthenticodeSignature -FilePath $file -Certificate $certificate -Force -HashAlgorithm SHA256 -IncludeChain All -TimestampServer "http://tsa.starfieldtech.com"
if ($result.Status.ToString().Contains("Error")) { Write-Error $result.StatusMessage }
else {
Write-Host $result.Status.ToString()
Write-Host $result.StatusMessage.ToString()
}
}
The process appears to complete successfully, each DLL that is signed outputs the following message, as per the three Write-Host lines in my script:
Setting Authenticode Signature for D:\a\1\s\...\SomeAssembly.dll
Valid
Signature verified.
Now, the problem becomes apparent when inspecting the DLL that is produced at the end of the build using the Nuget Package Explorer. I see the following error: "The file is signed, however the signed hash does not match the computed hash". This is seen here in this screenshot (the error appears as a tooltip when hovering over the red error icon).
I have also tried:
Running this locally with a self-signed certificate, which appears to work fine.
Running the build on an older build agent- in this case the signing process fails during the build with the error "Get-PfxCertificate : The specified network password is not correct".
Using signtool.exe. This produces the same error.
I've certainly run out of ideas now. What could I be missing?

I've actually figured this out on my own.
My Code signing pipeline step was followed by a strong-naming step. The strong-naming step was changing the hash of the assembly so it no longer matched the hash specified in the signature.
The solution was to move the strong-naming step to happen before the code signing one. Now I can successfully strong-name the assembly, sign it, then after packaging, I can sign the nuget package and all the signatures are valid in the output.

Related

Imported pfx certificate is not saved on disk

I wrote a script in PowerShell to import a certificate on Windows Server 2016/2019. The script is added to an Azure DevOps pipeline and the agent is an Environment Agent running as NT AUTHORITY\SYSTEM. It first imports the certificate into the LocalMachine\My store and sets read permissions on the certificate right after that. It comes with quite weird behavior while running as a pipeline or executed manually. I'll write out the different kinds of behavior:
Import certificate on a server (successful)
Check the existence of the file on the physical disk (successful -> found in C:\ProgramData\Microsoft\Crypto\RSA\MachineKeys)
Set read permissions for account x (successful)
Import certificate on a server (successful)
Check the existence of the file on the physical disk (failed. File is not stored programdata\machinekeys, but in C:\ProgramData\Microsoft\Crypto\Keys)
Set read permissions for account x (failed --> can't find the file on the proper location)
Import certificate on a server (successful)
Check the existence of the file on the physical disk (successful)
Set read permissions for account x (failed --> can't find the file on the proper location. does not exist on the server )
There is no way to predict which one of the scenario's will occur while running the script.
I monitored server and file behavior with ProcMon (SysInternal Suite) during the import and did see the file being created and saved in while in C:\ProgramData\Microsoft\Crypto\RSA\MachineKeys
during both successful and failed attempts.
Import PFX code:
$FilePath = 'e:\folder\certificate.pfx'
$password = 'iLoveChocolateyCookies'
$cert = New-Object system.Security.Cryptography.X509Certificates.X509Certificate2($FilePath, $password, "PersistKeySet,MachineKeySet")
$store = New-object System.Security.Cryptography.X509Certificates.X509Store -argumentlist "My", "LocalMachine"
$store.Open([System.Security.Cryptography.X509Certificates.OpenFlags]::"ReadWrite")
$store.Add($cert)
$store.Close()
Physical disk location check code:
$checkCert = Get-ChildItem "Cert:\LocalMachine\My" | Where-Object { $_.thumbprint -eq '<insertThumbhere>' }
$rsaCertCheck = [System.Security.Cryptography.X509Certificates.RSACertificateExtensions]::GetRSAPrivateKey($checkCert)
$checkCertDirectory = (Get-ChildItem -Path 'C:\Programdata\Microsoft\Crypto' -Include $rsaCertCheck.Key.UniqueName -File -Recurse).DirectoryName
if ($checkCertDirectory -eq "C:\ProgramData\Microsoft\Crypto\RSA\MachineKeys") {
Write-Host "Certificate found on physical disk."
} else {
throw 'nope'
}
Import-PfxCertificate showed the same behaviour. That is why I decided to use the .Net approach after consulting StackOverflow and Google, but ended up with the same issues.
A helpful hand will be very much appreciated :)

How to run Powershell as an Admin yet execute functions as a Standard user?

I have a PowerShell script that has a function that deletes a certificate based on certain values in the Subject. This function works but only if I run the commands in a normal PowerShell window. If I run the PowerShell as an admin with separate admin credentials it fails.
The reason it fails is because when my script gets to the function where PowerShell parses the Personal Store of the Current User, it checks under the admin account running the script and not the current user itself. Is it possible to fix this within the Function itself?
Can I tell PowerShell to check the Currently Logged in Users Personal certificate Store and not the certificate store of the Admin user that is executing the script?
Unfortunately I need admin credentials to execute PowerShell scripts. I can't run them locally with my normal account.
function deleteCert(){
try {
Write-Host "Deleting Some Certificate..."
$cert = Get-ChildItem -Path Cert:\CurrentUser\My -Recurse | Where-Object {$_.Subject -Like "*Some Certificate*"}
if($cert -eq $null){
Write-Host "Unable to locate Certificate"
}else{
Remove-Item $cert
Write-Host "Deleting Some Certificate Successful..."
}
}catch {
Write-Error "Exception caught in removing certificate"
Exit
}
}

Visual studio team services deploymen/buildt certificate error

I am trying to build a click-once application using the Continuous integration and deployment feature in VSTS (Visual studio team services Online)We are trying to build this using the Hosted agent Visual studio 2015 We had difficulties signing the strong name key file with an error of
MSB3326: Cannot import the following key file: xxxx.snk. The key file may be password protected. To correct this, try to import the certificate again or import the certificate manually into the current user's personal certificate store.
And after that
MSB3321: Importing key file "xxxx.pfx" was canceled.
I have tried to both select from store and from file changed the location and made sure to commit but with no success.
Any ideas how i can overcome this errors or what am doing wrong.
Clerification on answer selected
Just wanted to make a clarification if anyone else has the same issue, in addition to the answer i had to place my certificate in my source control code and commit it. Then to select its location add a global variable on the VSTS Build
$cert.Import("$(CertPath)", $password, [System.Security.Cryptography.X509Certificates.X509KeyStorageFlags]"PersistKeySet")
Where $(CertPath) would be something like $(Build.SourcesDirectory)\SharedSolutionFiles\CertificateName.pfx
You can create a PowerShell script and add a PowerShell Script step in your build definition to import the certificate file before the VSBuild step.
Build failed without PowerShell Import Certificate Step:
Build passed with PowerShell Import Certificate Step:
The PowerShell Script I used:
$pfxpath = 'pathtoees.pfx'
$password = 'password'
Add-Type -AssemblyName System.Security
$cert = New-Object System.Security.Cryptography.X509Certificates.X509Certificate2
$cert.Import($pfxpath, $password, [System.Security.Cryptography.X509Certificates.X509KeyStorageFlags]"PersistKeySet")
$store = new-object system.security.cryptography.X509Certificates.X509Store -argumentlist "MY", CurrentUser
$store.Open([System.Security.Cryptography.X509Certificates.OpenFlags]"ReadWrite")
$store.Add($cert)
$store.Close()
The better way is that you can setup a on premise build agent and import the certificate to certificate store, then change build agent service account to the same account.
Instead of using either an on premise build or loading the certificates on to the certificate stores on the build agent (which could be considered insecure) it is possible to overwrite the build task FileSign and construct one that uses a certificate file and password.
I have outlined the steps here:
https://stackoverflow.com/a/55313239/2068626
After failing to use methods in other answers, I found another way to use PowerShell script to import pfx certificate. my script was written for GitHub Actions but you can easily change the syntax to VSTS or Azure pipeline (using 'task: PowerShell#2' for azure pipeline for example). You may also like to update file path from github to your devops path. Also password can be replaced with a secured variable.
- name: Import certificate from the command-line
shell: pwsh
run: |
$Secure_String_Pwd = ConvertTo-SecureString "<password>" -AsPlainText -Force
Import-PfxCertificate -FilePath '${{github.workspace}}\<path>\project1_TemporaryKey.pfx' -CertStoreLocation Cert:\CurrentUser\My -Password $Secure_String_Pwd
Then build your Visual Studio ClickOnce project. The error should be gone.
After the build, you may like to remove the certificate from the machine. Here is a PowerShell example provided by Microsoft:
https://learn.microsoft.com/en-us/answers/questions/360772/powershell-commands-to-delete-personal-certificate.html
$users = "user1","user2","user3","user4","user5"
Get-ChildItem Cert:\CurrentUser\My | ForEach-Object {
$ifkeep = $false
foreach($user in $users){
if($_.Subject -match $user){
$ifkeep = $true
break
}
}
if($ifkeep -eq $false){
Remove-Item $_
}
}

Powershell Script to Install Certificate Into Active Directory Store

I'm trying to write a powershell script to install a certificate into the active directory certificate store,
Here are the steps to do this manually, any help would be greatly appreciated.
On a Windows 2008R2 domain controller,
Click Start -> Run
type MMC
click ok
Click File -> Add/Remove Snap-In
Select "Certificates" -> Add
Select "Service Account"
Click Next
Select "Local Computer"
Click Next
Select "Active Directory Domain Services"
Click Finish
Click Ok
I want the script to install the certificate into :
NTDS\Personal
I would post an image but I don't have enough "reputation" apparently, so I can only provide text instructions.
So basically what I've tried is, I've used this powershell function below to import a certificate into the Local Machine -> Personal Store, which is where most certificates go, and the code works.
But I need to install the certificate into the "NTDS\Personal" store on a domain controller, but the $certRootStore only accepts localmachine or CurrentUser, so I'm stuck : /
function Import-PfxCertificate
{
param
(
[String]$certPath,
[String]$certRootStore = "localmachine",
[String]$certStore = "My",
$pfxPass = $null
)
$pfx = new-object System.Security.Cryptography.X509Certificates.X509Certificate2
if ($pfxPass -eq $null)
{
$pfxPass = read-host "Password" -assecurestring
}
$pfx.import($certPath,$pfxPass,"Exportable,PersistKeySet")
$store = new-object System.Security.Cryptography.X509Certificates.X509Store($certStore,$certRootStore)
$store.open("MaxAllowed")
$store.add($pfx)
$store.close()
}
Import-PfxCertificate -certPath "d:\Certificate.pfx"
Regards Alex
Using a combination of what you already had above and the registry keys for the two certificate stores this works.
The only other thing is that I don't know how NTDS determines which certificate to use when there are multiple in the certificate store.
function Import-NTDSCertificate {
[CmdletBinding()]
param(
[Parameter(Mandatory)]
[string]$PFXFile,
[Parameter(Mandatory)]
[string]$PFXPassword,
#Remove certificate from LocalMachine\Personal certificate store
[switch]$Cleanup
)
begin{
Write-Verbose -Message "Importing PFX file."
$PFXObject = New-Object -TypeName System.Security.Cryptography.X509Certificates.X509Certificate2
$PFXObject.Import($PFXFile,$PFXPassword,[System.Security.Cryptography.X509Certificates.X509KeyStorageFlags]::Exportable)
$thumbprint = $PFXObject.Thumbprint
}
process{
Write-Verbose -Message "Importing certificate into LocalMachine\Personal"
$certificateStore = New-Object -TypeName System.Security.Cryptography.X509Certificates.X509Store('My','LocalMachine')
$certificateStore.Open('MaxAllowed')
$certificateStore.Add($PFXObject)
$certificateStore.Close()
Write-Verbose -Message "Copying certificate from LocalMachine\Personal to NTDS\Personal"
$copyParameters = #{
'Path' = "HKLM:\Software\Microsoft\SystemCertificates\MY\Certificates\$thumbprint"
'Destination' = "HKLM:\SOFTWARE\Microsoft\Cryptography\Services\NTDS\SystemCertificates\My\Certificates\$thumbprint"
'Recurse' = $true
}
Copy-Item #copyParameters
}
end{
if ($Cleanup){
Write-Verbose -Message "Removing certificate from LocalMachine\Personal"
$removalParameters = #{
'Path' = "HKLM:\SOFTWARE\Microsoft\SystemCertificates\MY\Certificates\$thumbprint"
'Recurse' = $true
}
Remove-Item #removalParameters
}
}
}
Alright, first the bad news. The only managed certificate stores are LocalMachine and CurrentUser, as we have all seen in powershell.
Now, the not so bad news. We know that the 'physical' location store (physical is MS' word, not mine) exists in the registry on the ADDS server, HKLM\Software\Microsoft\Cryptography\Services\NTDS\SystemCertificates. This was dually verified by both
Using procmon while importing a certificate into the store using the mmc snap-in
Scavenging msdn for this nugget
The link in #2 shows that all physical stores for services are stored in the path mentioned above, substituting NTDS for . The real service name, not the display name.
However,
Because of the bad news. Trying to map it in powershell with that reg key as the root and -PSProvider Certificate will prove disappointing, it was the first thing I tried.
What one can try, is using the X509Store constructor that takes an IntPtr to a SystemStore, as described here. Yes, that invovles some unmanaged code, and mixing the two is something I do rarely, but this and googling for HCERTSTORE C# should get you there.
Even though this post is years old, it is still helpful and turns up in searches, so to address the question of "I don't know how NTDS determines which certificate to use when there are multiple in the certificate store", the answer is that you will get unreliable results when there are two or more valid certificates installed that meet the requested criteria so it is recommended to remove the old/unneeded certificate(s) and just leave the newest/best one for the server auth.

Automatically sign powershell script using Get-PfxCertificate

I have to sign remote scripts with a certificate from the remote machine from which I have a .pfx file.
I would like to automate the scripting by supplying the password to the Get-PfxCertificate programmatically.
So the question is:
Is it possible to somehow supply programmatically the required password to
Get-PfxCertificate?
$CertPath = "my.pfx"
$CertPass = "mypw"
$Cert = New-Object System.Security.Cryptography.X509Certificates.X509Certificate2($CertPath, $CertPass)
Set-AuthenticodeSignature -Certificate $Cert -TimeStampServer http://timestamp.verisign.com/scripts/timstamp.dll -FilePath $OutputFilename
Make sure you have the proper permissions otherwise you won't be able to create an instance of the X509Certificate2 object.
I did a bit of checking around on this and couldn't find a clean way to provide the password programmatically. I suspect it is meant to be this way for security reasons. Either that or the PowerShell development team just blew it by not including a Credential parameter for this cmdlet. The only other option I can think of is to use someting like SendKeys to send the individual password character key presses to the PowerShell console at the right time via a background job (blech - just threw up in my mouth a little). :-)
Another way of doing this is by loading your certificate directly from your certificate store using PS Providers. Use Get-PSProviders to determine available PSProviders on your machine.
Once you have cert provider loaded, you can now get the certificate using Get-ChildItem
Launch certmgr.msc from run to launch the certificate store
Assuming that your certificate is stored under Personal folder in your cert store and has "Company Name" set in the subject property of the certificate, and there is only certificate in that folder with Company Name in the subject - you can get the certificate like so
$my_cert = Get-ChildItem cert:\CurrentUser\My | ? {$_.Subject -match "Company Name"}
$my_cert will be your certificate object that you can pass directly to Set-AuthenticodeSignature cmdlet
Set-AuthenticodeSignature -Certificate $my_cert -FilePath fqn_to_dll.dll -Timestampserver "http://timestampurl"
post signing, you can retrieve the sign status by querying on the Status property for "Valid" or not like
$result = Set-AuthenticodeSignature -Certificate $my_cert -FilePath fqn_to_dll.dll -Timestampserver "http://timestampurl" | Select Status
if(-Not ($result -eq "Valid")){
Write-Output "Error Signing file: Status: $($result.Status)"
}