Set-AuthenticodeSignature UnknownError with new certificate, how to troubleshoot - powershell

I have been trying to get a new Sectigo code signing certificate working, with no luck, and Sectigo support is utterly useless. I am testing with this code, with the executable of course pathed to an actual PS1 file.
$executable = 'PATH TO.ps1'
$cert = Get-ChildItem cert:\CurrentUser\My -codesign
$timeStampServer = "http://timestamp.sectigo.com"
The time server seems to be working, since $timeStampServer echos http://timestamp.sectigo.com to the console. And the certificate SEEMS to be working because $cert echos a Thumbprint and Subject to the console.
But
Set-AuthenticodeSignature -filePath:$executable -certificate:$cert -timeStampServer:$timeStampServer -force
produces a blank SignerCertificate and UnknownError for the Status. For what it is worth the Path is just the file name, not the full path.
Unlike this thread, $cert.privatekey produces
PublicOnly : False
CspKeyContainerInfo : System.Security.Cryptography.CspKeyContainerInfo
KeySize : 4096
KeyExchangeAlgorithm : RSA-PKCS1-KeyEx
SignatureAlgorithm : http://www.w3.org/2000/09/xmldsig#rsa-sha1
PersistKeyInCsp : True
LegalKeySizes : {System.Security.Cryptography.KeySizes}
I wonder, is there anything else I can do to test the situation? I am waiting (about 110 minutes to go) on Sectigo support before I try downloading and installing a reissued certificate, but as crap as their support has been, I don't expect the new cert to work any better than the old, nor do I expect any insight from them as to the problem. They have my money, I expect them to say "PowerShell is your problem". So, hoping for some suggestions here as to what could be the issue, and what steps to take to isolate the problem.
One thing that does perk my ears up is that this link suggests I should also see EnhancedKeyUsageList for $cert and I do not. And when I look at the cert with Certlm I don't see an Intended Purposes column at all. But I think that's an OS issue as actually looking at the Cert there under the General tab, I have Enable all purposes for this certificate selected, and Code Signing is checked in the greyed out list.
Now, oddly, I get a single line with only UnknownError when I run Set-AuthenticodeSignature without dumping a variable to the console. But, if I dump $cert to the console right before I get
SignerCertificate :
TimeStamperCertificate :
Status : UnknownError
StatusMessage : The data is invalid
Path : PATH TO.ps1
SignatureType : None
IsOSBinary : False
Again with the correct local path. The StatusMessage doesn't exactly add much, but the fact that the TimeStamperCertificate is also blank makes me wonder if that's the issue. Given how much it seems Sectigo sucks, can I use some other generic timestamp server I can use, or am I limited to using the Timestamp Server of the certificate issuer? I tried using the timestamp server I had been using with my old GlobalSign EV cert, "http://timestamp.globalsign.com/scripts/timestamp.dll", and that produces the same results.
Also for what it is worth, the PS1 I am trying to sign for testing is one line
$scriptPath = Split-Path $script:myInvocation.myCommand.path -parent
I have never had such problems before. I had a Sectigo certificate last year and everything worked fine, but that was a different reseller, and in the meantime the Sectigo process seems to have changed. Last year my signed PDF from the KVK (Dutch Better Business Bureau) was fine for validation. But this year they demanded I provide a plain text translation of that document. And for years before I never had issues but then I was using a EV cert on a thumb drive. Which I gave up when GlobalSign took 4 months to get a thumb drive from London to Rotterdam.
But back on topic, suggestions?
EDIT: Further searching led to this, so I tried
$Cert = Get-PfxCertificate -FilePath "PATH TO.pfx"
And I put both the PFX and target PS1 in the root of C. Same results.
EDIT #2: After days of really horrible support from Comodo/Sectigo I demanded a refund, and bought a new certificate from SSL.Com. MUCH better experience with the validation process, but exactly the same issues with signing code. Now verified on both a Windows 10 and an old Windows 7 VM. So the code signing problem is definitely on my end. Meaning, more than ever I hope someone here can provide some insight.

We're sorry you're experiencing an issue. Here is some information to help resolve the issue. If you have any further questions, please feel free to contact Sectigo Support at https://sectigo.com/support and a member of our team will reach out to you.
Powershell ISE uses 'Unicode Big Endian' encoding and that could be the problem. Please try recreating the file using UTF-8 and set the Authenticode signature.
#creating the script into a new file
type \path\scriptfile.ps1 | out-file \path\scriptfile_utf.ps1 -encoding utf8
#get the certificate
$cert = Get-ChildItem cert:\CurrentUser\My -codesigning
#add Authenticode Signature to the script
Set-AuthenticodeSignature \path\scriptfile_utf.ps1

I had the same error. I tried to sign a .cmd file with a CodeSigningCert and received Unknown Error with a blank SignerCertificate as well.
When I tried signing a PowerShell script it worked fine. That is because you cannot sign a non-Executable with a CodeSigningCert. Might not be your issue, but that was what was wrong for me.
I did not use an official certificate though, I created one with the New-SelfSignedCertificate Cmdlet. Maybe you can try with a self-signed one and check if the error occurs as well?

Related

How to I relate the signature algorithms of a Windows certificate, with the parameters for signing with SignTool?

We have a PowerShell script written years ago by someone who has moved on, to sign our ClickOnce deployments. This has worked well, until recently when the certificate expired. I'm tasked with updating the PowerShell script to incorporate the new signing certificate we recently purchased.
I put the new certificate into the Certificate store, per the instructions from the previous maintainer. I've made modifications to the PowerShell script, fixing other bugs as they've come along, but I'm at a place where I don't know how to relate the details of the certificate with the parameters for SignTool.exe. Looking at the properties of the certificate, from the Details view, I see the following relevant values:
Signature algorithm: sha384RSA
Signature hash algorithm: sha384
Thumbprint algorithm: sha1
Looking at Microsoft's SignTool documentation page for the sign command I see values used by the PowerShell such as /td and /fd, but I don't know which relates to the values displayed in the properties for the certificate. Also, I'm not certain that I need the thumbprint algorithm. According to the Microsoft page I referenced it should be used if working with multiple certificates, which the PowerShell script is not.
Here is what is currently in the PowerShell script for signing the ClickOnce app:
Get-AuthenticodeSignature *.exe,*.dll | ? Status -eq NotSigned | % Path | %{&$signtool sign /tr $timestamp /td sha384 /fd sha384 $hash $_ }
That now gives me an error of, "##[error]SignTool Error: No certificates were found that met all the given criteria."

Import-PFXCertificate password issue

I have a client side cert PFX from some idiot to allow some users access his website and I need to script it so I can allow multiple users to auto import this cert into the local store during a logon to our RDS environment.
This cert also came with a long complicated password that I need to pass to said function.
So I had the bright idea of using PS function Import-PFXCertificate to do this.
$PlainTextPass = "f4#)]\as1"
$pfxpass = $PlainTextPass |ConvertTo-SecureString -AsPlainText -Force
Import-PfxCertificate -filepath C:\important.pfx cert:\CurrentUser\my -
Password $pfxpass
It fails with this error, and I can't find any direct reference to it on the web.
Import-PfxCertificate : The PFX file you are trying to import requires either a different password or membership in an Active Directory principal to which it is protected.
The test user I am running against is a domain admin. Not that should matter as it's installing the cert into CurrentUser
Try surrounding the plain text password with single quotes instead of double quotes. I had a password with $ in it that gave me the same error until I swapped the quotes.
I was able to resolve the same issue in my case after I have stumbled upon similar post in ServerFault - Wrong password during pfx certificate import Windows(10, 2016)
My certificate was encoded during export to PFX using AES256-SHA256, switching to TripleDES-SHA1 resolved the problem.
Try changing the password. Just keep alphabetic letters. This solved the issue for me.

Error when using CmdLet New-AzureRmADAppCredential to create new credential with certificates

I am trying to use the New-AzureRmADAppCredential cmdlet. I am trying to create a new credential using the following syntax:
New-SelfSignedCertificateEx -Subject "CN=$ClientId" -KeySpec "Exchange" -FriendlyName "$ClientId"
$start = (Get-Date).ToUniversalTime()
$end = $start.AddDays(4)
New-AzureRmADAppCredential -ApplicationId $application.ObjectId -CertValue $keyValue -StartDate $start -EndDate $end
I've noticed that if I try to create a new credential with a certificate, it failed on a specific apps with the following error:
New-AzureRmADAppCredential : Update to existing credential with KeyId 'keyid' is not allowed.
This app has 2 credentials, one is a password and the other is a certificate. The keyid belong to the certificate credential. The weird part is that on other app it worked fine, even if the app has multiple certificate credentials. I've tried to look at the documentation, but couldn't find anything useful.
So, my question is - why this error happens? and how I can solve this?
Update: By looking in the code of the cmdlet, it seems like it always updating the whole list, so I think it might be something related to permission, but I am not sure.
Thanks,
Omer
Yeah, New-AzureRmADAppCredential isn't strong enough, if AAD App has keys, it will update the latest key in the key list and of course won't work(Update to existing credential with KeyId is not allowed), I think it's a bug; if no key existed, it will create a new key. You can use New-AzureADApplicationPasswordCredential instead.
Ok, so we found out this happens because the first certificate was uploaded to Azure AD by modifying the application manifest. After deleting it and adding it again using powershell everything worked...

Desired State Configuration credential private key not acquired

I'm trying to use powershell DSC for a few things. I wanted to have the passed credentials encrypted per the instructions at http://technet.microsoft.com/en-us/library/dn781430.aspx it all seems to work fine until I run start-DscConfiguration on the target node and i get the error:
The private key could not be acquired.
+ CategoryInfo : NotSpecified: (root/Microsoft/...gurationManager:String) [], CimException
+ FullyQualifiedErrorId : MI RESULT 1
+ PSComputerName : DmitriyDev
Going back I checked to see that the mof contains the credentials encrypted and the meta.mof contains the matching thumbprint, etc.
going back to the original article i see the example code:
# Get the certificate that works for encryption
function Get-LocalEncryptionCertificateThumbprint
{
(dir Cert:\LocalMachine\my) | %{
# Verify the certificate is for Encryption and valid
if ($_.PrivateKey.KeyExchangeAlgorithm -and $_.Verify())
{
return $_.Thumbprint
}
}
}
When I test my certificate using this code (on the target node) I see that the PrivateKey of the certificate is null. I'm not sure how the certificate is null. Trying a few things with certutil and the technique mentioned http://blogs.technet.com/b/vishalagarwal/archive/2010/03/30/verifying-the-private-key-property-for-a-certificate-in-the-store.aspx it seems that I do indeed have a private key, however Powershell see it only as null.
On the target node, I even exported the public private key manually and reimported them, with no luck as outlined in another dsc tutorial.
I also tried using procmon to see what the problem was on the target node. I see the wmiprvse process and see that it runs as System (as expected), and I checked to make sure that the permissions on the private key allowed for system (all on the target node)
So my question is how do I get my private key to be used by DSC specifically the LCM on the target node? Or how do I diagnose the problem more?
I had a similar error when using New-SelfSignedCertificate to create my certificates. For anyone with similar issues, I suspect the problem is related to the storage provider used by New-SelfSignedCertificate (see http://blogs.technet.com/b/vishalagarwal/archive/2010/03/30/verifying-the-private-key-property-for-a-certificate-in-the-store.aspx, which talks about a problem with the Microsoft Software Key Storage Provider and .NET classes). There's a powershell script available on technet that creates self-signed certificates, and defaults to using a different storage provider, which solved the problem for me.
Okay, i'm not sure exactly why this works, but it does. Using the Computer template seems to work. In terms of work, powershell on the target node can see it's private key from
dir cert:\LocalMachine\My | ? PrivateKey -ne $null
Once that happens it all works as expected. So long story short is don't use the workstation Auth template but the Computer template.

How to trust a certificate in Windows Powershell

I am using Windows 7, and want to run signed scripts from Powershell, the security-settings of Powershell are set to "all-signed", and my scripts are signed with a valid certificate from my company. I have also added the .pfx-file to my local certificate store (right-clicked the pfx-file and installed).
However, when I start a signed script, I get a message that says:
"Do you want to run software from this untrusted publisher?
File Z:\Powershell Signed Scripts\signed.ps1 is published by CN=[MyCompanyName] and is not trusted on your system. Only run scripts from
trusted publishers.
[V] Never run [D] Do not run [R] Run once [A] Always run [?] Help
(default is "D"):"
Since I want to automatically call these scripts on my systems, I would like to add my imported certificate to the trusted list on my system, so that I do not get a message anymore when I run a signed script for the first time. How can I make my certificate a trusted one?
How to trust a certificate in Windows Powershell
Indeed, you can do this without any mmc :)
First, check the location of your personal certificate named for example "Power" :
Get-ChildItem -Recurse cert:\CurrentUser\ |where {$_ -Match "Power"} | Select PSParentPath,Subject,Issuer,HasPrivateKey |ft -AutoSize
(This one should be empty:)
gci cert:\CurrentUser\TrustedPublisher
Build the command with the path to your certificate:
$cert = Get-ChildItem Certificate::CurrentUser\My\ABLALAH
Next work on certificate store (Here I work on two certificate store : user & computer)
$store = New-Object System.Security.Cryptography.X509Certificates.X509Store "TrustedPublisher","LocalMachine"
$store.Open("ReadWrite")
$store.Add($cert)
$store.Close()
Check, you should find your certificate :
ls cert:\CurrentUser\TrustedPublisher
Sounds like you need to verify that the script is signed properly and that you have the correct certificate installed in the correct certificate store.
Use the Get-AuthenticodeSignature cmdlet to get information about the signed script.
Also review Scott's guide for signing certificates.