I'm trying to use powershell DSC for a few things. I wanted to have the passed credentials encrypted per the instructions at http://technet.microsoft.com/en-us/library/dn781430.aspx it all seems to work fine until I run start-DscConfiguration on the target node and i get the error:
The private key could not be acquired.
+ CategoryInfo : NotSpecified: (root/Microsoft/...gurationManager:String) [], CimException
+ FullyQualifiedErrorId : MI RESULT 1
+ PSComputerName : DmitriyDev
Going back I checked to see that the mof contains the credentials encrypted and the meta.mof contains the matching thumbprint, etc.
going back to the original article i see the example code:
# Get the certificate that works for encryption
function Get-LocalEncryptionCertificateThumbprint
{
(dir Cert:\LocalMachine\my) | %{
# Verify the certificate is for Encryption and valid
if ($_.PrivateKey.KeyExchangeAlgorithm -and $_.Verify())
{
return $_.Thumbprint
}
}
}
When I test my certificate using this code (on the target node) I see that the PrivateKey of the certificate is null. I'm not sure how the certificate is null. Trying a few things with certutil and the technique mentioned http://blogs.technet.com/b/vishalagarwal/archive/2010/03/30/verifying-the-private-key-property-for-a-certificate-in-the-store.aspx it seems that I do indeed have a private key, however Powershell see it only as null.
On the target node, I even exported the public private key manually and reimported them, with no luck as outlined in another dsc tutorial.
I also tried using procmon to see what the problem was on the target node. I see the wmiprvse process and see that it runs as System (as expected), and I checked to make sure that the permissions on the private key allowed for system (all on the target node)
So my question is how do I get my private key to be used by DSC specifically the LCM on the target node? Or how do I diagnose the problem more?
I had a similar error when using New-SelfSignedCertificate to create my certificates. For anyone with similar issues, I suspect the problem is related to the storage provider used by New-SelfSignedCertificate (see http://blogs.technet.com/b/vishalagarwal/archive/2010/03/30/verifying-the-private-key-property-for-a-certificate-in-the-store.aspx, which talks about a problem with the Microsoft Software Key Storage Provider and .NET classes). There's a powershell script available on technet that creates self-signed certificates, and defaults to using a different storage provider, which solved the problem for me.
Okay, i'm not sure exactly why this works, but it does. Using the Computer template seems to work. In terms of work, powershell on the target node can see it's private key from
dir cert:\LocalMachine\My | ? PrivateKey -ne $null
Once that happens it all works as expected. So long story short is don't use the workstation Auth template but the Computer template.
Related
I have been trying to get a new Sectigo code signing certificate working, with no luck, and Sectigo support is utterly useless. I am testing with this code, with the executable of course pathed to an actual PS1 file.
$executable = 'PATH TO.ps1'
$cert = Get-ChildItem cert:\CurrentUser\My -codesign
$timeStampServer = "http://timestamp.sectigo.com"
The time server seems to be working, since $timeStampServer echos http://timestamp.sectigo.com to the console. And the certificate SEEMS to be working because $cert echos a Thumbprint and Subject to the console.
But
Set-AuthenticodeSignature -filePath:$executable -certificate:$cert -timeStampServer:$timeStampServer -force
produces a blank SignerCertificate and UnknownError for the Status. For what it is worth the Path is just the file name, not the full path.
Unlike this thread, $cert.privatekey produces
PublicOnly : False
CspKeyContainerInfo : System.Security.Cryptography.CspKeyContainerInfo
KeySize : 4096
KeyExchangeAlgorithm : RSA-PKCS1-KeyEx
SignatureAlgorithm : http://www.w3.org/2000/09/xmldsig#rsa-sha1
PersistKeyInCsp : True
LegalKeySizes : {System.Security.Cryptography.KeySizes}
I wonder, is there anything else I can do to test the situation? I am waiting (about 110 minutes to go) on Sectigo support before I try downloading and installing a reissued certificate, but as crap as their support has been, I don't expect the new cert to work any better than the old, nor do I expect any insight from them as to the problem. They have my money, I expect them to say "PowerShell is your problem". So, hoping for some suggestions here as to what could be the issue, and what steps to take to isolate the problem.
One thing that does perk my ears up is that this link suggests I should also see EnhancedKeyUsageList for $cert and I do not. And when I look at the cert with Certlm I don't see an Intended Purposes column at all. But I think that's an OS issue as actually looking at the Cert there under the General tab, I have Enable all purposes for this certificate selected, and Code Signing is checked in the greyed out list.
Now, oddly, I get a single line with only UnknownError when I run Set-AuthenticodeSignature without dumping a variable to the console. But, if I dump $cert to the console right before I get
SignerCertificate :
TimeStamperCertificate :
Status : UnknownError
StatusMessage : The data is invalid
Path : PATH TO.ps1
SignatureType : None
IsOSBinary : False
Again with the correct local path. The StatusMessage doesn't exactly add much, but the fact that the TimeStamperCertificate is also blank makes me wonder if that's the issue. Given how much it seems Sectigo sucks, can I use some other generic timestamp server I can use, or am I limited to using the Timestamp Server of the certificate issuer? I tried using the timestamp server I had been using with my old GlobalSign EV cert, "http://timestamp.globalsign.com/scripts/timestamp.dll", and that produces the same results.
Also for what it is worth, the PS1 I am trying to sign for testing is one line
$scriptPath = Split-Path $script:myInvocation.myCommand.path -parent
I have never had such problems before. I had a Sectigo certificate last year and everything worked fine, but that was a different reseller, and in the meantime the Sectigo process seems to have changed. Last year my signed PDF from the KVK (Dutch Better Business Bureau) was fine for validation. But this year they demanded I provide a plain text translation of that document. And for years before I never had issues but then I was using a EV cert on a thumb drive. Which I gave up when GlobalSign took 4 months to get a thumb drive from London to Rotterdam.
But back on topic, suggestions?
EDIT: Further searching led to this, so I tried
$Cert = Get-PfxCertificate -FilePath "PATH TO.pfx"
And I put both the PFX and target PS1 in the root of C. Same results.
EDIT #2: After days of really horrible support from Comodo/Sectigo I demanded a refund, and bought a new certificate from SSL.Com. MUCH better experience with the validation process, but exactly the same issues with signing code. Now verified on both a Windows 10 and an old Windows 7 VM. So the code signing problem is definitely on my end. Meaning, more than ever I hope someone here can provide some insight.
We're sorry you're experiencing an issue. Here is some information to help resolve the issue. If you have any further questions, please feel free to contact Sectigo Support at https://sectigo.com/support and a member of our team will reach out to you.
Powershell ISE uses 'Unicode Big Endian' encoding and that could be the problem. Please try recreating the file using UTF-8 and set the Authenticode signature.
#creating the script into a new file
type \path\scriptfile.ps1 | out-file \path\scriptfile_utf.ps1 -encoding utf8
#get the certificate
$cert = Get-ChildItem cert:\CurrentUser\My -codesigning
#add Authenticode Signature to the script
Set-AuthenticodeSignature \path\scriptfile_utf.ps1
I had the same error. I tried to sign a .cmd file with a CodeSigningCert and received Unknown Error with a blank SignerCertificate as well.
When I tried signing a PowerShell script it worked fine. That is because you cannot sign a non-Executable with a CodeSigningCert. Might not be your issue, but that was what was wrong for me.
I did not use an official certificate though, I created one with the New-SelfSignedCertificate Cmdlet. Maybe you can try with a self-signed one and check if the error occurs as well?
I am trying to import a pfx certificate using this command:
Import-PfxCertificate -FilePath deleteme\App1\App1\Windows_TemporaryKey.pfx -CertStoreLocation Cert:\CurrentUser\My
This runs on an Azure DevOps agent and it terminates with the following error:
2021-08-17T08:12:41.8589900Z Import-PfxCertificate : The PFX file you are trying to import requires either a different password or membership in an
2021-08-17T08:12:41.8843817Z Active Directory principal to which it is protected.
2021-08-17T08:12:41.9009498Z At D:\Agents\02-V2\_work\_temp\ea432c03-2d7e-41d0-b921-60675873b966.ps1:5 char:1
2021-08-17T08:12:41.9091156Z + & Import-PfxCertificate -FilePath deleteme\App1\App1\Windows_Temporar ...
2021-08-17T08:12:41.9275703Z + ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
2021-08-17T08:12:41.9332015Z + CategoryInfo : NotSpecified: (:) [Import-PfxCertificate], Win32Exception
2021-08-17T08:12:41.9657873Z + FullyQualifiedErrorId : System.ComponentModel.Win32Exception,Microsoft.CertificateServices.Commands.ImportPfxCer
2021-08-17T08:12:41.9940803Z tificate
Now, I have seen plenty of stackoverflow posts dealing with this error, but all of them were for certificates which actually had private keys. This one has no private key (at least I have not been prompted to enter one when importing it on my local machine) and it is actually the template CMake pfx file, which I believe comes with every Windows CMake distribution (in the cmake-x.xx.x-windows-x86_64\share\cmake-x.xx\Templates\Windows directory). So I am intereseted in the second part of the error message: the membership in the "Active Directory principal to which it is protected". This is already pretty cryptic for me. The pfx has some principal associated with it? I haven't seen any such properties when inspecting the pfx using the Open command in the file's context menu. Or the current user has to have some AD principal associated? If it is so, which one?
To put this whole Import-PfxCertificate effort into context - it is part of me trying to automatically build an UWP app in CI, but CMake refuses to pass the ARM64 compiler checks exactly because it can not import this particular certificate (it is possible to import it without problems on all of my local machines). I know that I am able to force CMake to skip those compiler checks but I would need to be able to import certificates anyway to sign the app, so I think I would hit that problem again anyways.
I'm using "http://gallery.technet.microsoft.com/scriptcenter/Configures-Secure-Remote-b137f2fe" for configuring secure remote powershell access to my Azure VM. It works good.
I deleted my machine with keeping attached disks. I've recreated this machine with previous identical parameters, but from "my disk" option.
And after that my secure remote powershell access stop working. Every time I tried to use "http://gallery.technet.microsoft.com/scriptcenter/Configures-Secure-Remote-b137f2fe" for downloading certificate i recieved the following error:
Get-AzureCertificate : Cannot validate argument on parameter 'Thumbprint'. The argument is null or empty. Supply an argument that is not null or empty and then try the command again.
At C:\Users\username\Desktop\InstallWinRMCertAzureVM.ps1:54 char:83
+ ... me -Thumbprint $WinRMCert -ThumbprintAlgorithm sha1
+ ~~~~~~~~~~
+ CategoryInfo : InvalidData: (:) [Get-AzureCertificate], ParameterBindingValidationException
+ FullyQualifiedErrorId : ParameterArgumentValidationError,Microsoft.WindowsAzure.Commands.ServiceManagement.Certi
ficates.GetAzureCertificate
Actually, the option
(Get-AzureVM -ServiceName $CloudServiceName -Name $Name | select -ExpandProperty vm).DefaultWinRMCertificateThumbprint
is empty.
But in original machine it was a valid thumbprint.
Can someone point me in the right direction, please?
Problem was fixed. Partially :)
So, I connected to my virtual machine via RDP and manually export certificate from LocalMachine store. After that, I've imported certificate to my local machine to the "Trusted Root Certification Authorities" (!) section in Local Machine store. DefaultWinRMCertificateThumbprint field in Azure VM setting is still empty, but now I can connect to machine via Powershell without any problems.
I am writing a Powershell script to automate the setting up of a Windows 2008 R2 server and one thing that is required is the importing of several certificates into different stores. After doing some research on how best to achieve this, I found that Importpfx.exe was the best choice for what I am aiming to do, which is import one .pfx file into the Trusted People store and another .pfx file into the Personal store, both for the Computer account. I then also need to Manage Private keys on the certificate imported into the Personal store once it has been imported.
At first, I thought that Importpfx.exe was doing this correctly, but after researching on how to manage the private keys via Powershell, I learned that this can be done my editing the acl for the file that corresponds to the imported certificate which should be found here "C:\ProgramData\Microsoft\Crypto\RSA\MachineKeys". This is where I started to notice that something wasn't quite right with the imported certificate. After searching this folder for a new file after importing the certificates, I noticed that no new files had been added to this folder.
I searched the entire C drive for all files sorted by date modified and found that new files had been added to this folder "C:\Users\'user'\AppData\Roaming\Microsoft\Crypto\RSA\S-1-5-21-2545654756-3424728124-1046164030-4917" instead of the expected folder. Whilst I was able to manually manage private keys for the certificate via the certificate store (as I was user who imported it), no other users were able to log onto the machine and manage the private keys, getting the error message "Cannot find the certificate and private key for decryption" (which would make sense given the folder that the corresponding file exists in).
I use a function to get the thumbprint of the certificates before trying to import the .pfx file. The code I have used to run is:
function GetCertificateThumbprint ( [string]$certPreFix, [string]$certPassword, [string]$certFolder, [string]$domain, [bool]$addIfNotFound, [hashtable]$return)
$storePath = "cert:\LocalMachine"
$storeDir = "My"
$storeName = [System.Security.Cryptography.X509Certificates.StoreName]::My
if($certPreFix -eq "XXX")
{
$storeDir = "TrustedPeople"
$storeName = [System.Security.Cryptography.X509Certificates.StoreName]::TrustedPeople
}
$storePath = [System.IO.Path]::Combine($storePath, $storeDir)
#Build the certificate file name and get the file
$certFileName = $certPreFix + "." + $domainName + ".*"
$certFile = Get-ChildItem -Path $certFolder -Include $certFileName -Recurse
if ($certFile)
{
# The certificate file exists so get the thumbprint
$Certificate = New-Object system.Security.Cryptography.X509Certificates.X509Certificate2($certFile, $certPassword)
$certThumbprint = $Certificate.Thumbprint
if($addIfNotFound)
{
# Check for the certificate's thumbprint in store and add if it does not exist already
if(-not(Get-ChildItem $storePath | Where-Object {$_.Thumbprint -eq $certThumbprint}))
{
Set-Location "$Env:windir\Tools"
.\importpfx.exe -f $certFile -p $certPassword -t MACHINE -s $storeDir
}
}
}
Can anyone see if I have done anything wrong? Has anyone come across this issue and got around it somehow? This is causing me issues as I cannot automate the Manage Private keys task properly!
I just ran in to the same problem. You must specify the MachineKeySet X509KeyStorageFlag when creating the certificate object:
New-Object system.Security.Cryptography.X509Certificates.X509Certificate2($certFile, $certPassword, "PersistKeySet,MachineKeySet")
Hopefully that helps someone.
I am facing a strange problem in developing an installation that should in one of the steps install a certificate.
The problem has to do with granting Certificate’s private key access for an account (e.g. IIS_IUSRS) on Windows Server 2008 R2. The private keys are stored in the location C:\Users\All Users\Microsoft\Crypto\RSA\MachineKeys.
A custom C# Setup Project imports a Certificate and gives access for an account on Certificate’s private key during the installation process. After some time (2-3 sec) the private key file is automatically deleted from the MachineKeys folder. Thus the installed Web Application cannot access the specific certificate and displays the following error message:
“System.Security.Cryptography.CryptographicException: Keyset does not exist”. This error occurs only on Windows Server 2008 R2, while for Windows Server 2003 everything is working correctly.
My question is, why the private key gets deleted and which process does this?
Thx
UPDATE 17/05/2012
I have not yet found a solution to the described problem, and no response has been posted on the other forums where I asked (forums.asp.net, social.msdn.microsoft.com). So, can anyone suggest any other resources or advice for further troubleshooting this issue?
Thanks again
This was happening to me too - my setup script would add the cert and grant access to the PK file fine, and the app would work. Then later, after I had closed the PowerShell editor I re-launched the app and it failed with a keyset not found.
Adding the PersistKeySet flag when importing the cert fixed the problem. Here's the PowerShell code for adding the cert and private key with persistence:
param(
[string]$certStore = "LocalMachine\TrustedPeople",
[string]$filename = "sp.pfx",
[string]$password = "password",
[string]$username = "$Env:COMPUTERNAME\WebSiteUser"
)
function getKeyUniqueName($cert) {
return $cert.PrivateKey.CspKeyContainerInfo.UniqueKeyContainerName
}
function getKeyFilePath($cert) {
return "$ENV:ProgramData\Microsoft\Crypto\RSA\MachineKeys\$(getKeyUniqueName($cert))"
}
$certFromFile = New-Object System.Security.Cryptography.X509Certificates.X509Certificate2($filename, $password)
$certFromStore = Get-ChildItem "Cert:\$certStore" | Where-Object {$_.Thumbprint -eq $certFromFile.Thumbprint}
$certExistsInStore = $certFromStore.Count -gt 0
$keyExists = $certExistsInStore -and ($certFromStore.PrivateKey -ne $null) -and (getKeyUniqueName($cert) -ne $null) -and (Test-Path(getKeyFilePath($certFromStore)))
if ((!$certExistsInStore) -or (!$keyExists)) {
$keyFlags = [System.Security.Cryptography.X509Certificates.X509KeyStorageFlags]::MachineKeySet
$keyFlags = $keyFlags -bor [System.Security.Cryptography.X509Certificates.X509KeyStorageFlags]::PersistKeySet
$certFromFile.Import($filename, $password, $keyFlags)
$store = Get-Item "Cert:\$certStore"
$store.Open("ReadWrite")
if ($certExistsInStore) {
#Cert is in the store, but we have no persisted private key
#Remove it so we can add the one we just imported with the key file
$store.Remove($certFromStore)
}
$store.Add($certFromFile)
$store.Close()
$certFromStore = $certFromFile
"Installed x509 certificate"
}
$pkFile = Get-Item(getKeyFilePath($certFromStore))
$pkAcl = $pkFile.GetAccessControl("Access")
$readPermission = $username,"Read","Allow"
$readAccessRule = new-object System.Security.AccessControl.FileSystemAccessRule $readPermission
$pkAcl.AddAccessRule($readAccessRule)
Set-Acl $pkFile.FullName $pkAcl
"Granted read permission on private key to web user"
Is very clear that is a security issue “System.Security.”. and you do not have permissions to do the installation., you need to set the permissions on the private key to allow that service account access it.
Edit later: Go to Start->Run->cmd->type mmc->Select File->Add/Remove->Select Certificates->Add->Computer Account->Local., i attach a screenshot is in spanish but I indicated the fields:
Open->Certificates->Personal->Certificates->Right click Certificate-> All Tasks->Manage Private Keys->Add Network Service.
Also check this entry to see how works this feature in Windows Server 2008., then please after you try it, come back and say if you could solve the issue with what I have told you.
http://referencesource.microsoft.com/#System/security/system/security/cryptography/x509/x509certificate2collection.cs,256 shows where the PersistKeySet flag is tested. The PersistKeySet flag is documented at https://msdn.microsoft.com/en-us/library/system.security.cryptography.x509certificates.x509keystorageflags%28v=vs.110%29.aspx with the phrase "The key associated with a PFX file is persisted when importing a certificate." My techno-babble to English translator tells me this means "You must include the PersistKeySet flag if you call the X509Certificate2 constructor and the certificate might already be installed on the machine." This probably applies to the .Import calls too. It's likely the powershell Import-PfxCertificate cmdlet already does this. But if you are doing what the accepted answer shows or what the OP asked, you need to include the special key. We used a variation of ejegg's script in our solution. We have a process that runs every 3 minutes to check that all configured certs are installed and this seems to work fine now.
The symptom we saw in powershell is the HasPrivateKey property is true but the PrivateKey value itself is null. And the key file for the cert in C:\ProgramData\Microsoft\Crypto\RSA\MachineKeys was deleted. The FindPrivateKey utility at https://msdn.microsoft.com/en-us/library/aa717039(v=vs.110).aspx helped us watch file get deleted.
So happy 4th birthday to the question with this very late response.