Imported pfx certificate is not saved on disk - powershell

I wrote a script in PowerShell to import a certificate on Windows Server 2016/2019. The script is added to an Azure DevOps pipeline and the agent is an Environment Agent running as NT AUTHORITY\SYSTEM. It first imports the certificate into the LocalMachine\My store and sets read permissions on the certificate right after that. It comes with quite weird behavior while running as a pipeline or executed manually. I'll write out the different kinds of behavior:
Import certificate on a server (successful)
Check the existence of the file on the physical disk (successful -> found in C:\ProgramData\Microsoft\Crypto\RSA\MachineKeys)
Set read permissions for account x (successful)
Import certificate on a server (successful)
Check the existence of the file on the physical disk (failed. File is not stored programdata\machinekeys, but in C:\ProgramData\Microsoft\Crypto\Keys)
Set read permissions for account x (failed --> can't find the file on the proper location)
Import certificate on a server (successful)
Check the existence of the file on the physical disk (successful)
Set read permissions for account x (failed --> can't find the file on the proper location. does not exist on the server )
There is no way to predict which one of the scenario's will occur while running the script.
I monitored server and file behavior with ProcMon (SysInternal Suite) during the import and did see the file being created and saved in while in C:\ProgramData\Microsoft\Crypto\RSA\MachineKeys
during both successful and failed attempts.
Import PFX code:
$FilePath = 'e:\folder\certificate.pfx'
$password = 'iLoveChocolateyCookies'
$cert = New-Object system.Security.Cryptography.X509Certificates.X509Certificate2($FilePath, $password, "PersistKeySet,MachineKeySet")
$store = New-object System.Security.Cryptography.X509Certificates.X509Store -argumentlist "My", "LocalMachine"
$store.Open([System.Security.Cryptography.X509Certificates.OpenFlags]::"ReadWrite")
$store.Add($cert)
$store.Close()
Physical disk location check code:
$checkCert = Get-ChildItem "Cert:\LocalMachine\My" | Where-Object { $_.thumbprint -eq '<insertThumbhere>' }
$rsaCertCheck = [System.Security.Cryptography.X509Certificates.RSACertificateExtensions]::GetRSAPrivateKey($checkCert)
$checkCertDirectory = (Get-ChildItem -Path 'C:\Programdata\Microsoft\Crypto' -Include $rsaCertCheck.Key.UniqueName -File -Recurse).DirectoryName
if ($checkCertDirectory -eq "C:\ProgramData\Microsoft\Crypto\RSA\MachineKeys") {
Write-Host "Certificate found on physical disk."
} else {
throw 'nope'
}
Import-PfxCertificate showed the same behaviour. That is why I decided to use the .Net approach after consulting StackOverflow and Google, but ended up with the same issues.
A helpful hand will be very much appreciated :)

Related

Visual studio team services deploymen/buildt certificate error

I am trying to build a click-once application using the Continuous integration and deployment feature in VSTS (Visual studio team services Online)We are trying to build this using the Hosted agent Visual studio 2015 We had difficulties signing the strong name key file with an error of
MSB3326: Cannot import the following key file: xxxx.snk. The key file may be password protected. To correct this, try to import the certificate again or import the certificate manually into the current user's personal certificate store.
And after that
MSB3321: Importing key file "xxxx.pfx" was canceled.
I have tried to both select from store and from file changed the location and made sure to commit but with no success.
Any ideas how i can overcome this errors or what am doing wrong.
Clerification on answer selected
Just wanted to make a clarification if anyone else has the same issue, in addition to the answer i had to place my certificate in my source control code and commit it. Then to select its location add a global variable on the VSTS Build
$cert.Import("$(CertPath)", $password, [System.Security.Cryptography.X509Certificates.X509KeyStorageFlags]"PersistKeySet")
Where $(CertPath) would be something like $(Build.SourcesDirectory)\SharedSolutionFiles\CertificateName.pfx
You can create a PowerShell script and add a PowerShell Script step in your build definition to import the certificate file before the VSBuild step.
Build failed without PowerShell Import Certificate Step:
Build passed with PowerShell Import Certificate Step:
The PowerShell Script I used:
$pfxpath = 'pathtoees.pfx'
$password = 'password'
Add-Type -AssemblyName System.Security
$cert = New-Object System.Security.Cryptography.X509Certificates.X509Certificate2
$cert.Import($pfxpath, $password, [System.Security.Cryptography.X509Certificates.X509KeyStorageFlags]"PersistKeySet")
$store = new-object system.security.cryptography.X509Certificates.X509Store -argumentlist "MY", CurrentUser
$store.Open([System.Security.Cryptography.X509Certificates.OpenFlags]"ReadWrite")
$store.Add($cert)
$store.Close()
The better way is that you can setup a on premise build agent and import the certificate to certificate store, then change build agent service account to the same account.
Instead of using either an on premise build or loading the certificates on to the certificate stores on the build agent (which could be considered insecure) it is possible to overwrite the build task FileSign and construct one that uses a certificate file and password.
I have outlined the steps here:
https://stackoverflow.com/a/55313239/2068626
After failing to use methods in other answers, I found another way to use PowerShell script to import pfx certificate. my script was written for GitHub Actions but you can easily change the syntax to VSTS or Azure pipeline (using 'task: PowerShell#2' for azure pipeline for example). You may also like to update file path from github to your devops path. Also password can be replaced with a secured variable.
- name: Import certificate from the command-line
shell: pwsh
run: |
$Secure_String_Pwd = ConvertTo-SecureString "<password>" -AsPlainText -Force
Import-PfxCertificate -FilePath '${{github.workspace}}\<path>\project1_TemporaryKey.pfx' -CertStoreLocation Cert:\CurrentUser\My -Password $Secure_String_Pwd
Then build your Visual Studio ClickOnce project. The error should be gone.
After the build, you may like to remove the certificate from the machine. Here is a PowerShell example provided by Microsoft:
https://learn.microsoft.com/en-us/answers/questions/360772/powershell-commands-to-delete-personal-certificate.html
$users = "user1","user2","user3","user4","user5"
Get-ChildItem Cert:\CurrentUser\My | ForEach-Object {
$ifkeep = $false
foreach($user in $users){
if($_.Subject -match $user){
$ifkeep = $true
break
}
}
if($ifkeep -eq $false){
Remove-Item $_
}
}

Drive Mapping with Azure Scale Sets using Desired State Configuration

I am running into an interesting issue. Maybe you fine folks can help me understand what's happening here. If there's a better method, I'm all ears.
I am running a DSC Configuration on Azure and would like to map a drive. I've read this really isn't what DSC is for, but I am not aware of any other way of doing this outside of DSC with Azure Scalesets. Here's the portion of the script I am running into issues:
Script MappedDrive
{
SetScript =
{
$pass = "passwordhere" | ConvertTo-SecureString -AsPlainText -force
$user = New-Object -TypeName System.Management.Automation.PSCredential -ArgumentList "username",$pass
New-PSDrive -Name W -PSProvider FileSystem -root \\azurestorage.file.core.windows.net\storage -Credential $user -Persist
}
TestScript =
{
Test-Path -path "W:"
}
GetScript =
{
$hashresults = #{}
$hashresults['Exists'] = test-path W:
}
}
I've also attempted this code in the SetScript section:
(New-Object -ComObject WScript.Network).MapNetworkDrive('W:','\\azurestorage.file.core.windows.net\storage',$true,'username','passwordhere')
I've also tried a simple net use command to map the drive instead of the fancy, New-Object or New-PSDrive cmdlets. Same behavior.
If I run these commands (New-Object/Net Use/New-PSDrive) manually, the machine will map the drive if I run it with a separate drive letter. Somehow, the drive is attempting to be mapped but isn't mapping.
Troubleshooting I've done:
There is no domain in my environment. I am simply attempting to create a scale set and run DSC to configure the machine using the storage account credentials granted upon creation of the storage account.
I am using the username and password that is given to me by the Storage Account user id and access key (randomly generated key, with usually the name of the storage account as the user).
Azure throws no errors on running the DSC module (No errors in Event Log, Information Only - Resource execution sequence properly lists all of my sequences in the DSC file.)
When I log into the machine and check to see if the drive is mapped, I run into a disconnected network drive on the drive letter I want (W:).
If I open Powershell, I receive an error: "Attempting to perform the InitializeDefaultDrives operation on the 'FileSystem' provider failed."
If I run "Get-PSDrive" the W: drive does not appear.
If I run the SetScript code manually inside a Powershell Console, the mapped drive works fine under a different drive letter.
If I try to disconnect the W: drive, I receive "This network connection does not exist."
I thought maybe DSC needed some time before mapping and added a Sleep Timer, but that didn't work. Same behavior.
I had a similar problem before, while it didn't involve DSC, mounting an Azure File share would be fine until the server would be restarted, then it would appear as a disconnected drive. This happend if i used New-Object/Net Use/New-PSDrive with the persist option.
The answer to that issue, i found in the updated docs
Persist your storage account credentials for the virtual machine
Before mounting to the file share, first persist your storage account
credentials on the virtual machine. This step allows Windows to
automatically reconnect to the file share when the virtual machine
reboots. To persist your account credentials, run the cmdkey command
from the PowerShell window on the virtual machine. Replace
with the name of your storage account, and
with your storage account key.
cmdkey /add:<storage-account-name>.file.core.windows.net /user:<storage-account-name> /pass:<storage-account-key>
Windows will now reconnect to your file share when the virtual machine
reboots. You can verify that the share has been reconnected by running
the net use command from a PowerShell window.
Note that credentials are persisted only in the context in which
cmdkey runs. If you are developing an application that runs as a
service, you will need to persist your credentials in that context as
well.
Mount the file share using the persisted credentials
Once you have a remote connection to the virtual machine, you can run
the net use command to mount the file share, using the following
syntax. Replace with the name of your storage
account, and with the name of your File storage share.
net use <drive-letter>: \\<storage-account-name>.file.core.windows.net\<share-name>
example :
net use z: \\samples.file.core.windows.net\logs
Since you persisted your storage account credentials in the previous
step, you do not need to provide them with the net use command. If you
have not already persisted your credentials, then include them as a
parameter passed to the net use command, as shown in the following
example.
Edit:
I don't have an Azure VM free to test it on, but this works fine on a Server 2016 hyper-v vm
Script MapAzureShare
{
GetScript =
{
}
TestScript =
{
Test-Path W:
}
SetScript =
{
Invoke-Expression -Command "cmdkey /add:somestorage.file.core.windows.net /user:somestorage /pass:somekey"
Invoke-Expression -Command "net use W: \\somestorage.file.core.windows.net\someshare"
}
PsDscRunAsCredential = $credential
}
In my brief testing the drive would only appear after the server was rebooted.
What I imagine is happening here:
DSC runs under the NT AUTHORITY\SYSTEM account and unless the Credential attribute has been set, the Computer account is used when pulling the files from a network share. But looking at how Azure Files operate, permissions shouldn't be an issue, but running this whole process under NT AUTHORITY\SYSTEM could. I suggest you try to run DSC as a user of your VM's and see if that works.
ps. You could also try to perform the same operation against a VM with network share where you are confident that share\ntfs permissions are correct. You might need to enable anonymous user to access your share for that to work.

Powershell Script to Install Certificate Into Active Directory Store

I'm trying to write a powershell script to install a certificate into the active directory certificate store,
Here are the steps to do this manually, any help would be greatly appreciated.
On a Windows 2008R2 domain controller,
Click Start -> Run
type MMC
click ok
Click File -> Add/Remove Snap-In
Select "Certificates" -> Add
Select "Service Account"
Click Next
Select "Local Computer"
Click Next
Select "Active Directory Domain Services"
Click Finish
Click Ok
I want the script to install the certificate into :
NTDS\Personal
I would post an image but I don't have enough "reputation" apparently, so I can only provide text instructions.
So basically what I've tried is, I've used this powershell function below to import a certificate into the Local Machine -> Personal Store, which is where most certificates go, and the code works.
But I need to install the certificate into the "NTDS\Personal" store on a domain controller, but the $certRootStore only accepts localmachine or CurrentUser, so I'm stuck : /
function Import-PfxCertificate
{
param
(
[String]$certPath,
[String]$certRootStore = "localmachine",
[String]$certStore = "My",
$pfxPass = $null
)
$pfx = new-object System.Security.Cryptography.X509Certificates.X509Certificate2
if ($pfxPass -eq $null)
{
$pfxPass = read-host "Password" -assecurestring
}
$pfx.import($certPath,$pfxPass,"Exportable,PersistKeySet")
$store = new-object System.Security.Cryptography.X509Certificates.X509Store($certStore,$certRootStore)
$store.open("MaxAllowed")
$store.add($pfx)
$store.close()
}
Import-PfxCertificate -certPath "d:\Certificate.pfx"
Regards Alex
Using a combination of what you already had above and the registry keys for the two certificate stores this works.
The only other thing is that I don't know how NTDS determines which certificate to use when there are multiple in the certificate store.
function Import-NTDSCertificate {
[CmdletBinding()]
param(
[Parameter(Mandatory)]
[string]$PFXFile,
[Parameter(Mandatory)]
[string]$PFXPassword,
#Remove certificate from LocalMachine\Personal certificate store
[switch]$Cleanup
)
begin{
Write-Verbose -Message "Importing PFX file."
$PFXObject = New-Object -TypeName System.Security.Cryptography.X509Certificates.X509Certificate2
$PFXObject.Import($PFXFile,$PFXPassword,[System.Security.Cryptography.X509Certificates.X509KeyStorageFlags]::Exportable)
$thumbprint = $PFXObject.Thumbprint
}
process{
Write-Verbose -Message "Importing certificate into LocalMachine\Personal"
$certificateStore = New-Object -TypeName System.Security.Cryptography.X509Certificates.X509Store('My','LocalMachine')
$certificateStore.Open('MaxAllowed')
$certificateStore.Add($PFXObject)
$certificateStore.Close()
Write-Verbose -Message "Copying certificate from LocalMachine\Personal to NTDS\Personal"
$copyParameters = #{
'Path' = "HKLM:\Software\Microsoft\SystemCertificates\MY\Certificates\$thumbprint"
'Destination' = "HKLM:\SOFTWARE\Microsoft\Cryptography\Services\NTDS\SystemCertificates\My\Certificates\$thumbprint"
'Recurse' = $true
}
Copy-Item #copyParameters
}
end{
if ($Cleanup){
Write-Verbose -Message "Removing certificate from LocalMachine\Personal"
$removalParameters = #{
'Path' = "HKLM:\SOFTWARE\Microsoft\SystemCertificates\MY\Certificates\$thumbprint"
'Recurse' = $true
}
Remove-Item #removalParameters
}
}
}
Alright, first the bad news. The only managed certificate stores are LocalMachine and CurrentUser, as we have all seen in powershell.
Now, the not so bad news. We know that the 'physical' location store (physical is MS' word, not mine) exists in the registry on the ADDS server, HKLM\Software\Microsoft\Cryptography\Services\NTDS\SystemCertificates. This was dually verified by both
Using procmon while importing a certificate into the store using the mmc snap-in
Scavenging msdn for this nugget
The link in #2 shows that all physical stores for services are stored in the path mentioned above, substituting NTDS for . The real service name, not the display name.
However,
Because of the bad news. Trying to map it in powershell with that reg key as the root and -PSProvider Certificate will prove disappointing, it was the first thing I tried.
What one can try, is using the X509Store constructor that takes an IntPtr to a SystemStore, as described here. Yes, that invovles some unmanaged code, and mixing the two is something I do rarely, but this and googling for HCERTSTORE C# should get you there.
Even though this post is years old, it is still helpful and turns up in searches, so to address the question of "I don't know how NTDS determines which certificate to use when there are multiple in the certificate store", the answer is that you will get unreliable results when there are two or more valid certificates installed that meet the requested criteria so it is recommended to remove the old/unneeded certificate(s) and just leave the newest/best one for the server auth.

Strange outcome from running Importpfx.exe to import certificates

I am writing a Powershell script to automate the setting up of a Windows 2008 R2 server and one thing that is required is the importing of several certificates into different stores. After doing some research on how best to achieve this, I found that Importpfx.exe was the best choice for what I am aiming to do, which is import one .pfx file into the Trusted People store and another .pfx file into the Personal store, both for the Computer account. I then also need to Manage Private keys on the certificate imported into the Personal store once it has been imported.
At first, I thought that Importpfx.exe was doing this correctly, but after researching on how to manage the private keys via Powershell, I learned that this can be done my editing the acl for the file that corresponds to the imported certificate which should be found here "C:\ProgramData\Microsoft\Crypto\RSA\MachineKeys". This is where I started to notice that something wasn't quite right with the imported certificate. After searching this folder for a new file after importing the certificates, I noticed that no new files had been added to this folder.
I searched the entire C drive for all files sorted by date modified and found that new files had been added to this folder "C:\Users\'user'\AppData\Roaming\Microsoft\Crypto\RSA\S-1-5-21-2545654756-3424728124-1046164030-4917" instead of the expected folder. Whilst I was able to manually manage private keys for the certificate via the certificate store (as I was user who imported it), no other users were able to log onto the machine and manage the private keys, getting the error message "Cannot find the certificate and private key for decryption" (which would make sense given the folder that the corresponding file exists in).
I use a function to get the thumbprint of the certificates before trying to import the .pfx file. The code I have used to run is:
function GetCertificateThumbprint ( [string]$certPreFix, [string]$certPassword, [string]$certFolder, [string]$domain, [bool]$addIfNotFound, [hashtable]$return)
$storePath = "cert:\LocalMachine"
$storeDir = "My"
$storeName = [System.Security.Cryptography.X509Certificates.StoreName]::My
if($certPreFix -eq "XXX")
{
$storeDir = "TrustedPeople"
$storeName = [System.Security.Cryptography.X509Certificates.StoreName]::TrustedPeople
}
$storePath = [System.IO.Path]::Combine($storePath, $storeDir)
#Build the certificate file name and get the file
$certFileName = $certPreFix + "." + $domainName + ".*"
$certFile = Get-ChildItem -Path $certFolder -Include $certFileName -Recurse
if ($certFile)
{
# The certificate file exists so get the thumbprint
$Certificate = New-Object system.Security.Cryptography.X509Certificates.X509Certificate2($certFile, $certPassword)
$certThumbprint = $Certificate.Thumbprint
if($addIfNotFound)
{
# Check for the certificate's thumbprint in store and add if it does not exist already
if(-not(Get-ChildItem $storePath | Where-Object {$_.Thumbprint -eq $certThumbprint}))
{
Set-Location "$Env:windir\Tools"
.\importpfx.exe -f $certFile -p $certPassword -t MACHINE -s $storeDir
}
}
}
Can anyone see if I have done anything wrong? Has anyone come across this issue and got around it somehow? This is causing me issues as I cannot automate the Manage Private keys task properly!
I just ran in to the same problem. You must specify the MachineKeySet X509KeyStorageFlag when creating the certificate object:
New-Object system.Security.Cryptography.X509Certificates.X509Certificate2($certFile, $certPassword, "PersistKeySet,MachineKeySet")
Hopefully that helps someone.

Private keys get deleted unexpectedly in Windows Server 2008 R2

I am facing a strange problem in developing an installation that should in one of the steps install a certificate.
The problem has to do with granting Certificate’s private key access for an account (e.g. IIS_IUSRS) on Windows Server 2008 R2. The private keys are stored in the location C:\Users\All Users\Microsoft\Crypto\RSA\MachineKeys.
A custom C# Setup Project imports a Certificate and gives access for an account on Certificate’s private key during the installation process. After some time (2-3 sec) the private key file is automatically deleted from the MachineKeys folder. Thus the installed Web Application cannot access the specific certificate and displays the following error message:
“System.Security.Cryptography.CryptographicException: Keyset does not exist”. This error occurs only on Windows Server 2008 R2, while for Windows Server 2003 everything is working correctly.
My question is, why the private key gets deleted and which process does this?
Thx
UPDATE 17/05/2012
I have not yet found a solution to the described problem, and no response has been posted on the other forums where I asked (forums.asp.net, social.msdn.microsoft.com). So, can anyone suggest any other resources or advice for further troubleshooting this issue?
Thanks again
This was happening to me too - my setup script would add the cert and grant access to the PK file fine, and the app would work. Then later, after I had closed the PowerShell editor I re-launched the app and it failed with a keyset not found.
Adding the PersistKeySet flag when importing the cert fixed the problem. Here's the PowerShell code for adding the cert and private key with persistence:
param(
[string]$certStore = "LocalMachine\TrustedPeople",
[string]$filename = "sp.pfx",
[string]$password = "password",
[string]$username = "$Env:COMPUTERNAME\WebSiteUser"
)
function getKeyUniqueName($cert) {
return $cert.PrivateKey.CspKeyContainerInfo.UniqueKeyContainerName
}
function getKeyFilePath($cert) {
return "$ENV:ProgramData\Microsoft\Crypto\RSA\MachineKeys\$(getKeyUniqueName($cert))"
}
$certFromFile = New-Object System.Security.Cryptography.X509Certificates.X509Certificate2($filename, $password)
$certFromStore = Get-ChildItem "Cert:\$certStore" | Where-Object {$_.Thumbprint -eq $certFromFile.Thumbprint}
$certExistsInStore = $certFromStore.Count -gt 0
$keyExists = $certExistsInStore -and ($certFromStore.PrivateKey -ne $null) -and (getKeyUniqueName($cert) -ne $null) -and (Test-Path(getKeyFilePath($certFromStore)))
if ((!$certExistsInStore) -or (!$keyExists)) {
$keyFlags = [System.Security.Cryptography.X509Certificates.X509KeyStorageFlags]::MachineKeySet
$keyFlags = $keyFlags -bor [System.Security.Cryptography.X509Certificates.X509KeyStorageFlags]::PersistKeySet
$certFromFile.Import($filename, $password, $keyFlags)
$store = Get-Item "Cert:\$certStore"
$store.Open("ReadWrite")
if ($certExistsInStore) {
#Cert is in the store, but we have no persisted private key
#Remove it so we can add the one we just imported with the key file
$store.Remove($certFromStore)
}
$store.Add($certFromFile)
$store.Close()
$certFromStore = $certFromFile
"Installed x509 certificate"
}
$pkFile = Get-Item(getKeyFilePath($certFromStore))
$pkAcl = $pkFile.GetAccessControl("Access")
$readPermission = $username,"Read","Allow"
$readAccessRule = new-object System.Security.AccessControl.FileSystemAccessRule $readPermission
$pkAcl.AddAccessRule($readAccessRule)
Set-Acl $pkFile.FullName $pkAcl
"Granted read permission on private key to web user"
Is very clear that is a security issue “System.Security.”. and you do not have permissions to do the installation., you need to set the permissions on the private key to allow that service account access it.
Edit later: Go to Start->Run->cmd->type mmc->Select File->Add/Remove->Select Certificates->Add->Computer Account->Local., i attach a screenshot is in spanish but I indicated the fields:
Open->Certificates->Personal->Certificates->Right click Certificate-> All Tasks->Manage Private Keys->Add Network Service.
Also check this entry to see how works this feature in Windows Server 2008., then please after you try it, come back and say if you could solve the issue with what I have told you.
http://referencesource.microsoft.com/#System/security/system/security/cryptography/x509/x509certificate2collection.cs,256 shows where the PersistKeySet flag is tested. The PersistKeySet flag is documented at https://msdn.microsoft.com/en-us/library/system.security.cryptography.x509certificates.x509keystorageflags%28v=vs.110%29.aspx with the phrase "The key associated with a PFX file is persisted when importing a certificate." My techno-babble to English translator tells me this means "You must include the PersistKeySet flag if you call the X509Certificate2 constructor and the certificate might already be installed on the machine." This probably applies to the .Import calls too. It's likely the powershell Import-PfxCertificate cmdlet already does this. But if you are doing what the accepted answer shows or what the OP asked, you need to include the special key. We used a variation of ejegg's script in our solution. We have a process that runs every 3 minutes to check that all configured certs are installed and this seems to work fine now.
The symptom we saw in powershell is the HasPrivateKey property is true but the PrivateKey value itself is null. And the key file for the cert in C:\ProgramData\Microsoft\Crypto\RSA\MachineKeys was deleted. The FindPrivateKey utility at https://msdn.microsoft.com/en-us/library/aa717039(v=vs.110).aspx helped us watch file get deleted.
So happy 4th birthday to the question with this very late response.