Is there a way to combine Microsoft Serialized Certificate Store (SST) Files? - powershell

I am trying to import multiple certificates to Outlook365 so that the Outlook web client will trust those S/MIME certificates. For reference, the command I'm using to import the SST file to Outlook365 is as follows:
Set-SmimeConfig -SMIMECertificateIssuingCA (Get-Content C:\Users\chaos\OneDrive\Documents\Personal\Folders\keys\AllImportantKeys.sst -Encoding Byte)
However, this command appears to replace the key store with the contents of the SST file, rather than appending the keys in the SST file.
The problem is that you can only export multiple certificates to an SST file if they are in the same folder in the Windows Certificate Manager. So, if you want to create an SST file that contains multiple certificates in multiple folders (for example, Current User\Personal\Certificates and Current User\Trusted Root Certification Authorities\Certificates), you are out of luck.
I have exported the certificates I need from four folders to four separate SST files. Is there a way to combine or merge them? I tried a simple file concatenation with Get-Content piped into Set-Content, but I get a bad format error when trying to import the SST file to Outlook365.
If SST files can't be easily combined, is there another workaround? For example, a way to force Outlook365 to append rather than replace? Or, a way to export all the keys I need at once (from multiple folders) to the SST file?
Thanks!

So, I figured out that Get-Content can accept a comma-delimited list of files. So, I was able to load multiple SST files into the Outlook365 store by simply doing this:
Set-SmimeConfig -SMIMECertificateIssuingCA (Get-Content C:\Keys1.sst, C:\Keys2.sst, C:Keys3.sst -Encoding Byte)

Related

How to generate one hash key for a directory in PowerShell?

This is what I am doing on linux
cat a-directory/* | md5
What would be the alternative in PowerShell, maybe something with CertUtil?
The reason I am doing this is that I want to make sure I can copy a large directory.

How can we use "ls" command for appending file list from FTP to local file?

I'm using the command ls *Pattern*Date* Files.txt to get a list of files from FTP to my local text file.
I'm now required to get multiple patterned files (The Date and Pattern may come in different order). So when I try to add another line ls *Date*Pattern* Files.txt, this clears the FCCfiles.txt and I'm not able to get the first set of files.
Is there any command that can append the list of files rather than creating new files list?
You cannot append the listing to a file with ftp.
But you can merge multiple listings in PowerShell. I assume that you run ftp from PowerShell, based on your use of powershell tag.
In ftp script do:
ls *Pattern*Date* files1.txt
ls *Date*Pattern* files2.txt
And then in PowerShell do:
Get-Content files1.txt,files2.txt | Set-Content files.txt
(based on How do I concatenate two text files in PowerShell?)

adding SSL automation task to pipeline

I've created a power shell script that sets the SSL based on a provided PFX file.
Using the VSTS pipeline, what is the recommended way of passing PFX file to the script?
Including PFX file in a solution
getting the PFX file path on a target environment (contains dependency,
assuming that PFX file is already placed on target environment)
any other solution...?
The common way to pass authentication to the script is using option 1 (Including PFX file in a solution) as you listed.
After adding the pfx file into your solution, you can import certificates and private keys by import-PfxCertificate.
Detail usage and examples of Import-PfxCertificate, you can refer this document.

Signing a folder and/or contents with OpenSLL error

I have written a Powershell script that reads the contents of a USB drive, reads files one by one and signs them in turn, generating a corresponding .sha256 file for each one.
I am having trouble getting this to either read files within subfolders or just to sign the folders on the root.
The command I am using to sign files is as follows;
C:\openssl\openssl.exe dgst -sha256 -sign $PriKey -out $path"\"$file".sha256" -passin pass:<passowrd> $path"\"$file
When it gets to a folder, it gives a 'Permission Denied' error (The user is a full administrator with full access to everything).
Does anyone know how I can get OpenSSL to either sign a folder (I realise folders are not conventional files) or to have Powershell read the contents of subfolders and sign the files contained there as per above?
I am relatively new to both.
Regards,
Jose

Powershell: Copy-Item -Recurse -Force is not copying all sub files

I have a one liner that is baked into a larger script for some high level forensics. It is just a simple copy-item command and writes the dest folder and its contents back to my server. The code works great, BUT even with the switches:
-Recurse -Force
It is not returning the file with an extension of .dat. As you can guess what I am trying to achieve, I need the .dat file for analysis. I am running this from a privileged account. My only thought was that it is a read/write conflict and the host file was currently utilizing it (or other sys file). What switch am I missing? The "mode" for the file that will not copy over is -a---. Not hidden, just not copying. Suggestions elsewhere have said to use xCopy/robocopy- if possible I do not want to call another dependancy- im already using powershell for the majority of the script, id prefer to stick with it....Any thoughts? Thanks in advance, this one has been tickling my brain for a little...
The only way to copy a file in use is to find the locking handle close it then retry the copy operation(handle.exe).
From your question it looks like you are trying to remotely copy user profiles which includes ntuser.dat and other files that would be needed to keep the profile working properly. Even if you did manage to find a way to unload the dat file(s), you would have to consider the impact that would have on the remote system.
Shadow copy is typically used by backup programs to copy files in use so your best bet would be to find the latest backup of each remote computer and then try to extract the needed files from the backed-up copies or maybe wait for the users to logoff and then try.