How to create a CSR using Google Cloud KMS? - csr

I want to use Google Cloud KMS for asymmetric signing. To complete setup with the destination provider I need to send them a CSR signed with the private key stored in Google. I've found some examples of doing this using Java or Go but I don't need to do it programatically and I don't know those languages anyway. Ideally I'm looking for something command line based using the SDK.

I gave up doing it this way and generated the private key and CSR on a local trusted machine then imported it into Google.

Related

Can I safely save my private key on public build servers?

This is a technical question on comprehension:
When I want a cloud server (e.g. GitHub Actions, Azure DevOps, or GitLab CI/CD) to build and publish an app to any of the app stores ... isn't it necessary then that I upload my private key to these servers' key vault, so they can sign my app on my behalf?
Isn't that concept a bit risky?
I mean, I was taught to let my private keys never leave my machines.
What if I accidentally misconfigure the security settings on the uploaded key? What if some black hat gets hold of the key and abuses it? I mean, with each build process, the private key is getting copied from the vault to the build runner, usually residing somewhere else.
What are the techniques used to ensure that private keys are kept safely on a public server? Is there an official audit performed on these departments?
Should I rather use different Authenticode certificates for each of the above providers? Or will a single certificate be resilient enough?
I couldn't find a technical discussion on this question, only marketing docs. Has this security concern been scientifically scrutinized?
It is safe to use certificates in Azure DevOps, Azure DevOps encrypts the certificate and then uses it in the build pipeline.
We could use Azure Key Vault to protect encryption keys and secrets like certificates, connection strings, and passwords in the cloud. You could refer this doc for more details.
We could also save the certificate file in the Library->Secure Files, if someone want to access or use it, he need enough permission.
I found some sample to use certificate in the Azure DevOps Service, you could check it.
 
If you are using SSL certificate thumbprint, you could save it in the variable and then set the variable to Secret via the “lock” icon at the end of the row.
If you are using PFX certificate, you could refer to this doc.
Update1
GitHub action
We could save the certificate here in the GitHub, GitHub also encrypts the certificate, we could not get the value after save the variable in the Secrets, and we will get *** when print the Secrets value in the log.

How to bootstrap certificates for LCM to reference for signatureverification settings?

WMF 5.1 includes new functionality to allow signing of MOF documents and DSC Resource modules (reference). However, this seems very difficult to implement in reality -- or I'm making it more complicated than it is...
My scenario is VMs in Azure and I'd like to leverage Azure Automation for Pull DSC Server; however, I see this applying on premise too. The problem is that the certificate used to sign the MOF configurations and/or modules needs to get placed on the machine before fetching and applying the configuration otherwise configuration will fail because the certificate isn't trusted or present on the machine.
I tried using Azure KeyVault to bootstrap the certificate (just the public key because that's my understanding of how signing works) and that fails using Add-AzureRmVMSecret because the CertificateUrl parameter expects a full certificate with the public/private key pair to install. In an ideal world, this would be the solution but that's not the case...
Other ideas, again in this context, would be to upload the cert to blob storage, use a CustomScriptExtension to pull down the cert and install into the LocalMachine store but that feels nasty as well because, ideally, that script should be signed as well and that puts us back in the same spot.
I suppose another idea would be to first PUSH a configuration that downloaded and installed certificates only but that doesn't sound great either.
Last option would be to rely on an AD GPO or something similar to potentially push the certificate first...but, honestly, trying to move away from much of that if/when possible...
Am I off-base on this? It seems like this should be a solvable problem -- just looking for at least one "good" way of doing it.
Thanks
David Jones has quite a bit of experiencing dealing with this issue in an on-premises environment, but as you stated the same concepts should apply for Azure. Here is a link to his blog. This is a link to his GitHub site with a PKITools module that he created. If all else fails you can reach out to him on Twitter.
While it's quite easy to populate a pre-booted image with public certificates. it's not possible (that I have found) to populate the private key.
DSC would require the private key to decrypt the passwords.
The most common tactic people blog about is to use the unattend to script the import of a PFX. issue there is you have to leave the password for the PFX in plain text. Perhaps that is ok in your environment.
The other option requires a more complicated setup. Use a simple DSC or GPO to auto enroll a unique certificate. then have the system, via first boot script or DSC custom resource, tickle an API (Like Polaris) and that triggers a DSC script that uses PKITools or other script to get the public certificate that the machine has. Then have that API push a new DSC config (or pull settings) to the machine.

Azure Service Fabric, KeyVault, SSL Certificates

I want to secure my own HTTPS end point (node.js express.js server) with a certificate which I have deployed to the cluster (that is, it exists in Cert:\LocalMachine\My).
I of course want to avoid having my certificate in source control. I can't use an EndpointBindingPolicy in the ServiceManifest because as far as I'm aware that is just for http.sys(?) based systems, which this isn't.
What I thought is perhaps run a SetupEntryPoint script to:
grab the certificate from the store
export it as a pfx with a random passphrase (or some appropriate format)
copy it to {pkgroot}/certs/ssl_cert.pfx
replace some sort of token in serverinit.js with the random passphrase
This way the server, er, code base doesn't need to have the certificate present, i just needs to trust that it will be there when the service is run.
However I don't think I can do this, if it even is as sensible idea, as the certificates in the store are marked such that the private key is non-exportable! Or, at least, they are with my RDP account!
Is there a way to export the certificate with its private key?
What are my options here?
I ended up writing a powershell script which runs in my release pipeline, arguments are clientID, clientSecret and certificateName. clientSecret is stored as a protected environmental variable for my agent.
Create new application registration under same subscription as KeyVault (which should be same as SF Cluster) (e.g. in portal.azure.com)
Note down app ID
Create app secret
Modify KeyVault ACL with App as principal, set get only on secrets
use REST api with client ID and secret https://learn.microsoft.com/en-us/rest/api/keyvault/getsecret
I chose this over grabbing the certificate in the SetupEntryPoint, for example, as this hides the client secret better from the open world (e.g. developers who shouldn't/don't need access to it)

save window.crypto generated private key in the browser keystore?

We are trying to implement the following workflow:
generate private key in browser, using window.crypto
create a PKCS10 certificate signing request in the browser
send the PKCS10 to a server
the server signs the request and returns an x509 certificate in PEM format
the browser stores the certificate for itself
The same thing already works using the keygen tag in the browser and using SPKAC instead of pkcs10. Now, however the browser does not store the certificate returned, just wants to save them. When we try to import the certificate to the browser by hand, we got "the private key for the certificate is missing or invalid".
We suspect that the private key generated by window.crypto.generateKey() does not get stored in the browser's keystore. How to get the private key stored in the keystore?
The implementation of the first two steps is based on http://blog.engelke.com/2014/08/23/public-key-cryptography-in-the-browser/
Update: As some browsers use the OS keystore, I am also looking into the possibility to save the key into the OS keystore through some other way.
What I have figured out so far:
Java cannot be used according to this question: Tell Java to use Windows keystore
In Windows one can use ActiveX controls.
Summary: Found no standard cross-browser and cross-OS way to generate and meaningfully use X509 certificates. There are combinations (new chrome versions (dropping keygen support) on non-windows OS) where there is no way to do this.

Loading a server-side certificate *and* a private key from Windows Server cert store?

I'm trying to get this external REST webservice that requires both a server-side certificate and a private key (both of which I got from the publisher as *.pem files of that service).
For my testing, I googled and found a way to combine these two pieces into a *.pfx file - and loading a X509Certificate2 instance from that binary file on disk works just fine.
Now I was trying to put this into the Cert Store on my production Windows Server 2008.
I can get the X509Certificate2 from the cert store in my C# code - no problem:
X509Store store = new X509Store(StoreLocation.CurrentUser);
store.Open(OpenFlags.ReadOnly);
X509Certificate2Collection certs = store.Certificates.Find(X509FindType.FindBySerialNumber, "serial-number-here", false);
if (certs.Count > 0)
{
X509Certificate2 cert = certs[0];
// set the certificate on the RestClient to call my REST service
_restClient.ClientCertificates.Add(cert);
}
store.Close();
But when I do this, then the web service barfs at me, claiming it needs a "SSL certificate"...
Also: when I was loading the X509Certificate2 from disk, from that *.pfx file, I had to provide a password - nothing needs to be provided here, when loading from the cert store.... odd....
It seems that even though I imported the *.pfx which contains both the server-side certificate and our private key, somehow I cannot get both back from the cert store...
Any idea how I can get this to work? Do I need to load the private key from the cert store in a second step? How?
These certificates still remain mainly a big voodoo-like mystery to me ..... can anyone enlighten me?
The first thing to check is to see whether the certificate store does have the private key.
Open up the certificate management snappin and find your certificate, double click it and make sure it has the red highlighted section like in the image below:
Next, if the private key is in the store then maybe the account accessing the certificate does not have permissions on the private key. There are two ways to check this:
In the certificate management snappin, right click the certificate > All tasks > Manage private keys. (You should be able to check and edit permissions here)
In your code you could access the PrivateKey property (i.e. Do var privateKey = cert.PrivateKey and see whether you get it back).
You did not write how is the web service implemented.
if it is deployed on IIS
if it is self hosted
Your code to get certificate from store is correct. The question is where did you import the pfx - CurrentUser or LocalMachine store. You are using CurrentUser store in the code example. If you imported the certificate to LocalMachine store it will not be found. Also, please specify the store name - StoreName.My (in MMC or certmgr.msc it means Personal) in the constructor of X509Store (it might be default, but who knows all the defaults anyway :) )
But when I do this, then the web service barfs at me, claiming it
needs a "SSL certificate"...
You need to ensure you have Client Authentication in the extended key usage of the certificate.
Also: when I was loading the X509Certificate2 from disk, from that
*.pfx file, I had to provide a password - nothing needs to be provided here, when loading from the cert store.... odd....
It's how it works. When you have a pfx then the private key in it is secured/encrypted with password (password can be an empty string). When you import pfx to certificate store then the private key is secured/encrypted with other key (not exactly sure what key it is). However you can add another level of protection to the private key by specifying strong protection when importing pfx to the store (I do not recommend it when used with ASP.NET, or web services or anything that does not have a desktop). But when it is your personal certificate to sign emails then it might be good to enable it. Windows will then popup a window whenever an application will try to use the private key.
#DanL might be right about the rights to the private key and his
1) - set rights on private key and
2) - accessing private key in X509Certificate2
are written OK. I would just add to 1) that you are trying to connect to the REST service from a ASP.NET application or another web service on IIS then the name of the account that you need to add permission for is IIS APPPOOL\name_of_the_apppool_your_app_runs_under