curl Failed to import cert file client.crt on command prompt and Powershell works fine on gitbash - powershell

I am using Windows...
When I run the following curl command through gitbash it works fine:
curl --cacert ca.crt --key client.key --cert client.crt "https://myurl"
However, if I try to run the same command in command prompt or Powershell, I get this error:
curl: (58) schannel: Failed to import cert file client.crt, last error is 0x80092002
What do I need to do to get the command working in Command Prompt or Powershell?

Windows version of curl.exe is not configured to work with openssl but git's is.
So to make sure whenever I typed 'curl' into a command prompt, it was using git's version of curl I added the path to git's curl (C:\Program Files\Git\mingw64\bin) in system environment variables and moved it right to the top…so it find’s git’s curl before it finds window’s curl.
After then restarted the command prompt it resolved the issue.

You are providing your client certificate in the wrong format. curl requires the certificate in the PEM format (source):
-E/--cert <certificate[:password]>
(SSL) Tells curl to use the specified certificate file when getting a file with
HTTPS or FTPS. The certificate must be in PEM format. If the optional password
isn't specified, it will be queried for on the terminal. Note that this option
assumes a "certificate" file that is the private key and the private
certificate concatenated! See --cert and --key to specify them independently.
If curl is built against the NSS SSL library then this option can tell curl the
nickname of the certificate to use within the NSS database defined by the
environment variable SSL_DIR (or by default /etc/pki/nssdb). If the NSS PEM
PKCS#11 module (libnsspem.so) is available then PEM files may be loaded. If you
want to use a file from the current directory, please precede it with "./"
prefix, in order to avoid confusion with a nickname.
If this option is used several times, the last one will be used.
Your certificate might be in the DER format or contain a whole certificate chain instead of your single client certificate.

in the manpage of curl, it is described that on Windows, it uses schannel provider by default (which itself uses the windows store). I am on the same errand now :-) trying to find a way to pass the certs from the command line and from local files.
Perhaps try importing the certs into the Windows store.

On our Windows 2019 server we have two curl.exe.
By default, the version 7.83.1 was summoned.
The issue was solved by using the version 7.54.1 and adding the full path to access it.

Related

Where does "ipsec import" store certificate file?

I'm now setting up libreswan server - client.
Basically, I'm trying to follow a procedure described here.
https://kifarunix.com/setup-ipsec-vpn-server-with-libreswan-on-centos-8/
I created client certificate which is aaa.bbb.p12 from the server machine using pk12util command.
And copied to client machine and import using ipsec import aaa.bbb.p12
ipsec import aaa.bbb.p12 was successful.
But I don't know where this file stored when ipsec import command was executed.
Is there any way I can browse this certificate file using a certain command?
I found something but not perfect.
I copied aaa.bbb.p12 and used ipsec command like below.
# ipsec import ./aaa.bbb.p12 --nssdir /etc/ipsec.d/certsdb
Then, I can see the certificate using the command below.
# certutil -L -d sql:/etc/ipsec.d/certsdb
But I still have one more problem.
If I import one more certificate file such as aaa.ccc.p12.
Then, it is imported but it does not display certificate's name.
Even though I imported aaa.bbb.p12 and aaa.ccc.p12 but the command below shows only aaa.bbb twice.
# certutil -L -d sql:/etc/ipsec.d/certsdb
Certificate Nickname Trust Attributes
SSL,S/MIME,JAR/XPI
aaa.bbb u,u,u
aaa.bbb u,u,u

Self sign certificate bigbluebutton

I have a local server without any domain or public IP for that. I'm gonna to setup SSL self sign certificate for BigBlueButton. How I can do it in my local server?
Without host and domain names, self-signed certificates will be the only option which means they will not be valid SSL certificates. I don't know BigBlueButtom but it's documentation doesn't recommend this set up for production environments. Not every browser will accept it either.
However, if you want to give it a try, you can generate self-signed SSL certs on Linux using this command:
sudo openssl req -x509 -nodes -days 365 -newkey rsa:2048 -keyout selfsigned.key -out selfsigned.crt
These options will create both a key file and a certificate. You will be asked a few questions about the server in order to embed the information correctly in the certificate.
And then you can try to adapt the instructions here.
I was setting up BBB environment recently.
Self-signed certificate is no good. To get it working I had to:
Use a real server setup (with let's encrypt) and a real domain to get real certificates
copy the certificates to my local development setup (and update nginx config of course)
set up /etc/hosts locally
Use real SSL certificate. I had to:
Install BBB. Use ip instead hostname. See
https://docs.bigbluebutton.org/2.2/install.html#configure-nginx-to-use-https
Example:
wget -qO- https://ubuntu.bigbluebutton.org/bbb-install.sh | bash -s -- -v bionic-230 -s 10.211.55.9 -e me#example.com -a -w
Configure nginx to use HTTPS for you real domain (Order of certificates is very important). See
https://docs.bigbluebutton.org/2.2/install.html#configure-nginx-to-use-https
Add to hosts file ip and you domain. Example:
10.211.55.9 example.com
Use command to change domain.
bbb-conf --setip example.com

Get VS Code Python extension to connect to Jupyter running on remote AWS EMR master node

I have a working Jupyter server running on an EMR master node where I can run python and pyspark code with no issue. When trying to get the VS Code Python extension to connect to the very same Jupyter server, I get the following error:
Failed to connect to remote Jupyter notebook.
Check that the Jupyter Server URI setting has a valid running server specified.
http://***.***.***.***:8888/lab
Error: Invalid response: 405 Method Not Allowed
I created my own self-signed certificate on the EMR cluster by following these instructions from IBM. Then added the certificate to Chrome following these stack overflow instructions another user linked to on GitHub.
From the bash terminal of the EMR master node:
# create key and cert
openssl req -newkey rsa:2048 -nodes -keyout key.pem -x509 -days 365 -out certificate.pem
# combine key and cert
openssl pkcs12 -inkey key.pem -in certificate.pem -export -out certificate.p12
Downloaded certificate.p12 to my local computer, and then adding to Chrome: chrome://settings/privacy > Manage certificates > Import > Select and import certificate.p12 > Restarting VS Code.
I still get the same error.
Should I create either the key.pem and certificate.pem on my local then combine into a certificate?
Do I need to use the original .pem key issued when creating the EMR cluster?
The newer versions of Jupyter start what appears to be something like the Terminal based lynx browser.
No matter which Terminal shell I choose, the output is extremely chaotic after I launch Jupyter, in that the 'document' that the terminal "browser" is viewing is intermixed with the output of the Jupyter server.
Through all of that noise, I can use some combination of the arrow keys and enter to somehow 'navigate' to a point where the following is displayed somewhere in the terminal intermixed with the Jupyter output (usually highlighted, but depends on which Terminal program):
cookie: username-***-***-***-***-****=2|1:0|10:***********|27:username-***-***-***-***-****|44:***********************************k1ZmE=|****************************1bef31e Allow? (Y/N/Always/neVer)
I type A and press enter.
Sometimes, and sometimes not, I will see the following in the Terminal for a short time:
Data transfer complete
Then I can press q and get out of whatever terminal browser thing Jupyter launched and just see the normal Jupyter server output. Copy the full url to the Jupyter server, paste it into VS Code Python extension python.dataScience.jupyterServerURI.
Everything works as expected after that.
No certificates or keys needed.

Copy files over SSH failed "Error: Cannot parse privateKey: Unsupported key format."

Im trying to copy files over SSH, im using the same SSH Service Connection and it's just fine with other SSH tasks but copying files seems to run into trouble, Heres what it looks when i monitor for user logins
sshd[32240]: Accepted publickey for azurePPL1 from 13.69.175.211 port 1984 ssh2: ECDSA SHA256:0...
and this seems to be fine but it's not?
heres the error Azure Pipelines is throwing
Error: Failed to connect to remote machine. Verify the SSH service connection details. Error: Cannot parse privateKey: Unsupported key format.
Now i wouldve suspected my SSH Service Connection configuration but since other ssh work im not sure what it could be
Any help is appreciated
Using the same SSH Service Connection and it's just fine with other
SSH tasks but copying files seems to run into trouble
Since it's all work for other SSH task to use the same SSH Service Connection just Copy Files over SSH has failed, it means there's no error on your SSH key pair and connection. In fact, the issue relevant with the parser which used in Copy Files over SSH task.
See the function about the script of the copy file task which open source in github: function run in CopyFileOverSSH.ts, and the definition of class SshHelper: sshhelper.ts. In fact, the Copy Files over SSH task uses Ssh2 npm package for the SSH connection and verify, the error message you are facing is coming from there. For the copy file task itself, it does not do any key parsing.
About key parsing, see this source function: keyParser.js. Locate to line 1447, you will see that it is the error message you received in the task of Azure Devops.
As I know, from the task v0.148 is using ssh2 library v0.8, but now ssh2 library has been updated into v0.8.5.
So to solve this issue Please regenerate the key pair with the command ssh-keygen -t rsa -m PEM, to force ssh-keygen to export as PEM format. Thus the key can work in the copy file task.
Its now clear that the Azure Task is using an old version of ssh2
where Ed25519 keys are not supported which results in this issue so ill just have to use RSA for now.

Get entire certificate using Perl's Net::SSLeay

I want to use the Perl library Net::SSLeay to download a server's SSL certificate but am having trouble figuring out how to do this. I want the entire certificate, not just the common name and a few of the fields.
For example, I would give the script the argument google.com and it would connect to https://google.com and get the entire certificate string for the certificate with CN "*.google.com".
This script needs to run on a Debian wheezy server so must use version 1.48 of Net::SSLeay.
I believe this might be what you want?
($reply, $err, $cert) = sslcat('www.google.com', 443, '/'); # 5
$PEM = Net::SSLeay::PEM_get_string_X509( $cert);
Will give you the PEM encoded SSL certificate. You can decode it using MIME::Base64.