I have the command openssl dgst -sha256 -binary _your_file_path_ | openssl enc -base64 I use in terminal to get an output for a jar file that matches what AWS Lambda uses to hash.
I want to program that in Java, but I am having trouble understanding exactly what is going on in that line, so that I can go through each step in my code. Obviously, there is mode than just hashing in SHA256, because when I do that the output does not match.
Could someone help explain the steps that line is completing in a simple way for me?
You need to break the command down to understand what is going on.
The first part of the command:
openssl dgst -sha256 -binary <file> gives you a SHA256 binary checksum for the file.
The second part of the command:
openssl enc -base64 encodes the SHA256 binary checksum to Base64.
So to replicate in Java, you just need to carry out those same steps:
Calculate a SHA256 binary checksum.
Base64 encode the SHA256 binary checksum.
Without you posting the command you used to try and get a SHA256 checksum separately to the command you did post, I'm guessing the reason you were probably getting a different hash is because by default a checksum seems to output in hexadecimal.
See my example below and how the results are completely different.
# Hexadecimal
$ openssl dgst -sha256 data.csv
SHA256(data.csv)= 114811b0b8998cb9853a5379598021410feddf69bb2ee7b7145d052a7e9b5d45
# Binary (note the usage of the -binary flag)
$ openssl dgst -sha256 -binary data.csv
H:SyY!Ai.]*~]E
If you then Base64 encode the hexadecimal checksum above, and the binary one, you'll also get two completely different results, as you can see below.
# Hexadecimal
$ printf 114811b0b8998cb9853a5379598021410feddf69bb2ee7b7145d052a7e9b5d45 | openssl enc -base64
MTE0ODExYjBiODk5OGNiOTg1M2E1Mzc5NTk4MDIxNDEwZmVkZGY2OWJiMmVlN2I3
MTQ1ZDA1MmE3ZTliNWQ0NQ==
# Binary
$ printf 'H:SyY!Ai.]*~]E' | openssl enc -base64
SDpTeVkhQWkuXSp+XUU=
For those, who TLDR. To get the same result as in this cat FILENAME.js | openssl dgst -sha256 -binary | openssl base64 -A command you should do the following conversions:
1) your content -> sha256 (you'll get the hexadecimal number, not a text)
2) hexadecimal -> binary
3) binary -> base64
Related
I use the following command in Powershell to sign a base64 encoded string. It is reading currently from a file. Can I also let it directly take it from a variable?
openssl dgst -sha256 -sign jwtRS256.key -binary $payload | openssl enc -base64 -
It works if I use the following:
openssl dgst -sha256 -sign jwtRS256.key -binary payload.b64 | openssl enc -base64 -A
Maybe it is very simple or it is not possible what I try to achieve.
This line is part of some steps that I try to follow to sign a concatenate of header.payload for JWT geneartion by using openssl.
The following command works for SHA1: csum -h SHA1 (FileName).txt > (FileName_chksum).txt. How to create a similar file using the SHA256 algorithm in AIX?
You can use the openssl command from the openssl.base package; it has a dgst sub-command that will generate a SHA256 hash of the file:
openssl dgst -sha256 filename.txt > filename_sha256.txt
By default, it will print in the following format:
SHA256(filename.txt)= hash-string-here
The csum command prints in a slightly different format:
hash-string-here filename.txt
... so you may want to rearrange the output of openssl based on your specific needs for the filename_sha256.txt file.
If you only want the hashed string itself in the new file, you could use awk:
openssl dgst -sha256 filename.txt | awk '{print $2}' > filename_sha256.txt
I'm trying to calculate the hash over a public key on both iOS (Swift) and macOS (terminal & OpenSSL), but both platforms export the key in a slightly different format.
My Swift code extracts the sequence containing both the modulus and exponent (according to Apple this is the PKCS#1 container).
let export = SecKeyCopyExternalRepresentation(publicKey, nil)! as Data
let hash = SHA256.hash(data: export)
// SHA256 digest: 57fc8238c609045b7c0b546f58d5f797ebec4e39eff481459edfb67bd850834d
print(hash)
Now when I do similar things with the terminal I get a different output.
openssl rsa -pubin -outform DER | openssl dgst -sha256
# writing RSA key
# 0ee9c99ef4ca3316e90dde23925bc9a670fa309d6f4663bb5d42050b5089b086
The latter one is cause by OpenSSL wrapping the output in a fuller structured ASN.1 container.
SEQUENCE (ASN.1 container)
SEQUENCE
OID (RSA algorithm)
NULL
BITSTRING
SEQUENCE (iOS container)
INTEGER (Modulus)
INTEGER (Exponent)
How can I use OpenSSL to export the key into only the sequence iOS expects, so the has will be the same for both commands?
It turns out OpenSSL has an undocumented parameter -RSAPublicKey_out that outputs the same data that SecKeyCopyExternalRepresentation does.
openssl rsa -pubin -RSAPublicKey_out -outform DER | openssl dgst -sha256
This provides the same digest for both iOS and macOS
I'm trying to hash a fairly small value using SHA1 for a university excercise.
I'm running OpenSSL 1.1.1 11 Sep 2018. Operating System is Ubuntu 18.04.1, running through Windows Subsystem for Linux 1.
Running any of the following;
echo "361448504617" | openssl dgst -SHA1
echo 361448504617 | openssl dgst -SHA1
openssl dgst -sha1 hash.txt
openssl SHA1 hash.txt
Returns:
(stdin)= f98a0e600cd960f6c414343748a8dabc5ae9ec0a
(stdin)= f98a0e600cd960f6c414343748a8dabc5ae9ec0a
SHA1(hash.txt)= f98a0e600cd960f6c414343748a8dabc5ae9ec0a
SHA1(hash.txt)= f98a0e600cd960f6c414343748a8dabc5ae9ec0a
If I go to an online SHA1 hash generator, such as https://passwordsgenerator.net/sha1-hash-generator/, it returns:
A599EBBA6735313C848118F6EDB63012163D7581
Which is also the answer to the worksheet, and also what the labratory instructors terminal returns.
Can anyone give me a hand in troubleshooting this?
Annnd, I figured it out.
OpenSSL was hashing the newline character also, pretty easy to solve using the -n argument for echo.
echo -n 361448504617 | openssl SHA1
Also, when OpenSSL was reading from file, I got the same error because vim was saving with an end of line character. Fixed by running the following commands inside vim:
:set binary
:set noeol
:wq
Does anyone knows how the dgst function of the Openssl library manage the input value? I mean, it considers the input value as ASCII characters or in any other charset encoding?
I'm trying to input hexadecimal values but can't find how to do this:
$echo -n "FFFF" | openssl dgst -sha256
The result is different from the obtained by other ways (e.g. Java's MessageDigest) with the hexadecimal number '0xFFFF' as input.
Normally dgst takes ASCII input, to get hash of 0xFFFF try:
printf "\xFF\xFF" | openssl dgst -sha256
The result should be: ca2fd00fa001190744c15c317643ab092e7048ce086a243e2be9437c898de1bb