What is the format of this private key? - ecdsa

I got a pem key like this:
-----BEGIN PRIVATE KEY-----
MIGHAgEAMBMGByqGSM49AgEGCCqGSM49AwEHBG0wawIBAQQgDZUgDvKixfLi8cK8
/TFLY97TDmQV3J2ygPpvuI8jSdihRANCAARRN3xgbPIR83dr27UuDaf2OJezpEJx
UC3v06+FD8MUNcRAboqt4akehaNNSh7MMZI+HdnsM4RXN2y8NePUQsPL
-----END PRIVATE KEY-----
I only know that this key is used for ecdsa signature.
Now I want to know the real value of the key, because I need it to calculate something, and I believe there is some kind of header in front of the real value.
I tried to use python-ecdsa to parse the key, but obviously didn't work.
How can I get the real value?

It looks like you have an ASN1 structure representing an ECDSA private key for the NIST P-256 curve, and it was encoded as a PEM file.
I was able to get your private and public key and the curve information using the OpenSSL command-line utility:
$ openssl ec -in test.pem -text -noout
read EC key
Private-Key: (256 bit)
priv:
0d:95:20:0e:f2:a2:c5:f2:e2:f1:c2:bc:fd:31:4b:
63:de:d3:0e:64:15:dc:9d:b2:80:fa:6f:b8:8f:23:
49:d8
pub:
04:51:37:7c:60:6c:f2:11:f3:77:6b:db:b5:2e:0d:
a7:f6:38:97:b3:a4:42:71:50:2d:ef:d3:af:85:0f:
c3:14:35:c4:40:6e:8a:ad:e1:a9:1e:85:a3:4d:4a:
1e:cc:31:92:3e:1d:d9:ec:33:84:57:37:6c:bc:35:
e3:d4:42:c3:cb
ASN1 OID: prime256v1
NIST CURVE: P-256
You can also look at the ASN1 structure itself:
$ openssl asn1parse -in test.pem
0:d=0 hl=3 l= 135 cons: SEQUENCE
3:d=1 hl=2 l= 1 prim: INTEGER :00
6:d=1 hl=2 l= 19 cons: SEQUENCE
8:d=2 hl=2 l= 7 prim: OBJECT :id-ecPublicKey
17:d=2 hl=2 l= 8 prim: OBJECT :prime256v1
27:d=1 hl=2 l= 109 prim: OCTET STRING [HEX DUMP]:306B02010104200D95200EF2A2C5F2E2F1C2BCFD314B63DED30E6415DC9DB280FA6FB88F2349D8A1440342000451377C606CF211F3776BDBB52E0DA7F63897B3A44271502DEFD3AF850FC31435C4406E8AADE1A91E85A34D4A1ECC31923E1DD9EC338457376CBC35E3D442C3CB

Related

How to convert 20 length byte Ethereum wallet address to hex format found in Metamask?

Using some of the code from Geth's source code https://github.com/ethereum/go-ethereum, I'm trying to find a way to generate valid Ethereum wallets. For starters, I'm using the hex version of secp256k1 prime MINUS 1, which is "FFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFEBAAEDCE6AF48A03BBFD25E8CD0364140". Plugging that into Metamask I get an address of "0x80C0dbf239224071c59dD8970ab9d542E3414aB2". I would like to use the functions to get the equivalent address.
I've so far put in the hex format of the private key "FFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFEBAAEDCE6AF48A03BBFD25E8CD0364140". I get a 20 length byte array [164 121 62 11 173 85 89 159 11 68 30 45 4 221 247 191 191 44 73 181]
To be specific, I defined the secp256k1 prime - 1's hex value and used Geth's Crypto's newKey() function
var r io.Reader
r = strings.NewReader("FFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFEBAAEDCE6AF48A03BBFD25E8CD0364140")
tmp, err := newKey(r)
if err != nil {
fmt.Println(err)
}
fmt.Println(tmp.Address)
I have two questions. First, did I input the private key right as a hex? Second, how do I convert this 20 length byte array to a "normal" hex address?

Hex conversion of GUID removes Zeros?

I have a script that takes the hex value of a GUID and converts to a GUID. However it is removing zeros for eg. this is my output. And the reg reads for the hex values.
$GUIDLocal1 = (Get-ItemProperty "$Reg\Common Api")."UniqueID"
$GUIDLocal2 = (Get-ItemProperty "$Reg\Test\Common Api")."UniqueID"
# This is not code below just info
$GUIDLocal1 is 54 171 225 63 61 204 14 79 168 61 49 246 193 140 121 152
$GUIdlocal2 is 54 171 225 63 61 204 14 79 168 61 49 246 193 140 121 152
ID in Database is 36ABE13F3DCC0E4FA83D31F6C18C7998
$guidinhex 36ABE13F3DCCE4FA83D31F6C18C7998
$guidinhex2 36ABE13F3DCCE4FA83D31F6C18C7998
# This is not code above just info
I am using this code for the conversion
$guidinHex = [string]::Empty
$guidinHex2 = [string]::Empty
$GUIDLocal1 | % { $guidInHEX += '{0:X}' -f [int]$_ }
$GUIDLocal2 | % { $guidInHEX2 += '{0:X}' -f [int]$_ }
ID is GUID with all {, }, and - removed for ease of view.
$GUIDLocal1 and $GUIDLocal2 is the hex value in registry.
I then use the code above to convert ($GUIDLocal1 and $GUIDLocal2 is the values guidinhex / 2).
The conversion works, but if there is a zero it strips it out as you can see above - this machine the GUID actually matches the reg values but my conversion is skewing the result I just need to know why and how not to have the conversion remove the Zero / s.
I thought adding [int] would help but to no avail.
The -f (format) operator lets you format a numeric value as a hexadecimal string with leading zeros. The format specification is {0:Xn} or {0:xn}, where n is the number of digits desired in the string output (padded with zeros if needed). Uppercase X or lowercase x specifies whether you want the hex values A through F to be uppercase or lowercase. Examples:
"{0:X2}" -f 15 # outputs string 0F
"{0:X3}" -f 27 # outputs string 01B
"{0:x4}" -f 254 # outputs string 00fe
...and so forth. Documentation here:
https://learn.microsoft.com/en-us/dotnet/standard/base-types/standard-numeric-format-strings#XFormatString

Safenet HSM doesn't response to the message

I'm new to HSM, I'm using TCP connection to communicate with 'safenet ProtectHost EFT' HSM. So as for a beginning i tried to call 'HSM_STATUS' method by sending following message.
full message (with header) :
0000 0001 0000 0001 0001 1001 1011 1111 0000 0000 0000 0001 0000 0001
this message can be broken down as follows (in reverse order):
message :
0000 0001 is the 1 byte long command : '01' (function code of
HSM_STATUS method is '01')
safenet header :
0000 0000 0000 0001 is the 2 byte long length of the message : '1'
(length of the function call '0000 0001' is '1')
0001 1001 1011 1111 is 2 byte long Sequence Number (An arbitrary
value in the request message which is returned with the response
message and is not interpreted by the ProtectHost EFT).
0000 0001 is the 1 byte long version number(binary 1 as in the
manual)
0000 0001 is the 1 byte long ASCII Start of Header character (Hex
01)
But the HSM does not give any output for this message.
Could anyone please tell what might be the reason for this? Am i doing something wrong in forming this message?
Even I faced the same issue and resolved it.The root cause for this is that the HSM expects the user to provide the accurate length of Input.
That is the SOH value needs to be accurately of 1 byte length,where as your input is of 4 bytes length.So the following input to HSM will give you correct output :
String command = "01"// SOH + "01"// version + "00"// arbitary value
+ "00"// arbitary value + "00"// length + "01"// length + "01" ; // function call
Hope this helps :)

What's the is maximum length of scrypt output?

I'd like to store an scrypt-hashed password in a database. What is the maximum length I can expect?
According to https://github.com/wg/scrypt the output format is $s0$params$salt$key where:
s0 denotes version 0 of the format, with 128-bit salt and 256-bit derived key.
params is a 32-bit hex integer containing log2(N) (16 bits), r (8 bits), and p (8 bits).
salt is the base64-encoded salt.
key is the base64-encoded derived key.
According to https://stackoverflow.com/a/13378842/14731 the length of a base64-encoded string is where n denotes the number of bytes being encoded.
Let's break this down:
The dollar signs makes up 4 characters.
The version numbers makes up 2 characters.
Each hex character represents 4 bits ( log2(16) = 4 ), so the params field makes up (32-bit / 4 bits) = 8 characters.
The 128-bit salt is equivalent to 16 bytes. The base64-encoded format makes up (4 * ceil(16 / 3)) = 24 characters.
The 256-bit derived key is equivalent to 32 bytes. The base64-encoded format makes up (4 * ceil(32 / 3)) = 44 characters.
Putting that all together, we get: 4 + 2 + 8 + 24 + 44 = 82 characters.
In Colin Percival's own implementation, the tarsnap scrypt header is 96 bytes. This comprises:
6 bytes 'scrypt'
10 bytes N, r, p parameters
32 bytes salt
16 bytes SHA256 checksum of bytes 0-47
32 bytes HMAC hash of bytes 0-63 (using scrypt hash as key)
This is also the format used by node-scrypt. There is an explanation of the rationale behind the checksum and the HMAC hash on stackexchange.
As a base64-encoded string, this makes 128 characters.

Agilent Vee or Matlab: 4 byte ASCII to floating point

I currently have an instrument that sends 4 bytes representing a floating point number of 32-bit in little endian format, the data looks like:
Gz*=
<«�=
N×e=
or this
à|ƒ=
is there a conversion for this in matlab, Agilent vee and manually
To convert an array of char to single, you can use typecast:
c = 'Gz*=';
f = typecast(c, 'single')
f = 0.041621
Just implicitly!
>> data = ['Gz*=';'<«�=';'N×e=']
data =
Gz*=
<«�=
N×e=
>> data+0
ans =
71 122 42 61
60 171 65533 61
78 215 101 61
data+0 forces it to be interpreted as a number which is fine.
If it's interpreted it backwards (I'm not sure if MATLAB is big or little endian) just use the swapbytes function.