Kraken API MATLAB client invalid signature error - matlab

I'm trying to do some authenticated calls to Kraken private endpoints but without success. I'm still getting an error EAPI:Invalid signature.
Does anybody know what's wrong?
Here's the code:
function [response,status]=kraken_authenticated(uri,postdata)
% test uri='0/private/AddOrder'
% test postdata='&pair=XBTEUR&type=buy&ordertype=limit&price=345.214&volume=0.65412&leverage=1.5&oflags=post'
url=['https://api.kraken.com/',uri];
% nonce
nonce = num2str(floor((now-datenum('1970', 'yyyy'))*8640000000));
[key,secret]=key_secret('kraken');
% 1st hash
Opt.Method = 'SHA-256';
Opt.Input = 'ascii';
sha256string = DataHash(['nonce=',nonce,postdata],Opt);
% 2nd hash
sign = crypto([uri,sha256string], secret, 'HmacSHA512');
header_1=http_createHeader('API-Key',key);
header_2=http_createHeader('API-Sign',char(sign));
header=[header_1 header_2];
[response,status] = urlread2(url,'POST',['nonce=',nonce,postdata],header);
end
Crypto function is in another file:
function signStr = crypto(str, key, algorithm)
import java.net.*;
import javax.crypto.*;
import javax.crypto.spec.*;
import org.apache.commons.codec.binary.*
keyStr = java.lang.String(key);
key = SecretKeySpec(keyStr.getBytes('UTF-8'), algorithm);
mac = Mac.getInstance(algorithm);
mac.init(key);
toSignStr = java.lang.String(str);
signStr = java.lang.String(Hex.encodeHex( mac.doFinal( toSignStr.getBytes('UTF-8'))));
end
I've also tried
sign = crypto([uri,sha256string], base64decode(secret), 'HmacSHA512');
but without success.
This is guide for authenticated call HTTPS Header:
API-Key = API key
API-Sign = Message signature using HMAC-SHA512 of (URI path + SHA256(nonce + POST data)) and base64 decoded secret API key
This is guide for authenticated call POST Data:
nonce = always increasing unsigned 64 bit integer
otp = two-factor password (if two-factor enabled, otherwise not required)
I've tried to pass "nonce" parameter or all parameters in "postdata" to POST data but without success.
Thanks for help.

The problem is in function crypto here:
keyStr = java.lang.String(key);
key = SecretKeySpec(keyStr.getBytes('UTF-8'), algorithm);
As the base64 encoded private key from kraken is not necessarily UTF-8 encoded, you cannot use UTF-8 encoding to extract the key and pass UTF-8 string to the SecretKeySpec function. You need to use byte array instead.
Similar issues
https://code.google.com/p/google-apps-script-issues/issues/detail?id=5113
https://code.google.com/p/google-apps-script-issues/issues/detail?id=3121
Solution for javascript
github.com/Caligatio/jsSHA

Related

HSM RSA OAEP Encryption with Asymmetric hashes fails

I am using the PKCS11Interop library to Encrypt and Decrypt data with the parameters below.
CK_RSA_PKCS_OAEP_PARAMS p = new CK_RSA_PKCS_OAEP_PARAMS();
p.HashAlg = (uint)CKM.CKM_SHA256;
p.Mgf = (uint)CKG.CKG_MGF1_SHA1;
p.Source = (uint)CKZ.CKZ_DATA_SPECIFIED;
p.SourceData = IntPtr.Zero;
p.SourceDataLen = 0;
CK_MECHANISM mech = CkmUtils.CreateMechanism(CKM.CKM_RSA_PKCS_OAEP, p);
The error I get is CKR_MECHANISM_PARAM_INVALID when attempting to Encrypt/Decrypt.
But when I use CKG_MGF1_SHA256 for MGF1 then both encryption and decryption works.
Am I missing something or mismatched hashes aren't supported by HSM boxes?

crypto-js decrypt from Hex

I am trying to make a JavaScript function with package crypto-js to decode AES (CBC mode).
I input the data in an online decoding tool and it decrypted correctly, so I am sure the following data is correct, but I just can't reproduce it by JavaScript.
Here is the online decrypting (so I'm sure the data, key, iv are correct): http://aes.online-domain-tools.com/link/deb718giF4dUxZylq/
My code with crypto-js#3.1.8:
// data, key, iv are all Hex
var data = "bd6e0a73147a2c224c7c20346d0e9a138b744a5d94463cdff6dbb965055f974f097104399d2c40af2f0ac667f3857e70e9703bf27f6411f7e97c3449e8921f3c98e665914689b4b77b5bbcc8d8bc319e680eb89eedb1c25178923ae57fb3fb476755d6009f1aed88fffcb9b2ed3b4cf6f23d9c4c56da1dde6619e45a8d6f06412853ae1941cf554b6824112a913750a7485ed67fb38b950411310410de998f2597c2fcc81a305b0df369f54b75426176";
var key = 'befce5c6da98837ea421811c832817ae';
var iv = "a884a7edd5d06a48d6da9ad11fd36a75";
// transfer Hex to WordArray
var _data = CryptoJS.enc.Hex.parse(data);
var base64_data = _data.toString(CryptoJS.enc.Base64);
var _key = CryptoJS.enc.Hex.parse(key);
var _iv = CryptoJS.enc.Hex.parse(iv);
decrypted = CryptoJS.AES.decrypt(
base64_data, // pass base64
_key, // pass WordArray
{iv: _iv, // pass WordArray
mode: CryptoJS.mode.CBC,
padding: CryptoJS.pad.ZeroPadding
})
console.log(decrypted.toString(CryptoJS.enc.Utf8));
// out put fail to match Utf8
It output Error: Malformed UTF-8 data
The decoded string should be: (the link is not important)
https://emogo-media-testing.s3.amazonaws.com/1503342403787_blob?AWSAccessKeyId=AKIAI5MUDCK6XYWKGAKA&Expires=1534882403&Signature=t1PFesQuOpOlIMKoOqje%2Bs7I%2Fhg
Any hint is appreciated. Thank you!
I know it has been a while since you asked the question but I will respond just so the next person does not stumble upon an unanswered question.
Your code works fine, it decrypts AES.CBC encrypted data correct, the problem lies with your input data.
Your encrypted data string should have looked like:
80b7c4881334675693ef9c95259e70b24d0736e98f8424233d5e37f353261c2a589287bc3f675449f7d8ed4e2289a4c06b22d7f83efc09cfb72abe3a76e193a8efbdc968232d29b9b58135bfa24d51e60e34791f652a0aa806d0be7734dd61a930a30c99f31f08740cdb182af07b19d5b4274deb958d984b3ccb9d6e2be0cfa3a026dd6b734dbf1dd3635bc7bcceface9c55dfb9455ca834a6dbd1aa0f3c23923ce6aeba59acbc80d681fee73487b9004496540830d44102b94e35eac291c4e3b8c9ac168ae799e46cde45ee652415ae69992d0f7527045fd42b82e9e6946cfb2dbcc3b93f19ff0e5035ab12250f7a917975b2f7c069cbd8a0ba0d94b318634a
for this example to work correctly.
The key you used is not a hex string but a text string. Your online example is no longer valid but I figured it out after a couple of tries.
If change the following line:
var _key = CryptoJS.enc.Hex.parse(key);
to:
var _key = CryptoJS.enc.Utf8.parse(key);
Your code example will work fine with your original data string.
When you decrypted the text on http://aes.online-domain-tools.com/ you probably had the plaintext textbox selected instead of hex for your key input.

Kraken API "Invalid signature" error using Matlab

I am trying to implement the API of the bitcoin exchange Kraken in MATLAB. Unfortunately I got stuck at trying to execute an authentication in order to retrieve private user data.
In particular, I was playing with the following Implementation: Kraken API MATLAB client invalid signature error. The documentation of Kraken's API is here: https://www.kraken.com/help/api
When connecting with the Private user data but I continuously run into the error: {"error":["EAPI:Invalid signature"]}. Could someone maybe have a quick look at the implementation below and look for flaws in the code? Or has someone successfully implemented the Kraken API for Matlab?
Many thanks!
% Private
uri = '0/private/Balance';
postdata='';
[response,status] = kraken_authenticated(uri,postdata)
% test uri='0/private/AddOrder'
% test postdata='&pair=XBTEUR&type=buy&ordertype=limit&price=345.214&volume=0.65412&leverage=1.5&oflags=post'
function [response,status]=kraken_authenticated(uri,postdata)
% Generate URL
url=['https://api.kraken.com/',uri];
% nonce
nonce = num2str(floor((now-datenum('1970', 'yyyy'))*8640000000));
key = ' '
secret = ' '
% 1st hash
Opt.Method = 'SHA-256';
Opt.Input = 'ascii';
sha256string = DataHash(['nonce=',nonce,postdata],Opt);
% 2nd hash
%sign = crypto([uri,sha256string], secret, 'HmacSHA512');
sign = crypto([uri,sha256string], base64decode(secret), 'HmacSHA512')
%sign = HMAC([uri,sha256string], base64decode(secret), 'SHA-512');
%header_0=http_createHeader('Content-Type','application/x-www-form-urlencoded');
header_1=http_createHeader('API-Key',key);
header_2=http_createHeader('API-Sign',char(sign));
header=[header_1 header_2];
[response,status] = urlread2(url,'POST',['nonce=',nonce,postdata],header);
end
function signStr = crypto(str, key, algorithm)
import java.net.*;
import javax.crypto.*;
import javax.crypto.spec.*;
import org.apache.commons.codec.binary.*
keyStr = java.lang.String(key);
key = SecretKeySpec(keyStr.getBytes('UTF-8'), algorithm);
%key = SecretKeySpec(keyStr.getBytes(), algorithm)
mac = Mac.getInstance(algorithm);
mac.init(key);
toSignStr = java.lang.String(str);
signStr = java.lang.String(Hex.encodeHex( mac.doFinal( toSignStr.getBytes('UTF-8'))))
%signStr = java.lang.String(Hex.encodeHex( mac.doFinal( toSignStr.getBytes())))
end
function header = http_createHeader(name,value)
header = struct('name',name,'value',value);
end
I'm actually trying to do my owm implementation in c++, and I was here for another error I get, but here is a possible cause I noticed in your code :
The first sha256 hash should be of a concatenation of nonce and postdata. As postdata is also containing the nonce, if the nonce value is 123456789, then you should do (pseudo-code) :
sha256("123456789nonce=123456789")
or in Matlab :
sha256string = DataHash([nonce,'nonce=',nonce],Opt);
I hope it helps.

Create WS security headers for REST web service in SoapUI Pro

We are developing a REST web service with the WS security headers to be passed through as header parameters in the REST request.
I am testing this in SoapUI Pro and want to create a groovy script to generate these and then use them in the REST request.
These parameters include the password digest, encoded nonce and created dateTime and password digest which is created from encoding the nonce, hashed password and created date and time, i.e. the code should be the same as that which generates these from using the Outgoing WS Security configurations in SoapUI Pro.
I have created a groovy test script in Soap UI Pro (below). However when I supply the created values to the headers I get authorisation errors.
I am able to hash the password correctly and get the same result a my python script.
Groovy code for this is ..
MessageDigest cript = MessageDigest.getInstance("SHA-1");
cript.reset();
cript.update(userPass.getBytes("UTF-8"));
hashedpw = new String(cript.digest());
This correctly hashes the text 'Password2451!' to í¦è~µ”t5Sl•Vž³t;$.
The next step is to create a password digest of the nonce the created time stamp and the hashed pasword. I have the following code for this ...
MessageDigest cript2 = MessageDigest.getInstance("SHA-1");
cript2.reset();
cript2.update((nonce+created+hashedpw).getBytes("UTF-8"));
PasswordDigest = new String(cript2.digest());
PasswordDigest = PasswordDigest.getBytes("UTF-8").encodeBase64()
This converts '69999998992017-03-06T16:19:28Zí¦è~µ”t5Sl•Vž³t;$' into w6YA4oCUw6nDicucw6RqxZMIbcKze+KAmsOvBA4oYu+/vQ==.
However the correct value should be 01hCcFQRjDKMT6daqncqhN2Vd2Y=.
The following python code correctly achieves this conversion ...
hashedpassword = sha.new(password).digest()
digest = sha.new(nonce + CREATIONDATE + hashedpassword).digest()
Can anyone tell me where I am going wrong with the groovy code?
Thanks.
changing my answer slightly as in original I was converting the pasword digest to a string value which caused the request to not validate some of the time as certain bytes did not get converted into the correct string value.
import java.security.MessageDigest;
int a = 9
nonce = ""
for(i = 0; i < 10; i++)
{
random = new Random()
randomInteger= random.nextInt(a)
nonce = nonce + randomInteger
}
Byte[] nonceBytes = nonce.getBytes()
def XRMGDateTime = new Date().format("yyyy-MM-dd'T'HH:mm:ss", TimeZone.getTimeZone( 'BTC' ));
Byte[] creationBytes = XRMGDateTime.getBytes()
def password = testRunner.testCase.testSuite.getPropertyValue( "XRMGPassword" )
EncodedNonce = nonce.getBytes("UTF-8").encodeBase64()
MessageDigest cript = MessageDigest.getInstance("SHA-1");
cript.reset();
cript.update(password.getBytes());
hashedpw = cript.digest();
MessageDigest cript2 = MessageDigest.getInstance("SHA-1");
cript2.update(nonce.getBytes());;
cript2.update(XRMGDateTime.getBytes());
cript2.update(hashedpw);
PasswordDigest = cript2.digest()
EncodedPasswordDigest = PasswordDigest.encodeBase64();
def StringPasswordDigest = EncodedPasswordDigest.toString()
def encodedNonceString = EncodedNonce.toString()
testRunner.testCase.setPropertyValue( "passwordDigest", StringPasswordDigest )
testRunner.testCase.setPropertyValue( "XRMGDateTime", XRMGDateTime )
testRunner.testCase.setPropertyValue( "XRMGNonce", encodedNonceString )
testRunner.testCase.setPropertyValue( "Nonce", nonce )

KRL: Signing requests with HMAC_SHA1

I made a test suite for math:hmac_* KRL functions. I compare the KRL results with Python results. KRL gives me different results.
code: https://gist.github.com/980788 results: http://ktest.heroku.com/a421x68
How can I get valid signatures from KRL? I'm assuming that they Python results are correct.
UPDATE: It works fine unless you want newline characters in the message. How do I sign a string that includes newline characters?
I suspect that your python SHA library returns a different encoding than is expected by the b64encode library. My library does both the SHA and base64 in one call so I to do some extra work to check the results.
As you show in your KRL, the correct syntax is:
math:hmac_sha1_base64(raw_string,key);
math:hmac_sha256_base64(raw_string,key);
These use the same libraries that I use for the Amazon module which is testing fine right now.
To test those routines specifically, I used the test vectors from the RFC (sha1, sha256). We don't support Hexadecimal natively, so I wasn't able to use all of the test vectors, but I was able to use a simple one:
HMAC SHA1
test_case = 2
key = "Jefe"
key_len = 4
data = "what do ya want for nothing?"
data_len = 28
digest = 0xeffcdf6ae5eb2fa2d27416d5f184df9c259a7c79
HMAC SHA256
Key = 4a656665 ("Jefe")
Data = 7768617420646f2079612077616e7420666f72206e6f7468696e673f ("what do ya want for nothing?")
HMAC-SHA-256 = 5bdcc146bf60754e6a042426089575c75a003f089d2739839dec58b964ec3843
Here is my code:
global {
raw_string = "what do ya want for nothing?";
mkey = "Jefe";
}
rule first_rule {
select when pageview ".*" setting ()
pre {
hmac_sha1 = math:hmac_sha1_hex(raw_string,mkey);
hmac_sha1_64 = math:hmac_sha1_base64(raw_string,mkey);
bhs256c = math:hmac_sha256_hex(raw_string,mkey);
bhs256c64 = math:hmac_sha256_base64(raw_string,mkey);
}
{
notify("HMAC sha1", "#{hmac_sha1}") with sticky = true;
notify("hmac sha1 base 64", "#{hmac_sha1_64}") with sticky = true;
notify("hmac sha256", "#{bhs256c}") with sticky = true;
notify("hmac sha256 base 64", "#{bhs256c64}") with sticky = true;
}
}
var hmac_sha1 = 'effcdf6ae5eb2fa2d27416d5f184df9c259a7c79';
var hmac_sha1_64 = '7/zfauXrL6LSdBbV8YTfnCWafHk';
var bhs256c = '5bdcc146bf60754e6a042426089575c75a003f089d2739839dec58b964ec3843';
var bhs256c64 = 'W9zBRr9gdU5qBCQmCJV1x1oAPwidJzmDnexYuWTsOEM';
The HEX results for SHA1 and SHA256 match the test vectors of the simple case.
I tested the base64 results by decoding the HEX results and putting them through the base64 encoder here
My results were:
7/zfauXrL6LSdBbV8YTfnCWafHk=
W9zBRr9gdU5qBCQmCJV1x1oAPwidJzmDnexYuWTsOEM=
Which match my calculations for HMAC SHA1 base64 and HMAC SHA256 base64 respectively.
If you are still having problems, could you provide me the base64 and SHA results from python separately so I can identify the disconnect?