Get keypair from seed - wavesplatform

I have waves full node on my server.
Using REST API I've generated a few addresses(like POST /addresses).
With REST API I can get encoded seed for each of this address, for example
GET /addresses/seed/<address>
{
"address" : "address",
"seed" : "seed_value"
}
But to send money from those addresses I need a corresponding private key.
So how can i get it?

Ok, as it turned out in this thread - https://forum.wavesplatform.com/t/question-about-addresses-endpoint/7752, each address is generated using the same seed with prepended bytesarray (for example: for a first address - [0x00, 0x00, 0x00, 0x00]).
Bytesarray is incremented after each address creation.
So, instead of using seed from addresses/seed/<address>, had to use seed from wallet/seed
This one works in python(using pywaves)
import axolotl_curve25519 as curve
import base58
import hashlib
import sha3
import pyblake2
import struct
import pywaves
def hashChain(noncedSecret):
b = pyblake2.blake2b(noncedSecret, digest_size=32).digest()
return sha3.keccak_256(b).digest()
seed = "value from /wallet/seed"
nonce = struct.pack(">L", 40)
seedHash = hashChain(nonce + base58.b58decode(seed))
accountSeedHash = hashlib.sha256(seedHash).digest()
private_key = base58.b58encode(curve.generatePrivateKey(accountSeedHash))
address = pywaves.Address(privateKey=private_key)
P.S. I'm pretty sure that don't understand how address' seed must be interpreted/used "in the right way", but I didn't find any documentation related to this. If someone know how to use it - I will be really appreciated.

Related

Phantom wallet addresses not making sense

If I import a seed phrase with #solana/web3.js I seem to get different public address than the ones generated by Phantom wallet when I import the same seed phrase. Anyone any idea why?
const seed = Bip39.mnemonicToSeedSync("<12 word phrase>").slice(0, 32);
const mint_authority = web3.Keypair.fromSeed(seed)
do I need to do anything with derivation paths so that addresses generated match those of Phantom wallet ?
You can use this code.
It runs correctly in my side.
seed_bytes = Bip39SeedGenerator(cfg.ETH_TEMP_MNEMONIC).Generate()
bip44_mst_ctx = Bip44.FromSeed(seed_bytes, Bip44Coins.SOLANA)
for i in range(100):
bip44_acc_ctx = bip44_mst_ctx.Purpose().Coin().Account(i)
bip44_chg_ctx = bip44_acc_ctx.Change(Bip44Changes.CHAIN_EXT)
new_wallet = WalletData()
new_wallet.public_key = bip44_chg_ctx.PublicKey().ToAddress()
new_wallet.private_key = Base58Encoder.Encode(
bip44_chg_ctx.PrivateKey().Raw().ToBytes() + bip44_chg_ctx.PublicKey().RawCompressed().ToBytes()[1:]
)

How to get system IP address using in scala code?

I want to set the ipaddress in variable using scala. I have tried below scenario. I did not get exactly what I was looking for.
val sysip = System.InetAddress.getLocalHost();
Can you please help?
import java.net._
val localhost: InetAddress = InetAddress.getLocalHost
val localIpAddress: String = localhost.getHostAddress
println(s"localIpAddress = $localIpAddress")
You can find more details via this link

Deploying Keras model to Google Cloud ML for serving predictions

I need to understand how to deploy models on Google Cloud ML. My first task is to deploy a very simple text classifier on the service. I do it in the following steps (could perhaps be shortened to fewer steps, if so, feel free to let me know):
Define the model using Keras and export to YAML
Load up YAML and export as a Tensorflow SavedModel
Upload model to Google Cloud Storage
Deploy model from storage to Google Cloud ML
Set the upload model version as default on the models website.
Run model with a sample input
I've finally made step 1-5 work, but now I get this strange error seen below when running the model. Can anyone help? Details on the steps is below. Hopefully, it can also help others that are stuck on one of the previous steps. My model works fine locally.
I've seen Deploying Keras Models via Google Cloud ML and Export a basic Tensorflow model to Google Cloud ML, but they seem to be stuck on other steps of the process.
Error
Prediction failed: Exception during model execution: AbortionError(code=StatusCode.INVALID_ARGUMENT, details="In[0] is not a matrix
[[Node: MatMul = MatMul[T=DT_FLOAT, _output_shapes=[[-1,64]], transpose_a=false, transpose_b=false, _device="/job:localhost/replica:0/task:0/cpu:0"](Mean, softmax_W/read)]]")
Step 1
# import necessary classes from Keras..
model_input = Input(shape=(maxlen,), dtype='int32')
embed = Embedding(input_dim=nb_tokens,
output_dim=256,
mask_zero=False,
input_length=maxlen,
name='embedding')
x = embed(model_input)
x = GlobalAveragePooling1D()(x)
outputs = [Dense(nb_classes, activation='softmax', name='softmax')(x)]
model = Model(input=[model_input], output=outputs, name="fasttext")
# export to YAML..
Step 2
from __future__ import print_function
import sys
import os
import tensorflow as tf
from tensorflow.contrib.session_bundle import exporter
import keras
from keras import backend as K
from keras.models import model_from_config, model_from_yaml
from optparse import OptionParser
EXPORT_VERSION = 1 # for us to keep track of different model versions (integer)
def export_model(model_def, model_weights, export_path):
with tf.Session() as sess:
init_op = tf.global_variables_initializer()
sess.run(init_op)
K.set_learning_phase(0) # all new operations will be in test mode from now on
yaml_file = open(model_def, 'r')
yaml_string = yaml_file.read()
yaml_file.close()
model = model_from_yaml(yaml_string)
# force initialization
model.compile(loss='categorical_crossentropy',
optimizer='adam')
Wsave = model.get_weights()
model.set_weights(Wsave)
# weights are not loaded as I'm just testing, not really deploying
# model.load_weights(model_weights)
print(model.input)
print(model.output)
pred_node_names = output_node_names = 'Softmax:0'
num_output = 1
export_path_base = export_path
export_path = os.path.join(
tf.compat.as_bytes(export_path_base),
tf.compat.as_bytes('initial'))
builder = tf.saved_model.builder.SavedModelBuilder(export_path)
# Build the signature_def_map.
x = model.input
y = model.output
values, indices = tf.nn.top_k(y, 5)
table = tf.contrib.lookup.index_to_string_table_from_tensor(tf.constant([str(i) for i in xrange(5)]))
prediction_classes = table.lookup(tf.to_int64(indices))
classification_inputs = tf.saved_model.utils.build_tensor_info(model.input)
classification_outputs_classes = tf.saved_model.utils.build_tensor_info(prediction_classes)
classification_outputs_scores = tf.saved_model.utils.build_tensor_info(values)
classification_signature = (
tf.saved_model.signature_def_utils.build_signature_def(inputs={tf.saved_model.signature_constants.CLASSIFY_INPUTS: classification_inputs},
outputs={tf.saved_model.signature_constants.CLASSIFY_OUTPUT_CLASSES: classification_outputs_classes, tf.saved_model.signature_constants.CLASSIFY_OUTPUT_SCORES: classification_outputs_scores},
method_name=tf.saved_model.signature_constants.CLASSIFY_METHOD_NAME))
tensor_info_x = tf.saved_model.utils.build_tensor_info(x)
tensor_info_y = tf.saved_model.utils.build_tensor_info(y)
prediction_signature = (tf.saved_model.signature_def_utils.build_signature_def(
inputs={'images': tensor_info_x},
outputs={'scores': tensor_info_y},
method_name=tf.saved_model.signature_constants.PREDICT_METHOD_NAME))
legacy_init_op = tf.group(tf.tables_initializer(), name='legacy_init_op')
builder.add_meta_graph_and_variables(
sess, [tf.saved_model.tag_constants.SERVING],
signature_def_map={'predict_images': prediction_signature,
tf.saved_model.signature_constants.DEFAULT_SERVING_SIGNATURE_DEF_KEY: classification_signature,},
legacy_init_op=legacy_init_op)
builder.save()
print('Done exporting!')
raise SystemExit
if __name__ == '__main__':
usage = "usage: %prog [options] arg"
parser = OptionParser(usage)
(options, args) = parser.parse_args()
if len(args) < 3:
raise ValueError("Too few arguments!")
model_def = args[0]
model_weights = args[1]
export_path = args[2]
export_model(model_def, model_weights, export_path)
Step 3
gsutil cp -r fasttext_cloud/ gs://quiet-notch-xyz.appspot.com
Step 4
from __future__ import print_function
from oauth2client.client import GoogleCredentials
from googleapiclient import discovery
from googleapiclient import errors
import time
projectID = 'projects/{}'.format('quiet-notch-xyz')
modelName = 'fasttext'
modelID = '{}/models/{}'.format(projectID, modelName)
versionName = 'Initial'
versionDescription = 'Initial release.'
trainedModelLocation = 'gs://quiet-notch-xyz.appspot.com/fasttext/'
credentials = GoogleCredentials.get_application_default()
ml = discovery.build('ml', 'v1', credentials=credentials)
# Create a dictionary with the fields from the request body.
requestDict = {'name': modelName, 'description': 'Online predictions.'}
# Create a request to call projects.models.create.
request = ml.projects().models().create(parent=projectID, body=requestDict)
# Make the call.
try:
response = request.execute()
except errors.HttpError as err:
# Something went wrong, print out some information.
print('There was an error creating the model.' +
' Check the details:')
print(err._get_reason())
# Clear the response for next time.
response = None
raise
time.sleep(10)
requestDict = {'name': versionName,
'description': versionDescription,
'deploymentUri': trainedModelLocation}
# Create a request to call projects.models.versions.create
request = ml.projects().models().versions().create(parent=modelID,
body=requestDict)
# Make the call.
try:
print("Creating model setup..", end=' ')
response = request.execute()
# Get the operation name.
operationID = response['name']
print('Done.')
except errors.HttpError as err:
# Something went wrong, print out some information.
print('There was an error creating the version.' +
' Check the details:')
print(err._get_reason())
raise
done = False
request = ml.projects().operations().get(name=operationID)
print("Adding model from storage..", end=' ')
while (not done):
response = None
# Wait for 10000 milliseconds.
time.sleep(10)
# Make the next call.
try:
response = request.execute()
# Check for finish.
done = True # response.get('done', False)
except errors.HttpError as err:
# Something went wrong, print out some information.
print('There was an error getting the operation.' +
'Check the details:')
print(err._get_reason())
done = True
raise
print("Done.")
Step 5
Use website.
Step 6
def predict_json(instances, project='quiet-notch-xyz', model='fasttext', version=None):
"""Send json data to a deployed model for prediction.
Args:
project (str): project where the Cloud ML Engine Model is deployed.
model (str): model name.
instances ([Mapping[str: Any]]): Keys should be the names of Tensors
your deployed model expects as inputs. Values should be datatypes
convertible to Tensors, or (potentially nested) lists of datatypes
convertible to tensors.
version: str, version of the model to target.
Returns:
Mapping[str: any]: dictionary of prediction results defined by the
model.
"""
# Create the ML Engine service object.
# To authenticate set the environment variable
# GOOGLE_APPLICATION_CREDENTIALS=<path_to_service_account_file>
service = googleapiclient.discovery.build('ml', 'v1')
name = 'projects/{}/models/{}'.format(project, model)
if version is not None:
name += '/versions/{}'.format(version)
response = service.projects().predict(
name=name,
body={'instances': instances}
).execute()
if 'error' in response:
raise RuntimeError(response['error'])
return response['predictions']
Then run function with test input: predict_json({'inputs':[[18, 87, 13, 589, 0]]})
There is now a sample demonstrating the use of Keras on CloudML engine, including prediction. You can find the sample here:
https://github.com/GoogleCloudPlatform/cloudml-samples/tree/master/census/keras
I would suggest comparing your code to that code.
Some additional suggestions that will still be relevant:
CloudML Engine currently only supports using a single signature (the default signature). Looking at your code, I think prediction_signature is more likely to lead to success, but you haven't made that the default signature. I suggest the following:
builder.add_meta_graph_and_variables(
sess, [tf.saved_model.tag_constants.SERVING],
signature_def_map={tf.saved_model.signature_constants.DEFAULT_SERVING_SIGNATURE_DEF_KEY: prediction_signature,},
legacy_init_op=legacy_init_op)
If you are deploying to the service, then you would invoke prediction like so:
predict_json({'images':[[18, 87, 13, 589, 0]]})
If you are testing locally using gcloud ml-engine local predict --json-instances the input data is slightly different (matches that of the batch prediction service). Each newline-separated line looks like this (showing a file with two lines):
{'images':[[18, 87, 13, 589, 0]]}
{'images':[[21, 85, 13, 100, 1]]}
I don't actually know enough about the shape of model.x to ensure the data being sent is correct for your model.
By way of explanation, it may be insightful to consider the difference between the Classification and Prediction methods in SavedModel. One difference is that, when using tensorflow_serving, which is based on gRPC, which is strongly typed, Classification provides a strongly-typed signature that most classifiers can use. Then you can reuse the same client on any classifier.
That's not overly useful when using JSON since JSON isn't strongly typed.
One other difference is that, when using tensorflow_serving, Prediction accepts column-based inputs (a map from feature name to every value for that feature in the whole batch) whereas Classification accepts row based inputs (each input instance/example is a row).
CloudML abstracts that away a bit and always requires row-based inputs (a list of instances). We even though we only officially support Prediction, but Classification should work as well.

Kraken API MATLAB client invalid signature error

I'm trying to do some authenticated calls to Kraken private endpoints but without success. I'm still getting an error EAPI:Invalid signature.
Does anybody know what's wrong?
Here's the code:
function [response,status]=kraken_authenticated(uri,postdata)
% test uri='0/private/AddOrder'
% test postdata='&pair=XBTEUR&type=buy&ordertype=limit&price=345.214&volume=0.65412&leverage=1.5&oflags=post'
url=['https://api.kraken.com/',uri];
% nonce
nonce = num2str(floor((now-datenum('1970', 'yyyy'))*8640000000));
[key,secret]=key_secret('kraken');
% 1st hash
Opt.Method = 'SHA-256';
Opt.Input = 'ascii';
sha256string = DataHash(['nonce=',nonce,postdata],Opt);
% 2nd hash
sign = crypto([uri,sha256string], secret, 'HmacSHA512');
header_1=http_createHeader('API-Key',key);
header_2=http_createHeader('API-Sign',char(sign));
header=[header_1 header_2];
[response,status] = urlread2(url,'POST',['nonce=',nonce,postdata],header);
end
Crypto function is in another file:
function signStr = crypto(str, key, algorithm)
import java.net.*;
import javax.crypto.*;
import javax.crypto.spec.*;
import org.apache.commons.codec.binary.*
keyStr = java.lang.String(key);
key = SecretKeySpec(keyStr.getBytes('UTF-8'), algorithm);
mac = Mac.getInstance(algorithm);
mac.init(key);
toSignStr = java.lang.String(str);
signStr = java.lang.String(Hex.encodeHex( mac.doFinal( toSignStr.getBytes('UTF-8'))));
end
I've also tried
sign = crypto([uri,sha256string], base64decode(secret), 'HmacSHA512');
but without success.
This is guide for authenticated call HTTPS Header:
API-Key = API key
API-Sign = Message signature using HMAC-SHA512 of (URI path + SHA256(nonce + POST data)) and base64 decoded secret API key
This is guide for authenticated call POST Data:
nonce = always increasing unsigned 64 bit integer
otp = two-factor password (if two-factor enabled, otherwise not required)
I've tried to pass "nonce" parameter or all parameters in "postdata" to POST data but without success.
Thanks for help.
The problem is in function crypto here:
keyStr = java.lang.String(key);
key = SecretKeySpec(keyStr.getBytes('UTF-8'), algorithm);
As the base64 encoded private key from kraken is not necessarily UTF-8 encoded, you cannot use UTF-8 encoding to extract the key and pass UTF-8 string to the SecretKeySpec function. You need to use byte array instead.
Similar issues
https://code.google.com/p/google-apps-script-issues/issues/detail?id=5113
https://code.google.com/p/google-apps-script-issues/issues/detail?id=3121
Solution for javascript
github.com/Caligatio/jsSHA

My web app on Yesod, Auth-HashDB, and PostgreSQL refuses to compile -- Couldn't match type ‘AuthEntity App’ with ‘User’

I've been trying to give myself a crash course in Yesod, but I can't seem to figure out what I'm doing wrong here. It's likely a conceptual failing, but I've more-or-less copy-pasted the code available on various short introductions to HashDB in an attempt to make a hashed DB authentication system, but no dice.
Foundation.hs:136:23:
Couldn't match type ‘AuthEntity App’ with ‘User’
In the expression: getAuthIdHashDB AuthR (Just . UniqueUser) creds
In an equation for ‘getAuthId’:
getAuthId creds = getAuthIdHashDB AuthR (Just . UniqueUser) creds
In the instance declaration for ‘YesodAuth App’
From each segment of code that's relevant:
config/models:
User
name Text
password Text Maybe
UniqueUser name
Model.hs:
import Yesod.Auth.HashDB (HashDBUser, userPasswordHash, setPasswordHash)
import Database.Persist.Quasi (lowerCaseSettings)
...
share [mkPersist sqlSettings, mkMigrate "migrateAll"]
$(persistFileWith lowerCaseSettings "config/models")
instance HashDBUser User where
userPasswordHash = userPassword
setPasswordHash h u = u { userPassword = Just h }
Foundations.hs:
...
import Yesod.Auth
import Yesod.Auth.HashDB (authHashDBWithForm, getAuthIdHashDB, authHashDB)
import Yesod.Auth.Message (AuthMessage (InvalidLogin))
...
instance YesodAuth App where
type AuthId App = UserId
loginDest _ = HomeR
logoutDest _ = HomeR
redirectToReferer _ = True
authPlugins _ = [ authHashDB (Just . UniqueUser) ]
getAuthId creds = getAuthIdHashDB AuthR (Just . UniqueUser) creds
authHttpManager = getHttpManager
Any help would be appreciated. I still kind of suck at Haskell, so this is also my attempt at a crash course in it as well.
This typically means that you don't have an AuthEntity associated type declared, which in turn means that you don't have a YesodAuthPersist instance. In your case, this is probably just:
instance YesodAuthPersist App where
type AuthEntity App = User
This is provided by the Yesod scaffolding.