I am trying to calculate the private data object's hash using sha256 hashing algorithm.
The requirement is to calculate the hash of private data object and match it with the hash stored on the ledger but both hashes do not match.
I am using Hyperledger fabric v2.3.3
For example, the private data object is:
const data = {
"assetType": "PART",
"partNumber": "3415",
"serialNumber": "312351234123"
};
Below is the code executed to save private data:
await stub.putPrivateData('collection_org1', `PVT-${id}`, Buffer.from(JSON.stringify(data)))
The hash calculated using crypto.createHash('sha256').update(JSON.stringify(dataString)).digest('hex')
prints the following hash:
ae2fda8277f53ccd0b9de91ccc2c289142e0030f423966f706686bc1b5fe2e6e
using shasum,
❯ echo -n "{\"assetType\":\"PART\",\"partNumber\":\"3415\",\"serialNumber\":\"312351234123\"}" | shasum -a 256
ae2fda8277f53ccd0b9de91ccc2c289142e0030f423966f706686bc1b5fe2e6e -
but the hash on the ledger fetched using stub.getPrivateDataHash() returns:
559f3ed03c81003bdb67efb83c3a90b6aac4d0169c40aa74a68fbf43f7eeb699
If I query the private data of the key, it returns the exact above object.
Am I doing something wrong here?
I tested this on my system and I'm getting expected result. Both private data hash and the calculated hash is matching. Attaching the code snippet.
// Data JSON
const data = {
"assetType": "PART",
"partNumber": "3415",
"serialNumber": "312351234123"
}
// Store private data
async putData(ctx) {
await ctx.stub.putPrivateData('_implicit_org_Org1MSP', "KEY123", Buffer.from(JSON.stringify(data)))
}
// Get Hash
async getDataHash(ctx) {
let dataHash = crypto.createHash('sha256').update(Buffer.from(JSON.stringify(data))).digest('hex')
let privateDataHashUint = await ctx.stub.getPrivateDataHash('_implicit_org_Org1MSP', "KEY123")
let resp = { dataHash, privateDataHash: Buffer.from(privateDataHashUint).toString('hex') }
return resp
}
Following response is received from getDataHash
{"dataHash":"ae2fda8277f53ccd0b9de91ccc2c289142e0030f423966f706686bc1b5fe2e6e","privateDataHash":"ae2fda8277f53ccd0b9de91ccc2c289142e0030f423966f706686bc1b5fe2e6e"}
Related
I want to update a field of a particular document in firestore while listening to any change in its value. So basically, when the field 'call' is found to be 1, we will update it again back to 0 after printing a message. However, it takes too much time to update. On the other hand, updating the same field using python is way quicker. Please advise me what should be done.
FirebaseFirestore.instance
.collection('joystick')
.snapshots()
.listen((QuerySnapshot querySnapshot) {
var firestoreList = querySnapshot.docs;
var data = firestoreList.first.data();
var callval = firestoreList.first.get('call');
print("call value while listening ${callval}");
if(callval == 1){
print("call value is 1 ");
//makecall();
FirebaseFirestore.instance.collection('joystick').doc('data').update({'call':0});
call = firestoreList.first.get('call');
print("Next call value is ${call}");
if(call == 0){
print("updated!");
callval = 0;
}
}
}).onError((e) => print(e));
Document writing takes as long as it takes and depends on network connection speed. But you can speed up uploading data by compressing them. Dart has a useful tool as GZip. GZip can compress String data up to 99% You can decode/encode data in many ways even HttpClient can autoUncompress it :).
import 'dart:io';
// Send compressed data.
Future<void> sendCompressed() async {
final gzip = GZipCodec();
// You can convert any data to JSON string with `dart:convert`.
const String jsonToSend = '{"field": "Some JSON to send to the server."}';
// Original Data.
final List<int> original = utf8.encode(jsonToSend);
// Compress data.
final List<int> compressed = gzip.encode(original);
// Send compressed to db.
print(compressed);
}
// Get compressed data.
Future<void> getCompressed() async {
final gzip = GZipCodec();
// Get compressed data from the data base.
final List<int> compressed = await http.get('https://data.com');
// Decompress
final List<int> decompress = gzip.decode(compressed);
// Decode back to String (JSON)
final String decoded = utf8.decode(decompress);
// Do what you want with decoded data.
print(decoded);
}
Repeat of https://github.com/pauldemarco/flutter_blue/issues/868
On Windows, I have some code that looks like:
wclGattClient.ReadCharacteristicValue(errorchar, wclGattOperationFlag.goNone, out var Value);
foreach (uint16 handle in Value)
{
foreach(chars in service)
if(chars.handle == handle)
{
wclGattClient.ReadCharacteristicValue(errorchar, wclGattOperationFlag.goNone, out var Val2);
print("UUID %s is flagged : %s", chars.uuid, Val2.toString());
}
}
ie, the device is returning a list of Handles that are in an alert status (so I can read them and present the condition to the user), and I need to match that up with the Handle of the char in the discoverServices so I know which ones to get the data from...
How do I do this with flutter_blue?
The flutter_blue documentation on Github includes a part about reading and writing characteristics:
// Reads all characteristics
var characteristics = service.characteristics;
for(BluetoothCharacteristic c in characteristics) {
List<int> value = await c.read();
print(value);
}
// Writes to a characteristic
await c.write([0x12, 0x34])
In this example BluetoothCharacteristic cwould be your handle you can use to read and write values to.
I was using parse server sdk in my app for database.
I have three class in my Back4App Dashboard which are "_User", "Office", "Office_Members".
In Office_Members class it has following columns,
user_id (Pointer to _User)
office_id (Pointer to Office)
count
To fetch the data including Pointer to _User as well from Office_Members, I am using following code,
QueryBuilder<ParseObject> parseQuery = QueryBuilder<ParseObject>(ParseObject("Office_Members"))
..whereEqualTo("office_id", ParseResponse_OfficeObject)
..includeObject(["user_id "]);
ParseResponse apiResponse = await parseQuery.query();
Output :
Payload : [{"className":"Office_Members","objectId":"twpDY51PUK","createdAt":"2020-08-14T09:58:59.775Z","updatedAt":"2020-08-14T09:58:59.775Z","office_id":{"__type":"Pointer","className":"Office","objectId":"4dkfSMrwBI"},"user_id":{"__type":"Pointer","className":"_User","objectId":"Hx5xJ5ABxG"},"count":1}]
In my payload response i am not getting whole user_id pointer response.
So can anybody help me that what i might be doing wrong?
Thanks.
The data should be included.
The logging function simply does not print the data of pointers.
The data should be included. The print function not print the data of pointers.
You can print it out directly for testing purposes, E.g.
response.results[0].get('user_id').get('name')
Evaluation Expression E.g.
In your model u can access at same way, E.g
Call Model
if(response.success){
return response.results.map((p) => Example.fromParse(p)).toList();
} else {
throw ParseErrors.getDescription(response.error.code);
}
Model
import 'package:parse_server_sdk/parse_server_sdk.dart';
class Example {
Example({this.id, this.name});
Example.fromParse(ParseObject parseObject) :
id = parseObject.objectId,
name = parseObject.get('user_id').get('name');
final String id;
final String name ;
#override
String toString() {
return 'Example{id: $id, name: $name}';
}
}
Why not simply use cloud code ? I'm not to familiar with flutter but I can suggest you this alternative solution.
Write a function like this.
Parse.Cloud.define("fetchMemberAndUser", async (request) => {
//Pass in ParseResponse_OfficeObject ID as parameter
var objectId = request.params.id;
//Now do a simple get query
var query = new Parse.Query(Parse.Object.extend("Office_Members"));
//Using .includes to get the user profile object
query.include("user_id");
//This will return Office_Memebers Object along with user profile
return query.get(objectId,{useMasterKey:true});
}
I'm attempting to implement cursor-based pagination for a reliable dictionary. I know that IReliableDictionary keys must implement IComparable, and that IReliableDictionary this method for enumerating dictionary entries:
IAsyncEnumerable<KeyValuePair<TKey,TValue>>>
CreateEnumerableAsync (
ITransaction txn,
Func<TKey,bool> filter,
EnumerationMode enumerationMode);
When EnumerationMode.Ordered is used, I assume we enumerate the key-value pairs according to the key's IComparable implementation.
Can we also assume the filter parameter is applied to each key in order of the key's IComparable implementation? Maybe another way to ask - are keys laid out in memory and/or enumerated in order of their IComparable implementation? If so, is this behavior documented, or should it be considered an implementation detail that may change?
I performed an experiment using the Voting Web sample, and the keys do appear to be filtered in order of their IComparable implementation, with some caveats:
It appears the filter can be run multiple times over the same key
The filter can be applied to items which have been recently deleted
VotingData.Controllers.VoteDataController has the following get and put operations to list and add voting categories:
[HttpPut("{name}")]
public async Task<IActionResult> Put(string name)
{
IReliableDictionary<string, int> votesDictionary = await this.stateManager.GetOrAddAsync<IReliableDictionary<string, int>>("counts");
using (ITransaction tx = this.stateManager.CreateTransaction())
{
await votesDictionary.AddOrUpdateAsync(tx, name, 1, (key, oldvalue) => oldvalue + 1);
await tx.CommitAsync();
}
return new OkResult();
}
I modified Get to enumerate votesDictionary in order, and applied a filter which builds a list of keys seen by the filter:
[HttpGet]
public async Task<IActionResult> Get()
{
CancellationToken ct = new CancellationToken();
IReliableDictionary<string, int> votesDictionary = await this.stateManager.GetOrAddAsync<IReliableDictionary<string, int>>("counts");
var filteredKeys = new List<string>();
using (ITransaction tx = this.stateManager.CreateTransaction())
{
IAsyncEnumerable<KeyValuePair<string, int>> list = await votesDictionary.CreateEnumerableAsync(tx, key =>
{
lock (this.locker)
{
filteredKeys.Add(key);
return true;
}
},
EnumerationMode.Ordered);
IAsyncEnumerator<KeyValuePair<string, int>> enumerator = list.GetAsyncEnumerator();
List<KeyValuePair<string, int>> result = new List<KeyValuePair<string, int>>();
while (await enumerator.MoveNextAsync(ct))
{
result.Add(enumerator.Current);
}
return this.Json(result);
}
}
I added keys to the dictionary in random alphabetical order and refreshed the page to run the Get query. On each refresh, the filteredKeys collection contained my entries in sorted alphabetical order. As mentioned above, the collection sometimes contained duplicate entries for certain strings. When I removed items from the collection and refreshed the page, I found deleted keys were still added to filteredKeys, although these elements were not returned in the result enumeration.
I've setup a simple "product" model (ie {id:"string","name":string, etc}) and setup a datasource using the REST connector to a remote URL that returns a JSON blob containing dozens of fields, how do I go about mapping the fields from the remote response to my local model? Whenever I execute my method I'm getting back the raw response from the remote....I was expecting, at a minimum, to get back an empty version of my model.
I'm pretty sure you will have to override the find() method on your model and perform this mapping work manually.
Something like this:
module.exports = function(app) {
var Product = app.models.Product;
var find = Product.find;
Product.find = function(filter, cb) {
// invoke the default method
find.call(Product, function(err, original_results) {
var results = {}; // a placeholder for your expected results
results.name = original_results.id;
results.name = original_results.name;
results.description = original_results.long_description;
// and so on
cb(null, results)
});
}
}