Using #ionic-native/ble I'm able to scan and discover a BLE device which has manufacturer specific data.
According to the lib (https://github.com/don/cordova-plugin-ble-central#ios-1) here is the way to get this data
const mfgData = new Uint8Array(device.advertising.kCBAdvDataManufacturerData);
console.log('Manufacturer Data: ', mfgData);
const hex = Buffer.from(mfgData).toString('hex');
console.log(hex);
The encode to hex result being 2604 0504 386 55c0b
What I don't understand is the proper way to use this result to decode the manufacturer (company) id, which is supposed to be "0x0426"
You can try the following :
const toHexString = bytes =>
bytes.reduce((str, byte) => str + byte.toString(16).padStart(2, '0'), '');
console.log(toHexString(new Uint8Array([0, 1, 2, 42, 100, 101, 102, 255])))
Related
I am using this library https://pub.dev/packages/flutter_reactive_ble for scanning the beacon. and for broadcasting a beacon I am using an android app called beacon simulator and it has got beacon UUID, majorID and minorID for each hosted beacon.
here the UUID is '0546b93b-f113-438d-8bab-b9e766dd'
but while scanning I am unable to look out for UUID instead I am getting this response in this structure.
DiscoveredDevice(id: 14:49:2F:BB:D5:97, name: , serviceData: {}, serviceUuids: [], manufacturerData: [76, 0, 2, 21, 5, 70, 185, 59, 241, 19, 67, 141, 139, 171, 185, 231, 102, 208, 17, 119, 0, 0, 0, 0, 191], rssi: -39)
Is there any way I can get the beacon's UUID in the format '0546b93b-f113-438d-8ba...'
Or any better way to uniquely identify a device? I know we can search for the device's Bluetooth address, but that just keeps on changing.
Thanks in advance.
I've encountered a problem deploying Django Channels on Heroku, whilst using a RedisChannelLayer.
I get a UnicodeError: encoding with 'idna' codec failed (UnicodeError: label empty or too long) during connection (full traceback below).
That seems to be a problem related to one of the labels in the host address being too long as shown in this python issue.
I printed some information out of my consumer, and also wrapped python's socket.getaddrinfo module to display host and connection information.
This related post has has the same problem connecting to shopify, not a redis instance, where they got around it by placing credentials into the request header. But I don't have control over channels_redis or asyncio.
Any clues?
Properties of the Django Channels Consumer:
.groups []
.channel_layer RedisChannelLayer(hosts=[{'address': ('h:alongkeycomprisingof65charsintotalxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx#ec2-18-202-152-61.eu-west-1.compute.amazonaws.com', '9759')}])
.channel_name specific.OSfTzyqY!pdvgHnaCxWiv
.room_name 8e3d3083-8bb1-4d85-89d3-4496d9b9e946
.room_group_name twined_8e3d3083-8bb1-4d85-89d3-4496d9b9e946
Full Traceback:
Traceback (most recent call last):
File "/app/.heroku/python/lib/python3.6/site-packages/channels/sessions.py", line 183, in __call__
return await self.inner(receive, self.send)
File "/app/.heroku/python/lib/python3.6/site-packages/channels/middleware.py", line 41, in coroutine_call
await inner_instance(receive, send)
File "/app/.heroku/python/lib/python3.6/site-packages/channels/consumer.py", line 59, in __call__
[receive, self.channel_receive], self.dispatch
File "/app/.heroku/python/lib/python3.6/site-packages/channels/utils.py", line 51, in await_many_dispatch
await dispatch(result)
File "/app/.heroku/python/lib/python3.6/site-packages/channels/consumer.py", line 73, in dispatch
await handler(message)
File "/app/.heroku/python/lib/python3.6/site-packages/channels/generic/websocket.py", line 175, in websocket_connect
await self.connect()
File "/app/backend/pink/consumers.py", line 24, in connect
await self.channel_layer.group_add(self.room_group_name, self.channel_name)
File "/app/.heroku/python/lib/python3.6/site-packages/channels_redis/core.py", line 589, in group_add
async with self.connection(self.consistent_hash(group)) as connection:
File "/app/.heroku/python/lib/python3.6/site-packages/channels_redis/core.py", line 835, in __aenter__
self.conn = await self.pool.pop()
File "/app/.heroku/python/lib/python3.6/site-packages/channels_redis/core.py", line 73, in pop
conns.append(await aioredis.create_redis(**self.host, loop=loop))
File "/app/.heroku/python/lib/python3.6/site-packages/aioredis/commands/__init__.py", line 175, in create_redis
loop=loop)
File "/app/.heroku/python/lib/python3.6/site-packages/aioredis/connection.py", line 113, in create_connection
timeout)
File "/app/.heroku/python/lib/python3.6/asyncio/tasks.py", line 339, in wait_for
return (yield from fut)
File "/app/.heroku/python/lib/python3.6/site-packages/aioredis/stream.py", line 24, in open_connection
lambda: protocol, host, port, **kwds)
File "/app/.heroku/python/lib/python3.6/asyncio/base_events.py", line 750, in create_connection
infos = f1.result()
File "/app/.heroku/python/lib/python3.6/concurrent/futures/thread.py", line 56, in run
result = self.fn(*self.args, **self.kwargs)
File "./backend/amy/asgi.py", line 69, in mygetaddrinfo
for res in socket._socket.getaddrinfo(host, port, family, type, proto, flags):
UnicodeError: encoding with 'idna' codec failed (UnicodeError: label empty or too long)
I've been able to work around by establishing my RedisChannelLayer using a dict of arguments to create_connection (as mentioned here, instead of providing a really long host name.
By manually parsing the password out of the REDIS_URL variable that heroku provides me, and rebuilding a host uri without it, I can add that password as a separate field to the create_connection dict, keeping the host string length below 64 characters.
I do this in my settings.py file which now looks like:
def parse_redis_url(url):
""" parses a redis url into component parts, stripping password from the host.
Long keys in the url result in parsing errors, since labels within a hostname cannot exceed 64 characters under
idna rules.
In that event, we remove the key/password so that it can be passed separately to the RedisChannelLayer.
Heroku REDIS_URL does not include the DB number, so we allow for a default value of '0'
"""
parsed = urlparse(url)
parts = parsed.netloc.split(':')
host = ':'.join(parts[0:-1])
port = parts[-1]
path = parsed.path.split('/')[1:]
db = int(path[0]) if len(path) >= 1 else 0
user, password = (None, None)
if '#' in host:
creds, host = host.split('#')
user, password = creds.split(':')
host = f'{user}#{host}'
return host, port, user, password, db
REDIS_URL = env('REDIS_URL', default='redis://localhost:6379')
REDIS_HOST, REDIS_PORT, REDIS_USER, REDIS_PASSWORD, REDIS_DB = parse_redis_url(REDIS_URL)
# DJANGO CHANNELS
CHANNEL_LAYERS = {
'default': {
'BACKEND': 'channels_redis.core.RedisChannelLayer',
'CONFIG': {
"hosts": [{
'address': f'redis://{REDIS_HOST}:{REDIS_PORT}',
'db': REDIS_DB,
'password': REDIS_PASSWORD,
}],
},
},
}
Trying to generate hybrid .apk from phone gap. And using PHP language to generate paytm "checksum".
At Server Side PHP page:
// Create an array having all required parameters for creating checksum.
$paramList["ENVIRONMENT"] = "staging";
$paramList["MID"] = PAYTM_MERCHANT_MID;
$paramList["ORDER_ID"] = $ORDER_ID;
$paramList["CUST_ID"] = $CUST_ID;
$paramList["INDUSTRY_TYPE_ID"] = $INDUSTRY_TYPE_ID;
$paramList["CHANNEL_ID"] = $CHANNEL_ID;
$paramList["TXN_AMOUNT"] = $TXN_AMOUNT;
$paramList["WEBSITE"] = "APPSTAGING";
//Here checksum string will return by getChecksumFromArray() function.
$checkSum = getChecksumFromArray($paramList,PAYTM_MERCHANT_KEY);
After getting the checksum value; checkSum is used for transaction.
var options = {
ENVIRONMENT: "staging",
MID: "XXXXXXXXXXXXX",
ORDER_ID: "ORDER0000000001",
CUST_ID: "10000988111",
INDUSTRY_TYPE_ID: "Retail",
CHANNEL_ID: "WAP",
TXN_AMOUNT: "1.50",
WEBSITE: "APPSTAGING",
CALLBACK_URL: "https://securegw-stage.paytm.in/theia/paytmCallback?ORDER_ID=ORDER0000000001",
CHECKSUMHASH: checkSum // php code generated checksum.
};
paytm.startPayment(options, app.successCallPayTm, app.failureCallPayTm);
When create the app for the same; it always shows the issue "Error Code: 330; PayTm checksum mismatch".
Please anyone guide.
I was trying to stream import data into big query using tabledata.insert_all
job_data = {
kind: 'bigquery#tableDataInsertAllRequest',
rows: [
{ json: { column_name: value} }
]
}
response = execute(
api_method: bigquery.tabledata.insert_all,
parameters: {
projectId: config['project_id'],
datasetId: DATASET_ID,
tableId: table_id
},
body_object: job_data
)
But I always get the following error message
Google::APIClient::Request Sending API request post https://www.googleapis.com/bigquery/v2/projects/propane-tribute-90023/datasets/development/tables/api_requests_20150414/insertAll {"User-Agent"=>"My Test App/1.0 google-api-ruby-client/0.8.5 Mac OS X/10.9.5\n (gzip)", "Content-Type"=>"application/json", "Accept-Encoding"=>"gzip", "Authorization"=>"Bearer ya29.VgFYvU2nxGDhWiCdS47XRw0J-7GLenRry0Cd3AA2D1RDzMh5gnf-m85I5GeSr9oNW51OuUb9mdwObg", "Cache-Control"=>"no-store"}
Decompressing gzip encoded response (155 bytes)
Decompressed (261 bytes)
Google::APIClient::Request Result: 400 {"Vary"=>"X-Origin", "Content-Type"=>"application/json; charset=UTF-8", "Date"=>"Wed, 15 Apr 2015 03:14:17 GMT", "Expires"=>"Wed, 15 Apr 2015 03:14:17 GMT", "Cache-Control"=>"private, max-age=0", "X-Content-Type-Options"=>"nosniff", "X-Frame-Options"=>"SAMEORIGIN", "X-XSS-Protection"=>"1; mode=block", "Server"=>"GSE", "Alternate-Protocol"=>"443:quic,p=0.5", "Transfer-Encoding"=>"chunked"} => {"error"=>{"errors"=>[{"domain"=>"global", "reason"=>"invalid", "message"=>"No rows present in the request.", "locationType"=>"other", "location"=>"rows"}], "code"=>400, "message"=>"No rows present in the request."}}
Does anyone have the same the issue and know how to fix it?
Thanks.
Make sure you are providing appropriate values for all of the column headers present in your table’s schema. Providing separate “json” entries will populate individual rows with the column data you provide. Unless you have already assigned values to the variables named column_name and value, you need to provide those values in the statement following the json declaration.
A sample Ruby syntax for a tabledata.insert_all “rows” operation would look as follows:
body = {
"rows" =>[
{"json" => { "person_id" => 10, "person_name" => "test"}},
{"json" => { "person_id" => 11, "person_name" => "test2"}}
]
}
I am new to D3.js. I love it but I am having real trouble figuring out the best approach to structuring data.
I would ideally like to create a simple multiline graph that has over points over the selected points. Firstly I have the multiple lines created but trying to add the points has stumped me, and I think it has to do with the structure of my data.
Here is my working fiddle. I'm not sure if I should be trying to use d3.nest to re-arrange the data
I have a json object that I am retrieving from a google form which is all nice and smooth. This is what it looks like:
var data = [{
"description": "Global warming is a serious and pressing problem. We should begin taking steps now even if this involves significant costs",
"year2013": 40,
"year2012": 36,
"year2011": 41,
"year2010": 46,
"year2009": 48,
"year2008": 60,
"year2006": 68,
}, {
"description": "The problem of global warming should be addressed, but its effects will be gradual, so we can deal with the problem gradually by taking steps that are low in cost",
"year2013": 44,
"year2012": 45,
"year2011": 40,
"year2010": 40,
"year2009": 39,
"year2008": 32,
"year2006": 24,
}, {
"description": "Until we are sure that global warming is really a problem, we should not take any steps that would have economic costs",
"year2013": 16,
"year2012": 18,
"year2011": 19,
"year2010": 13,
"year2009": 13,
"year2008": 8,
"year2006": 7,
}, {
"description": "Don't know / refused",
"year2013": 1,
"year2012": 1,
"year2011": 1,
"year2010": 1,
"year2009": 1,
"year2008": 0,
"year2006": 1,
}]
Any help would be appreciated, I have been at it for days.
Cheers!
First - I would flatten your data
data = [
{date:"2011",type: "line0", amount:20}
...
]
Then nest your data by type
nested = d3.nest()
.key( (d) -> return d.type )
.entries(data)
Then append your line groups
# Line Groups
groups = container.selectAll('g.full-line')
.data(nested, (d) -> return d.key )
# ENTER
groups.enter().append('svg:g')
.attr( 'class', (d,i) -> "full-line#{i}" )
# EXIT
d3.transition(groups.exit()).remove()
# TRANSITION
d3.transition(groups)
Then append your chart lines
# Individual Lines
lines = groups.selectAll('.line').data (d)-> [d.values]
# ENTER
lines.enter().append("svg:path")
.attr("class","line")
.attr("d", d3.svg.line()
.interpolate(interpolate)
.defined(defined)
.x( (d,i) -> return xScale(d,i) )
.y( (d,i) -> return yScale(d,i) ) )
# EXIT
d3.transition( groups.exit().selectAll('.line') )
.attr("d",
d3.svg.line()
.interpolate(interpolate)
.defined(defined)
.x( (d,i) -> return xScale(d,i) )
.y( (d,i) -> return yScale(d,i) ) )
# TRANSITION
d3.transition(lines)
.attr("d",
d3.svg.line()
.interpolate(interpolate)
.defined(defined)
.x( (d,i) -> return xScale(d,i) )
.y( (d,i) -> return yScale(d,i) ) )
Thanks
I ended up using something similar.
/* Transform Data */
data = data.map(function (d) {
return {
country: d.country,
date: new Date(d.year.toString()),
value: d.value
};
});
/* Nest Data */
data = d3.nest().key(function (d) {
return d.country;
}).entries(data);`