Return Height from Google Fit Rest API - rest

Trying to get the latest height captured on Google Fit by a user for a web app. Using the https://developers.google.com/fit/rest/ I got the below.
const scopes = [
'https://www.googleapis.com/auth/plus.me',
'https://www.googleapis.com/auth/fitness.body.read'
];
const fitness = google.fitness('v1');
const gfHeight = await fitness.users.dataSources.get({
userId: 'me',
dataSourceId: '',
datasetId: '',
});
console.log(gfHeight.data);
Returns the log below which is just details of the data source:
...
{dataStreamId: 'raw:com.google.weight:com.google.android.apps.fitness:user_input',
dataStreamName: 'user_input',
type: 'raw',
dataType: { name: 'com.google.height', field: [[Object]] },
application: { packageName: 'com.google.android.apps.fitness' },
dataQualityStandard: []
}]}
On adding the datasets to datasource it returns a 404 so not sure how to structure the request to get an object containing the height.
const gfHeight = await fitness.users.dataSources.datasets.get...

Try the tutorial here which uses the Fit API data-types as reference.
get body height
Endpoint:
https://www.googleapis.com/fitness/v1/users/me/dataSources/derived:com.google.height:com.google.android.gms:merge_height/datasets/-
Alternative: /users/me/dataSources/raw:com.google.height:com.google.android.apps.fitness:user_input/datasets/-
Reference: https://developers.google.com/fit/rest/v1/data-types
description
This description relates to the primary endpoint above, for merged data points. This endpoint returns all of the body height data points that were synced to the Google Fit platform from devices connected to Google Fit. The body height values are returned as floating point numbers with a unit of meters. Each datapoint has a start datetime (startTimeNanos) and end datetime (endTimeNanos) and although they are likely the same, we will need to check that before creating the data point. The nanos values are unix epoch nanoseconds that are aligned to UTC.

Related

Algolia search for array that contains value

I am using Algolia search, and right now I use this to find a specific item by id:
algolia.getObject(id)
However, I need to make a search by barcode rather than ID - need a pointer in the right direction here.
The barcodes field is an array that can contain one or more barcode numbers.
You can trigger a search with filters on the barcodes attributes. The filters parameter supports multiple format, numeric values included. It does not matter if the attribute hold a single or multiple (an array) values. Here is an example with the JavaScript client:
const algoliasearch = require('algoliasearch');
const client = algoliasearch('YOUR_APP_ID', 'YOUR_API_KEY');
const index = client.initIndex('YOUR_INDEX_NAME');
index
.search({
filters: 'barcodes = YOUR_BARCODE_VALUE',
})
.then(response => {
console.log(response.hits);
});
The above example assumes that your records have a structure like this one:
{
"barcodes": [10, 20, 30]
}

Performance issues about fetching results via calling firebase cloud functions for autocomplete in Swift

In my ElasticSearch server, I have an indice called tags like this:
tags
- tag_id_1
- name: "Stack Overflow"
- popularity: 100
- tag_id_2
- name: "Stack Exchange"
- popularity: 80
- tag_id_3
- name: "Something else"
- popularity: 20
Where each tag has a name and popularity. When users search with query "Stack", for example, in the backend, an ElasticSearch is performed and all tags containing the word "Stack" are returned and sorted by their popularity, in decreasing order, then displayed in a UITableView.
However, I learned from here that I shouldn't open my ElasticSearch cluster to everybody, instead, I should create my own REST API that fetches results from my cluster and give the result back. So I did. Look at the picture below, it shows exactly what all this is doing:
However, Firebase Cloud Function seems to be very slow. I wish to show the result almost immediately after the search text has changed. But usually the result comes back after 3 - 4 seconds. A good example would be GoogleMaps: When I use GoogleMaps and search for an address, the autocomplete suggestions get returned and displayed very fast as if all the addresses in the world are stored on my phone.
How do I make this happen? Should I not use Cloud Functions for this feature?
---------------------------UPDATE 1--------------------------------
Look at the GIF below, I'm fetching the search results by calling Firebase Cloud Function, notice that after I entered "S", then several seconds later, the results come in:
the code for Cloud Function is:
exports.searchTags = functions.https.onRequest((req, res) => {
const { query } = req.query;
const ESConfig = {
uri: `https://<some_url>/tags/tag/_search`,
method: 'GET',
json: true,
auth: // My ES authentication
body: {
query: {
regexp: { name: `.*${query.toLowerCase()}.*` }
// Use regex to match any tags that contains "query"
},
sort: [
{
popularity: { order: 'desc' }
// Sort by popularity
}
]
}
}
// "request" is a node module called "request-promise"
request(ESConfig).then((results) => {
let tags = [];
results.hits.hits.forEach((hit) => tags.push(hit["_source"].name))
return res.status(200).send(tags)
}).catch((error) => res.status(400).send(error))
})
However, when I perform the same ElasticSearch by connecting the device to my ElasticSearch cluster directly, instead of calling cloud function, the search results come in within half a second, which is what I want:
---------------------------UPDATE 2--------------------------------
I did what #DougStevenson suggested:
At the beginning of my app's startToSearch method, I created a time stamp startTime = new Date()
At the beginning of my Cloud Function, I created a time stamp requestReceivedTime = new Date()
In my Cloud Function, after the ElasticSearch is completed and before returning the result, I created a time stamp elasticSearchCompleteTime = new Date()
In my app, when receive the result, I created another time stamp getResultTime = new Date()
After doing a little simple math, I discovered that
Connecting to Cloud Function (haven't executed yet) takes about 2.5 seconds, but sometimes only 0.5 second, it varies: requestReceivedTime - startTime
Elastic Search takes only 0.1 - 0.2 seconds: elasticSearchCompleteTime - requestReceivedTime
Total time taken is usually 3 seconds: getResultTime - startTime
Therefore, I found that every step else takes very little time, except for "connecting to Cloud Functions", which takes 2.5 seconds. I don't know why this would happen though, I have very fast Internet.

Number of items limited to around 100

I've created the following Input field.
var oCityInput = new Input({ // sap/m/Input
showSuggestion: true,
showTableSuggestionValueHelp: true,
suggestionItems:{
path: "/cities",
template: new ListItem({ // sap/ui/core/ListItem
text: "{cname}",
additionalText: "{provi}"
}),
},
});
The "cities" array contains around 8400 record, but when I type some character the suggestion function it seems that is looking for only in the first 100 items of the array.
I've created an example in jsbin. If you try to looking for the first elements it works... but if you try to type the last city the suggestion will not come out.
In newer versions of SAP UI5 the JSONModel also supports the setSizeLimit() method:
model.setSizeLimit(iNumOfYourJsonEntries);
API description: "Set the maximum number of entries which are used for list bindings."
Be careful because it can lead to performance issues.

Store contents with rest proxy giving incorrect count

ExtJS 5.1.x, with several stores using rest proxy.
Here is an example:
Ext.define('cardioCatalogQT.store.TestResults', {
extend: 'Ext.data.Store',
alias: 'store.TestResults',
config:{
fields: [
{name: 'attribute', type: 'string'},
{name: 'sid', type: 'string'},
{name: 'value_s', type: 'string'},
{name: 'value_d', type: 'string'}
],
model: 'cardioCatalogQT.model.TestResult',
storeId: 'TestResults',
autoLoad: true,
pageSize: undefined,
proxy: {
url: 'http://127.0.0.1:5000/remote_results_get',
type: 'rest',
reader: {
type: 'json',
rootProperty: 'results'
}
}
}
});
This store gets populated when certain things happen in the API. After the store is populated, I need to do some basic things, like count the number of distinct instances of an attribute, say sid, which I do as follows:
test_store = Ext.getStore('TestResults');
n = test_store.collect('sid').length);
The problem is that I have to refresh the browser to get the correct value of 'n,' otherwise, the count is not right. I am doing a test_store.load() and indeed, the request is being sent to the server after the .load() is issued.
I am directly querying the backend database to see what data are there in the table and to get a count to compare to the value given by test_store.collect('sid').length);. The strange thing is that I am also printing out the store object in the debugger, and the expected records (when compared to the content in the database table) are displayed under data.items array, but the value given by test_store.collect('sid').length is not right.
This is all done sequentially in a success callback. I am wondering if there is some sort of asynchronous behavior giving me the inconsistent results between what is is the store and the count on the content of the store?
I tested this with another store that uses the rest proxy and it has the same behavior. On the other hand, using the localStorage proxy gives the correct count consistent with the store records/model instances.
Here is the relevant code in question, an Ajax request fires off and does its thing correctly, and hit this success callback. There really isn't very much interesting going on... the problem section is after the console.log('TEST STORE HERE'); where I get the store, print the contents of the store, load/sync then print the store (which works just fine) and then finally print the length of uniquely grouped items by the sid attribute (which is what is not working):
success: function(response) {
json = Ext.decode(response.responseText);
if(json !== null && typeof (json) !== 'undefined'){
for (i = 0, max = json.items.length; i < max; i += 1) {
if (print_all) {
records.push({
sid: json.items[i].sid,
attribute: json.items[i].attribute,
string: json.items[i].value_s,
number: json.items[i].value_d
});
}
else {
records.push({
sid: json.items[i].sid
})
}
}
//update store with data
store.add(records);
store.sync();
// only add to store if adding to search grid
if (!print_all) {
source.add({
key: payload.key,
type: payload.type,
description: payload.description,
criteria: payload.criteria,
atom: payload.atom,
n: store.collect('sid').length // get length of array for unique sids
});
source.sync();
}
console.log('TEST STORE HERE');
test_store = Ext.getStore('TestResults');
test_store.load();
test_store.sync();
console.log(test_store);
console.log(test_store.collect('sid').length)
}
// update grid store content
Ext.StoreMgr.get('Payload').load();
Ext.ComponentQuery.query('#searchGrid')[0].getStore().load();
}
For completeness, here is the data.items array output items:Array[2886]
which is equivalent count of unique items grouped by the attribute sid and finally the output of console.log(test_store.collect('sid').length), which gives the value from the PREVIOUS run of this: 3114...

JQGrid Dynamic Select Data

I have utilised the example code at Example Code at this link
and I have got my grid to show a dynamically constructed select dropdown on add and edit. However when it is just showing the data in the grid it shows the dropdown index instead of its associated data. Is there a way to get the grid to show the data associated with the index instead of the index itself.
e.g. the data on my select could be "0:Hello;1:World"; The drop down on the edit/add window is showing Hello and World and has the correct indexes for them. If the cell has a value of 1 I would expect it to show World in the grid itself but it is showing 1 instead.
Here is the row itself from my grid:
{ name: 'picklist', index: 'picklist', width: 80, sortable: true, editable: true,
edittype: "select", formatter: "select", editrules: { required: true} },
I am filling the dynamic data content in the loadComplete event as follows:
$('#mygrid').setColProp('picklist', { editoptions: { value: picklistdata} });
picklist data is a string of "0:Hello;1:World" type value pairs.
Please can anyone offer any help. I am fairly new to JQGrids so please could you also include examples.
I know you have already solved the problem but I faced the same problem in my project and would like to offer my solution.
First, I declare a custom formatter for my select column (in this case, the 'username' column).
$.extend($.fn.fmatter, {
selectuser: function(cellvalue, options, rowdata) {
var userdata;
$.ajax({
url:'dropdowns/json/user',
async:false,
dataType:'json',
cache:true,
success: function(data) {
userdata = data;
}
});
return typeof cellvalue != 'undefined' ? userdata[cellvalue] : cellvalue ;
}
});
This formatter loads up the mapping of id and user in this case, and returns the username for the particular cellvalue. Then, I set the formatter:'selectuser' option to the column's colModel, and it works.
Of course, this does one json request per row displayed in the grid. I solved this problem by setting 10 seconds of caching to the headers of my json responses, like so:
private function set_caching($seconds_to_cache = 10) {
$ts = gmdate("D, d M Y H:i:s", time() + $seconds_to_cache) . " GMT";
header("Expires: $ts");
header("Pragma: cache");
header("Cache-Control: max-age=$seconds_to_cache");
}
I know this solution is not perfect, but it was adequate for my application. Cache hits are served by the browser instantly and the grid flows smoothly. Ultimately, I hope the built-in select formatter will be fixed to work with json data.
If you save in jqGrid ids of the select elements and want to show the corresponding textes then you should use formatter:'select' in the colModel (see http://www.trirand.com/jqgridwiki/doku.php?id=wiki:predefined_formatter#formatter_type_select) together with the edittype: "select".
The Usage of stype: 'select' could be also interesting for you if you plan to support data searching.