How to get Height data given during registration from Googlefit REST Apis - rest

I am new to google fit rest APIs and I have been trying to calculate BMI of a user for my project.
So, when the user authorizes to my application at that time I get data for the last 30 days (historical data for some calculation). From then on I extract data periodically this way I always get the latest weight data.
But the issue comes in the scenario when the user has been using google fit way before authentication because I wouldn't know the exact date I should pass as parameter to extract height data.
The aggregate data type accepts the date range of only 90 days, else it returns the error as aggregate duration too large . But the user's height data could be as old as 2/3 years.
I also looked at the option of custom data types, but there again I don't know the exact date range to make an API call.
Is there any way that I can either get the user's joining date (the date he/she registered to Google fit) because a user has to provide height during registration or if I could somehow get data under About you (Gender, DoB, Weight, Height) in Profile tab
Or If I could get the calculated BMI data point directly from API.
Any help would be highly appreciated.

Related

Simple way to call a REST API from an excel cell, hopefully without VBA

I have a column of hundreds of parcel tracking numbers, and I want to use the parcel carrier's REST API to pull some of the tracking information (promised delivery date, and actual delivery date) and then use conventional spreadsheet techniques on it.
Is there a simple way to put the REST call in a formula in an excel cell? in pseudocode it would be call this REST url, using the tracking number in the cell to the left as an argument, and here's the authentication user/password, and return the value in the response field 'deliveryDate'
I am intrigued by Power Query and I figured out how to use Power Query to do it for a static REST url with the tracking number manually filled in, but I don't know how to make PQ do it for hundreds of items.
or maybe there is an online tool for building this function with lego blocks for a caveman like myself?

What time does the Graph's Insights API refresh the attributes with day as the aggregation period?

For example,
https://developers.facebook.com/docs/graph-api/reference/v2.11/insights
For the aggregation period 'day', Does anyone know at what time does Facebook refresh that value?
I remember reading it to be around 8am, but I can't remember if it was accurate or where I read it.
When you check out the endtime you get inside the values (Graph API Explorer request for me?fields=insights.metric(page_stories); supply your own page access token), for all three periods (day/wekk/days_28) it is of the form
2018-02-19T08:00:00+0000
Same time portion in each case.

How to efficiently check database object based on location/proximity to user's location?

I am constructing an app (in XCode) which, in a general sense, displays information to users. The information is stored as individual objects in a database (happens to be a Parse-server hosted by heroku). The user can elect to "see" information that has been created within a set distance from their current location. (The information, when saved to the DB, is saved along with its lat and long based on the location of the user when they initiated the save). I know I can filter the pieces of information by comparing their lat and long to the viewing user's current lat and long and only display those which are close in enough. Roughly/generally:
var currentUserLat = latitude //latitude of user's current location
var infoSet = [Objects] //set of all pulled info from DB
for info in infoSet{
if info.lat-currentUserLat < 3{//arbitrary value
//display the info
}else{
//don't display
}
}
This is set up decently enough, and it works fine. The reason it works fine, though, is because of the small number of entries in the DB at this current time (the app is in development). Under practical usage (ie many users) the DB may be full of information objects (lets say, a thousand). In my opinion, to individually pull and compare the latitude of the information and compare it to the current user's latitude for each and every DB entry would take too long. I know there must be a way to do it in a timely manner (think tinder... they only display profiles of people who are in the near vicinity and it doesn't take that long for them to do so despite millions of profiles) but I do not know what is most efficient. I thought of creating separate sections for different geographical regions in the DB and then only searching those particular section of the DB depending on where the user's current location is, but this seems unsophisticated and would still lead to large amounts of info being pulled. What is the best way to do this?
Per Deploying a Parse Server to Heroku you can Install a MongoDB add-on or another of the Data Stores in the Add-on Category in which you can use Geospatial Indexes and Queries which are specifically intended for this sort of application.
Is there a reason you need to do that sort of checking on the client side? I would suggest sending your coordinates to your server and then having the server query your database with those coordinates and figure out which items to pull based on the given coordinates respectively. Then you can have the server return back to the client side whichever items were "close" to that user
EDIT: reworded

Facebook Api Graph Pull data from a timeframe

I am new to this stuff, I am trying to use Facebook api Graph to get the data for the last 30 days and link it to Klipfolio.
I am able to pull all the data but no matter what i try I can not seam to get FB to only return all of the data not just last 30 days or 7days.
am I able to a time frame parameters?
Any help would be appreciated, this is what I have to pull all the data,
/adaccounts?fields=amount_spent
Thanks
i don't use facebook adds so i can't exactly test with data . but facebook graph api supports time based pagination of the data . read the time based pagination section at the graph api v2.2 docs page. .
it provides with until and since parameters.
A time-paginated edge supports the following parameters:
until : A Unix timestamp or strtotime data value that points to the
end of the range of time-based data.
since : A Unix timestamp or
strtotime data value that points to the start of the range of
time-based data.
i checked it with end points like
me/feed?until=10/11/2011
and it works as expected . so test the parameter for your required data .

Can response data from core reporting api be grouped?

Explanation:
I am able to query the Google Core reporting APIv3 using the client library to get data on pageviews for specific URLs of a website I am working on. I want to get data(pageviews) for each day within a specified range. So far I am simply looping through the range, sending individual request to the API. in each request I am setting the same value for the start date and the end date.
Problem:
Obviously this gets the job done, BUT it is certainly not the best way to go about it. Because, assumming I want to get data for the past 3 months for each of about 2000 URIs. Then I will need 360000 number of requests and that value is well over the limit quota defined by Google.
Potential solution: So one way I thought of solving this issue is probably to send a request setting start-date and end-date to be a week apart but the API will return a sum of the values rather than the individual values.
main question: So is there a way to insist that these values should not be added up and returned as a sum but rather returned (as associative array or something like that) separately for each.
I hope the question is clear and that there is a solution! Thank you!
Very straightforward:
Metric: ga:pageview, Dimension: ga:date, Set a filter for your pagepath, and set a start-date and end-date.
Example:
https://www.googleapis.com/analytics/v3/data/ga?ids=ga%3Axxyyzz&dimensions=ga%3Adate&metrics=ga%3Apageviews&filters=ga%3Apagepath%3D%3D%2Ffaq.html&start-date=2013-06-27&end-date=2013-07-11&max-results=50
This will return the pageviews for that the faq.html& page for each day in the time-frame.
You should check out the QueryExplorer. Great tool to find out how to structure queries.