I am struggling with matching the "active time" returned by Fit REST API with reality.
As an example - on 12/14 I had two walks, about 45 minutes each. The api returns one of them as type 7 ("walking" - right!) and another one as type 0 (in vehicle - wrong!). However, Fit app shows both as "walking", so it apparently uses a different data source.
I checked some other days and on these days, the session with type 0 is indeed a valid "in vehicle" session.
I tried all aggregated data sources that return com.google.activity.segment. Most of them are empty, I've found data only in merge_activity_segments and platform_activity_segments (and they seem to be identical).
Google's docs have a caveat about delay in data sync, but they never specified how long this delay is. The data I am looking at is about 24 hours old - if their sync is that slow, then this API is more or less unusable.
I am using the following POST to https://www.googleapis.com/fitness/v1/users/me/dataset:aggregate
{
"aggregateBy": [
{
"dataSourceId": "derived:com.google.activity.segment:com.google.android.gms:merge_activity_segments"
}
],
"endTimeMillis": "1481788800000",
"startTimeMillis": "1481702400000",
"bucketByTime": {
"period": {
"timeZoneId": "America/Los_Angeles",
"type": "day",
"value": 1
}
}
}
For reference - activity types: https://developers.google.com/fit/rest/v1/reference/activity-types
Has anyone been able to retrieve activity time from Fit's REST API that is correct? Any suggestions?
By the way, steps and calories seem to work fine - just aggregate the following datasets:derived:com.google.calories.expended:com.google.android.gms:merge_calories_expended and derived:com.google.step_count.delta:com.google.android.gms:estimated_steps
A side note - it is probably the worst documented API from a major company I have seen.
Related
We've recently started creating API endpoints. One of these end points is hardcoded to change 2 of our reference type codes (i.e. code: "P" for mobile is being changed to "M") from their system value to a custom value (out of a configurable list that has approximately 12 records at the moment. I'm trying to convince them it's bad practice and a terrible idea to change this reference data because of all of the issues it can cause for systems that use the api, however they believe it increases the "independence" of the API from the system of truth. We work in an enterprise environment and currently only our systems hit the api.
Is there any other data or information (Copious amounts of google searching hasn't revealed anyone discussing this sort of issue specifically) that suggests this is a bad idea? Or am I wrong in thinking so?
Edit:
For reference here's some examples:
What the data would look like in the source system the api pulls from
{
"phone_type": "P",
"phone_number": "1234567890",
"user_id":"username"
}
What that same data would look like coming from our API now
{
"phone_type": "M",
"phone_number": "1234567890",
"user_id":"username"
}
What the reference data would look like coming from our reference codes end point
[
{
"code": "P",
"description": "Mobile Number",
"active":"true"
}
]
I have a little experience dealing with Facebook Graph API. I need to get daily views for the video as oppose to lifetime views.
FB API docs don't show this to be an option (lifetime - only period param)
https://developers.facebook.com/docs/graph-api/reference/video/video_insights/
However, I've seen another post on SO answering this question, yet for some reasons it doesn't work for me
(Getting a video views with Facebook API).
This is my successful API call which returns lifetime stats as expected:
/{video_id}/video_insights/total_video_views/lifetime
This is what I thought I should be doing:
/{video_id}/video_insights/total_video_views/day
... but got this error:
{
"error": {
"message": "(#100) Invalid parameter",
"type": "OAuthException",
"code": 100,
"error_data": "Daily query is not supported for metric (total_video_views)"
}
}
Then, as the SO post suggested, I tried different period param:
/{video_id}/video_insights/total_video_views/month
... and got this:
{
"error": {
"message": "(#100) Invalid parameter",
"type": "OAuthException",
"code": 100,
"error_data": "Period should be either lifetime or day."
}
}
... which tells that day should be acceptable param just like lifetime
Eventually, just for fun, I thought I'll pass "wrong" param - Day:
/{video_id}/video_insights/total_video_views/Day
... and got this:
{
"error": {
"message": "(#100) For field 'video_insights': period must be one of the following values: day, week, days_28, month, lifetime",
"type": "OAuthException",
"code": 100
}
}
This states that all of these values are good (day, week, days_28, month, lifetime), yet they don't work.
I'm really confused here. I saw daily break down for a video views on FB webpage/insights and thought it should be possible to do the same through the API.
What am i doing wrong?
I encountered the same problem shortly after your post. All of my testing and research has led me to believe Facebook has removed the ability to retrieve anything other than lifetime. After reading all of the Change Log for the last few releases, I don't see any evidence of this change being documented by Facebook. This is obviously very disappointing.
The API responses clearly indicate "day" should be a valid period, which leads me to believe it was hastily removed.
Unfortunately I do not see a reliable workaround. One may be able to achieve an approximation of daily by retrieving lifetime every 24 hours and calculating the difference between the two. This would only be an approximation though since there could be data reconciliation issues for a given time period.
It is too bad this issue didn't get any traction because it is core to the Facebook Video Insights API.
Here is my solution (after running this through FB support)
basically,
1) i get list of all posts:
fields='object_id,type'
url="https://graph.facebook.com/v3.2/{}?access_token={}&fields={}&limit={}".format(resource_id,access_token,fields,limit)
while url:
rec=requests.request(method="GET",url=url)
content = json.loads(rec.content)
if "next" in content["paging"]:
url=content["paging"]["next"]
else:
url=False
break
2) then, i iterate through each of them (filtering those with videos) and retrieve data for the post which contains this video rather than video itself (which turned out to be not possible anymore with daily grouping)
insights=[]
video_post_id=page_id+"_"+video_id
metric="post_video_views"
url_insight=f"https://graph.facebook.com/v3.2/{video_post_id}/insights?metric={metric}&since={since}&until={until}&pretty=0&access_token={access_token}"
while url_insight:
rec=requests.request(method="GET",url=url_insight)
insight = json.loads(rec.content)
if "next" in insight["paging"]:
url_insight=insight["paging"]["next"]
else:
url_insight=False
break
this worked for me
I have a requirement to add 'Conversion Window' (as above) to an existing Java application which creates batches of Facebook ads. I can't find how to set Conversion Window via the API or how to get a list of them from the API.
This is the most relevant information I've found:
https://developers.facebook.com/docs/marketing-api/reference/ads-action-stats
But it doesn't give me all of what I need.
Although named similarly, those are two different things.
Conversion window specified with bidding is a time period used for optimization of ad delivery. The parameter is called attribution_specand can be set on adset. Valid combinations are described here.
Adset with conversion window of 1-day view, 7-day click would be specified like this:
{
"name": "Adset name",
"attribution_spec": [
{
"event_type": "VIEW_THROUGH",
"window_days": 1
},
{
"event_type": "CLICK_THROUGH",
"window_days": 7
}
],
... other adset params ...
}
Attribution window is a parameter used when loading insights. Using that you can get the stats broken down into different time periods, which can be handy for advanced analytics.
I am attempting to dynamically create a group category with in a course using the following service:
[/d2l/api/lp/(version)/(orgUnitId)/groupcategories/ \[POST\]][1]
The following is the GroupData (Group.GroupData in Create form) JSON block that I am sending to this service:
{
"Name": "New Group Category",
"Description": {
"Content": "",
"Type": "HTML"
},
"EnrollmentStyle": 0,
"EnrollmentQuantity": null,
"AutoEnroll": false,
"RandomizeEnrollments": false,
"NumberOfGroups": 5,
"MaxUsersPerGroup": null
}
I am making the call with the user context of a administrative "Utility" account. I have 2 test courses, both of which I have confirmed I am able create the category through the web interface using this utility account.
My problem is I am having mixed results depending on the course that I try to create the category in. In one course the course returns 200-OK, in the other it returns 403-Forbidden.
Here are the (simplified) requests :
Call 1
/d2l/api/lp/1.4/350110/groupcategories/
Result: 403-Forbidden
Call 2
/d2l/api/lp/1.4/19988/groupcategories/
Result: 200-OK
The only difference is the OrgUnitID. Version, JSON, and user context are all the same, yet I'm getting 2 different results. I have tried with several other courses and again, I have success in some but not all; always receiving a 403 as the error.
After some investigation, I believe I have found 2 distinct differences between courses that are successful and those that return 403.
Courses created just before April 2012 are successful, anything afterwards fail
Courses with a 5 digit Org Unit ID are successful, anything with 6 digits seems to fail.
So my thoughts are we either applied a patch late march / early April of 2012 which somehow changed how courses are flagged on creation, OR somehow only 5 digits (or less?) Org IDs are being accepted by the service.
I'm hoping someone could provide some insight or verify they have no issue with 6+ digit OUIDs and group category creation.
Further reviewing the documentation on API Responses - Disposition and error handling I realized that there are 3 possible cases for a 403 response:
Response body contains Timestamp out of range
Response body contains Invalid Token
application or calling user context does not have the permissions required for the attempted action
Given this, I took a closer look at the response header and realized the issue was actually #2 "Invalid Token", not #3 as I was assuming.
Investigating my code further it seems the user defined SHA256 function I was using was producing an incorrect HASH/Signature when the data being hashed was exactly 55 characters long (yes I realize how crazy this sounds). The temporary work around is to pad my OrgIDs with leading zeros, so my request would actually look something similar too:
/d2l/api/lp/1.4/00350110/groupcategories/
Thankfully, this seems to work, and is acceptable for the immediate future. Long term solution will be to replace my SHA256 function with something more reliable.
I am using Colfusion 7MX for my development, which does not have a native SHA256 Hash function, hence the use of the user defined function.
I am trying to get realtime stock data from BSE and NSE using yahoo finance web-services. I was able to get some data using following URL
http://finance.yahoo.com/webservice/v1/symbols/COALINDIA.NS/quote?format=json
But it gives me very limited information.
{
"list": {
"meta": {
"type": "resource-list",
"start": 0,
"count": 1
},
"resources": [
{
"resource": {
"classname": "Quote",
"fields": {
"name": "COAL INDIA LTD",
"price": "367.649994",
"symbol": "COALINDIA.NS",
"ts": "1418895539",
"type": "equity",
"utctime": "2014-12-18T09:38:59+0000",
"volume": "2826975"
}
}
}
]
}
}
I need more information like yearly high, low, last traded price etc. and I couldn't find any documentation related to this from yahoo where it details how to get more information.
Is there documentation available related to these services? Or please suggest if there are any alternatives available.
I don't know where the definitive documentation might be but for your particular example try appending &view=detail to your URL.
http://finance.yahoo.com/webservice/v1/symbols/COALINDIA.NS/quote?format=json&view=detail
This will at least give you the year_high and year_low that you asked after.
Now, even though the following won't work for your COALINDIA.NS symbol (I suspect the exchange is not supported), it might be worth exploring the following two examples:
Example 1: As before, but for Apple and Yahoo symbols, with &view=detail appended:
http://finance.yahoo.com/webservice/v1/symbols/YHOO,AAPL/quote?format=json&view=detail
Example 2: And now using a completely different url, resulting in much more response data. One key caveat is this data is delayed by 15 minutes:
http://query.yahooapis.com/v1/public/yql?q=select%20*%20from%20yahoo.finance.quotes%20where%20symbol%20IN%20(%22YHOO%22,%22AAPL%22)&format=json&env=http://datatables.org/alltables.env
If you discover the major differences between those two options and what impact they might have then please do let us all know; I'd be interested in finding out more.
If you are fine with getting NSE qoutes, you can use this package for the purpose, it is extremely easy to setup.
http://nsetools.readthedocs.org/en/latest/index.html
Since it uses NSE website/services as data source, the quotes will not be delayed (max few seconds).
Beware that these data are both delayed and inconsistent. You are not getting anything even remotely close to tick or real-time data.
From example 2, refresh a few times, and inspect the "LastTradeWithTime" key-value pair. I sometimes get different quotes from different times of day, for no apparent reason. They are sometimes delayed up to three hours.
You get what you pay for; in other words, this is not a free lunch.
For those who are curious about the different options available in the Yahoo Finance URLs, I think these links might help. If it's not what you're looking for, sorry.
http://internetbandaid.com/2009/03/31/yahoo-stocks-api/
https://ilmusaham.wordpress.com/tag/stock-yahoo-data/
Note: the wordpress site contains information that was taken from a site called gummy-stuff.org which is listed in full at the bottom of the above site (I can only list 2 urls in this post so I had to do the round-about way). Oddly, I found this site on my own yesterday. Funny how stuff comes back around. If you visit this site you'll just see a statement from Yahoo that the info he had originally listed (you're looking at some of this site on the above wordpress site) was never intended to be for public consumption and is a violation of Yahoo's terms and conditions agreement as it can apparently be used for hacking purposes. I was curious to see what was on the original post so I searched for it on the WayBack Machine. BTW, the links to the spread sheets are still active in the archive.
Cheers. Thom