Maximum value for hub.lease_seconds in PubSubHubbub - publish-subscribe

I have been using PubSubHubbub version 0.4 to retrieve YouTube realtime data. I was going through the PubSubHubbub document to know about hub.lease_seconds
In this document, they didn't mention anything about maximum limit for this parameter.
What is the maximum value that can be given as hub.lease_seconds parameter?

It's 828000 seconds. If you try to give any value larger than 828000 it is simply ignoring it and sets the hub.lease_seconds as 828000.

Related

Retrieve all of a user's playlist from SoundCloud limited to 50?

I'm trying to retrieve all the playlists from my account via
http://api.soundcloud.com/users/145295911/playlists?client_id=xxxxxx, as the API reference shows.
However, I can only retrieve recent 50 playlists instead of all of my playlists. I've been looking for this but it seems like no one has had this issue before. Is there a way to get all of them?
Check out the section of their API on pagination.
Most results from our API are returned as a collection. The number of items in the collection returned is limited to 50 by default with a maximum value of 200. Most endpoints support a linked_partitioning parameter that will allow you to page through collections. When this parameter is passed, the response will contain a next_href property if there are additional results. To fetch the next page of results, simply follow that URI. If the response does not contain a next_href property, you have reached the end of the results.

Rest Api manipulating URL to get X amount of results

how to force api to get specified amount of results ?
Now I use:
api/orders/?display=full&filter[date_add]=[2016-04-23,2018-06-23]&date=1
And I got all the results from this date filter but i want to trim amount of results to 5 for example. Is there any method to set maximum amount of resluts ?
Are you trying to achieve pagination? You can send the number of results you want and the number you need to skip with the parameters in Rest Api call. The trimming is generally done at the db level i.e. you can send the number of results required in the fetch query of your db.
Here is solution, i need to set limit which looks like:
api/orders/?display=full&filter[date_add]=[2016-04-23,2018-06-23]&date=1&limit=1

Can response data from core reporting api be grouped?

Explanation:
I am able to query the Google Core reporting APIv3 using the client library to get data on pageviews for specific URLs of a website I am working on. I want to get data(pageviews) for each day within a specified range. So far I am simply looping through the range, sending individual request to the API. in each request I am setting the same value for the start date and the end date.
Problem:
Obviously this gets the job done, BUT it is certainly not the best way to go about it. Because, assumming I want to get data for the past 3 months for each of about 2000 URIs. Then I will need 360000 number of requests and that value is well over the limit quota defined by Google.
Potential solution: So one way I thought of solving this issue is probably to send a request setting start-date and end-date to be a week apart but the API will return a sum of the values rather than the individual values.
main question: So is there a way to insist that these values should not be added up and returned as a sum but rather returned (as associative array or something like that) separately for each.
I hope the question is clear and that there is a solution! Thank you!
Very straightforward:
Metric: ga:pageview, Dimension: ga:date, Set a filter for your pagepath, and set a start-date and end-date.
Example:
https://www.googleapis.com/analytics/v3/data/ga?ids=ga%3Axxyyzz&dimensions=ga%3Adate&metrics=ga%3Apageviews&filters=ga%3Apagepath%3D%3D%2Ffaq.html&start-date=2013-06-27&end-date=2013-07-11&max-results=50
This will return the pageviews for that the faq.html& page for each day in the time-frame.
You should check out the QueryExplorer. Great tool to find out how to structure queries.

why the offset and limit

Facebook simulator test https://graph.facebook.com/me/feed?limit=50 return to the 16, switched to https://graph.facebook.com/me/feed?limit=1&offset=9, but no data is returned.
If offset changed to 8,it will return a data. How this is going, who can help me?
The amount of data is enough, but it can not use limit and offset to display all the data.
This facebook blog post explains how the limit and offset parameters work and also the limitations of them.
Use time-based paging instead of “offset” to ensure you are getting back as many results as possible with each call. For these Graph API connections, use the “since” and “until” parameters.

Maximum number of network updates retrieved per API call

Is there any restriction on the number of entries that are retrieved using a single call to the Network Updates API? I found this forum comment "The per-user limit is per call, so 300 requests with however many updates they have." on the thread
http://developer.linkedin.com/forum/increase-search-api-throttle-limit
I want to confirm that indeed there is no limit. I have received as many as 106 entries in a single call.
Thanks in advance.
The maximum number of updates returned from the Network Updates API appears to be 250. Performing the following query as an example:
http://api.linkedin.com/v1/people/~/network/updates?count=500
Even if I try to specify the start parameter at, say, 250, I can't get the next 250 updates from the API:
http://api.linkedin.com/v1/people/~/network/updates?count=250&start=250
So it looks like 250 is the max, with no ability to page beyond that.
UPDATE:
Have verified that 250 is the maximum number returned, either in a single call or via the paging parameters. Looks like the documentation has been updated to reflect this.