Maximum number of results for realtime API - real-time

We have a user who is getting 1000 results back from a realtime API query but is expecting more.
From the docs:
https://developers.google.com/analytics/devguides/reporting/realtime/v3/reference/data/realtime#resource
It does not appear that the realtime API response contains a facility for paging.
Is it correct to assume this API endpoint:
https://developers.google.com/analytics/devguides/reporting/realtime/v3/reference/data/realtime/get
Can return a maximum of 1000 results?

You need to remember that real-time is still beta. The real-time request doesn't include a next link which means there is no way to get extra data back.
Try setting &max-results=10000 see what that returns. The max rows for the Reporting API is 10000, however I have been unable to find any information on what the max number you can set in the Real-time API is. Real-time isn't returning an error if I set it to 10000 however I don't have an account with that many real-time users so I cant test it. I am going to send Google an email to see if I can get a verification what the max number is for the real-time api.
https://www.googleapis.com/analytics/v3/data/realtime?ids=ga:78110423&metrics=rt:activeUsers&access_token={accessToken}&max-results=10000
You might want to add a issue request to issues - google analytics requesting that they add nextlink at the very least.

Related

Facebook Developer APIs - Trying to fetch all the campaigns, Adsets and Ads

This is what I'm trying to do -
The first GET request is to https://graph.facebook.com//<ACT_ID>/campaigns, which successfully returns me all the campaigns (Ofcourse I've handled pagination using the cursors provided)
The next step I perform is, for every campaign ID, I make a GET request to https://graph.facebook.com//<CAMPAIGN_ID>/adsets to fetch the respective adsets.
Here is where things go wrong. I get back an error which says limit (#17) User request limit reached after a few requests.
I have also tried using batch requests. But it appears as if the batch requests are in turn making individual multiple calls internally, which is again ending up in the request limit error.
Can anyone help me figure out how I can achieve this ? Bypass the limit or perhaps a different approach.
Thanks.
Please let me know if you need clarification on my question.
[UPDATE] : I have tried looking at the metrics from the header of an GET insights call. The CPU usage/call count/total time are way below the limits specified in the documentations. I have no idea why the request limit error is showing up.
I have found a way to fetch campaigns, adsets and ads in a single call.
https://graph.facebook.com/<ACT_ID>/campaigns?fields=id,name,adsets{id,name},ads{adset_id,name}
This will return an array of campaigns, with each campaign having arrays of adsets and ads that belong to it.
Now that I got the data, I can easily parse it according to the format I want which is each element in the array of campaigns having a array of adsets that belong to it. Each element in this array of adsets having an array of ads that belong to it.

Facebook Graph API Rate limit for development

I have a Facebook Application in development mode that shows as having 3 daily_active_users. From my understanding of the Graph API documentation, I can make 200 * daily active users = total request per hour, thus, I should be able to make 600 requests per hour
I am then making a Test User and trying to create a page via the accounts endpoint:
https://developers.facebook.com/docs/graph-api/reference/user/accounts/#Creating
This goes well for a single request. I then tried to script this, with 2 second timeouts in between each request, and tried to create 100 pages. After about 10 requests, I get the following response from the Facebook API:
{"error":{"message":"We limit how often you can post, comment or do other things in a given amount of time in order to help protect the community from spam. You can try again later. Learn More","type":"OAuthException","code":368,"error_data":{"sentry_block_data":"...","help_center_id":0},"error_subcode":1390008,"error_user_msg":"","fbtrace_id":"..."}}.
It states that my request is being rejected due to hitting some kind of limit, but what is this limit? I can't find it in the documentation anywhere. Is there a limit to the number of pages I can create with a test user per hour/day?

How do I fetch the facebook events of a particular location

I would like to fetch all the facebook events by passing a latitude or longitude or a place, As FQL support is no more, I've tried the following query
search?q=*&type=events&center=37.76,-122.427&distance=1000
But the results I get are from syria, India, USA, Hungary and all, why I'm not getting the events from only that location I've specified. Is there any legal issues in fetching the public events from facebook, How many number of eevnns I can fetch at a time??
There is no way to directly search for events in a specific area. The center and distance parameters are only available for Places, as you can read in the docs: https://developers.facebook.com/docs/graph-api/using-graph-api#search
More information:
https://www.facebook.com/help/community/question/?id=10201749749651827
How can I query public facebook events by location/city?
About "how many events can i fetch", those are the API rate limits: https://developers.facebook.com/docs/graph-api/advanced/rate-limiting
API results usually have a limit of 25 entries for one API call. But you can set it higher:
/search?type=event&q=test&limit=100
Legal issues depend on what you want to do with the data, impossible to say without knowing the details for your App.

Facebook Marketing API Rate Limit

I know that fb have made available some documentation about the requests limits to the api https://developers.facebook.com/docs/marketing-api/api-rate-limiting,
but it is not clear how each api call is calculated...
i.e, If I want to get stats for ~10,000 adsets, how can I evenly space the time between the calls ?
The best answer i could find for this question from another SO thread -
"After some testing and discussion with the Facebook platform team, there is no official limit I'm aware of or can find in the documentation. However, I've found 600 calls per 600 seconds, per token & per IP to be about where they stop you. I've also seen some application based rate limiting but don't have any numbers.
As a general rule, one call per second should not get rate limited. On the surface this seems very restrictive but remember you can batch certain calls and use the subscription API to get changes."
Source - What's the Facebook's Graph API call limit?
Official Doc-: https://developers.facebook.com/docs/graph-api/advanced/rate-limiting
Rate limits are imposed on each app. The rate limiting tool gives you information about how close your app is to being throttled. Click on any sample to get more detail on the types of utilization.
Your app can make 200 calls per hour per user in aggregate. As an example, if your app has 100 users, this means that your app can make 20,000 calls. This isn't a per-user limit, so one user could make 19,000 of those calls and another could make 1,000. This limit is calculated based on the number of calls made in the previous hour.
Source-: https://developers.facebook.com/docs/graph-api/advanced/rate-limiting
Heavily rate-limited per ad account. For development only. Not for production apps running for live advertisers.

Realtime Twitter Replies?

I have created Twitter bots for many geographic locations. I want to allow users to #-reply to the Twitter bot with commands and then have the bot respond with the results. I would like to have the bot reply to the user as quickly as possible (realtime).
Apparently, Twitter used to have an XMPP/Jabber interface that would provide this type of realtime feed of replies but it was shut down.
As I see it my options are to use one of the following:
REST API
This would involve polling every X minutes for each bot. The problem with this is that it is not realtime and each Twitter account would have to be polled.
Search API
The search API does allow specifying a "-to" parameter in the search and replies to all bots could be aggregated in a search such as "-to bot1 OR -to bot2...". Though if you have hundreds of bots then the search string would get very long and probably exceed the maximum length of a GET request.
Streaming API
The streaming API looks very promising as it provides realtime results. The API allows you to specify a follow and track parameters. follow is not useful as the bot does not know who will be sending it commands. track allows you to specify keywords to track. This could possibly work by creating a daemon process that connects to the Streaming API and tracks all references to the bot's names. Once again since there are lots of bots to track the length and complexity of the query may be an issue. Another idea would be to track a special hashtag such as #botcommand and then a user could send a command using this syntax #bot1 weather #botcommand. Then by using the Streaming API to track all references to #botcommand would give you a realtime stream of all the commands. Further parsing could then be done to determine which bot to send the command to. This blog post has more details on the Streaming API
Third-party service
Are there any third-party companies that have access to the Twitter firehouse and offer realtime data?
I haven't investigated these, but here are a few that I have found:
Gnip
Tweet.IM
excla.im
TwitterSpy - seems to use polling, not realtime
tweethook
I'm leaning towards using the Streaming API. Is there a better way to get near realtime #-replies for many (hundreds) of Twitter accounts?
UPDATE: Twitter just announced that in the future they will have User Streams which expands upon the Streaming API. User Streams Preview
Either track or follow will work for the cases you describe. See http://apiwiki.twitter.com/Streaming-API-Documentation#track for details on what track actually does. The doc on follow is on the same page.
There are rate limits of sorts on the streaming API, but they have to do with how big a slice of the total tweet stream you're consuming. For writing a bot like this you won't hit these limits without a pretty big user base. And when you get that user base you can apply for elevated access levels that increase the rate limets.
There's the twitter firehose but you're probably best off using the Streaming API. The firehose is open to Google (try googling your twitter name) and as the link says they're opening it up to all soon enough.
You'll want to get your IP whitelist too.
If your not already, you want to check out the GoogleGroup for twitter devs.
The track predicate for the streaming api would actually be useful because if you follow your bot's user IDs, you'll get all the messages made by your bots and all the other messages that mention your bots #usernames (including #replies). It really does track everything public on twitter relating to the user IDs you follow with it, give it a shot.
REST API:
The most comprehensive results with the least amount of false positives. Will include protected statuses if the bot is following the protected account. If you poll every thirty seconds it is pretty close to realtime and you will be well under your rate limit (350/hour) if you are using api.twitter.com/1 with OAuth.
Streaming API:
You will want to avoid the Search API. It is trending more and more towards popular results and not complete results.
Streaming API
The fastest but also likely to miss some statuses as well as include false positives. Protected statuses for example are not included. Track for a screen_name will return statuses with that screen_name in it but will also include tweets that just have the screen_name as a string without the # so be sure to filter on your side.