I am currently getting the number of likes on individual pages by calling the below code, however I would like to get the likes for multiple pages at the same time. Can't seem to find any documentation on if BATCH will allow this.
$like_result = file_get_contents('http://graph.facebook.com/?id=http://www.myadd.com/page1.php);
$like_array = json_decode($like_result, true);
$like_no = $like_array['shares'];
i.e. need the number of likes for:
ahttp://graph.facebook.com/?id=http://www.myadd.com/page1.php
ahttp://graph.facebook.com/?id=http://www.myadd.com/page2.php
ahttp://graph.facebook.com/?id=http://www.myadd.com/page3.php
The Graph API documentation explains how to use ids to select multiple objects simultaneously:
/<API VERSION>/?ids=http://www.myadd.com/page1.php,http://www.myadd.com/page2.php,http://www.myadd.com/page3.php
You could also use the Batch Requests API (which has usage examples in the documentation) or a combination of the two (making batch calls of requests which ask for multiple objects)
Related
For the Marketing API, I know that I'm able to make one call to retrieve all of the adsets from a certain account along with their insights, but am I able to specify the date_preset for the insights edge in that same call?
For example, the following gives me lifetime insights stats:
/v2.4/{accountID}/adcampaigns?fields=insights
To be clear - I know this is possible to retrieve by making separate calls for each adset id (where I know I can specify the date_preset); instead, I'd like to do this via the call where I get a long list of the ad sets plus their insights details in one go.
Yes this is possible using query expansion, however you probably should not do it in this anyway.
Using query expansion results in multiple requests being executed in one HTTP call, in this case one to get all the adcampaigns, and then N requests where N is the number of adcampaigns returned. This will in turn affect your rate limiting.
The most efficient way to request all insights for all adcampaigns (ad sets) is instead to request them at the account level, specifying aggregation level:
/v2.4/act_{ADACCOUNT_ID}/insights?date_preset=last_7_days&level=campaign
This requires just 1 request, or the number of requests to retrieve the total number of pages.
If you really want to achieve this with query expansion, you can do the following for example:
/v2.4/act_{ADACCOUNT_ID}/adcampaigns?fields=insights.date_preset(last_30_days).time_increment(all_days)
You can see the parameters to insights that would normally be query parameters of the form param_name=param_value are now in the form of param_name(param_value).
To specify the date_preset , here is the correct format . Its important to use insights as edge to get the date_preset filtering .
/v2.10/act_{ADACCOUNT_ID}/insights?fields=impressions,clicks,ctr,unique_clicks,unique_ctr,spend,cpc&date_preset=last_3d
The above one is tested with latest Graph Api version(2.10) as of now . FOr more info related to the date_preset values refer to there api docs .
https://developers.facebook.com/docs/marketing-api/insights/parameters
The QuickBooks Online api supports paging and sorting results with special query parameters. Paging requires two parameters: PageNumber and ResultsPerPage. However, there doesn't seem to be any way of figuring out how many pages are available at a given number of results per page, or how many objects there are. The response only includes the current page and how many things are on it.
Is it possible to get either a total count of items for a given search? Or at least a total number of pages?
In QBO there is no direct api to get the total count or the total number of pages.
You need to use paging only for this use case.
To use paging use the findAll method(with page no and chunk size attributes) of the corresponding entity.
ex - Ref doc for Customer(QBO)
http://developer-static.intuit.com/SDKDocs/QBV2Doc/ipp-java-devkit-2.0.10-SNAPSHOT-javadoc/
(QBOEmployeeService).
Ref example -https://developer.intuit.com/docs/0025_quickbooksapi/0055_devkits/0200_ipp_java_devkit/0800_crud_examples
I am performing a rest call to facebooks search API using type=event
e.x.
search?fields=id,name,picture,owner,description,start_time,end_time,location,venue,updated_time,ticket_uri&q=concert&type=event
I have looked through the documentation and still have a few questions about specific pagination behavior of the event search API.
If I used a broad search term like "ma" and keep querying the pagination ['next'] URL would I cycle through all facebook events starting with "ma"? Does the pagination array give any indication when there are no more results to return?.
Do these searches include past events? If so is it possible to eliminate past events using the "since" parameter?
What is the maximum for the limit parameter?
Update:
As far as I can tell the number of pages you can get from a facebook search is limited to 500. This includes pages that can be accessed via pagination. In other words a query with limit >=500 will not return a pagination url, likewise a query with limit 250 will only return one pages worth of pagination.
You will "next page" until the count of results comes less then the limit
I'm not sure if that is possible using a simple Graph Request. Maybe using FQL
I don't know exactly. But i used a 2000 limit one day. And it worked.
Other doubts you can get answers testing your resquests with this tool
https://developers.facebook.com/tools/explorer/
I am also doing the same thing like you. I am collecting public post using graph search api.
When there are no results available or you reach max limit pagination section will not be there in response. So you can always check for paging is there in json response or not something like this.
NextResult = DeserJsonFBResponce.paging != null ? DeserJsonFBResponce.paging.next : string.Empty;
I am not so sure about this with events but for public post i am able to eliminate post using science and until parameters.
Maximum for the limit parameter is 2000 per get request.
I'm fetching a large amount of comments from a public page using Facebook's Graph API.
By default facebook returns 25 comments per response, and uses paging. This causes the need for multiple requests, which is uneccesery as I know ahead there will be a lot of comments.
I read about the "limit" parameter that you can pass to ask for a certain amount of items per response.
I was wondering, what is the limit of that parameter? I'm assuming I can't pass &limit=10000.
There's a different way for fetching comments:
https://graph.facebook.com/<PAGE_ID>_<POST_ID>/comments?limit=500
The maximum value for the limit parameter is 500.
yes, with limit parameter you can pass what number of certain resource you want in one call. default limit is 25.
for ex. if you want 100 comment in one call for a post having id POST_ID, you can query like this:
https://graph.facebook.com/POST_ID?fields=comments.limit(100)
I think they have changed this. For /feed? I only get 200-225 posts back but for comments I get as many as 2000 back
Old question, but this is in the current Facebook documentation in case anyone finds this question via search (emphasis mine):
Some edges may also have a maximum on the limit value for performance reasons. In all cases, the API returns the correct pagination links.
In other words, even if you specify a limit above what's allowed by the endpoint, the "pagination.previous" and "pagination.next" elements will always provide the correct URL to resume where it left off.
I would recommend you to use FQL instead.
FQL provide a more flexible approach where you can combine data types (posts, users, pages, etc..) as you please. You can also query for comments belonging to a list of stories instead of just one limiting your number of requests even more.
There are a couple of drawbacks though:
1. There is a limit on 5000 comments. Here you would use a query looking something like: "SELECT id, ...... FROM comments, ... WHERE parent_id in (1,2,3....) ORDER BY time LIMIT 0, 5000". Even though you split this up in several queries with "LIMIT 0, 1000", "LIMIT 1000, 1000", LIMIT 2000, 1000, etc.., you would never get anything over 5000 comments("LIMIT 5000, 1000" would return empty).
2. All real requests made on Facebooks server counts as one request. You can send of something that is actually a combination of requests, this will be counted as multiple requests.
3. Facebook does not like to heavy requests. You can end up with getting blocked for a shorter time periods(minutes -> hours, not days). If this happens, act on it.
I am looking to optimize my Facebook app.
Today I make a batch call with four graph API calls:
/me
/me/friends
/me/likes
/me/feed
If I change this to a single graph API call using field expansion like this:
/me?fields=id,name,username,friends,likes,feed
Will that now count as one hit against the API instead of four for rate limiting purposes?
Unfortunately, each call in the batch is counted as an api call, it's just faster to call them within a batch since it will be 1 request. See here documentation on Facebook API:
Limits
We currently limit the number of requests which can be in a batch to 50, but each call within the batch is counted separately for the purposes of calculating API call limits and resource limits. For example, a batch of 10 API calls will count as 10 calls and each call within the batch contributes to CPU resource limits in the same manner.
Source:
https://developers.facebook.com/docs/reference/api/batch/
Based on real-world testing, I've found that field expansion can count for multiple uses under the rate limit. For example, starting from a quiet state, a sequence of 63 field-expanded calls to a single api (graph.facebook.com/IDENTITY/posts) brought us to the 600 call rate limit.
According to the Facebook Docs,
The Field Expansion feature of the Graph API, allows you to effectively "join" multiple graph queries into a single call.
So your queries above would represent four calls in the Batch form, and one call in the Field Expanded form.
As I noted in a comment above: A batch sends multiple-but-not-necessarily-related queries to Facebook in a single request. Field expansion is like doing joins in SQL through a single query.