I use facebook graph api and I encountered a problem relating to likes.
My request:
My goal is to find the count of the like. but the query timeout. What is the solution?
Thanx
My goal is to find the count of the like
So you only want the overall number of likes, the counter, but not the individual likes?
Then you should ask for the summary via field expansion:
/{page_id}/feed?fields=likes.limit(0).summary(1)
For each feed item, you will get a likes data structure that looks like this:
"likes": {
"data": [
],
"summary": {
"total_count": 12345
}
}
Try the options below.
Uncheck all the boxes and make the request again, if you give no error, go marking one by one until you find the problem.
Try not to use the limit (999999) is very, I've had problems trying to get as much information in one query page.
Make sure your access token created this with all the necessary permissions to your query.
I confess that I have never seen this error in the Graph API, is very generic and it is difficult to give you a more accurate suggestion.
Related
When I use /{page_id}/feed?access_token=xxxx, this give me all the posts on the page, both by user and page. I want to limit and control the posts. I want to put constraints like:
Timestamp (that is to get posts after a particular timestamp)
Post id (to get post after a particular post)
Since getting all the posts from feed is irrelevant and in-effective. Is there any way to accomplish this ?
You can use
GET /{page_id}/feed?limit={nr_of_posts_to_return}&since={timestamp}
to be able to limit the number of results and specify the starting timestamp. Have a look at the reference here:
https://developers.facebook.com/docs/graph-api/using-graph-api/v2.0#paging
For your second Use Case you'd need to use the Batch API imho, because with a single Graph API request you can't filter on specific Posts. Instead, you need to use the Batch API to split this in two queries as described here:
https://developers.facebook.com/docs/graph-api/making-multiple-requests/#operations
The request would then look like this:
curl \
-F 'access_token={your_access_token}' \
-F 'batch=[{ "method":"GET","name":"get-post","relative_url":"{your_post_id}?fields=created_time"},{"method":"GET","relative_url":"{your_page_id}/feed?since={result=get-post:$.created_time}&limit={nr_of_posts_to_return}"}]' \
https://graph.facebook.com/
In Graph Explorer, you have to change the HTTP method to Post, then add a new field called batch. Leave the URL blank so far. Paste this as batch value:
[{ "method":"GET","name":"get-post","relative_url":"293088074081904_400071946716849?fields=created_time"},{"method":"GET","relative_url":"293088074081904/feed?since={result=get-post:$.created_time}&limit=1"}]
This works at least for me.
For others looking for a solution, it appears the 'since' done at the 'comment' and 'reply' levels are ignored. Which means this is not a solution for me.
The query Tobi provided will provide all the posts after the first 'since' but every comment and reply in those posts, regardless of that you set their 'since' to.
Further to this, if you wish to search for new comments , regardless of the age of the post, this fails as well. For example:remove the first 'since' and change to limit=1000 and only request comments as a fields using 'since' , this will return the last 1000 posts and all comments for all of those 1000.
That said, thank you Tobi for your time and showing me how to get everything I need in a single function call. I may experiment parsing the complete recordset every time. ( maybe too much traffic though!)
I'm trying to perform a simple pagination based on time through a facebook endpoint. I was getting results that didn't match my since.
An example call <username>/statuses?since=1390176000
returns this pagination:
"paging": {
"previous": "https://graph.facebook.com/8489236245/statuses?since=1390500000&limit=25&__paging_token=<num>",
"next": "https://graph.facebook.com/8489236245/statuses?limit=25&until=1390250670&__paging_token=<num>"
}
My expected behavior was that after a query with since I will iterate on next till I reach NOW. But when doing the query they provide with until=1390250670 I actually get OLDER results. Is there any logical explanation for this? Should I just use the previous paging ?
As you are looking at the user's statuses, the pagination is reversed as the data is ordered in reverse chronological order. The newest entries are always on the first page, so paginating to the next page will always give you older entries.
Unfortunately, the Facebook documentation doesn't mention the ordering for this API call.
I am performing a rest call to facebooks search API using type=event
e.x.
search?fields=id,name,picture,owner,description,start_time,end_time,location,venue,updated_time,ticket_uri&q=concert&type=event
I have looked through the documentation and still have a few questions about specific pagination behavior of the event search API.
If I used a broad search term like "ma" and keep querying the pagination ['next'] URL would I cycle through all facebook events starting with "ma"? Does the pagination array give any indication when there are no more results to return?.
Do these searches include past events? If so is it possible to eliminate past events using the "since" parameter?
What is the maximum for the limit parameter?
Update:
As far as I can tell the number of pages you can get from a facebook search is limited to 500. This includes pages that can be accessed via pagination. In other words a query with limit >=500 will not return a pagination url, likewise a query with limit 250 will only return one pages worth of pagination.
You will "next page" until the count of results comes less then the limit
I'm not sure if that is possible using a simple Graph Request. Maybe using FQL
I don't know exactly. But i used a 2000 limit one day. And it worked.
Other doubts you can get answers testing your resquests with this tool
https://developers.facebook.com/tools/explorer/
I am also doing the same thing like you. I am collecting public post using graph search api.
When there are no results available or you reach max limit pagination section will not be there in response. So you can always check for paging is there in json response or not something like this.
NextResult = DeserJsonFBResponce.paging != null ? DeserJsonFBResponce.paging.next : string.Empty;
I am not so sure about this with events but for public post i am able to eliminate post using science and until parameters.
Maximum for the limit parameter is 2000 per get request.
I have been trying to find a way to POST changes to one of the Threads object fields in my inbox to mark it as "read", i.e. change unread from 1 to 0 as seen below from the JSON response I get:
"unread": 1,
"id": "1643543545",
"updated_time": "2013-02-12T14:53:26+0000",
"comments": {
"data": [
{
.
.
.
}
]
}
However, I am a bit lost in finding out which part of the API document talks about that and which objects into which you can POST and alter their fields. Looking at the Thread object, what's mentioned there is only the ability to fetch data in read-only manner. There's no mention whatsoever if the object's fields can be updated or altered, i.e. change unread from 1 to 0.
Is it possible in the first place, or POST is only dedicated to specific part of the API lie "feeds", "messages", etc.
If such think do not exist, any ideas on how to do that would be appreciated ( you'll get a discontinued Canadian penny as a token of appreciation :)
It is not currently possible for developers to mark as thread or message as "read".
Possible duplicate of:
Facebook Graph API, mark inbox as read?
I'm fetching a large amount of comments from a public page using Facebook's Graph API.
By default facebook returns 25 comments per response, and uses paging. This causes the need for multiple requests, which is uneccesery as I know ahead there will be a lot of comments.
I read about the "limit" parameter that you can pass to ask for a certain amount of items per response.
I was wondering, what is the limit of that parameter? I'm assuming I can't pass &limit=10000.
There's a different way for fetching comments:
https://graph.facebook.com/<PAGE_ID>_<POST_ID>/comments?limit=500
The maximum value for the limit parameter is 500.
yes, with limit parameter you can pass what number of certain resource you want in one call. default limit is 25.
for ex. if you want 100 comment in one call for a post having id POST_ID, you can query like this:
https://graph.facebook.com/POST_ID?fields=comments.limit(100)
I think they have changed this. For /feed? I only get 200-225 posts back but for comments I get as many as 2000 back
Old question, but this is in the current Facebook documentation in case anyone finds this question via search (emphasis mine):
Some edges may also have a maximum on the limit value for performance reasons. In all cases, the API returns the correct pagination links.
In other words, even if you specify a limit above what's allowed by the endpoint, the "pagination.previous" and "pagination.next" elements will always provide the correct URL to resume where it left off.
I would recommend you to use FQL instead.
FQL provide a more flexible approach where you can combine data types (posts, users, pages, etc..) as you please. You can also query for comments belonging to a list of stories instead of just one limiting your number of requests even more.
There are a couple of drawbacks though:
1. There is a limit on 5000 comments. Here you would use a query looking something like: "SELECT id, ...... FROM comments, ... WHERE parent_id in (1,2,3....) ORDER BY time LIMIT 0, 5000". Even though you split this up in several queries with "LIMIT 0, 1000", "LIMIT 1000, 1000", LIMIT 2000, 1000, etc.., you would never get anything over 5000 comments("LIMIT 5000, 1000" would return empty).
2. All real requests made on Facebooks server counts as one request. You can send of something that is actually a combination of requests, this will be counted as multiple requests.
3. Facebook does not like to heavy requests. You can end up with getting blocked for a shorter time periods(minutes -> hours, not days). If this happens, act on it.