I am trying to get user likes in facebook. I can only get 100 at a time for some reason. i tried using the limit parameter in open graph syntax and it didn't help. I alwso tried writing an FQL query to get more likes and to no avail. No matter what I do, i get only 100 likes per request. It's even worse. Most of the likes are of no interest to me. I'm using only likes on several categories. if i could have gotten (using FQL) 100 likes of a user which are all of those categories, that would have been sufficient to me. But when I call the FQL query it seems that FB is querying on 100 first likes and returning the results instead of returning 100 results. I am despaired at this mechanism, is there a way out of this or is FB really gave no way to get more likes in a single call?
Read about paging in the docs: https://developers.facebook.com/docs/graph-api/using-graph-api/v2.2?locale=en_GB#paging
I guess the max limit is 100, if you want to get more than that you have to make another call by using paging. There is no way to filter with the API, you will have to do that on your own after getting the likes.
Related
Is there a simple way to get the number of likes on posts on a page from the facebook graph API. My current Aproach is to get the feeds of the page, and then for each individual post get the ammount of likes.
So First I itterate all the posts on
--This make this a ton of times as I can only retrive 100 at a time
/{page-id}/feed
And then using each post ID
--Make this request even more times
/{object-id}/likes?summary=true
But this is horribly inneficient and takes a lot for each page.
So basically the question is, can I get the info making less requests?
This works for me:
/{page-id}/posts?fields=message,likes.limit(1).summary(true)
I have the following FQL:
SELECT id, created, like_info, comment_info FROM photo WHERE owner = me()
It returns all the information I want in one HTML request and works great.
Except that it only returns the first 100 photos. When I query via the OpenGraph API using /self/photos/uploaded and paginate through those results, I properly get the several hundred photos I expect.
The problem is that to get the like and comment info, I have to (potentially) issue several more queries in order to paginate through the comments and likes sections for each photo. As several of these photos have more than 25 comments and more than 25 likes, this can easily add up to several hundred HTTP requests.
I've tried various WHERE clauses in the FQL to get beyond the 100th result returned, but the FQL simply won't return the 101st photo. And, as usual, Facebook's documentation is sorely lacking.
Anyone have any ideas?
FQL has LIMIT and OFFSET keywords, so you could get 100 photos, starting with the 101st by adding LIMIT 100 OFFSET 101. I believe you can request up to 5000 items in FQL.
You should look at FQL Multiqueries to cut down on the number of calls.
I need to get ALL user likes at once without pagination.
I could hit: graph.facebook.com/me/likes ...however, is there a limit to the # of objects returned by facebook? if so, what is that limit and can it be overwritten?
The default limit is something like 25 results. You can specify a limit by providing a limit parameter to facebook:
https://graph.facebook.com/me/likes?limit=100
Checkout the API Documentation under the heading "Paging".
That said, there's never a guarantee that you'll get all the likes at once, even if you set the limit parameter to be greater than or equal to the number of likes on an object.
On top of that, you'll often find that the number of likes reported on the Facebook website or by the Graph API is higher than what you can get by fetching the /likes connection in the Graph API. I'm trying (and failing) to find the SO question that talked about why that is, but if I remember right that number sometimes includes shares and other actions, not just likes.
You should use the pagination to page thru all the data that the Graph API can return.
I would like to retrieve the full newsfeed including historical data of a given user. In principal, this is straight forward using either an authenticated call to the Graph API or to the FQL API.
With the Graph API, I access the me/home endpoint. This results in 25 entries. I can iterate over the pages and retrieve around 8 pages back into history giving me around 200 entries. I write around 200 entries, because with each run through this I get a different number of total entries. Sometimes more, sometimes less.
With the FQL API, I call SELECT post_id, created_time, actor_id, message FROM stream WHERE filter_key = 'nf' AND is_hidden=0 AND created_time > 1262304000 LIMIT 500 where the created time reflects 1 Jan 2010. This gives me around 150 entries.
Both methods don't seem to allow to work your way backwards into history. In the FQL query, I also tried to play around with the created_time field and LIMIT to go backwards in small chunks but it didn't work.
The documentation of the stream table http://developers.facebook.com/docs/reference/fql/stream/ says somehow cryptically:
The profile view, unlike the homepage view, returns older data from our databases.
Homepage view - as far as I understand - is another word for Newsfeed, so that might mean that what I want is not even possible at all?
To make things worse (but that's not the main topic of this question) the returned datasets from the two methods differ. Both contain entries that the other does not show but they also have many entries in common. Even worse, the same is true in comparison to the real newsfeed on the Facebook website.
Does anyone have any experience or deeper insights on this?
Maybe I am mis-understanding your question, but can't you simply call the graph api with /me/home?limit=5000 and then ?limit=5000&offset=5000 or whatever the max limit value Facebook allows is?
I'm trying to retrieve all the photos a user is tagged in using the Graph API but I can only get the latest 25.
Is it possible to get more, and if so, how?
Have you tried adding a limit and offset parameters? Quoting the documentation:
Paging
When querying connections, there are several useful parameters that enable you to filter and page through connection data:
limit, offset: https://graph.facebook.com/me/likes?limit=3
until, since (a unix timestamp or any date accepted by strtotime): https://graph.facebook.com/search?until=yesterday&q=orange
Currently there is 100 items limitation per query both on photos and likes:
However, 100 pics query takes so much time to run for me.
The following API call:
https://graph.facebook.com/me/photos?limit=500
gives only 100 results with paging link.
Run in FQL explorer (If you have more than 100 pic on your account):
https://developers.facebook.com/tools/explorer/?method=GET&path=me%2Fphotos%3Flimit%3D500
Setting limit=0 may not always work in the case that a user has a huge number of tagged photos. Also note that the tagged photos graph API can return a large number of embedded comments as well, so especially if you are developing a mobile app, it can take a long time to return all the photo data. Finally I find that sometimes Facebook will limit the number of entries it can return in times of high load.
So... perhaps the best way is to use the "paging" "next" url that appears at the end of the returned photo data. This gives you the next graphAPI call that you can then use to get the next x photos. It does this by using and pre-populating the limit and until parameters and incorporating them within the graph api call. Very handy.