Can I paginate Facebook FQL queries on the photo table? - facebook

I have the following FQL:
SELECT id, created, like_info, comment_info FROM photo WHERE owner = me()
It returns all the information I want in one HTML request and works great.
Except that it only returns the first 100 photos. When I query via the OpenGraph API using /self/photos/uploaded and paginate through those results, I properly get the several hundred photos I expect.
The problem is that to get the like and comment info, I have to (potentially) issue several more queries in order to paginate through the comments and likes sections for each photo. As several of these photos have more than 25 comments and more than 25 likes, this can easily add up to several hundred HTTP requests.
I've tried various WHERE clauses in the FQL to get beyond the 100th result returned, but the FQL simply won't return the 101st photo. And, as usual, Facebook's documentation is sorely lacking.
Anyone have any ideas?

FQL has LIMIT and OFFSET keywords, so you could get 100 photos, starting with the 101st by adding LIMIT 100 OFFSET 101. I believe you can request up to 5000 items in FQL.
You should look at FQL Multiqueries to cut down on the number of calls.

Related

getting user likes in facebook

I am trying to get user likes in facebook. I can only get 100 at a time for some reason. i tried using the limit parameter in open graph syntax and it didn't help. I alwso tried writing an FQL query to get more likes and to no avail. No matter what I do, i get only 100 likes per request. It's even worse. Most of the likes are of no interest to me. I'm using only likes on several categories. if i could have gotten (using FQL) 100 likes of a user which are all of those categories, that would have been sufficient to me. But when I call the FQL query it seems that FB is querying on 100 first likes and returning the results instead of returning 100 results. I am despaired at this mechanism, is there a way out of this or is FB really gave no way to get more likes in a single call?
Read about paging in the docs: https://developers.facebook.com/docs/graph-api/using-graph-api/v2.2?locale=en_GB#paging
I guess the max limit is 100, if you want to get more than that you have to make another call by using paging. There is no way to filter with the API, you will have to do that on your own after getting the likes.

Facebook Graph API: Get all posts from all the pages the user likes

I want to get all the posts from all the pages from the user /me/home feed.
Right now Facebook is deciding for the user what posts will get to the feed and which ones will not.
For example, if the user is subscribed (likes) 100 pages and all 100 of them posts an update the user feed will not show all 100 of them, only a portion of updates that it thinks important. Neither the API.
Is it possible to get all updates using the Graph api (like a regular timeline)?
You can try FQL, for example:
{"query1":"SELECT type,post_id,created_time,actor_id,target_id,message,attachment.media,attachment.caption,attachment.name,attachment.description,attachment.fb_checkin,likes.count,likes.user_likes,likes.can_like,comment_info,description FROM stream WHERE filter_key='pp' AND created_time<now() ORDER BY created_time DESC","query2":"SELECT id,name,pic FROM profile WHERE id IN (SELECT actor_id,target_id FROM #query1)"}
The keyword was filter_key='pp', means that you want to get all page's news feed.
I have no idea it will include ALL of 100 pages on real time, however this should be enough to achieve your goal. One more point, news feed have 1 week limitation, means that you cannot query older than 1 week's news feed.
Update:
https://graph.facebook.com/me/home?filter=pp is alternative way if you don't want to use FQL.
Use the following FQL in your https request to get the list of all likes paginated
SELECT src_big, src_small, owner,caption FROM photo WHERE object_id IN (SELECT object_id FROM like WHERE user_id == me() LIMIT 10 OFFSET 8 )

Getting 'pic-square' from user posts on the news feed using Facebook Graph API

I'm trying to get the 50 x 50 profile picture for user posts on the news feed. It's easy to do this for likes and comments on posts but I can't figure out how to do it for the initial posts.I also want to limit the query so that I only get results from "people" not "pages" and I only want unique results. In other words if a user appears twice on the feed I only need their picture once. I've played with the Graph API Explorer extensively and have looked all over the forums, here, as well as the documentation on the Facebook developers site. I would think this would be a common request so I'm not sure why it's been so hard to find.My guess is that the syntax of the query would look something like this. Although this query doesn't work in the explorer.
me/home/?fields=from.id.fields(pic_square),from&profile_type=user
Well after a lot of research and trial and error with FQL I figured it out. It's a series of nested queries and it returns an array with the urls of the pic-square images of the users whose posts appear in your news feed. The FQL query looks like this:
SELECT pic_square FROM user WHERE uid IN (SELECT actor_id FROM stream WHERE filter_key IN (SELECT filter_key FROM stream_filter WHERE uid=me()))
I was not able to differentiate between user and page profile pics.

Facebook Graph / Fql: Get all pics

I'm having return issues with graph API and I'm wondering if any knows why or how I can fix it.
I need to download all photo data for a given user (friend data, not the active user). Here are the two things I have tried.
FQL: "select pid from photo where subject=friend_uid"
Graph: "friend_uid/photos?fields=picture,created_time,tags.fields(name,id)&limit=1000"
I have friends_photos and user_photos permissions.
Any users with > 1000 tags will have drastically reduced result numbers.
For example, it will only return around 200 photos or so which is not acceptable, I need all of them
Chunking with since/until (or created_time < or > __) as well as limit clauses only improves the result count with FQL but the amount of chunking makes it VERY inefficient.
Any ideas? The tag data is also important for my purposes.
So, I need the proper query or sequence of queries to obtain ALL tag data for all photos of a given user_id using either FQL or graph-api.
I recently created a similar project (pulls all the photos from all your facebook friends in order) in PHP. Facebook's limits are poorly documented but I found that with the Graph API, it's 400 photos the friend is tagged in and 5000 photos the friend uploaded per request. Note that pulling from {user}/photos only pulled photos they are tagged in, while {user}/photos/uploaded only pulls photos that {user} uploaded. I figured that that 400 tagged and 5000 uploaded photos was enough for my situation.
If you do need additional photos, you will have to check the number of returned photos for the user for the /photos request and check if it's equal to 400. If so, you will have to go on to the next page, recursively.
For the /photos/uploaded request, Facebook uses cursor-based pagination (see bottom of this page), which means that pagination->next and pagination->prev data is only sent when there are more values to return. This makes it fairly easy to get the next page (once again you will have to do this recursively).

How can I retrieve the full newsfeed of a user via the Facebook API?

I would like to retrieve the full newsfeed including historical data of a given user. In principal, this is straight forward using either an authenticated call to the Graph API or to the FQL API.
With the Graph API, I access the me/home endpoint. This results in 25 entries. I can iterate over the pages and retrieve around 8 pages back into history giving me around 200 entries. I write around 200 entries, because with each run through this I get a different number of total entries. Sometimes more, sometimes less.
With the FQL API, I call SELECT post_id, created_time, actor_id, message FROM stream WHERE filter_key = 'nf' AND is_hidden=0 AND created_time > 1262304000 LIMIT 500 where the created time reflects 1 Jan 2010. This gives me around 150 entries.
Both methods don't seem to allow to work your way backwards into history. In the FQL query, I also tried to play around with the created_time field and LIMIT to go backwards in small chunks but it didn't work.
The documentation of the stream table http://developers.facebook.com/docs/reference/fql/stream/ says somehow cryptically:
The profile view, unlike the homepage view, returns older data from our databases.
Homepage view - as far as I understand - is another word for Newsfeed, so that might mean that what I want is not even possible at all?
To make things worse (but that's not the main topic of this question) the returned datasets from the two methods differ. Both contain entries that the other does not show but they also have many entries in common. Even worse, the same is true in comparison to the real newsfeed on the Facebook website.
Does anyone have any experience or deeper insights on this?
Maybe I am mis-understanding your question, but can't you simply call the graph api with /me/home?limit=5000 and then ?limit=5000&offset=5000 or whatever the max limit value Facebook allows is?