I'm trying to get statistics for likes on my domain. I would like to get all likes (if possible with user ids) for all pages on my domain (which has tens of thousands of pages)
What does domain_like_adds actually return?
SELECT metric, value FROM insights
WHERE object_id=[domain-id] AND
metric='domain_like_adds' AND
end_time=end_time_date('2011-01-03')
AND period=period('month')
Returns blank, does anyone know what data domain_like_adds returns?
Regards,
Niklas
I don't think there's any way you're going to get user IDs as that is a major privacy invasion, but I believe domain_like_adds indicates how many NEW likes your domain got in the given time period, as opposed to the cumulative likes your domain has earned until that point. It doesn't appear there's a viable way to determine the # of likes of all objects in your domain for all time without tracking it from the beginning and/or going back and summing up historical data.
You can make a sitemap.xml of your site and crawl the urls against the Facebook Graph API. I actually made a Ruby script to do this: http://bobbelderbos.com/2012/01/ruby-script-facebook-like-stats-blog/. I don't think you can get the users that 'liked' your pages, but this script might be useful to find out what URLs are most popular.
Related
We have spent several days looking into FB Graph API and third party tools for scraping FB data but cant figure out if it is even possible to scrape what we are looking for and if it falls into FB policies (really not looking forward to start a lawsuit with FB).
We need to obtain statistic of how often is specific question (read - problem that we will try to solve) posted on Facebook. We need to get all FB posts filtered by three criterium:
Location - country or city of user that posted the post
Time - Some reasonable period of time, for example a full month, week or day
Keyword - keyword that can be associated with questions that we are looking for
We would then takes this data set and manually go over it in order to distinguish whats relevant to us and what is not. Maybe use some language processing engine like wit.ai or api.ai to use data set to teach app to regonize which posts are relevant and which not. But thats on us, later.
So the question: Is it possible (technically and also from FB policies point of view) and what would be the steps to get FB posts filtered by three criterium stated above?
I need to quickly get the names of about 1000 users that I now only have the facebook id and access tokens of. I'm not comfortable with the FB api yet so I was considering just writing a scraper to retrive the name from the FB page of the user (since I have the id of the users).
Is this allowed? I assume it's not "best practice" but how severe is it? Will it get me banned for instance? The data will only be used to complete our user database so no advertisement
Alternativly: Can anyone point me in to a good (and up to date) guide of how to get user info using the FB api (keep in mind that I have the ID and the access.tokens of all my users).
No, scraping is not allowed and you MUST use the Graph API: https://www.facebook.com/apps/site_scraping_tos_terms.php
/me?fields=name&access_token=[user-access-token] returns the name of a User - You may run into API limits though, but if it´s a one time thing it should not really matter. If you run into limits, just wait a bit and get the next batch.
Is there a way that I can retrieve the total number of Likes for my entire website for the day?
Essentially, I'd like to display the total number of Like for each blog post that day.
It's all time stats and not for the day but this simple Graph API call shows how many times an URL has been shared, for example:
https://graph.facebook.com/http://google.com
If you launch it everyday with every URL of your website, you can store the values and get daily stats.
You can use Facebook Insights to have statistics for your domain.
Cheers,
You need to put some code into your website, as said in the documentation so you can't do it for the domain you want, just for domains under your administration.
For the rest, just can get the likes for a single URL, using, for example, this link
Total Likes shows only likes for current page, not whole websites, as it says. Try it with different urls.
If I have an e-commerce site, for instance, where I have a selection of products that each gets a 'Like' button, how would I go about ranking these products in order of their popularity?
What I do is, I store the number of likes locally on my own Database.
You can grab the number of likes for a URL from link_stats table:
http://developers.facebook.com/docs/reference/fql/link_stat/
I looked around a lot for a simpler solution, but it's best if you store the number of likes locally. You can update your local value when a user clicks on the like button or you can do it every so often for all URLs if you don't have that many. I personally update the number of counts for all my URLs once every 24 hours.
-Roozbeh
You may be able to do it if you have all your products on one page using this tool:
http://www.pagesort.com
I suggest you keeping local reference count (in database) and set up facebook realtime update on those objects. So every time somebody likes your url (object), facebook service will post update to your api. https://developers.facebook.com/docs/reference/api/realtime/
I am developing a social media monitoring application. Currently, we are entering Facebook page ids into the application to collect data from possible customers' Facebook walls (so we have a realistic sample for the customer for direct promotion).
These page ids are used to collect wall postings and comments and to compute statistics (e.g. to show most used words), and are presented to the user in a special view. Requirements are to collect all postings and comments without exception in near-live time. We currently have about 130 page ids in the system, with more to come.
Right now, I am using the Graph API for this, with several disadvantages:
FB API access is restricted to 600 request/10 minutes. To get a near-live view, I need to access the API at least each two hours. As we are using API requests in other parts of the program, too, it is obvious that the limit is hit sooner or later (actually, this already happens)
The responses are mostly redundant: to receive current comments, I have to request the wall postings (comments are enclosed in postings) with the URL http://graph.facebook.com/NAME/feed...
The probability for hitting the limits is dependent on the number of postings on the several walls
I cannot get all comments with this method (e.g. comments on postings some time ago)
I am currently trying out how to switch to (or to complement Graph API usage) using FQL by querying the stream and the comment tables but this also has limitations:
I cannot restrict my query to a specific timespan, leading to redundancy again
The max number of posts I am getting for each one of my 130 page ids is 61 - (why 61?)
I need an unpredictable number of additional requests because I need to get special objects like videos and links in separate requests.
My question now is - if anyone is doing similar things: How did you solve these problems? How do you get a pseudo-live-stream of a larger number (up to, say 1,000) of walls?
Letting the customer grant extra permissions to us is currently not an option.
you will probably have to meet with FaceBook and work out a contractual deal for greater access to their data. I would bet that the answer will be no, no and no, seeing as it appears you are trying to monetize their data, and furthermore, do so without the explicit permission of the users, but hey give it a shot.
I have a similar task - By default FB return only last ~50 posts or all in last 30 days(whichever is smaller) in FQL you should use created_time filter to receive more results. my current problem is that via FQL I receive no more than ~500 posts from any FB page wall even when LIMIT increased:
'select post_id from stream where source_id = 40796308305 and created_time <'.time().' LIMIT 1000000 ;'
this FQL request to CocaCola FB Page returns now ~300 posts only (less than 2 day posts).
If you find a better solution pls advise :)