I am trying to create a ranking system for URLs that are fetched from RSS Feeds.
I tried to get FB likes or Twitter shares for URLs but I dont see consistent result.
Instead of doing that by myself, is there any 3rd party library which does such calculation?
What are the best practices on finding share counts for URLs on Twitter + Facebook?
What inconsistency are you seeing?
You should be able to score them by matching the URL to, for example, a tweet and then scoring based on its retweet_count value. The important step is determining what you want your scoring to be based on.
If it's time sensitive then look at the Reddit or Hacker News algorithms that use number of votes (in your case, shares and retweets) and the time since they were posted. That way newer tweets would score higher but older tweets with many retweets would still remain high (reflects popularity).
Related
While figuring out how to use the Instagram Graph API for Hashtag searches I found out that the rate limit kind of makes it impossible for my use case to search hashtags. Maybe I just don't know the best practices for the Hashtag search so here is my use case:
A client wants an application to manage coupon/discounts initiatives for his clients. Each of those clients have a store and want to offer discounts for everyone posting on instagram with a store-specific hashtag. My client wants to create/update/delete stores in an application and define the hashtag for a store. Thats all done. Now I need to trigger hashtag searches for each store. And heres the problem:
There are 70 stores (at first, may be more in the future). Each store has a unique hashtag and wants to know who posts something with their hashtag. When someone posts with a hashtag the application should know after max. 20 seconds. This means triggering a hashtag search 70 times (because there are as many hashtags as there are stores) every 20 seconds (I know once I get the hashtag ID I can save it and don't need to search for the acutal term anymore, but I still need the newest media for the hashtag ID).
Thats one thing. The other thing is I can only query 30 different unique hashtags in a week (facebook api limitation). But I will need to query at least 70 different hashtags. And the stores don't want to create a facebook account and instagram business page, so they can oauth into my application just for that.
So at the moment the way of doing it is kind of create 70 fake-pseudo-accounts (one for each store) and use those acocunts to prevent hitting the rate limit. But I don't think this is the way facebook wants me to use the graph api.
In this post Facebook API rate limit increase an answer from 2017 shows that there is a possibility to request a rate limit increase and in the linked picture is even the sentence "This is a page management app with few users but many calls". But more recent comments show that this is not possible anymore and I couldn't find any rate limit increase option in the developer portal.
https://stackoverflow.com/a/39408611/6315447 This answer even says there is no possibility anymore and If you hit the rate limit you're doing something you are not supposed to. But how should I use the API then?
Am I missing something? What is the best practice for such a use case?
Thanks in advance.
We have spent several days looking into FB Graph API and third party tools for scraping FB data but cant figure out if it is even possible to scrape what we are looking for and if it falls into FB policies (really not looking forward to start a lawsuit with FB).
We need to obtain statistic of how often is specific question (read - problem that we will try to solve) posted on Facebook. We need to get all FB posts filtered by three criterium:
Location - country or city of user that posted the post
Time - Some reasonable period of time, for example a full month, week or day
Keyword - keyword that can be associated with questions that we are looking for
We would then takes this data set and manually go over it in order to distinguish whats relevant to us and what is not. Maybe use some language processing engine like wit.ai or api.ai to use data set to teach app to regonize which posts are relevant and which not. But thats on us, later.
So the question: Is it possible (technically and also from FB policies point of view) and what would be the steps to get FB posts filtered by three criterium stated above?
I use FB Insights data everyday when running analysis for my company. However, I have had some inconsistencies in the data and don't know if it is caused by a misunderstanding of the meaning of the "deeper" metrics. I have searched everywhere and am hoping that someone can help me.
Key Metrics tab:
Why is my Daily Organic Reach less than my Daily Reach of Page Posts? What is the count delta from? What is not included in Daily Organic Reach this included in Daily Reach of Page Posts?
Can you reach the same person Organically and Virally? Why does Organic Reach + Viral Reach equal to more than Total Reach (and Paid Reach is 0)?
‘Daily Likes Sources’ tab:
What are the full definitions of each of the sources: profile_connect, mobile, api, recommended_pages, page_suggestions, timeline, external_connect, page_profile, hovercard, search, ticker, like_story
Are mobile likes independent of the others?
Why would the Daily New Likes column in the Key Metrics tab not equal the summation all the columns for the same day in the Daily Likes Sources tab?
‘Daily Viral Reach by Story Type’ tab:
What are the full definitions of each of the story types: fan, page post, user post, mention?
If we normally get 1-5 viral uniques from user post, and then one day get 1.5k, what is the likely source of this?
‘Daily Page Consumers by Consumption Type’ tab:
What are the full definitions of the consumption types: other click, link click
Are photo views and video views included in other clicks?
Well I have only recently started using facebook insights and this is what I can share with you.
Q. Why is my Daily Organic Reach less than my Daily Reach of Page Posts?
A. As far as I understand these are two very different factors, Your organic reach is the number of people that find you through searches (Facebook, google, yahoo, bing ...) while Daily reach of page posts is the number of people visiting your page due to internal efforts, such as posting status updates.
Q. Can you reach the same person Organically and Virally?
A. Organic was explained previously while Viral is an effect of other people sharing your posts, be it images, statuses or videos. In general if you have interesting updates your viral reach will be higher than your organic reach. So basically yes, this just means that you are reaching a target audience using two different methods of marketing (If you are achieving in this, then keep it up, the more platforms people see you on, the comfortable they become with your brand)
The rest of your paragraph is very difficult for me to follow, please could you re-write in an easier point per question format.
1.1 I have explained why the organic reach is less, if you want to increase your organic reach you need to do some SEO (Search Engine Optimization) I really hope you know what it means.
Im not quite sure what count delta is, but I am assuming that if it refers to organic vs posts then it would be the difference between them. I am not 100% sure of what factors are included in each type of reach (Sorry, but I suggest you do some more research on that)
1.2 To re-iterate, these are two different forms of online marketing so simply yes, you can reach the same person by doing both forms of marketing. Paid reach is affected by many things, first it depends on whether you are doing ppc (pay per click) or cpm (pay per impression). Here you will need to run multiple campaigns and set your ctr targets, then compare the results and choose which is better. You need keep repeating this to continuously optimize your spending per person entering your page.
2.1 profile_connect - A like through a friend
mobile - A like from the mobile site (m.facebook.com)
api - This would be if you have the api on an external website of yours
recommended_pages - Liked by someone cause a friend posted it to them either in chat or in a PM
page_suggestions - I think this is if another page has liked you and refers people to you, but not 100% sure
timeline - No clue, this is new to FB and I havent had a chance to look at it yet ;)
external_connect - This would be from an external website, similar to the api, but rather just a url link as opposed to the api (can be found on forums...)
page_profile - ??
hovercard - ??
search - Facebook search
ticker - ??
like_story - Some one that came to your page because they saw a post of yours (true Viral)
2.2 I highly doubt that mobile like are independant of others, I just dont think that f-book has setup proper tracking for their mobile site
2.3 I think that people removing their likes may cause this, but I have asked myself this question a number of times :/
3.1 ok well you need to understand that viral reach is when a friend of a fan sees the post.
fan - A friend of a fan acted on a post that they saw (eg: John likes top-racers fan page)
page post - you share your page and it lands up on a fans wall and one of their friends act on it
user post - you post to your page ...
mention - A fan mentions you in one of their posts (the strongest viral)
3.2 Well this is a tough one because I don't know what you did, but I would suggest you list everything you do and when this happens have a look to see what you did different (sometimes viral really is viral)
4. I have never looked into these, sorry.
I hope this information helps you, but I really would suggest that you try and do some more research on the topic. Remember that Google tactics aren't necessarily going to work in Facebook, especially with paid advertising.
Good luck ;)
I'm trying to get statistics for likes on my domain. I would like to get all likes (if possible with user ids) for all pages on my domain (which has tens of thousands of pages)
What does domain_like_adds actually return?
SELECT metric, value FROM insights
WHERE object_id=[domain-id] AND
metric='domain_like_adds' AND
end_time=end_time_date('2011-01-03')
AND period=period('month')
Returns blank, does anyone know what data domain_like_adds returns?
Regards,
Niklas
I don't think there's any way you're going to get user IDs as that is a major privacy invasion, but I believe domain_like_adds indicates how many NEW likes your domain got in the given time period, as opposed to the cumulative likes your domain has earned until that point. It doesn't appear there's a viable way to determine the # of likes of all objects in your domain for all time without tracking it from the beginning and/or going back and summing up historical data.
You can make a sitemap.xml of your site and crawl the urls against the Facebook Graph API. I actually made a Ruby script to do this: http://bobbelderbos.com/2012/01/ruby-script-facebook-like-stats-blog/. I don't think you can get the users that 'liked' your pages, but this script might be useful to find out what URLs are most popular.
I am developing a social media monitoring application. Currently, we are entering Facebook page ids into the application to collect data from possible customers' Facebook walls (so we have a realistic sample for the customer for direct promotion).
These page ids are used to collect wall postings and comments and to compute statistics (e.g. to show most used words), and are presented to the user in a special view. Requirements are to collect all postings and comments without exception in near-live time. We currently have about 130 page ids in the system, with more to come.
Right now, I am using the Graph API for this, with several disadvantages:
FB API access is restricted to 600 request/10 minutes. To get a near-live view, I need to access the API at least each two hours. As we are using API requests in other parts of the program, too, it is obvious that the limit is hit sooner or later (actually, this already happens)
The responses are mostly redundant: to receive current comments, I have to request the wall postings (comments are enclosed in postings) with the URL http://graph.facebook.com/NAME/feed...
The probability for hitting the limits is dependent on the number of postings on the several walls
I cannot get all comments with this method (e.g. comments on postings some time ago)
I am currently trying out how to switch to (or to complement Graph API usage) using FQL by querying the stream and the comment tables but this also has limitations:
I cannot restrict my query to a specific timespan, leading to redundancy again
The max number of posts I am getting for each one of my 130 page ids is 61 - (why 61?)
I need an unpredictable number of additional requests because I need to get special objects like videos and links in separate requests.
My question now is - if anyone is doing similar things: How did you solve these problems? How do you get a pseudo-live-stream of a larger number (up to, say 1,000) of walls?
Letting the customer grant extra permissions to us is currently not an option.
you will probably have to meet with FaceBook and work out a contractual deal for greater access to their data. I would bet that the answer will be no, no and no, seeing as it appears you are trying to monetize their data, and furthermore, do so without the explicit permission of the users, but hey give it a shot.
I have a similar task - By default FB return only last ~50 posts or all in last 30 days(whichever is smaller) in FQL you should use created_time filter to receive more results. my current problem is that via FQL I receive no more than ~500 posts from any FB page wall even when LIMIT increased:
'select post_id from stream where source_id = 40796308305 and created_time <'.time().' LIMIT 1000000 ;'
this FQL request to CocaCola FB Page returns now ~300 posts only (less than 2 day posts).
If you find a better solution pls advise :)