I was wondering if anyone know how i could access facebook API from within R.. I would like, to access the text in my wall post, and do very basic analysis. The problem is that, aside from the fact that my knowledge of text mining is very basic, my knowledge of how to use web-api (if that is even a correct concept), is non-existing.
Assuming that i have an api-key, can someone provide me with a very basic example/code to demonstrate how i could make the connection from R to facebook, and download some data? I imagine i will need one or more R packages, such as: Rcurl, Rjson...
My main focus right now is to learn/improve my text mining skills in R, and so i don't want to get lost into/distracted by basic programing to access facebook api.
Finally, i read in the comment on a related question
Update Facebook status using R?
that
"...it's not like you can pull large amounts of data from Facebook to do data analysis...".
Can anyone elaborate on that?
Thanx
you can use Rfacebook package from following link to access Facebook API http://cran.r-project.org/web/packages/Rfacebook/Rfacebook.pdf. Further to this, you can use advance text mining packages within R for text mining on the feed.
A more comprehensive help is here http://pablobarbera.com/blog/archives/3.html
An example on how to use it is here
http://thinktostart.wordpress.com/2013/11/19/analyzing-facebook-with-r/
Why even worry about using the API at all? You can use a feature in Facebook to download all your data and it comes as a zipped file with HTML as the main data store. From there, you can grep and mine to your hearts content and you will be able to learn much more about R than jumping in headfirst with APIs.
With the Facebook Graph API, you can get Facebook data for text mining.
You cannot search for posts using the Facebook search bar
But the Graph API supports searching for the following types of objects:
All public posts: https://graph.facebook.com/search?q=watermelon&type=post
People: https://graph.facebook.com/search?q=mark&type=user
Pages: https://graph.facebook.com/search?q=platform&type=page
Events: https://graph.facebook.com/search?q=conference&type=event
Groups: https://graph.facebook.com/search?q=programming&type=group
Places: https://graph.facebook.com/search?q=coffee&type=place
Checkins: https://graph.facebook.com/search?type=checkin
Objects with location
I would highly recommend JavaScript, PHP or other languages mentioned in this document for getting the data and then using R for text mining since R has great tools for text mining
If you still prefer to go with R for the mining, you can connect to Facebook using RCurl and rjson and get the pages using getURL() command and write a small parser to parse the data using simple Regex.
Source: Did the same thing during an internship
if you want easy access, you can also consider using twitter. the twitteR packages provides easy access to public accounts! plus there are some hands-on text mining applications for twitter online, see for example: http://jeffreybreen.wordpress.com/2011/07/04/twitter-text-mining-r-slides/
Related
To start: I'm completely new to working with APIs, so please bear with me.
My first question is related to getting access to the Walmart API. I see the example code to generate time stamp and signature. How do I run this file? I've looked at YouTube videos, the Walmart tutorial, and other posts in this forum and am still a little stuck.
Second, I'm guessing this file needs to be included in the actual application to continue to be able to access the products?
Third, my goal is to map only a subset of the product catalog for users of the app to view. Let's use 'soda' as an example. Is it the Taxonomy API I need to use? And how do I limit the available products a user can search?
Note: This will be implemented in a Flutter application, if it makes any difference.
I need to show telegram channel posts in a website. but I don't know how to export telegram channel into xml. I need to have both texts and images and also other files and media like mp4 - pdf or other things.
Is there any way to do that?
In three steps:
First you need create a bot with #botfather. Then add bot to channel. (There is no need to make bot admin.)
Second use a programming language and write a program that receives message from channel and send it to server.
Third you must provide a way in site back-end to receive posts that your program sends.
For second step i suggest you to use python. there are some modules that can deal with bots.i think in your case telepot can be simplest module that do everything you need.
For third step you must add more details about your site back-end. anyway i suggest you to write a Restful API for back-end and send posts to site with python requests module.
You need to use telegram API to access the content of a channel.
Telegram API is fairly complicated. There are clients in different languages that makes it easier to interact with the API.
I personally worked with Telethon and it's relatively simple to get it work. If you follow the directions on the home page, there is also an interactive client you can play around to get yourself familiar with how it works.
If you are familiar with other languages there are clients for those languages as well. If you prefer any specific language please comment.
Call me crazy, but I'm looking for a service that will deliver autocomplete functionality similar to Google, Twitter, etc. After searching around for 20 min I thought to ask the geniuses here. Ideas?
I don't mind paying, but it would great if free.. Also is there a top notch NLP service that I can submit strings to and get back states, cities, currencies, company names, establishments, etc. Basically I need to take unstructured data (generic search string) and pull out key information with relevant meta-data.
Big challenge, I know.
Sharing solutions I found after further research.
https://github.com/haochi/jquery.googleSuggest
http://shreyaschand.com/blog/2013/01/03/google-autocomplete-api/
If you dont want to implement it yourself, you can use this service called 'Autocomplete as a Service' which is specifically written for these purposes. You can access it here - www.aaas.io.
you can add metadata with each record and it returns metadata along with the matching results. Do check out demo put up on the home page. It has got a very simple API specifically written for autocomplete search
It does support large datasets and you can apply filters as well while searching.
Its usage is simple - Add your data and use the API URL as autocomplete data source.
Disclaimer: I am founder of it. I will be happy to provide this service to you.
There seems to be no available documentation for the Objective-C client for the Google Data API. Google's API help webpage only has options for .NET, Java, Python, and the HTTP Protocol. I want to access data from a spreadsheet on my Google Docs account, and then add new data.
I have added the correct source codes to my project as outlined in the GData Wiki, and am now completely lost. There are a ton of classes to sort through for Spreadsheets, and there are very few comments and I can't really tell by method names what does what.
If possible, can someone post a couple snippets of code to first access the available documents, then pick one of the choices, and then add information to a cell (like A1)?
Thank you in advance for your consideration!
Did you look at the spreadsheet sample?
http://code.google.com/p/gdata-objectivec-client/wiki/GDataObjCIntroduction
A year or so later, a page for just this: Google Data APIs Objective-C Client Library
Generally, to create a new entry, you'll use an http POST to the feed's postLink. The Obj-C library service class provides POST operations as the method fetchEntryByInsertingEntry:
I believe creating a new spreadsheet still requires uploading a new document, as mentioned in the docs. You could create a simple CSV text file and upload that as a spreadsheet.
Note that there is a discussion group for users of the library.
I have an internal tool written in java. It would be useful to get a little
feedback on how much it is used by colleagues.
A simple solution would be to have the application display an image which it fetches from
a web hit counter like application and just look at how often the image is accessed.
So what I am looking for: a stand-alone application (i.e. no Apache modules, cgi scripts, etc),
which serves one or a couple of static images and and can log accesses, preferably with as
little as possible of support of everything else.
Searching for "hit counter" gave little relevant, "lightweight http server" was more relevant, although mostly overkill still. Any suggestions?
You could try using Google Analytics. Most of the time, people using Google Analytics are tracking pageviews on a web page, and Google Provides some javascript that you can place on your page and it will track the visits to that page as well as browser capabilities/etc. Behind the scenes, that javascript is placing an image tag on the page in the manner you describe.
However, since your application is java and not a web app (I assume it's a standalone and not an applet), you won't be able to include Google's javascript (unless you embed a javascript interpreter...yick). Fortunately, it is possible to use Google's analytics without javascript.
The trick is that Google's scripts use the image http://www.google-analytics.com/__utm.gif and pass parameters via the query string. You can find a list of the parameters you can pass to the query string here. So all you'd have to do is figure out what the query string should be and have your client make the request to google's image (after setting up your google analytics account, of course).
Just use Google Analytics, it's really easy and requires a short script on your pages.
Michal Kebrt's simple UNIX HTTP server does exactly what I was looking for.