Import feed to Google Spreadsheet - facebook

Trying, for funsies, to make Google Spreadsheet read comments from FaceBook. I have discovered that FaceBook actually uses an API for this, and I've been here and made the code to actually export it in a manner I find most convenient. Now I just need to make Google Spreadsheet somehow read it.. My first thought was to use the different import ways they have (IMPORTXML, IMPORTDATA, IMPORTFEED, IMPORTHTML and IMPORTRANGE). The IMPORTRANGE is a quite obvious no go in my book, and so far I've had no success with the other ones. Am I still too far off importing it, or is there hope? If I get the code a cURL I can just use a (really) long string in my web browser and get something I can surely extract data from, though I would only be able to do it locally, the goal with this is that it will be set globally so to speak.

Solved it using "How to import JSON data into Google Spreadsheets in less than 5 minutes", it instructed me to make a custom script in my sheets and paste this script in it including the generated text, and then simply use the new formula "ImportJSON". Worked like a treat :D Do I just delete this post or simply keep it as is?.. Still kinda fresh to Stack Overflow :P

Related

accessing XHR within google sheets

I've been using import functions a lot in my sports modeling, but I've never been able to figure out how to get past the issue of pulling information that is dynamically imported from another source.
For example, i'm trying to use importxml to pull the money line values in this link here: https://www.sportsbookreview.com/betting-odds/nfl-football/money-line/
I can get the information in the left columns up until "PINNACLE", and after research i now understand I can't get the rest of the information because it's not static on the page and I need to go to the source... how do I find the source of this information so I can pull it from there?
I tried inspecting the page, clicking on "network", clicking on "XHR", refreshing the page and previewing the results, but nothing seemed to match.
Am I looking in the wrong place?
The page uses websockets to download the data, so I don't think you could simulate it in Google spreadsheets using formulas (maybe it could be possible in a script). However in this particular case there is a 'classic view' variant of the page which includes all the data in its source:
https://classic.sportsbookreview.com/betting-odds/nfl-football/money-line/

Google Docs API - Spreadsheets which are Form Responses

So, I'm using the Google Docs API to extract spreadsheets programmatically.
However, I'm only interested in spreadsheets which are live feeds from form responses (survey responses). Is there a way of detecting this? So far the only approach I have is that the spreadsheet has a 'Form Responses 1' tab, which is quite a long way into the process; ideally I'd like to know before I download the file.
I'd be overjoyed by solutions which start with the form and tell you the relevant spreadsheet; ones which verify if a spreadsheet is a form's output are okay; knowing it's impossible would be helpful.
It is possible to get a list of the worksheets by using the Spreadsheets API.
Here is a link to the relevant documentation.
https://developers.google.com/google-apps/spreadsheets/#retrieving_information_about_worksheets
The only way I know this is possible is by using Apps Script, which is Google Apps's scripting language based on Javascript.
You can use Form.getDestinationId() to find the target spreadsheet, like in this example :
function myFunction() {
var form = FormApp.openById('1af5Ur_7IPHOXxdyqVXFF4tFA4WHx6PLf9uL56iPLMgI');
Logger.log(form.getDestinationId())
}
More info about Apps Script's Form capabilities here.
Note that Apps Script has very limited integration capabilities, so if you're trying to build an integrated webapp it might get complicated (but not impossible).

How can I program a button on an Access form to link to a browser window that looks up multiple addresses on Google Maps?

My problem is very similar to the one posted here:
http://www.utteraccess.com/forum/Plotting-Addresses-Maps-t1968130.html
except that thread never found any solutions. Basically, I'm working on an Access form that has a datasheet as a subform. Upon clicking a button on the main form I'm trying to make it so that a browser window opens up and, using the address columns from the spreadsheet data in the subform, plot all the address markers listed. I've looked up a lot of ways to attempt this but I've yet to find a way that seems to work.
I'm not even sure if it's possible to plot multiple markers on Google Maps, but according to research (and after trying it myself) it seems like it isn't, although I don't want to rule it out entirely because I'm still not 100% sure. However I know both Google Earth and batchgeo.com do allow this. I still want to try and do this on Google Maps, but if that doesn't work I want to try to do it using batchgeo.com and if that still doesn't work, then Google Earth (I don't want to make the user download external software if possible).
If it helps, from what I've read API's seem like a useful tool, though I'm not sure how to apply it to an Access form, it seems more like a way to embed to already existing websites.
I'd really appreciate if someone could help me figure out how to approach this problem!
Maybe this would help?
http://ramblings.mcpher.com/Home/excelquirks/getmaps/mapmarkers
It is Excel but should be translatable.
Here is another example, this time using Access:
http://www.utteraccess.com/forum/Google-Maps-Multiple-Mar-t1973499.html
...from what I've read API's seem like a useful tool, though I'm not
sure how to apply it to an Access form, it seems more like a way to
embed to already existing websites.
You're right. There's no way, that I'm aware of, to embed a Google Maps object in a form (like an ActiveX control). Microsoft MapPoint is a software product that lets you do Map integration by way of an ActiveX control (no need to use HTML and/or javascript).
What I usually do on a project like you're working on is I get my HTML page working the way I want it to, outside and independent of MS Access. You should be able to program and test the HTML file locally without having to use an actual web server. Just use something like NotePad++ or Sublime Text Editor 2 to write your HTML and Javascript and then open the file in your browser to see if it works. I'm quite sure you'll need to use Javascript in your HTML page to make this work. That's what the Google Maps API is all about.
After you have your webpage working, then you will have to go into Access and write code to create that web page on the fly with the address data for the current data set. You can just write it out to the Windows Temp folder and then open your browser control that that web page.
Julian Knight's answer links to more specifics on how to create the HTML page on the fly. It looks like gobble-de-gook, mostly because it is. Outputting HTML/Javascript/CSS from VBA is far less than optimal. This is why you troubleshoot it outside of Access, as much as you can.

XML Parsing of Multiline tags

I am using XML Parsing in one of my apps. I have not done this before but I observed something weird (or may be now) today. I am trying to locate a business on in my app and their googlemaps link is obviously big
http://maps.google.com/maps?f=q&source=s_q&hl=en&geocode=&q=1001+Fannin+Street,+Houston,+TX&aq=0&sll=37.0625,-95.677068&sspn=36.999937,73.476563&ie=UTF8&hq=&hnear=1001+Fannin+St,+Houston,+Harris,+Texas+77002&z=16.
Every time I load using this link the app crashes. However if I change the link to something like :
http://maps.google.com/?saddr=0000+FM+0000+RD+Houston+TX+77000
the app loads perfectly and works perfect.
I know this is not a problem in my app as I am using this just for reference to something else and not loading the addr in google maps app (as this works with the big link). So I am concluding that there is something wrong in the way I am writing it in my XML.
Please do not direct me to any tools and stuff that shorten the link etc. as I dont want to get in to that. I am sure that I am messing up somewhere in my basics so if some1 tells me what are the basics behind this.
Thank you,
Well if the top link is in XML and hasn't had its ampersands escaped you'll not have well formed XML.
& should be escaped as &

What are some good ways of keeping content from being copied to other sites

I understand that no matter what I do, someone will be able to copy it. However I can still make them work hard for it. What are some good ways of making data not easily copied using php compatible coding.
--- Added ----
The data is a listing of results for certain local sports events. We send people out to collect the information, post the information, make corrections and such. However a competing website takes our results (I know they are directly copying them) and never updates them which causes people to call our office and complain.
---- Answer for my Use ----
I picked one of them, however I am going to use multiple of your answers. I am going to add my link in a using the copy pasta trick. I am going to put fake hidden text into it. I am also going to do the fake hidden text trick with different versions of the div tag that are fake (making it even harder to scrape or to do something like copy to textpad and replace it real easily), and I am going to talk to a lawyer as well about legal recourse and what I can do to make it illegal for them to copy the data (such as creative bios or something cool like that). Thanks for your help.
Joe, you can't really make them work really hard to get your data. It's essentially just a single request to any of your pages. Your best option is to explicitly state that you own the rights to all of your content, and that any infringement on that ownership will lead to legal ramifications*.
* Not a lawyer
Your data will be copied to every computer that requests the page and it will stay there until the person clears their cache. To answer your question, you can't.
What you can do is create a CSS style such as:
.copy-pasta { display: none; }
And then throughout your content, add something like this:
<p class="copy-pasta">Content provided via [your website here]</p>
This will increase your page rank when copy-pasters blatantly steal your content, meaning you will show up first in search results.
Place some <div style="display: inline; position: absolute; overflow: hidden; width: 0px">useless words</div> in the text. It won't display for reading, but if someone copy and paste... "WOW where it came from WTF!! *CRY*"
How about putting links to your site in with the displayed data? No big fanfare, but just suggest that the for the most up to date figures, they can go to the real website that publishes them.
Most of what you try will only work for a time. Until you exceed their laziness factor. (What they're doing suggests a high laziness factor.)
Laws don't protect publicly available data, but you may be able to protect the packaging and presentation.
Programs used to copy out data look for the data using pattern-matching. You could 'decorate' your data with randomly-chosen tags (like one row would have a span tag surrounding it, the next row a div, etc...). Just a thought.
Clarification:
With screen-scraper at least, the user of the program specifies what HTML comes before the data they want, and what HTML comes after it. You can make it more difficult for them to automatically retrieve the data.
Why are people calling your office to complain if the data is on a competing website? If they have a domain name that is similar enough to yours that people are confusing the two of you or if they've put something on their site that makes it look like you've endorsed them, then you've got them for trademark infringement.
Disable the context menu is a start.
$(document).bind('contextmenu', function(e)
{
return false;
});
Or
<body oncontextmenu="return false;">
Forbidding people to get data is almost impossible. You can mess up your tags and make the code really dirty and hard to parse... but it's not really enough. You could also generate a big image with the data in it, this would be painful to parse! ... but you don't want to do that.
Because you said...
However a competing website takes our
results (I know they are directly
copying them) and never updates them
which causes people to call our office
and complain.
... my call would be to take this the other way and create an API allowing people to get your content in a way that YOU designed.
Also if they are just shamelessly stealing your data and they don't have the right to do it, consider a legal option.
Another option is to use PHP code to generate images from the site's HTML. You would use the images to display the content, instead of HTML which can be easily copied out. Example code is here, and I bet you could find more code to do this by Googling:
http://www.acasystems.com/en/web-thumb-activex/faq-php-convert-html-to-image.htm
Try Copyscape it wont prevent your content from being copied, but it will make finding the copies very easy.
You may encrypt the data on the page, and have javascript obfuscated decoding routine that will decode it for you viewers. You may switch keys and encryption algorithms from time to time. Same javascript should disable ability to select text and/or copy it to prevent manual copy-pasting.
They won't be able to copy manually and their scraper would have to be able to run javascript to get the data.
Caveat is that the data won't be visible for Google, but if data is rather numeric it might not be such a big harm.
If they scrape automatically and very often you may also try to pinpoint their IP by observing most active IP-s on your site and serve them fake data.
Please don't use lawyers, that's hitting below the belt.
use swf to display your data, just like other online books