I'm looking for a API online service that provides the page download speed of specific pages on different websites. The result should give me the load time in milliseconds (i.e. 3400 ms).
I've firstly seen Google's PageSpeed service, but it only gives me a useless 'score'. I'm actually unable to find any other service that fits my needs.
If possible, i'm looking for a REST service that gives me a good number of requests limit (at least 2500 / day).
You can use http://www.webpagetest.org which has a restful API as well: https://sites.google.com/a/webpagetest.org/docs/advanced-features/webpagetest-restful-apis
You may need to run your own private version of WebpageTest in order to use the restful API (the public one requires an API key).
I am going to be brief here.
You can call Page Speed Insights API by Google. Here it is.
Related
Is it possible to do this? Ideally to return the report in the very same page with ajax?
Example the user adds www.mywebsite.com to the field and then the report of pagespeed is returned. If not possible then redirect to Pagespeed result page.
You have a few options here. Starting from easiest to hardest (and in my opinion "worst" to "best" solution).
Add the Page Speed Insights (PSI) test page to an iframe on your site. You can then change the URL of that iframe to https://developers.google.com/speed/pagespeed/insights/?url=yourwebsite.com and manipulate the ?url=yourwebsite.com to be whatever you want.
This may be against Google's terms of service and is also a bad user experience but it is the easiest way to achieve it. I will leave you to investigate that option if you decide to do it.
Redirect users to a new tab. So just do <a target="_blank" href="https://developers.google.com/speed/pagespeed/insights/?url=yourwebsite.com">view your report</a> or redirect via JS on a button click.
Yet again not a great option as people are leaving your site but at least this won't be against Google's terms of service.
Use the page speed insights API. https://developers.google.com/speed/docs/insights/v5/get-started.
This is your best option in terms of time vs flexibility. You supply the API with the URL and it returns a JSON response with all of the metrics it gathers and the scoring.
Please note PSI is on version 6 of the API which should be available for general use soon.
Obviously this is a lot more work but well worth the effort as you can style everything as you please.
Install Lighthouse, the engine that drives PSI on your own server.
You can find the Lighthouse repository here. Please note you need to know how to use node, it is useful to understand puppeteer and you need a reasonable amount of server admin knowledge to get chromium (used as a headless web browser for running the tests) working and linked correctly.
At this stage you have complete control and can write your own test, scoring criteria etc. You can also run as many tests as your server will allow. If you want this level of control and freedom then this is the best option. However be prepared to sink a lot of hours into this solution!
So I am trying to build my first full website, and my idea for this website involves using a publicly available API. The only issue is that most public APIs have a rate limit of a certain amount of requests per hour, and if I am making direct requests from my application to their API, I will probably run out of requests if I have any users whatsoever.
My question is, is there a way to design the website in a way that could not have the outside dependency? What I was thinking was using this public API to build my own API service that my website uses with only the information I need. The only issue I see with this is that the public API is constantly changing, so I will constantly have to run scripts to update my own API with the correct data and would have to redeploy. Is there any clean way of accomplishing this from a design perspective? Thanks
I'm trying to export data from Presence Insights on Bluemix, I followed the following documentation:
https://presenceinsights.ng.bluemix.net/pidocs/analytics/export/
however I can't seem to find export button mentioned inside the document.
Data can be exported from the IBM Presence Insights Dashboard if you have data available. There are also REST APIs for exporting data. They are documented in the Floors, Sites, and Zones sections of the API Reference.
There were REST APIs in the product some time ago, but they were found to have limitations that made them less useful in production. In particular, the amount of data that builds up forces the response time on the API to grow beyond what the Bluemix infrastructure allowed. The API requests would timeout. To that end, the APIs were backed out, but it appears the documentation was left. That will be removed shortly.
Presence Insights still understands the value of exporting the data, so a new scheme is under investigation. For example, it would be ideal if the data could be exported under the covers to a production data storage facility, on a regular time frame.
In the interim, an alternative solution would be to use a Subscription to gather the backend enter/exit/dwell/timeout events directly and roll your own solution to store only what you need in whatever facility works for your application.
I am very confused about the correct or recommended mechanism to use for accessing google fusion tables APIs in app scripts. There seem to be two methods with examples but no discussion about which is preferred or why. Is one of these interfaces newer and preferred while the other is dying? Is one obsolete or more restricted in what it can do?
Method 1 is the REST API described here
https://developers.google.com/fusiontables/docs/v2/sql-reference#Select
Method 2 is a set of library functions sort of described here under the Apps Script/Google Advanced Services:
https://developers.google.com/apps-script/advanced/fusion-tables
For example, using the REST api to do a dql query, we end up with something like this:
function runSQL(sql){
var getDataURL = 'https://www.googleapis.com/fusiontables/v1/query?sql='+sql;
var dataResponse = UrlFetchApp.fetch(getDataURL,getUrlFetchOptions()).getContentText();
return dataResponse;
}
And using the advanced API we use something like this:
result = FusionTables.Query.sql(sql, { hdrs: false });
The REST API seems much harder to use, requireing complex oAuth and developer keys to be configured in advance and coded into the application while the Advanced Services API harvests all this behind the scenes and makes for simple API calls like I show here.
I have seen numerous examples using each of the above with no hint as to why one author chose her mechanism instead of the other.
Your help is greatly appreciated.
The service within app-script is a work in progress, so the full functionality of the API might not be fully supported at the moment. As you mentioned though, the big advantage of the service over the REST API is that you do not have to handle the OAuth flow, as you only need to enable it on your script (as stated here).
The Apps Script "advanced service" implementation still lacks some advanced functionality (like alt=media format queries or multipart / resumable uploads) -- if it actually has those features, it lacks extremely basic documentation of them, to the point that the Apps Script editor autocomplete is unaware of them. The tradeoff of these functionality gaps is that you don't need to handle keys, request building, etc.
So, if you're doing simple sql select / importRows work, the Advanced Service should be able to cover almost all your needs. If you need to delete from your FusionTables, you might want to consider setting up the REST API - because deleting is 1 record per query, the better way to delete is to instead "download what you want to keep, then re-upload it back via replaceRows."
(This worked for me for a while, but eventually what I was keeping outgrew the Apps Script service's limitations and I began receiving Empty Response errors from the call to replaceRows. My remedy was to perform my record maintenance tasks via the REST API, where I can specify resumable uploads, timeouts, etc., while more "normal" interactions are done through the Advanced Service.)
How to retrieve large amount of data from REST API GitHub? Nowadays it provided only a small amount of data JSON from GitHub timeline, in many cases limited to only 300 events. I need a bigger volume to work in my Master Research and i need to know how to via the REST API.
github's api (and most IMHO good apis) use pagination to reduce load on themselves and clients. you could write a simple script to go through all the "pages" of results one at a time, then combine your results after the fact locally.
more info here:
http://developer.github.com/guides/traversing-with-pagination/