Nuxt data fetching from api, local path? - axios

I'm trying to get my feet wet with Nuxt.
I understand that there are different scenarios for data-fetching:
- First call: Server fetches data from api, prerenders html/app, sends whole page
- After that: App on client makes requests to api directly, only fetches json
This is handeld by nuxt automatically.
So I guess I have to expose my API to the client as well, correct?
Would I set the base-path of Axios in Nuxt to something like "http://www.myproj.com/api" ?
If yes, is there any way that nuxt can access the api locally when providing server-rendered content (for example "http://localhost:3333") instead?

Yes there is. When configuring axios in your nuxt.config.js you can set a baseURL and a browserBaseURL. Nuxt will use the baseURL when pre-rendering and the browserBaseURL from the client.
You can see this in the docs here.
If you are deploying to a vps you can have your api running on something like http://localhost:3333 and set that as your baseURL. For the browserBaseURL, if you are using https, you would want so set up an upstream for your api in nginx so that your browserBaseURL would be something like '/api'.

Related

Data Factory can't download CSV file from web API with Basic Auth

I'm trying to download a CSV file from a website in Data Factory using the HTTP connector as my source linked service in a copy activity. It's basically a web call to a url that looks like https://www.mywebsite.org/api/entityname.csv?fields=:all&paging=false.
The website uses basic authentication. I have manually tested by using the url in a browser and entering the credentials, and everything works fine. I have used the REST connector in a copy activity to download the data as a JSON file (same url, just without the ".csv" in there), and that works fine. But there is something about the authentication in the HTTP connector that is different and causing issues. When I try to execute my copy activity, it downloads a csv file that contains the HTML for the login page on the source website.
While searching, I did come across this Github issue on the docs that suggests that the basic auth header is not initially sent and that may be causing an issue.
As I have it now, the authentication is defined in the linked service. I'm hoping that maybe I can add something to the Additional Headers or Request Body properties of the source in my copy activity to make this work, but I haven't found the right thing yet.
Suggestions of things to try or code samples of a working copy activity using the HTTP connector and basic auth would be much appreciated.
The HTTP connector expects the API to return a 401 Unauthorized response after the initial request. It then responds with the basic auth credentials. If the API doesn't do this, it won't use the credentials provided in the HTTP linked service.
If that is the case, go to the copy activity source, and in the additional headers property add Authorization: Basic followed by the base64 encoded string of username:password. It should look something like this (where the string at the end is the encoded username:password):
Authorization: Basic ZxN0b2njFasdfkVEH1fU2GM=`
It's best if that isn't hard coded into the copy activity but is retrieved from Key Vault and passed as secure input to the copy activity.
I suggest you try to use the REST connector instead of the HTTP one. It supports Basic as authentication type and I have verified it using a test endpoint on HTTPbin.org
Above is the configuration for the REST linked service. Once you have created a dataset connected to this linked service you can include it in you copy activity.
Once the pipeline executes the content of the REST response will be saved in the specified file.

Swagger Multiple hosts in same Json spec

I am using a single host for documenting REST API's in Swagger Ui 2.0 but I need two hosts in the JSON file for calling rest API's one for http and the other one for https. Is it possible? If yes then how to do that?
Thanks!
The way swagger figures out URLs is this:
You provide the basic one in index.html from where the swagger.json gets generated. The generated swagger.json does not contain a URL per se, or any http/https information. It only has a path relative to the base URL you provided.
After the UI gets generated based on generated swagger.json, the "Try it out" buttons execute GET/POST/PUT requests based on the URL info in the address bar. check this piece of code in your swagger-ui.js:
if (url && url.indexOf('http') !== 0) {
url = this.buildUrl(window.location.href.toString(), url);
}
So, if you want to use https, use https in the address bar to hit Swagger UI. You will also need to mention the same in your index.html, and in swagger-ui.js in the above code.

OpenStreetMap Direct Routing Request via GET error

I have problem with OpenRouteService API (Direct Routing Request via GET described in here http://wiki.openstreetmap.org/wiki/OpenRouteService#Direct_Routing_Request_.28via_GET.29).
My request is:
http://openls.geog.uni-heidelberg.de/route?start=18.609%2C53.02&end=18.749%2C53.49&via=18.01%2C53.12&lang=pl&distunit=KM&routepref=Pedestrian&&weighting=Recommended&avoidAreas&useTMC=false&noMotorways=false&noTollways=false&noUnpavedroads=false&noSteps=false&noFerries=false&instructions=false
(from Toruń in Poland to Grudziądz via Bydgoszcz).
Unfortonatly, I get error:
"validation error: Expected element 'EndPoint#http://www.opengis.net/xls' instead of 'viaPoint#http://www.opengis.net/xls' here in element WayPointList#http://www.opengis.net/xls"
If I put antyhing in "via=" this error appear.
When I change "via=" to empty value:
http://openls.geog.uni-heidelberg.de/route?start=18.609%2C53.02&end=18.749%2C53.49&via=&lang=pl&distunit=KM&routepref=Pedestrian&&weighting=Recommended&avoidAreas&useTMC=false&noMotorways=false&noTollways=false&noUnpavedroads=false&noSteps=false&noFerries=false&instructions=false
all works fine.
Is it problem with my request or api isn't working correctly?
It seems that the frontend API of OpenRouteService receives the GET requests correctly, but after it builds the request in xml and relays it to the backend server the server fails to validate the request correctly. It should be a problem with the backend server.
The frontend php code is here for reference.
The web frontend, which POSTs the xml directly to another backend server, seems to work correctly with via points.

Calling external REST web service from Single Page Application

I am creating a SPA with Backbone & Underscore JS. The basic feature of the app being that on entering a search term , it needs a trigger an external REST web service call and fetch the JSON response. However when i try this, the browser cancels the request as i guess it tries to make a cross-domain AJAX call.
I am hosting this SPA in my local and the REST web service is hosted on a external server. If i need to make cross-domain calls, what is the procedure which i need to follow without making any changes in the server side? I heard JSONP is one of the alternatives but not sure on the approach.
It looks like it is the same problem as in this question. It is pretty useful:
JSONP and Backbone.js
If your external service supports it already, you are correct that JSONP would be the way to go for cross domain requests without having to change anything on the server side. I assume you're using jQuery. Here's an example from jQuery's docs:
var flickerAPI = "http://api.flickr.com/services/feeds/photos_public.gne?jsoncallback=?";
$.getJSON( flickerAPI, {
tags: "mount rainier",
tagmode: "any",
format: "json"
})
.done(function( data ) {
console.log(data);
});
You'll notice the ?jsoncallback=? in the flickr URL. That tells flickr to wrap the response in a JSONP callback instead of just returning normal JSON. When flickr sees that, they wrap the response like this:
jQuery19104044632513541728_1395560629443({
"title": "Recent Uploads tagged mountrainier",
...other json data...
});
So instead of returing JSON, they wrap it in a function call which jQuery puts on the global window object. That function call will call your success function with the json data.
Luckily, you don't have to know anything about the inner-workings of it. All you do is call $.getJSON and it'll work!

Query parameters generate 400 in website configuration on root URL

Our website is getting mentioned on Twitter and people are running it through URL shorteners. In some cases this appends utm query parameters to the URL like so:
http://coreos.com/?utm_source=buffer&utm_campaign=Buffer&utm_content=buffer1b61d&utm_medium=twitter
However, going to that URL will generate a 400 and give an XML body!
<Error>
<Code>InvalidArgument</Code>
<Message>Invalid argument.</Message>
<Details>Invalid query parameter: utm_source</Details>
</Error>
This works fine on pages with a path however:
http://coreos.com/docs/sdk/?utm_source=buffer&utm_campaign=Buffer&utm_content=buffer1b61d&utm_medium=twitter
How do I configure Google Cloud Storage to work properly?
This was an issue with how Cloud Storage handled URL parameters on the root object, because the API uses the same domain as live traffic. Things have been updated to ignore this sort of unused parameter in buckets with a web config.