Our website is getting mentioned on Twitter and people are running it through URL shorteners. In some cases this appends utm query parameters to the URL like so:
http://coreos.com/?utm_source=buffer&utm_campaign=Buffer&utm_content=buffer1b61d&utm_medium=twitter
However, going to that URL will generate a 400 and give an XML body!
<Error>
<Code>InvalidArgument</Code>
<Message>Invalid argument.</Message>
<Details>Invalid query parameter: utm_source</Details>
</Error>
This works fine on pages with a path however:
http://coreos.com/docs/sdk/?utm_source=buffer&utm_campaign=Buffer&utm_content=buffer1b61d&utm_medium=twitter
How do I configure Google Cloud Storage to work properly?
This was an issue with how Cloud Storage handled URL parameters on the root object, because the API uses the same domain as live traffic. Things have been updated to ignore this sort of unused parameter in buckets with a web config.
Related
I'm trying to download a CSV file from a website in Data Factory using the HTTP connector as my source linked service in a copy activity. It's basically a web call to a url that looks like https://www.mywebsite.org/api/entityname.csv?fields=:all&paging=false.
The website uses basic authentication. I have manually tested by using the url in a browser and entering the credentials, and everything works fine. I have used the REST connector in a copy activity to download the data as a JSON file (same url, just without the ".csv" in there), and that works fine. But there is something about the authentication in the HTTP connector that is different and causing issues. When I try to execute my copy activity, it downloads a csv file that contains the HTML for the login page on the source website.
While searching, I did come across this Github issue on the docs that suggests that the basic auth header is not initially sent and that may be causing an issue.
As I have it now, the authentication is defined in the linked service. I'm hoping that maybe I can add something to the Additional Headers or Request Body properties of the source in my copy activity to make this work, but I haven't found the right thing yet.
Suggestions of things to try or code samples of a working copy activity using the HTTP connector and basic auth would be much appreciated.
The HTTP connector expects the API to return a 401 Unauthorized response after the initial request. It then responds with the basic auth credentials. If the API doesn't do this, it won't use the credentials provided in the HTTP linked service.
If that is the case, go to the copy activity source, and in the additional headers property add Authorization: Basic followed by the base64 encoded string of username:password. It should look something like this (where the string at the end is the encoded username:password):
Authorization: Basic ZxN0b2njFasdfkVEH1fU2GM=`
It's best if that isn't hard coded into the copy activity but is retrieved from Key Vault and passed as secure input to the copy activity.
I suggest you try to use the REST connector instead of the HTTP one. It supports Basic as authentication type and I have verified it using a test endpoint on HTTPbin.org
Above is the configuration for the REST linked service. Once you have created a dataset connected to this linked service you can include it in you copy activity.
Once the pipeline executes the content of the REST response will be saved in the specified file.
The aim is to connect to a public REST api using ADF. It's my first stab at sending requests to a REST api in ADF. It is the Companies House ('CH') governement website's API in England.
I have created an account and obtained a key. Apparently, it is basic authentication and the user name is the API key and password will be ignored (CH note on authentication)
I want to explore the contents of the 'Search all' API (CH note on Search All) and want to copy the results to Blob Storage.
I therefore set the linked service to use REST as below, the obfuscated User Name is the key I obtained from CH, the password is jsut the key repeated as their documentation states they ignore the password:
[
I then have added a REST dataset referencing this linked service:
And the testing of the connection works fine.
Problems then arise in the copy data task, I am getting an error when previewing and also when I attempt a copy to blob of 'Invalid Authorization Header':
I'd be grateful for pointers on where I'm going wrong.
I can't reproduce your Auth error but i notice that you want to add some parameters with your GET request in the Request Body.
I think you need to add parameters in relativeUrl property:
A relative URL to the resource that contains the data. When this
property isn't specified, only the URL that's specified in the linked
service definition is used. The HTTP connector copies data from the
combined URL: [URL specified in linked service]/[relative URL
specified in dataset].
Also i suggest you checking the correct REST API format of Search Api you are using.There is no other special features in the ADF REST connector. Just make sure the GET request works locally and duplicate it.
I am trying to collect data from amazon web services. Every time I make the call I get back a 403 Forbidden.
This is what my code looks like (the link is jumbled):
Invoke-RestMethod -Uri "https://hosted-data-work.s3.amazonaws.com/20161121T220310.324/dw_split/73610000000000001/assignment_fact/part00101.gzAWSAccessKeyId=ASIAJVX3JXfd5dfdfRKJNGM74Q&Expires=1479839499&Signature=J4JdyX53AwH6wExVmoVAtkweCEI%3D&resp222onse-contentdisposition=inline%3B%20filename%3D%22assignment_fact-00000-095582fd.gz%22%3B&x-amz-security-token=bluh" -Method Get
The link above is a download file. I just want to get the data the simplest way possible. What else do I need to add in the call? I have no clue about aws!
How did you generate that URL? It looks like a presigned URL, which means the authorization for accessing the object will be granted based on the credentials used when presigning. There are a couple of possible reasons that could be giving you a not authorized response:
The credentials used to generate the presigned URL do not actually have permissions to read the object. Double check your IAM policies and/or ACLs for the bucket and the IAM user which generated the URL.
The signature got truncated/corrupted between the the time you generated the presigned URL and the time you tried to use it. Try logging the url when you generate it and again when you use it and compare to make sure they match exactly.
Presigned URLs expire after a specified validity period which cannot be longer than 1 week. Make sure you are generating a fresh URL when needed and setting the expiration appropriately.
Any of those could be causing the result you're seeing.
I was misinformed and took a bad approach at this problem. I did not know I can simply download the file to my computer. I though it had to be transferred from bucket to bucket, then to my computer.
There are many other question related to this, but they didn't help me fix my problem.
I'm using the Facebook server-side login for a website, which I want to test locally. The path that initiates the login action is [http://localhost:8080/fblogin] (this redirects to the Facebook login dialogue, and goes from there).
I can successfully get the code, but when I try to exchange that for an access token, I get the following error:
{"error":{"message":"Missing redirect_uri parameter.","type":"OAuthException","code":191}}
I am providing the redirect_uri, url encoded and it is the same as the one I use to get the first code. Here is the url I'm using to request the access token (with the all-caps query string parameters replaced with their actual values, of course):
https://graph.facebook.com/oauth/access_token?client_id=CLIENT_ID&redirect_uri=http%3A%2F%2Flocalhost%3A8080%2Ffblogin&client_secret=CLIENT_SECRET&code=CODE_FROM_FB
I suspect this might have to do with how my app is set up on Facebook. Here are the values I have set:
Display Name: (an actual display name here)
App Domains: localhost
Contact email: (an actual email here)
Site URL: [http://localhost:8080/fblogin]
What do I need to tweak in the settings to get this to work? Or does this look correct?
By the way, if it makes any difference, I am using the Play! framework, version 2.0.1
After digging around a little more, I found that it was necessary for me to use POST when sending the request from my server to get the access token.
Interesting that using POST worked for you as this didn't for me.
In any case, did you add the query parameters using setQueryParameter()? (see How to make multiple http requests in play 2?)
According to this Microsoft api page
http ://msdn.microsoft.com/en-us/library/live/hh826522#reading_albums
, I tried to translate this browser access skydrive URL
https://skydrive.live.com/?cid=0A263A7CBEAAFB80&sc=photos#cid=0A263A7CBEAAFB80&id=A263A7CBEAAFB80%214521&sc=photos
to
http://apis.live.net/v5.0/folder.0A263A7CBEAAFB80.A263A7CBEAAFB80!4521
Since the album is shared to public, I should be allowed to access without access token.Am I missed something? Or is just using the wrong URL?
I notice I can get public profile by using this
https ://apis.live.net/v5.0/0A263A7CBEAAFB80/
And I have read the REST API
http ://msdn.microsoft.com/en-us/library/live/hh243648.aspx
, but I found it is hard to deduce the correct url format
This get all albums
https ://apis.live.net/v5.0/0A263A7CBEAAFB80/albums
REST URL also fails.
I've been looking for a guide for this as well, but just noticed that your URL appears to have an additional '/' in it. The one listed on the site is:
https://apis.live.netv5.0/%fileid%
Were you able to get this to work?