Unable Auhorize Access to REST API end service with Azure Data Factory with Token Error status code 401 Unauthorized - azure-data-factory

I have configured our Azure Data Factory with a REST Link Service. I have obtained a valid token from the end service.
The token successfully works when using the token with POSTMAN, but the token returns Error status code 401 Unauthorized with Azure Data Factory.
As you can see with POSTMAN when I send a GET request with the Token I get data back:
However, with Azure Data Factory I get the error:
{
"errorCode": "2200",
"message": "Failure happened on 'Source' side. ErrorCode=RestCallFailedWithClientError,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=Rest call failed with client error, status code 401 Unauthorized, please check your activity settings.\nRequest URL: https://pm2.preqinsolutions.com/apiCore/api/countries.\nResponse: ,Source=Microsoft.DataTransfer.ClientLibrary,'",
"failureType": "UserError",
"target": "Copy data1",
"details": []
}
The ADF Link Service is a follows:
Can someone see something obvioius that would prevent access due 401 Unauthorized status with ADF?
I have also tried the following ADF configuration. However, I'm getting the same error:

Please try changing the authHeader name to Authorization and the value to Bearer <your API key>. I suspect from the screenshot of Postman that is what it’s doing. If that does not work then please provide a link to the documentation. This explains Bearer tokens.
You might also check the lifetime of your access token and make sure it doesn’t expire after an hour, for example.
I did a quick test against a REST API I know (Power BI REST API). The linked service looks like this and the value for the Authorization header was Bearer MY_ACCESS_TOKEN_HERE.
The dataset looks like:
The source of the Copy activity looks like:
This succeeded.
I would suggest you contact pm2.preqinsolutions.com support to see if they can provide more information on your access token / API key and have them confirm that the API isn't restricted to be accessed from only certain allowed API addresses. (ADF will not be accessing it from an on-premises IP address like your laptop.) You might also change the REST API linked service to use a self-hosted integration runtime instead of an Azure integration runtime to validate the IP address the API is called from isn't the issue.

Related

GCP IAM REST API Service account key issue

I have been struggling with this particular issue in GCP. I am trying to generate service account keys using Rest API calls outside of GCP. Below is screenshot of the service account along with the roles.
The as far as i can tell the Service account "Service account admin key" is the parent to create, list, etc child permissions.
So when invoking the Rest API call to generate key using this documentation:2
I get the below error
{
"error": {
"code": 403,
"message": "Permission iam.serviceAccountKeys.create is required to perform this operation on service account projects/XXXYYYZZZZZZ/serviceAccounts/XXXYYYYZZZZZZ.iam.gserviceaccount.com.",
"status": "PERMISSION_DENIED"
}
}
What am I missing?!
Updated: Adding additional screenshots of how i setup authorization and testing of Rest API call.
Following your steps, I was able to replicate it without any errors. As an alternative you can generate an access token instead as authentication.
Add an Auth Header. Generate a Bearer Token by using the command below:
gcloud auth application-default print-access-token
Remove the API Key to your URL. This sample URL retrieves:
https://iam.googleapis.com/v1/projects/PROJECT_ID/serviceAccounts/SA_NAME#PROJECT_ID.iam.gserviceaccount.com/keys
Add keyTypes
USER_MANAGED
Add access token from the gcloud results above.
See sample screenshots below:
You can also refer to this if you want to generate service account keys, just make sure you update your URL, add a JSON body with keyAlgorithm, and use POST instead of GET. For more info, follow this guide.

Azure Data Lake Storage Gen2 REST API - List filesystems - "code": "AuthorizationPermissionMismatch

I try to get list filesystems and their properties in the Azure Storage account using request:
https://<account_name>.dfs.core.windows.net/?resource=account
with one header Bearer
and get response
"error": {
"code": "AuthorizationPermissionMismatch",
"message": "This request is not authorized to perform this operation using this ******"
}
But when I make request for list paths:
https://<account_name>.dfs.core.windows.net/<filesystem>?recursive=true&resource=filesystem
I get a response with correct data.
Can you give me some advice on what is wrong?
P.s. My auth params.
enter image description here
I tested the same request in my environment by getting the token using client ID and secret. I received the same error as you are getting :
As a Solution ,
I added Storage Account Contributor role to the Service Principal
which I am using to get the bearer token as below:
Then , I get the bearer token using below client_credentials method:
After the above, When I perform the same request again then it gets
successful:

UnrecognizedClientException while testing API Gateway with integration type Kinesis

I have created an API using API gateway with integration type 'Kinesis'. I am trying to access ListStreams method. I have created a role with AmazonKinesisFullAccess policy and trusted identity is set to apigateway.amazonaws.com. I have provided arn to the execution role field but while testing this API using console, getting following error:
<UnrecognizedClientException>
<Message>The security token included in the request is invalid</Message>
</UnrecognizedClientException>
Thanks
This was a classic example of wrong/incomplete exception detail.
My fault was that I was looking for kinesis in a wrong region. But rather than returning some error like "Resource not found", api gateway returned "Invalid security token".

Azure KeyVault Get Secret API responds with 404 or 401 error

I am trying to get a secret out of Azure Key Vault. It is a very simple Restful API call. For example for all key, it is as simple as this:
GET {vaultBaseUrl}/secrets?api-version=7.0
where vaultBaseUrl is provided in Azure Console as Vault DNS name.
I am using console mode for testing
https://learn.microsoft.com/en-us/rest/api/keyvault/getsecrets/getsecrets#code-try-0
But the return value if always 404.
When I try curl in the Azure console, it gives 401 - Unauthorized.
However I can use the command line to get the secret out.
Is there any secret to making the restful call and curl work to get the secret out? All these situations use the same credentials.
A side questions is, that on the micorosft api testing page there is a 'Request Preview' section with a green Run button, almost as if it is inviting you to run the api, but the link is to learn.microsoft.com and the copy button on the box is disabled. I have never seen so many problems in one place, so I am thinking may be I don't understand something here.
The doc seems not to be correct. If you want to get the secret, you could use the Client credentials flow to get the access token and use it to get the secret.
Follow the steps as below.
1.Register an app in the Azure Active Directory, see this link. Get the application id and key, see this link. Add the service principal in the Access policies in your keyvault with the correct secret permission(just search the name of your AD App then add it).
2.In the postman, send a request to the url
POST https://login.microsoftonline.com/{your tenant id}/oauth2/token?api-version=1.0
Request body and complete sample(client_id and client_secret are the application id and key in step 1):
3.Copy the access_token in step 2, then use it as an Authorization token to call the api:
GET https://yourkeyvault.vault.azure.net/secrets?api-version=7.0
Besides, if you want to use CURL to get the secret, try the one like below. The TOKEN is the same with the access_token in step 2 above.
curl -X GET -H "Authorization: Bearer [TOKEN]" https://yourkeyvault.vault.azure.net/secrets?api-version=7.0
For more details about getting access_token via curl and complete steps, you could refer to this link. Don't forget to change the resource to https://vault.azure.net in the Request the Access Token step.
Pass Bearer token.There will be an Url to generate a token and pass it to authentication then u will donot get the error.

Access to cloud storage from client URL

From a Google cloud application, I need to open a file located in my project’s cloud storage. I tried to use a URL of the following form to access the file but I get the error below:
http://storage.googleapis.com/my-bucket/my-file
Error: Access denied. Anonymous caller does not have storage objects
www.googleapis.com/upload/storage/v1/b/http://my_appl//my-bucket/my-file
Error 404
www.googleapis.com/storage/v1/b/my-bucket/my-file
Error 404
https://www.googleapis.com/storage/v1/b/my-bucket/o/my-file
"code": 401,
"message": "Anonymous caller does not have storage.objects.get access to my-bucket/my-file
https://www.googleapis.com/storage/v1/b/my-bucket/o/my-file/place?key=my-key
Not found
Am I composing the URL incorrectly?
http://storage.googleapis.com/my-bucket/my-file
This one is fine. However, unless an object is publicly readable, you'll need to authorize the request, which means either including an "Authorization" header in the request with appropriate credentials or signing the URL with the private key of a service account.
https://www.googleapis.com/download/storage/v1/b/my-bucket/o/my-file?key=my-key&alt=media
This is also okay, but an API key does not provide authentication. You'll still need an Authorization header unless the object is publicly viewable.