GCP IAM REST API Service account key issue - rest

I have been struggling with this particular issue in GCP. I am trying to generate service account keys using Rest API calls outside of GCP. Below is screenshot of the service account along with the roles.
The as far as i can tell the Service account "Service account admin key" is the parent to create, list, etc child permissions.
So when invoking the Rest API call to generate key using this documentation:2
I get the below error
{
"error": {
"code": 403,
"message": "Permission iam.serviceAccountKeys.create is required to perform this operation on service account projects/XXXYYYZZZZZZ/serviceAccounts/XXXYYYYZZZZZZ.iam.gserviceaccount.com.",
"status": "PERMISSION_DENIED"
}
}
What am I missing?!
Updated: Adding additional screenshots of how i setup authorization and testing of Rest API call.

Following your steps, I was able to replicate it without any errors. As an alternative you can generate an access token instead as authentication.
Add an Auth Header. Generate a Bearer Token by using the command below:
gcloud auth application-default print-access-token
Remove the API Key to your URL. This sample URL retrieves:
https://iam.googleapis.com/v1/projects/PROJECT_ID/serviceAccounts/SA_NAME#PROJECT_ID.iam.gserviceaccount.com/keys
Add keyTypes
USER_MANAGED
Add access token from the gcloud results above.
See sample screenshots below:
You can also refer to this if you want to generate service account keys, just make sure you update your URL, add a JSON body with keyAlgorithm, and use POST instead of GET. For more info, follow this guide.

Related

Unable Auhorize Access to REST API end service with Azure Data Factory with Token Error status code 401 Unauthorized

I have configured our Azure Data Factory with a REST Link Service. I have obtained a valid token from the end service.
The token successfully works when using the token with POSTMAN, but the token returns Error status code 401 Unauthorized with Azure Data Factory.
As you can see with POSTMAN when I send a GET request with the Token I get data back:
However, with Azure Data Factory I get the error:
{
"errorCode": "2200",
"message": "Failure happened on 'Source' side. ErrorCode=RestCallFailedWithClientError,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=Rest call failed with client error, status code 401 Unauthorized, please check your activity settings.\nRequest URL: https://pm2.preqinsolutions.com/apiCore/api/countries.\nResponse: ,Source=Microsoft.DataTransfer.ClientLibrary,'",
"failureType": "UserError",
"target": "Copy data1",
"details": []
}
The ADF Link Service is a follows:
Can someone see something obvioius that would prevent access due 401 Unauthorized status with ADF?
I have also tried the following ADF configuration. However, I'm getting the same error:
Please try changing the authHeader name to Authorization and the value to Bearer <your API key>. I suspect from the screenshot of Postman that is what it’s doing. If that does not work then please provide a link to the documentation. This explains Bearer tokens.
You might also check the lifetime of your access token and make sure it doesn’t expire after an hour, for example.
I did a quick test against a REST API I know (Power BI REST API). The linked service looks like this and the value for the Authorization header was Bearer MY_ACCESS_TOKEN_HERE.
The dataset looks like:
The source of the Copy activity looks like:
This succeeded.
I would suggest you contact pm2.preqinsolutions.com support to see if they can provide more information on your access token / API key and have them confirm that the API isn't restricted to be accessed from only certain allowed API addresses. (ADF will not be accessing it from an on-premises IP address like your laptop.) You might also change the REST API linked service to use a self-hosted integration runtime instead of an Azure integration runtime to validate the IP address the API is called from isn't the issue.

Keycloak Account management api update password does not work

I am trying to update my password via keycloak account management using postman and I get this error:
"error": "RESTEASY003650: No resource method found for POST, return 405 with Allow header"
My endpoint: http://keycloak_url/auth/realms/{realm name}/account//credentials/password/
I have done a post request
Password reset functionality via API is removed from keycloak(12+) as it was unsafe. You can refer this thread from github. You won't find /credentials/password/ api if you are using keycloak 12 or above.
Alternative that I can suggest is that use Application Initiated Action (AIA) or use Admin Rest API
You can see further these got removed from keycloak here.
References : https://github.com/keycloak/keycloak/pull/7393#issuecomment-773502862
I am under keycloak 17+, I also had troubles to make it work,
The correct url to use should be like:
https://myHost.com/auth/admin/realms/myRealm/users/99999999-9999-9999-9999-999999999999/reset-password
You absolutely need the /auth/admin/realms keywords (some other endpoints only use /auth/realms) !
You will also need an access token from either a keycloak user or a keycloak client in the Authorization header. Check somewhere else to see how to generate and use an access token.
The body should be like:
{
"type": "password",
"temporary": true,
"value": "myNew-password1"
}
Check documentation:
https://www.keycloak.org/docs-api/17.0/rest-api/index.html#:~:text=Set%20up%20a%20new%20password%20for%20the%20user.

How to auto generate new Bearer Token in Postman for GCP Storage

I am trying to upload file from local to GCP bucket through cloud storage Rest API (https://storage.googleapis.com/upload/storage/v1/b) using Postman.
I am using Bearer Token for authorization and running $(gcloud auth print-access-token) command on GCP Shell to generate that token every time.
I need to know, how to auto generate that token from Postman while sending request ?
Is there any way to execute $(gcloud auth print-access-token) every time as a Pre-request Script within Postman ?
Thanks
I'm not very good with postman, but I think you can run pre-request to get token and reuse it in the subsequent request.
If so, you can get inspiration from the gcloud auth print-access-token command by adding the --log-http param to visualize the request performed by the CLI and to reproduce them in Postman.
EDIT 1
If you perform the request, you can see that a post is performed to this URL https://oauth2.googleapis.com/token
To reproduce the call, you can try with a curl
curl -X POST -d "grant_type=refresh_token&client_id=32555940559.apps.googleusercontent.com&client_secret=ZmssLNjJy2998hD4CTg2ejr2&refresh_token=<REFRESH_TOKEN>&scope=openid+https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fuserinfo.email+https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform+https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fappengine.admin+https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcompute+https%3A%2F%2Fwww.googleapis.com%2Fauth%2Faccounts.reauth" https://oauth2.googleapis.com/token
In this call, you need your REFRESH_TOKEN, that you can get here
cat ~/.config/gcloud/legacy_credentials/<YOUR EMAIL>/adc.json
Google Cloud Storage requires authentication as other Google APIs and one of the authentication way is providing bearer token. These bearer tokens are short lived and require regeneration.
So there are 3 ways to generate bearer tokens so you can interact with Google Storage API or other Google APIs using Postman:
Using oauth2l CLI ( Manual Regeneration of new bearer token and update of Authorization header with the new token)
This oauth2l CLI utility allows you to generate bearer tokens which can be pasted into the Authorization header in postman. You can use
Configuration of Postman with OAuth 2 and User Credentials ( Tokens can be managed via the Postman UI and expired ones cleaned up at the click of a button)
Postman can be configured to trigger the OAuth 2 flow and use a generated bearer token in all of the requests. But please make sure that all users have the correct permissions in the Google Cloud Platform project.
You will need to create OAuth 2 credentials in Google Cloud Console:
Go to APIS and Services
Then go to Credentials tab
Click on Create Credentials
Select OAuth Client ID
Fill the fields to create OAuth Client ID ( also add an Authorized redirect URI however this doesn’t need to resolve to anywhere).
The Client ID and Client Secret need to be saved in your machine.
Use Postman’s environment variable functionality to use different credentials per environment/project. In Postman create a new environment for your credentials using the cog icon at the top right.
Configure the variables accordingly: AUTH_CALLBACK_URL , AUTH_URL, AUTH_CLIENT_ID, AUTH_CLIENT_SECRET, AUTH_ACCESS_TOKEN_URL
This variable should be identical to that defined in the OAuth 2 Client ID creation menu and should be one of the following : AUTH_SCOPE
Once defined, these variables can be used in your Authorization tab in Postman. This can be configured at the collection level, the folder level or even the individual request level.
To Regenerate the Token, you can go to Authorization Tab and click on GET NEW ACCESS TOKEN
Configuration of Postman to use a pre-request script and service credentials (The pre-request script automatically regenerates the bearer token when it expires)
For this please check this Tutorial to follow the steps provided there.

Authentication with Openshift API running on IBM cloud failed with 401 unauthorized error

I'm trying to build some application to manage my OpenShift cluster on IBM cloud and the first step is to authenticate against both IBM cloud and the OpenShift cluster.
https://cloud.ibm.com/docs/openshift?topic=openshift-cs_api_install#kube_api
I followed the steps describe in the above link, and successfully obtained all the tokens including 'access_token', 'id_token' and 'refresh_token'. Among them the 'id_token' is supposed to be used to authenticate against the OpenShift API.
With the access_token I can visit IBM cloud API successfully, like obtaining account, cluster information.
However, when I use the id_token to call OpenShift API, it failed with the following error. It happened even for the '/version' api, which can be accessed without providing a bearer token.
{
"kind": "Status",
"apiVersion": "v1",
"metadata": {},
"status": "Failure",
"message": "Unauthorized",
"reason": "Unauthorized",
"code": 401
}
I can verify that my account have correct service roles assigned as described here, and I can see corresponding roles with 'ibm' prefix assigned in OpenShift web portal as well.
Can anyone please verify that the instructions in the first link above is still valid or have any clue about what might have been wrong?
[Update]
To help troubleshooting, I paste a sample of tokens here, this is what I get for the step3 in the 'Working with your cluster by using the Kubernetes API' section in the link, it is a bit lengthy:
{
"access_token": "eyJraWQiOiIyMDIxMDIxOTE4MzUiLCJhbGciOiJSUzI1NiJ9.eyJpYW1faWQiOiJJQk1pZC0yNzAwMDU1WERHIiwiaWQiOiJJQk1pZC0yNzAwMDU1WERHIiwicmVhbG1pZCI6IklCTWlkIiwianRpIjoiMDY1OWI5MjktMDE1Zi00MDg0LTgwZWMtYmFhZjBhYTBkNDQ4IiwiaWRlbnRpZmllciI6IjI3MDAwNTVYREciLCJnaXZlbl9uYW1lIjoi6Iic5a2QIiwiZmFtaWx5X25hbWUiOiLpmYgiLCJuYW1lIjoi6Iic5a2QIOmZiCIsImVtYWlsIjoicmFmb3VsQDE2My5jb20iLCJzdWIiOiJjaHN6Y2hlbkBjbi5pYm0uY29tIiwiYXV0aG4iOnsic3ViIjoiY2hzemNoZW5AY24uaWJtLmNvbSIsImlhbV9pZCI6IklCTWlkLTI3MDAwNTVYREciLCJuYW1lIjoi6Iic5a2QIOmZiCIsImdpdmVuX25hbWUiOiLoiJzlrZAiLCJmYW1pbHlfbmFtZSI6IumZiCIsImVtYWlsIjoicmFmb3VsQDE2My5jb20ifSwiYWNjb3VudCI6eyJ2YWxpZCI6dHJ1ZSwiYnNzIjoiOWM5NzI1YmQxM2VhNDU2Nzg4YWMwZWU3OGQ4NjQ2ZTEiLCJpbXNfdXNlcl9pZCI6Ijg4NzM1NzYiLCJmcm96ZW4iOnRydWUsImltcyI6IjM0NjU1MiJ9LCJpYXQiOjE2MTQyNTU5ODYsImV4cCI6MTYxNDI1OTU4NiwiaXNzIjoiaHR0cHM6Ly9pYW0uY2xvdWQuaWJtLmNvbS9pZGVudGl0eSIsImdyYW50X3R5cGUiOiJ1cm46aWJtOnBhcmFtczpvYXV0aDpncmFudC10eXBlOmFwaWtleSIsInNjb3BlIjoiaWJtIG9wZW5pZCBjb250YWluZXJzLWt1YmVybmV0ZXMiLCJjbGllbnRfaWQiOiJrdWJlIiwiYWNyIjoxLCJhbXIiOlsicHdkIl0sInN1Yl85Yzk3MjViZDEzZWE0NTY3ODhhYzBlZTc4ZDg2NDZlMSI6ImNoc3pjaGVuQGNuLmlibS5jb20iLCJpYW1faWRfOWM5NzI1YmQxM2VhNDU2Nzg4YWMwZWU3OGQ4NjQ2ZTEiOiJJQk1pZC0yNzAwMDU1WERHIiwicmVhbG1lZF9zdWJfOWM5NzI1YmQxM2VhNDU2Nzg4YWMwZWU3OGQ4NjQ2ZTEiOiJJQk1pZC1jaHN6Y2hlbkBjbi5pYm0uY29tIn0.Rm3F0UKz9Aq3-1xXMmkFi0UkENIvQUkRo6qhtWaG3LKBH5HHsZbAQeJUhKqXYbI643nj2ssDP2U50BVv-6zbpfmyVncP5Z5Dmi620mi2QesduRQaH1XlC-l7KuF3uT0hJ_9FSD-0Wqi5ph0pkKxHJ-BmLkHC-4F0NByiUtwIpwyTpthuzwC251XZsQ9Ya8gzCxHB9DFb3tzOF3cupVVZmc2mMJbv4JuTSnP00H5rOT4yIzeI0Lqm6LhDpMRJ4P8glmIxmU6fag42P94pFNf3jEzIZGl49NINiWXlKbAleij3vSouobtYvrBmxWQF4KpuwKPEI-bMf1zpsHPYBHWidg",
"id_token": "eyJraWQiOiIyMDIxMDIxOTE4MzUiLCJhbGciOiJSUzI1NiJ9.eyJpYW1faWQiOiJJQk1pZC0yNzAwMDU1WERHIiwiaXNzIjoiaHR0cHM6Ly9pYW0uY2xvdWQuaWJtLmNvbS9pZGVudGl0eSIsInN1YiI6ImNoc3pjaGVuQGNuLmlibS5jb20iLCJhdWQiOiJrdWJlIiwiZ2l2ZW5fbmFtZSI6IuiInOWtkCIsImZhbWlseV9uYW1lIjoi6ZmIIiwibmFtZSI6IuiInOWtkCDpmYgiLCJlbWFpbCI6InJhZm91bEAxNjMuY29tIiwiZXhwIjoxNjE0MjU5NTg2LCJzY29wZSI6ImlibSBvcGVuaWQgY29udGFpbmVycy1rdWJlcm5ldGVzIiwiaWF0IjoxNjE0MjU1OTg2LCJhdXRobiI6eyJzdWIiOiJjaHN6Y2hlbkBjbi5pYm0uY29tIiwiaWFtX2lkIjoiSUJNaWQtMjcwMDA1NVhERyIsIm5hbWUiOiLoiJzlrZAg6ZmIIiwiZ2l2ZW5fbmFtZSI6IuiInOWtkCIsImZhbWlseV9uYW1lIjoi6ZmIIiwiZW1haWwiOiJyYWZvdWxAMTYzLmNvbSJ9LCJzdWJfOWM5NzI1YmQxM2VhNDU2Nzg4YWMwZWU3OGQ4NjQ2ZTEiOiJjaHN6Y2hlbkBjbi5pYm0uY29tIiwiaWFtX2lkXzljOTcyNWJkMTNlYTQ1Njc4OGFjMGVlNzhkODY0NmUxIjoiSUJNaWQtMjcwMDA1NVhERyIsInJlYWxtZWRfc3ViXzljOTcyNWJkMTNlYTQ1Njc4OGFjMGVlNzhkODY0NmUxIjoiSUJNaWQtY2hzemNoZW5AY24uaWJtLmNvbSIsImdyb3Vwc185Yzk3MjViZDEzZWE0NTY3ODhhYzBlZTc4ZDg2NDZlMSI6WyJkZXZvcHMtYWRtaW4tdnBuLW9ubHkiLCJkZXZvcHMtZGVmYXVsdC12cG4tb25seSIsIklQV0MgQWRtaW4iXX0.Y42KUJRGgZA9OV164GAKSF0W5rRNGf3x32YXrAo5UvKhpOK0k4r_hwZU5BZhI2y3t-UqM7lNOIxexpft2Zmc9ApQ6BlVN-iN1jcfBzxmrUPMObpc1-vDrAc9Sq84J8nYzy1Rk32ydFHeb3V2iDhJn14_NOnXwhuz9EFkSg0uUZHugTAPx5A-VcdrehceX0yOqAOfX5EzTtmHoI8-JQbfNt8pyBSJs8Eoag7_mtfNgx13bP_-M8W7tltCSHhPEO46gUurPFkvasHggConPQ_oBw3ANAvY8tDfivrGmdiR2Q-uc4SnFAjOgC77YskDLskBcOeehhBvxwDkyufztzqM6w",
"refresh_token": "OKDsw87zCujUXCmb4LZ3-DFQN7lUa0ejdqau_fL3Voms7M7DaKYgO07gZW29VQbcwdGc3z8jrQjjf_4gOutKyRCZ6LyEiSEKTZQ6Kovwqji02Puxu3fzIFB9f8-a1hMlkTtP4u32_FTCmOZA6ARvzxEyRX36CtQEzSVz-zVMsvPxdgyztUEWPTtvbr7aPn4eq209OzTGzTyPCBFR-N0gVp2tKLbIrGmyi_vgC-6xLRvR2nWGJsUwaaBjXwvICeCBY3qRJ90VyP1krBSHa72f1XJWpvLnBWHN8qo1dfPknHvknlEZ3kMUA87KZkynkgiVifhRq90oNAKYHhKJ4XRs2tyz05zW5a8qEhgoIVsslUzDLLNU1btRF_3g587dKckPzEav3BgQlCik4im8gIC74HFGZOz4P7z9QKLJHQY7ElDillH8pLRjW8Dx0yZvn8Yo5rSqJSj0zUmJxNZMUNEpF_DTQhHCePNOWu1_1q4o5cIb_Mv-mGMMVwrVUsJYUyaeV9O5cWl58eWlHQxS3SbuAjsBrzfSdcrIyFe5aQViyL_sL1-o54xFrMJPC3prPD25TS4vUOwAy7tc9r1AGZG00YUGaxPwzKcOWBI4DqksIiEKPOtcm3k0y24TuwRPa0AK-9jfYAzkx3rciBYGKbq1WOFjX-p6LH67ayxVUJcQcjSMe-35LZnsHQtc0VOxNHjJKdJiHsKOYEDY1Nz0k4zGZr1EZ6j7w4tLpBXP9ThC8hReiihWDmld9lzFdLwKZPF7jl4u03a2WQZ6j-wMHvLtOBcLDiKwEaeWaGp8v_YS3j4iGqkcAytf7z_-toD1O3ZHtIUlbe6H64IAVPKadN1Y1SD49Ouk1fk8xDFr7HQ4RuDTLfZnLGzC4vvzysCmJEX837Wjf2f9WdirEaKxoSlDDJKilt--20Ota-5CTimD8u0SttC6CD1Glj8bbAS8ddCAfVirDJty7FW3eyALvAHifKqzRa1kBDPHb305q91oSWYdzBKIlTinN9BAXDc3ZccVkWM6Y3VgUzh2iQwM0lKadts7OMwqhLDk7rukAXHRUpKxy-85rUf-a0oz41s69PXdQteoh559vEb0uyrq0kOnI1RnuJ7MaEGDC25Kfezumo0snwYRmQhXMPMeKkxBKxs9ZydKxxcp1qtLwFyHA6MhZuXRpZM9Qse9mqovNdHHOhAQIZu3J7HJusuVdg3SJhZkTH__gXpCc2hBeOpR0rPc6qZm7z2nU5pJQ2XgzH2TUm6psA",
"ims_user_id": 8873576,
"token_type": "Bearer",
"expires_in": 3600,
"expiration": 1614259586,
"refresh_token_expiration": 1616847976,
"scope": "ibm openid containers-kubernetes"
}
In addition, the following approach works but the token is obtained through the OpenShift web console, and thus cannot be obtained programmatically(at least I don't see how),
"Authorization: Bearer sha256~6V_OvZ5OoV8vnHF33Es5qsloAY-iXkLQ8dfl_Nsyn94"
Thanks!
You can not and should not send the ID-Token to get access to APIs, its only meant to be used by the client who did the initial authentication. It also typically have a very short lifetime (like 5 minutes in some implementation).
The only purpose of the ID-token is basically o create the local user session.
On the page you refer to it says at the end:
ID token: Every IAM ID token that is issued via the CLI expires after
one hour. When the ID token expires, the refresh token is sent to the
token provider to refresh the ID token. Your authentication is
refreshed, and you can continue to run commands against your cluster.
It sounds like they mean the access token. In openID connect you don't renew your ID-token (what I am aware of)
Have been busy in the past few days, I will share how I solved this problem here. In fact it didn't address the original issue, but is another way to achieve the goal.
So it turned out that there was another doc regarding how the access token can be obtained(Yes, as mentioned by #Tore Nestenius it should be an access token instead of an id token). The token described here is actually the same as what one would get through the Openshift web console. And basically it has nothing to do with the previous link I shared in the question.

Azure KeyVault Get Secret API responds with 404 or 401 error

I am trying to get a secret out of Azure Key Vault. It is a very simple Restful API call. For example for all key, it is as simple as this:
GET {vaultBaseUrl}/secrets?api-version=7.0
where vaultBaseUrl is provided in Azure Console as Vault DNS name.
I am using console mode for testing
https://learn.microsoft.com/en-us/rest/api/keyvault/getsecrets/getsecrets#code-try-0
But the return value if always 404.
When I try curl in the Azure console, it gives 401 - Unauthorized.
However I can use the command line to get the secret out.
Is there any secret to making the restful call and curl work to get the secret out? All these situations use the same credentials.
A side questions is, that on the micorosft api testing page there is a 'Request Preview' section with a green Run button, almost as if it is inviting you to run the api, but the link is to learn.microsoft.com and the copy button on the box is disabled. I have never seen so many problems in one place, so I am thinking may be I don't understand something here.
The doc seems not to be correct. If you want to get the secret, you could use the Client credentials flow to get the access token and use it to get the secret.
Follow the steps as below.
1.Register an app in the Azure Active Directory, see this link. Get the application id and key, see this link. Add the service principal in the Access policies in your keyvault with the correct secret permission(just search the name of your AD App then add it).
2.In the postman, send a request to the url
POST https://login.microsoftonline.com/{your tenant id}/oauth2/token?api-version=1.0
Request body and complete sample(client_id and client_secret are the application id and key in step 1):
3.Copy the access_token in step 2, then use it as an Authorization token to call the api:
GET https://yourkeyvault.vault.azure.net/secrets?api-version=7.0
Besides, if you want to use CURL to get the secret, try the one like below. The TOKEN is the same with the access_token in step 2 above.
curl -X GET -H "Authorization: Bearer [TOKEN]" https://yourkeyvault.vault.azure.net/secrets?api-version=7.0
For more details about getting access_token via curl and complete steps, you could refer to this link. Don't forget to change the resource to https://vault.azure.net in the Request the Access Token step.
Pass Bearer token.There will be an Url to generate a token and pass it to authentication then u will donot get the error.