How to auto generate new Bearer Token in Postman for GCP Storage - rest

I am trying to upload file from local to GCP bucket through cloud storage Rest API (https://storage.googleapis.com/upload/storage/v1/b) using Postman.
I am using Bearer Token for authorization and running $(gcloud auth print-access-token) command on GCP Shell to generate that token every time.
I need to know, how to auto generate that token from Postman while sending request ?
Is there any way to execute $(gcloud auth print-access-token) every time as a Pre-request Script within Postman ?
Thanks

I'm not very good with postman, but I think you can run pre-request to get token and reuse it in the subsequent request.
If so, you can get inspiration from the gcloud auth print-access-token command by adding the --log-http param to visualize the request performed by the CLI and to reproduce them in Postman.
EDIT 1
If you perform the request, you can see that a post is performed to this URL https://oauth2.googleapis.com/token
To reproduce the call, you can try with a curl
curl -X POST -d "grant_type=refresh_token&client_id=32555940559.apps.googleusercontent.com&client_secret=ZmssLNjJy2998hD4CTg2ejr2&refresh_token=<REFRESH_TOKEN>&scope=openid+https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fuserinfo.email+https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform+https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fappengine.admin+https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcompute+https%3A%2F%2Fwww.googleapis.com%2Fauth%2Faccounts.reauth" https://oauth2.googleapis.com/token
In this call, you need your REFRESH_TOKEN, that you can get here
cat ~/.config/gcloud/legacy_credentials/<YOUR EMAIL>/adc.json

Google Cloud Storage requires authentication as other Google APIs and one of the authentication way is providing bearer token. These bearer tokens are short lived and require regeneration.
So there are 3 ways to generate bearer tokens so you can interact with Google Storage API or other Google APIs using Postman:
Using oauth2l CLI ( Manual Regeneration of new bearer token and update of Authorization header with the new token)
This oauth2l CLI utility allows you to generate bearer tokens which can be pasted into the Authorization header in postman. You can use
Configuration of Postman with OAuth 2 and User Credentials ( Tokens can be managed via the Postman UI and expired ones cleaned up at the click of a button)
Postman can be configured to trigger the OAuth 2 flow and use a generated bearer token in all of the requests. But please make sure that all users have the correct permissions in the Google Cloud Platform project.
You will need to create OAuth 2 credentials in Google Cloud Console:
Go to APIS and Services
Then go to Credentials tab
Click on Create Credentials
Select OAuth Client ID
Fill the fields to create OAuth Client ID ( also add an Authorized redirect URI however this doesn’t need to resolve to anywhere).
The Client ID and Client Secret need to be saved in your machine.
Use Postman’s environment variable functionality to use different credentials per environment/project. In Postman create a new environment for your credentials using the cog icon at the top right.
Configure the variables accordingly: AUTH_CALLBACK_URL , AUTH_URL, AUTH_CLIENT_ID, AUTH_CLIENT_SECRET, AUTH_ACCESS_TOKEN_URL
This variable should be identical to that defined in the OAuth 2 Client ID creation menu and should be one of the following : AUTH_SCOPE
Once defined, these variables can be used in your Authorization tab in Postman. This can be configured at the collection level, the folder level or even the individual request level.
To Regenerate the Token, you can go to Authorization Tab and click on GET NEW ACCESS TOKEN
Configuration of Postman to use a pre-request script and service credentials (The pre-request script automatically regenerates the bearer token when it expires)
For this please check this Tutorial to follow the steps provided there.

Related

Getting Error 403 Trying to update PAT on Azure Devops

Based on the doc (https://learn.microsoft.com/en-us/rest/api/azure/devops/tokens/pats/update?view=azure-devops-rest-7.1&tabs=HTTP) provided by MS, i'm trying to update my Token expiration date through the api. I made a sample request using a full access token to authenticate and passing the authorizationId of the token i want to update in the body:
My sample request using Postman
And it keeps returning Error 403, I've checked my organization policies and tried by adding or removing parameters from body but it din't work.
I've also made another request to get the list of tokens on my organization with the same token authorization and api version and that went well.
According to your screenshot you are using the Basic Auth with PAT.
Please note that you must authenticate with an Azure AD token to use this API instead of a PAT token. In order to call the API directly, you need to provide an Azure AD access token as a Bearer token in Authorization header of your request. Please see Manage personal access tokens (PATs) using REST API and Q: Can I use basic auth with all Azure DevOps REST APIs? for details.
You can follow below steps to get the AAD Bearer token:
Install the Azure Az PowerShell module.
Login with a user account which has the permission in your DevOps org (Owner or PCA) with command Connect-AzAccount
1.) Alternately login from the cloud shell with command Connect-AzAccount -UseDeviceAuthentication, you will see the following message:
2.) Then copy the url https://microsoft.com/devicelogin and open in a new tap, enter the code IVR7VRWJQ to authenticate.
3.) Login with the Azure DevOps organization owner or other PCA account. After successful login you will see the account info, then follow below steps to get the Bearer Token.
Get the Bearer token:
$token = (Get-AzAccessToken -ResourceUrl "499b84ac-1321-427f-aa17-267ca6975798").Token
$token
Copy and use the token in script or Postman to update the PAT.

K8S Dashboard login with url

I'm running an eks cluster, installed k8s dashboard etc. All works fine, I can login in the UI in
http://localhost:8001/api/v1/namespaces/kubernetes-dashboard/services/https:kubernetes-dashboard:/proxy/#/login
Is there a way for me to pass the token via the url so I won't need a human to do this?
Thanks!
Based on official documentation it is impossible to put your authentication token in URL.
As of release 1.7 Dashboard supports user authentication based on:
Authorization: Bearer <token> header passed in every request to Dashboard. Supported from release 1.6. Has the highest priority. If present, login view will not be shown.
Bearer Token that can be used on Dashboard login view.
Username/password that can be used on Dashboard login view.
Kubeconfig file that can be used on Dashboard login view.
As you can see, only the first option bypasses the Dashboard login view. So, what is Bearer Authentication?
Bearer authentication (also called token authentication) is an HTTP authentication scheme that involves security tokens called bearer tokens. The name “Bearer authentication” can be understood as “give access to the bearer of this token.” The bearer token is a cryptic string, usually generated by the server in response to a login request. The client must send this token in the Authorization header when making requests to protected resources:
You can find more information about Baerer Authentication here.
The question now is how you can include the authentication header in your request. There are many ways to achieve this:
curl command - example:
curl -H "Authorization: Bearer <TOKEN_VALUE>" <https://address-your-dashboard>
Postman application - here is good answer to set up authorization header with screenshots.
reverse proxy - you can be achieve this i.e. by configuring reverse proxy in front of Dashboard. Proxy will be responsible for authentication with identity provider and will pass generated token in request header to Dashboard. Note that Kubernetes API server needs to be configured properly to accept these tokens. You can read more about it here. You should know, that this method is potentially insecure due to Man In The Middle Attack when you are using http.
You can also read very good answers to the question how to sign in kubernetes dashboard.

How to obtain the authorization code required for User Credentials through the cURL's command line

I trying to use GCS "User Credentials" to connect to Google cloud storage using libcurl library.
"User Credentials" authentication needs Client Id & Secret key to connect to GCS, but in this process Authentication Code also needs to be generated.
I need to generate this Authentication code using cURL.
Can anyone help me ??
The Client ID you mentioned is the same as the Authentication ID and can only be generated from either the Cloud Console's Credentials Page or via the OAuth 2.0 Playground.
If you are trying to generate an Access Token (OAUTH2_TOKEN), you will need to complete an authentication flow to authorize requests as a user. Cloud Storage uses OAuth 2.0 for API authentication and authorization.
Here's what you need to do to get an authorization access token from the OAuth 2.0 Playground:
Select & authorize APIs (Cloud Storage)
Select the scope for the APIs you would like to access or input your own OAuth scopes, e.g.: https://www.googleapis.com/auth/devstorage.read_write
Then click the "Authorize APIs" button
Once you've got the Authorization Code click the "Exchange authorization code for tokens" button, you will get a refresh and an access token which is required to access OAuth protected resources.
Grab the Access Token to use in your cURL command
Then configure your request to Cloud Storage API by constructing your HTTP request like so (upload):
curl -X POST --data-binary #[OBJECT_LOCATION] \
-H "Authorization: Bearer [OAUTH2_TOKEN]" \
-H "Content-Type: [OBJECT_CONTENT_TYPE]" \
"https://www.googleapis.com/upload/storage/v1/b/[BUCKET_NAME]/o?uploadType=media&name=[OBJECT_NAME]"
You can have a look at this Cloud Storage upload example in our public docs to guide you with constructing a request and testing it out.
Hope this helps.

Azure KeyVault Get Secret API responds with 404 or 401 error

I am trying to get a secret out of Azure Key Vault. It is a very simple Restful API call. For example for all key, it is as simple as this:
GET {vaultBaseUrl}/secrets?api-version=7.0
where vaultBaseUrl is provided in Azure Console as Vault DNS name.
I am using console mode for testing
https://learn.microsoft.com/en-us/rest/api/keyvault/getsecrets/getsecrets#code-try-0
But the return value if always 404.
When I try curl in the Azure console, it gives 401 - Unauthorized.
However I can use the command line to get the secret out.
Is there any secret to making the restful call and curl work to get the secret out? All these situations use the same credentials.
A side questions is, that on the micorosft api testing page there is a 'Request Preview' section with a green Run button, almost as if it is inviting you to run the api, but the link is to learn.microsoft.com and the copy button on the box is disabled. I have never seen so many problems in one place, so I am thinking may be I don't understand something here.
The doc seems not to be correct. If you want to get the secret, you could use the Client credentials flow to get the access token and use it to get the secret.
Follow the steps as below.
1.Register an app in the Azure Active Directory, see this link. Get the application id and key, see this link. Add the service principal in the Access policies in your keyvault with the correct secret permission(just search the name of your AD App then add it).
2.In the postman, send a request to the url
POST https://login.microsoftonline.com/{your tenant id}/oauth2/token?api-version=1.0
Request body and complete sample(client_id and client_secret are the application id and key in step 1):
3.Copy the access_token in step 2, then use it as an Authorization token to call the api:
GET https://yourkeyvault.vault.azure.net/secrets?api-version=7.0
Besides, if you want to use CURL to get the secret, try the one like below. The TOKEN is the same with the access_token in step 2 above.
curl -X GET -H "Authorization: Bearer [TOKEN]" https://yourkeyvault.vault.azure.net/secrets?api-version=7.0
For more details about getting access_token via curl and complete steps, you could refer to this link. Don't forget to change the resource to https://vault.azure.net in the Request the Access Token step.
Pass Bearer token.There will be an Url to generate a token and pass it to authentication then u will donot get the error.

How to properly authorize request to Google Cloud Storage API?

I am trying to use the Google Cloud Storage JSON API to retrieve files from a bucket using http calls.
I am curling from a Container in GCE within the same project as the storage bucket, and the service account has read access to the bucket
Here is the pattern of the requests:
https://storage.googleapis.com/{bucket}/{object}
According to the API console, I don't need anything particular as the service account provides Application Default Credentials. However, I keep having this:
Anonymous caller does not have storage.objects.get
I also tried to create an API key for the project and appended it to the url (https://storage.googleapis.com/{bucket}/{object}?key={key})but I still got the same 401 error.
How can I authorize requests to query this API?
The URL that you are using is not correct. The APIs use a URL that starts with https://www.googleapis.com/storage/v1/b.
Using API keys is not recommended. Instead you should use a Bearer: token. I will show both methods.
To get an access token for the gcloud default configuration:
gcloud auth print-access-token
Then use the token in your curl request. Replace TOKEN with the token from the gcloud command.
To list buckets:
curl -s -H "Authorization: Bearer TOKEN" https://www.googleapis.com/storage/v1/b
curl https://www.googleapis.com/storage/v1/b?key=APIKEY
To list objects:
curl -s -H "Authorization: Bearer TOKEN" https://www.googleapis.com/storage/v1/b/examplebucket/o
curl https://www.googleapis.com/storage/v1/b/examplebucket/o?key=APIKEY
API Reference: List Buckets
If you are able to create another cluster you can obtain permission like this:
Click in "avanced edit"
next click in "Allow full access to all Cloud APIs"
And that's it :D