I am trying to fetch secret values from Aws like grant_type, client_id, and client_secret in my scala code using the maven dependencies. I found a link in which we can connect to a region and retrieve the secret value using the below
SecretManager(Regions."regionFromAWS")
.retrieveSecretValue("SecretName")
however it returns a Task[Secret] result from which we can extract the values. but i am not sure how to extract the values.
Related
I have a drone cicd pipeline that builds a npm project which I want to upload to Google Cloud Storage (gcs). I found a drone gcs plugin which seems to be able to do so.
But I don't know what to use for the token parameter. The documentation says: "credentials to access Google Cloud Storage".
I have create a ServiceAccount and downloaded the json for it. My first attempt was to use the base64 encode json (as done with the App Engine Plugin) but this failed with this error:
failed to authenticate token: invalid character 'e' looking for beginning of value1
Is this a oauth2 token? How can I create a token, so that drone-ci can upload the files to my bucket?
I see the GCS plugin is broken :(, but I have added another plugin Google Cloud Auth that allows you to pass SA json as string secret and then use the auth plugin to activate the SA based auth.
You can then mount the ~/.config/gcloud in all the required steps and do the required gcloud tasks. For an example check https://plugins.drone.io/plugins/google-cloud-run that uses this method.
I hope this helps you.
I am trying to authenticate a java app into the google photos api, using my own account, which doesn't support service_account as per documentation. The problem is that the file generated by the google-console for OAuth2 authentication doesn't contain a type field, that only the file generated by creating credentials for a service account will work.
I tried authenticating the web-app through gcloud auth application-default login with/without reading the generated file, which does contain a type/client_id/client_secret/refresh_token.
E.G.:
PhotosLibrarySettings settings =
PhotosLibrarySettings.newBuilder()
.setCredentialsProvider(FixedCredentialsProvider.create(
GoogleCredentials.fromStream(
new FileInputStream("credentials.json")
)))
.build();
Any ideas that might help?
I had the same issue when trying to use Google API for Calendar.
As explained here, your json file is invalid. He explains how to generate the correct json, using the "Service Account key" credential.
I don't know if this solves your problem, but it didn't for me. This type of credential requires a G Suit account, which wasn't what I was looking for. I was using the GoogleCredentials for OAuth2 authentication and in my json file, the only things it requires is type, client_id, client_secret and refresh_token.
The type accepts "service_account" (for G Suit) and "authorized_user" (for OAuth2).
I'm using the Serverless framework, and I want to be able to reference my API Gateway URL in my acceptance tests.
My test environment is regularly destroyed and then recreated, so hardcoding a URL into the tests is not possible.
I can see there are ways to reference API Gateway as an AWS environment variable, but this doesnt help me to locally get the URL for my tests.
I was hoping that the cloudformation output would be referenced in the .serverless package, and accessible via json, but this doesnt seem to be the case.
Any idea how I can reference the API Gateway URL in my acceptance test files?
NOTE: These tests need to be run on AWS, not using a local server to mimic API Gateway
The serverless-plugin-test-helper plugin can help here. It will generate a YAML file containing all of the outputs of your stack. This includes a couple of standard ones - the S3 bucket that was used (ServerlessDeploymentBucketName) and the base service endpoint (ServiceEndpoint).
If you are using Node and have your tests in the same directory as the stack being tested then there's also a module to read this file. Otherwise, it's just standard YAML and you can use whatever tools are convenient.
Consider adding an APIGateway custom domain for your API. You can then use a known DNS name for your acceptance tests.
You will need to add an ApiGateway base path mapping, apigateway domain name, and a route53 recordset to the resources section of your serverless.yml.
Detect Google Cloud Project Id from a container in Google hosted Kubernetes cluster.
When connecting to BigTable; I need to provide the Google Project Id. Is there a way to detect this automatically from within K8s?
In Python, you can find the project id this way:
import google.auth
_, PROJECT_ID = google.auth.default()
The original question didn't mention what programming language was being used, and I had the same question for Python.
You can use the metadata service. Example:
curl -H "Metadata-Flavor: Google" -w '\n' http://metadata.google.internal/computeMetadata/v1/project/numeric-project-id
This will work from any VM running on Google Compute Engine or Container Engine.
See https://cloud.google.com/compute/docs/storing-retrieving-metadata:
Google Compute Engine defines a set of default metadata entries that provide information about your instance or project. Default metadata is always defined and set by the server.
...
numeric-project-id The numeric project ID of the instance, which is not the same as the project name visible in the Google Cloud Platform Console. This value is different from the project-id metadata entry value.
project-id The project ID.
Google has some libraries for this too: ServiceOptions.getDefaultProjectId
https://googleapis.github.io/google-cloud-java/google-cloud-clients/index.html
https://github.com/googleapis/google-cloud-java/blob/master/google-cloud-clients/google-cloud-core/src/main/java/com/google/cloud/ServiceOptions.java
https://github.com/googleapis/google-cloud-java/tree/master/google-cloud-clients/google-cloud-core
I am trying to configure iReport from Jaspersoft with BigQuery and I am following every step posted on the Internet but they don't work.
The specific steps to configure this are following those from here
The main step is to create a Project in BigQuery which gives you 3 main things:
the Project ID
the Client ID
the Client Secret
With these parameters you can create the JDBC URL explained here which is required to connect iReport with BigQuery.
BUT, when you get to the part when you only have to put the Credentials (username, and password) on the "New Database JDBC Connection" creation of iReport, I can't use my Client Secret, iReport requests the <path to key file>, which I think is a *.p12 file, because iReport gives this error:
QL problems: java.io.IOException: toDerInputStream rejects tag type ##
Please help either on creating this *.p12 file and see what happens or maybe giving me the exact steps to make this configuration work, if any of you have configured this correctly.
Thank You.
The *.p12 file is used when you want to connect with a ServiceAccount.
Please go to Google apis console and select your project.
At the Dashboard there will be your Project ID
To obtain username, and password please go to the API Access menu and select: Create another client ID From here you either select service account or Installed application.
If you choose service account you'll get the .p12 file and the username.
If you choose installed application then you should select the Other then press Create Client ID then you have your client ID and client Secret
Hopefully this will solve your problem, if not, then feel free to ask.