Upload files in Drone Pipeline to Google Cloud Storage using gcs plugin - google-cloud-storage

I have a drone cicd pipeline that builds a npm project which I want to upload to Google Cloud Storage (gcs). I found a drone gcs plugin which seems to be able to do so.
But I don't know what to use for the token parameter. The documentation says: "credentials to access Google Cloud Storage".
I have create a ServiceAccount and downloaded the json for it. My first attempt was to use the base64 encode json (as done with the App Engine Plugin) but this failed with this error:
failed to authenticate token: invalid character 'e' looking for beginning of value1
Is this a oauth2 token? How can I create a token, so that drone-ci can upload the files to my bucket?

I see the GCS plugin is broken :(, but I have added another plugin Google Cloud Auth that allows you to pass SA json as string secret and then use the auth plugin to activate the SA based auth.
You can then mount the ~/.config/gcloud in all the required steps and do the required gcloud tasks. For an example check https://plugins.drone.io/plugins/google-cloud-run that uses this method.
I hope this helps you.

Related

Google Cloud Composer Environment Setup Error: Connect to Google Cloud Storage

I am trying to create an environment in Google Cloud Composer. Link here
When creating the environment from scratch and selecting all the default fields, the following error appears:
CREATE operation on this environment failed 22 hours ago with the following error message:
CREATE operation failed. Composer Agent failed with: Cloud Storage Assertions Failed: Unable to write to GCS bucket.
GCS bucket write check failed.
I then created a google cloud storage bucket within the same project to see if that would help and the same error still appears.
Has anyone been able successfully create a Google Cloud Composer environment and if so please provide guidance on why this error message continues to appear?
Update: Need to update permissions to allow access it seems like. Here is a screenshot of my permissions page but not editable.
It seems like you haven't given the required IAM policies to the service account. I would advise you to read more about the IAM policies on Google Cloud here
When it comes to the permissions of the bucket, there are permissions like the Storage Object Admin that might fit your needs.

Meteor-Files intergration with google storage block prevent publish users

I have Meteor-React app locally, I added Meteor file with the integration of Google Cloud storage from this guide:
https://github.com/VeliovGroup/Meteor-Files/blob/master/docs/google-cloud-storage-integration.md
As Google changed documentation I follow this NPM package:
https://www.npmjs.com/package/#google-cloud/storage
then I follow this link and create a key-file
https://cloud.google.com/sdk/gcloud/reference/iam/service-accounts/keys/create#--iam-account
and finally created a bucket:
gcs = new Storage({keyFilename:
`${Meteor.rootPath}/assets/app/gntapp-3fb3c-storage.json`})
and activated from gcloud terminal :
gcloud auth activate-service-account --key-file [KEY_FILE]
I uploaded the first image in storage.
now my problem is My users' publication does not work and users not publish.
when I do:
console.log(Meteor.users.find({}).fetch())
in Meteor.Startup I got information of bucket kind: 'storage#bucket'
but login with username and password work as normal.
can anybody help me to figure out the issue?
I use METEOR#1.8.1

Google Cloud Storage 500 Internal Server Error 'Google::Cloud::Storage::SignedUrlUnavailable'

Trying to get Google Cloud Storage working on my app. I successfully saved an image to a bucket, but when trying to retrieve the image, I receive this error:
GCS Storage (615.3ms) Generated URL for file at key: 9A95rZATRKNpGbMNDbu7RqJx ()
Completed 500 Internal Server Error in 618ms (ActiveRecord: 0.2ms)
Google::Cloud::Storage::SignedUrlUnavailable (Google::Cloud::Storage::SignedUrlUnavailable):
Any idea of what's going on? I can't find an explanation for this error in their documentation.
To provide some explanation here...
Google App Engine (as well as Google Compute Engine, Kubernetes Engine, and Cloud Run) provides "ambient" credentials associated with the VM or instance being run, but only in the form of OAuth tokens. For most API calls, this is sufficient and convenient.
However, there are a small number of exceptions, and Google Cloud Storage is one of them. Recent Storage clients (including the google-cloud-storage gem) may require a full service account key to support certain calls that involve signed URLs. This full key is not provided automatically by App Engine (or other hosting environments). You need to provide one yourself. So as a previous answer indicated, if you're using Cloud Storage, you may not be able to depend on the "ambient" credentials. Instead, you should create a service account, download a service account key, and make it available to your app (for example, via the ActiveStorage configs, or by setting the GOOGLE_APPLICATION_CREDENTIALS environment variable).
I was able to figure this out. I had been following Rail's guide on Active Storage with Google Storage Cloud, and was unclear on how to generate my credentials file.
google:
service: GCS
credentials: <%= Rails.root.join("path/to/keyfile.json") %>
project: ""
bucket: ""
Initially, I thought I didn't need a keyfile due to this sentence in Google's Cloud Storage authentication documentation:
If you're running your application on Google App Engine or Google
Compute Engine, the environment already provides a service account's
authentication information, so no further setup is required.
(I am using Google App Engine)
So I commented out the credentials line and started testing. Strangely, I was able to write to Google Cloud Storage without issue. However, when retrieving the image I would receive the 500 server error Google::Cloud::Storage::SignedUrlUnavailable.
I fixed this by generating my private key and adding it to my rails app.
Another possible solution as of google-cloud-storage gem version 1.27 in August 2020 is documented here. My Google::Auth.get_application_default as in the documentation returned an empty object, but using Google::Cloud::Storage::Credentials.default.client instead worked.
If you get Google::Apis::ClientError: badRequest: Request contains an invalid argument response when signing check that you have dash in the project name in the signing URL (i.e projects/-/serviceAccounts explicit project name in the path is deprecated and no longer valid) and that you have "issuer" string correct, as the full email address identifier of the service account not just the service account name.
If you get Google::Apis::ClientError: forbidden: The caller does not have permission verify the roles your Service Account have:
gcloud projects get-iam-policy <project-name>
--filter="bindings.members:<sa_name>"
--flatten="bindings[].members" --format='table(bindings.role)'
=> ROLE
roles/iam.serviceAccountTokenCreator
roles/storage.admin
serviceAccountTokenCreator is required to call the signBlob service, and you need storage.admin to have ownership of the thing you need to sign. I think these are project global rights, I couldn't get it to work with more fine grained permissions unfortunately (i.e one app is admin for a certain Storage bucket)

how external app can access ibm cloud object storage

I have IBM COS service and able to use Curl command via cli to retrieve objects. I used IAM tokens to retrieve. But how do I let an external web app ex., node access this service?
what value should be there in authorization for external app access?
External apps will come in the form of something like the AWS CLI or any other app that uses either an HTTP library coupled with IBM Cloud Object Storage API or even an SDK for languages like Python, Java or Node.Js
All of the above will ask you for access key and secret key.
You can get both of them from the IBM Cloud console by generating new HMAC Credentials [1]:
Navigate to your Cloud Object storage account
Click on right under Service credentials
Click New credentials button on right
For the "Add Inline Configuration Parameters (Optional)" text box enter the following JSON:
{"HMAC":true}
[1] https://console.bluemix.net/docs/services/cloud-object-storage/iam/service-credentials.html#service-credentials
We'll you could use the ibm-cos-sdk Node library https://www.npmjs.com/package/ibm-cos-sdk. You'll need to use your HMAC credentials.
var config = {
endpoint: '<endpoint>',
ibmAuthEndpoint: 'https://iam.ng.bluemix.net/oidc/token',
serviceInstanceId: '<resource-instance-id>',
accessKeyId: '<HMAC access_key>',
secretAccessKey: '<HMAC secret access key>'
};

Powershell for Google Cloud: Authenticate with a service account

I'm trying to build an automatic sync solution that uses a Google Cloud storage bucket for storing data.
When I install the cloud SDK it asks for my authentication, but obviously I don't want to use my credentials on the client's server, it should be done with a service account with specific permissions, right?
The documentation just says to authenticate with your credentials. What is the security best practice here?
Found it, it's this simple command:
gcloud auth activate-service-account --key-file=credentials.json
And it works! I can upload stuff with PowerShell
The doc is here