Accessing Media files in Google Storage from Dialogflow fullfillment - google-cloud-storage

How can we access media files stored on cloud storage from dialogflow ?
I am able to access the files if I make them public but what are the other options ?
Is it supposed to work if we grant access to the service (dialogflow and/or firebase ) accounts ?
// https://developers.google.com/actions/assistant/responses#media_responses
// Create a media response
conv.ask(new MediaObject({
name: track.title,
url: track.source,
description: track.artist,
icon: new Image({
url: track.image,
alt: 'Media icon'
})
}));

No, granting access to the service accounts that you're running your Action under are insufficient. Your Action runs in the cloud, but the audio file is downloaded directly by the user's device.
One technique that should work is to use a Signed URL. This lets you put access restrictions on the cloud storage bucket, but generate a URL to access the file for a limited time.

Related

Can Google signed URL generated on one google project can upload bucket object into another google project?

Can Google signed URL generated on one google project can upload bucket object into another google project?
I have used google-document (Code-sample C#) to upload data to google storage bucket but do not want to upload data into another/different project bucket.
e.g. If service account used to generate Singed URL and google project name is "gcp-project-signed-data", then generated signed url can only upload data to "gcp-project-signed-data" and not any other project even if destination project is different i.e. it must give some access related error message,
How can we achieve this?
Two things:
Bucket names are in a global namespace, if your signed URL is for my-bucket there is no other bucket with that name in any other project:
https://cloud.google.com/storage/docs/naming-buckets#considerations
The signed URL reflects the permissions of the original (signing) service account. If the service account does not have permissions to write in other-bucket then any signed URLs created with that account cannot write to other-bucket either.

Signed URL created by Google Cloud Storage Python library blob.generate_signed_url method gets "Access Denied" error

I am trying to create a signed URL for a private object stored in cloud storage.
The storage client is being created using a service account that has the Storage Admin role:
storage_client = storage.Client.from_service_account_json('service.json')
bucket = storage_client.bucket(bucket_name)
blob = bucket.blob(blob_name)
url = blob.generate_signed_url(
version="v4",
# This URL is valid for 15 minutes
expiration=datetime.timedelta(minutes=15),
# Allow GET requests using this URL.
method="GET"
)
This generates a URL that when accessed via a browser gives this error:
<Error>
<Code>AccessDenied</Code>
<Message>Access denied.</Message>
<Details>Anonymous caller does not have storage.objects.get access to the Google Cloud Storage object.</Details>
</Error>
What am I missing here? The service account has no problem interacting with the bucket or blob normally - I can download it/etc. It's just the Signed URL that doesn't work. I can make the object public and then download it - but that defeats the purpose of being able to generate a signed URL.
All of the other answers I've found seem to focus on issues using application default credentials or are very old examples from the v2 API.
Clearly there's something about how I'm using the service account - do I need to explicitly give it permissions on that particular object? Is the Storage Admin role not enough in this context?
Going crazy with this. Please help!

Google Cloud Storage 500 Internal Server Error 'Google::Cloud::Storage::SignedUrlUnavailable'

Trying to get Google Cloud Storage working on my app. I successfully saved an image to a bucket, but when trying to retrieve the image, I receive this error:
GCS Storage (615.3ms) Generated URL for file at key: 9A95rZATRKNpGbMNDbu7RqJx ()
Completed 500 Internal Server Error in 618ms (ActiveRecord: 0.2ms)
Google::Cloud::Storage::SignedUrlUnavailable (Google::Cloud::Storage::SignedUrlUnavailable):
Any idea of what's going on? I can't find an explanation for this error in their documentation.
To provide some explanation here...
Google App Engine (as well as Google Compute Engine, Kubernetes Engine, and Cloud Run) provides "ambient" credentials associated with the VM or instance being run, but only in the form of OAuth tokens. For most API calls, this is sufficient and convenient.
However, there are a small number of exceptions, and Google Cloud Storage is one of them. Recent Storage clients (including the google-cloud-storage gem) may require a full service account key to support certain calls that involve signed URLs. This full key is not provided automatically by App Engine (or other hosting environments). You need to provide one yourself. So as a previous answer indicated, if you're using Cloud Storage, you may not be able to depend on the "ambient" credentials. Instead, you should create a service account, download a service account key, and make it available to your app (for example, via the ActiveStorage configs, or by setting the GOOGLE_APPLICATION_CREDENTIALS environment variable).
I was able to figure this out. I had been following Rail's guide on Active Storage with Google Storage Cloud, and was unclear on how to generate my credentials file.
google:
service: GCS
credentials: <%= Rails.root.join("path/to/keyfile.json") %>
project: ""
bucket: ""
Initially, I thought I didn't need a keyfile due to this sentence in Google's Cloud Storage authentication documentation:
If you're running your application on Google App Engine or Google
Compute Engine, the environment already provides a service account's
authentication information, so no further setup is required.
(I am using Google App Engine)
So I commented out the credentials line and started testing. Strangely, I was able to write to Google Cloud Storage without issue. However, when retrieving the image I would receive the 500 server error Google::Cloud::Storage::SignedUrlUnavailable.
I fixed this by generating my private key and adding it to my rails app.
Another possible solution as of google-cloud-storage gem version 1.27 in August 2020 is documented here. My Google::Auth.get_application_default as in the documentation returned an empty object, but using Google::Cloud::Storage::Credentials.default.client instead worked.
If you get Google::Apis::ClientError: badRequest: Request contains an invalid argument response when signing check that you have dash in the project name in the signing URL (i.e projects/-/serviceAccounts explicit project name in the path is deprecated and no longer valid) and that you have "issuer" string correct, as the full email address identifier of the service account not just the service account name.
If you get Google::Apis::ClientError: forbidden: The caller does not have permission verify the roles your Service Account have:
gcloud projects get-iam-policy <project-name>
--filter="bindings.members:<sa_name>"
--flatten="bindings[].members" --format='table(bindings.role)'
=> ROLE
roles/iam.serviceAccountTokenCreator
roles/storage.admin
serviceAccountTokenCreator is required to call the signBlob service, and you need storage.admin to have ownership of the thing you need to sign. I think these are project global rights, I couldn't get it to work with more fine grained permissions unfortunately (i.e one app is admin for a certain Storage bucket)

how external app can access ibm cloud object storage

I have IBM COS service and able to use Curl command via cli to retrieve objects. I used IAM tokens to retrieve. But how do I let an external web app ex., node access this service?
what value should be there in authorization for external app access?
External apps will come in the form of something like the AWS CLI or any other app that uses either an HTTP library coupled with IBM Cloud Object Storage API or even an SDK for languages like Python, Java or Node.Js
All of the above will ask you for access key and secret key.
You can get both of them from the IBM Cloud console by generating new HMAC Credentials [1]:
Navigate to your Cloud Object storage account
Click on right under Service credentials
Click New credentials button on right
For the "Add Inline Configuration Parameters (Optional)" text box enter the following JSON:
{"HMAC":true}
[1] https://console.bluemix.net/docs/services/cloud-object-storage/iam/service-credentials.html#service-credentials
We'll you could use the ibm-cos-sdk Node library https://www.npmjs.com/package/ibm-cos-sdk. You'll need to use your HMAC credentials.
var config = {
endpoint: '<endpoint>',
ibmAuthEndpoint: 'https://iam.ng.bluemix.net/oidc/token',
serviceInstanceId: '<resource-instance-id>',
accessKeyId: '<HMAC access_key>',
secretAccessKey: '<HMAC secret access key>'
};

How to make IBM Bluemix Object Storage file publicly accessible?

On Bluemix, I created a Java application using Liberty for Java and the Object Storage Service. I then bound the Java Application and Object Storage Device. I uploaded the images into the container which I created in the Object Storage service. Now I want to access the uploaded images publicly, such as opening the images in a browser directly. I created the URL like the IBM Bluemix documentation said. After I access the URL in browser it shows the following error:
401 Unauthorized
Unauthorized
This server could not verify that you are authorized to access the document you requested.
My sample URL
Is it possible to make the URL public?
You can create a temporary public URL using the swift command line.
First you need to set a key and they create the temporary url. For example:
swift post -m "Temp-URL-Key:yourkey"
swift tempurl GET 3000 /v1/AUTH_90e12a182adf4a32bbd5e34645380244/offermsgs-cateimgs/books.jpg yourkey
The output of the command above will be your temporary public URL and in the example below it will be valid for 3000 seconds.
You can modify Object Storage ACL as well to make all files read only as suggested in the following post:
Public URLs For Objects In Bluemix Object Storage Service