How to make IBM Bluemix Object Storage file publicly accessible? - ibm-cloud

On Bluemix, I created a Java application using Liberty for Java and the Object Storage Service. I then bound the Java Application and Object Storage Device. I uploaded the images into the container which I created in the Object Storage service. Now I want to access the uploaded images publicly, such as opening the images in a browser directly. I created the URL like the IBM Bluemix documentation said. After I access the URL in browser it shows the following error:
401 Unauthorized
Unauthorized
This server could not verify that you are authorized to access the document you requested.
My sample URL
Is it possible to make the URL public?

You can create a temporary public URL using the swift command line.
First you need to set a key and they create the temporary url. For example:
swift post -m "Temp-URL-Key:yourkey"
swift tempurl GET 3000 /v1/AUTH_90e12a182adf4a32bbd5e34645380244/offermsgs-cateimgs/books.jpg yourkey
The output of the command above will be your temporary public URL and in the example below it will be valid for 3000 seconds.
You can modify Object Storage ACL as well to make all files read only as suggested in the following post:
Public URLs For Objects In Bluemix Object Storage Service

Related

Signed URL created by Google Cloud Storage Python library blob.generate_signed_url method gets "Access Denied" error

I am trying to create a signed URL for a private object stored in cloud storage.
The storage client is being created using a service account that has the Storage Admin role:
storage_client = storage.Client.from_service_account_json('service.json')
bucket = storage_client.bucket(bucket_name)
blob = bucket.blob(blob_name)
url = blob.generate_signed_url(
version="v4",
# This URL is valid for 15 minutes
expiration=datetime.timedelta(minutes=15),
# Allow GET requests using this URL.
method="GET"
)
This generates a URL that when accessed via a browser gives this error:
<Error>
<Code>AccessDenied</Code>
<Message>Access denied.</Message>
<Details>Anonymous caller does not have storage.objects.get access to the Google Cloud Storage object.</Details>
</Error>
What am I missing here? The service account has no problem interacting with the bucket or blob normally - I can download it/etc. It's just the Signed URL that doesn't work. I can make the object public and then download it - but that defeats the purpose of being able to generate a signed URL.
All of the other answers I've found seem to focus on issues using application default credentials or are very old examples from the v2 API.
Clearly there's something about how I'm using the service account - do I need to explicitly give it permissions on that particular object? Is the Storage Admin role not enough in this context?
Going crazy with this. Please help!

Google Cloud Storage 500 Internal Server Error 'Google::Cloud::Storage::SignedUrlUnavailable'

Trying to get Google Cloud Storage working on my app. I successfully saved an image to a bucket, but when trying to retrieve the image, I receive this error:
GCS Storage (615.3ms) Generated URL for file at key: 9A95rZATRKNpGbMNDbu7RqJx ()
Completed 500 Internal Server Error in 618ms (ActiveRecord: 0.2ms)
Google::Cloud::Storage::SignedUrlUnavailable (Google::Cloud::Storage::SignedUrlUnavailable):
Any idea of what's going on? I can't find an explanation for this error in their documentation.
To provide some explanation here...
Google App Engine (as well as Google Compute Engine, Kubernetes Engine, and Cloud Run) provides "ambient" credentials associated with the VM or instance being run, but only in the form of OAuth tokens. For most API calls, this is sufficient and convenient.
However, there are a small number of exceptions, and Google Cloud Storage is one of them. Recent Storage clients (including the google-cloud-storage gem) may require a full service account key to support certain calls that involve signed URLs. This full key is not provided automatically by App Engine (or other hosting environments). You need to provide one yourself. So as a previous answer indicated, if you're using Cloud Storage, you may not be able to depend on the "ambient" credentials. Instead, you should create a service account, download a service account key, and make it available to your app (for example, via the ActiveStorage configs, or by setting the GOOGLE_APPLICATION_CREDENTIALS environment variable).
I was able to figure this out. I had been following Rail's guide on Active Storage with Google Storage Cloud, and was unclear on how to generate my credentials file.
google:
service: GCS
credentials: <%= Rails.root.join("path/to/keyfile.json") %>
project: ""
bucket: ""
Initially, I thought I didn't need a keyfile due to this sentence in Google's Cloud Storage authentication documentation:
If you're running your application on Google App Engine or Google
Compute Engine, the environment already provides a service account's
authentication information, so no further setup is required.
(I am using Google App Engine)
So I commented out the credentials line and started testing. Strangely, I was able to write to Google Cloud Storage without issue. However, when retrieving the image I would receive the 500 server error Google::Cloud::Storage::SignedUrlUnavailable.
I fixed this by generating my private key and adding it to my rails app.
Another possible solution as of google-cloud-storage gem version 1.27 in August 2020 is documented here. My Google::Auth.get_application_default as in the documentation returned an empty object, but using Google::Cloud::Storage::Credentials.default.client instead worked.
If you get Google::Apis::ClientError: badRequest: Request contains an invalid argument response when signing check that you have dash in the project name in the signing URL (i.e projects/-/serviceAccounts explicit project name in the path is deprecated and no longer valid) and that you have "issuer" string correct, as the full email address identifier of the service account not just the service account name.
If you get Google::Apis::ClientError: forbidden: The caller does not have permission verify the roles your Service Account have:
gcloud projects get-iam-policy <project-name>
--filter="bindings.members:<sa_name>"
--flatten="bindings[].members" --format='table(bindings.role)'
=> ROLE
roles/iam.serviceAccountTokenCreator
roles/storage.admin
serviceAccountTokenCreator is required to call the signBlob service, and you need storage.admin to have ownership of the thing you need to sign. I think these are project global rights, I couldn't get it to work with more fine grained permissions unfortunately (i.e one app is admin for a certain Storage bucket)

Google Speech API returns 403 PERMISSION_DENIED

I have been using the Google Speech API to transcribe audio to text from my PHP app (using the Google Cloud PHP Client) for several months without any problem. But my calls have now started to return 403 errors with status "PERMISSION_DENIED" and message "The caller does not have permission".
I'm using the Speech API together with Google Storage. I'm authenticating using a service account and sending my audio data to Storage. That's working, the file gets uploaded. So I understand - but I might be wrong? - that "the caller" does not have permission to then read to the audio data from Storage.
I've been playing with permissions through the Google Console without success. I've read the docs but am quite confused. The service account I am using (I guess this is "the caller"?) has owner permissions on the project. And everything used to work fine, I haven't changed a thing.
I'm not posting code because if I understand correctly my app code isn't the issue - it's rather my Google Cloud settings. I'd be grateful for any idea or clarifications of concepts!
Thanks.
Being an owner of the project doesn't necessarily imply that the service account has read permission on the object. It's possible that the object was uploaded by another account that specified a private ACL or similar.
Make sure that the service account has access to the object by giving it the right permissions on the entire bucket or on the specific object itself.
You can do so using gsutil acl. More information and additional methods may be found in the official documentation.
For instance the following command gives READ permission on an object to your service account:
gsutil acl -r ch -u serviceAccount#domain.com:R gs://bucket/object
And this command gives READ permission on an entire bucket to your service account:
gsutil acl -r ch -u serviceAccount#domain.com:R gs://bucket
In google cloud vision,when your creating credentials with service account key, you have to create role and set it owner and accesses full permissions

How to access files in container in Object Storage Service in Bluemix?

How can I access files by url in Bluemix Object Storage?
Is there a way to make the container public?
How can I access the file in Object Storage just by typing the url
in the browser?
How can I retrieve an image by url to display it in html?
You can create a temporary URLs using the swift command line to provide public access to your Object Storage files.
First you have to set the swift CLI, you can find steps on this link.
After you have swift cli configured for your environment you can run the following commands to create temporary URLs for your files:
swift stat
to locate your account field (starts with AUTH_)
swift post -m "Temp-URL-Key:<key>"
to set a secret key
swift stat
to verify a secret key was created
swift tempurl GET <seconds> <path> <key>
to create the temporary URL
You can then access the file via the following URL:
https://<access point>/<API version>/AUTH_<project ID>/<container namespace>/<object namespace>
Complete details are available in the Object Storage documentation here.
I wrote a comment on that here :
Public URLs For Objects In Bluemix Object Storage Service
BlueMix is still using swift but S3 API is the most reliable.
So to answere your questions :
How can I access files by url in Bluemix Object Storage?
after uploading an image ( for example), you have to use a tool that can access your image and make it public ( it will add a public acl in the properity of the object ). you can use Cloudberry for example or S3 Browser and use the functionality " make public ".
Is there a way to make the container public?
Your contenair will run in bluemix, but the service can create a public URL. yes.
How can I access the file in Object Storage just by typing the url in the browser?
here is an example of an image I made public on my object store :
https://s3-api.dal-us-geo.objectstorage.softlayer.net/mourad-bucket-rasp-1/OBAMA.jpg
You can do this using any browser after adding the acl to "public read"
( again, if you use python and boto3 sdk, see my post here : Public URLs For Objects In Bluemix Object Storage Service )
How can I retrieve an image by url to display it in html?
Several way to do that, since, it has now a public Url and the first part of the ul will not change, only name of your object will change, just call your URL using variable ( like bucket, names etc.. )
the complete API reference has been released and it is Here
Finally these too commands saved the day
First use swift and change access control of container
swift post container-name --read-acl ".r:*,.rlistings"
Next Using Curl Configure Container to a particular Url for accessing Files
curl -X GET " https://<access point>/<version>/AUTH_projectID/container-name" -H "X-Auth-Token:<auth token>" -H "X-Container-Read: .r:*,.rlistings"
And also very grateful for the help provided by Alex da Silva

How to upload files to XTRF so a local file can be referenced in call to createSimpleQuote

Trying to understand options for attatching input files can to a quote/project when using the Web Service API for Partners 1.0.
We have figured out how to use the login and the createSimpleQuote SOAP methods to create a Quote as a customer.
The thing step that we are struggling with now is how to upload the input files that are to be referenced in the files section of the payload to createSimpleQuote.
<par:files>
<par:name>?</par:name>
<par:category>?</par:category>
<par:url>?</par:url>
</par:files>
Ideally we like the url element to reference local files (using file:///tmp/sample.pdf) as it is done in the java usage example).
What options do we have for uploading files?
How do we get the local path value of an uploaded file that can be used in the createSimpleQuote SOAP call?
Please advice?
It is not possible to upload files from your local disk to XTRF via Web Service API.
In order to reference a file using file:// protocol, the file must be visible to XTRF. There are several options to do this:
the file should be uploaded before sending request to the XTRF Web Service API
request to the XTRF Web Service API should be sent from the same machine where XTRF is running
remote disk where the file is stored should be mounted (e.g. using CIFS, NFS or another network filesystem protocol) on the machine where XTRF is running
Note that it is also possible to refer to a file using other protocols, e.g. http:// or ftp://.