how external app can access ibm cloud object storage - ibm-cloud

I have IBM COS service and able to use Curl command via cli to retrieve objects. I used IAM tokens to retrieve. But how do I let an external web app ex., node access this service?
what value should be there in authorization for external app access?

External apps will come in the form of something like the AWS CLI or any other app that uses either an HTTP library coupled with IBM Cloud Object Storage API or even an SDK for languages like Python, Java or Node.Js
All of the above will ask you for access key and secret key.
You can get both of them from the IBM Cloud console by generating new HMAC Credentials [1]:
Navigate to your Cloud Object storage account
Click on right under Service credentials
Click New credentials button on right
For the "Add Inline Configuration Parameters (Optional)" text box enter the following JSON:
{"HMAC":true}
[1] https://console.bluemix.net/docs/services/cloud-object-storage/iam/service-credentials.html#service-credentials

We'll you could use the ibm-cos-sdk Node library https://www.npmjs.com/package/ibm-cos-sdk. You'll need to use your HMAC credentials.
var config = {
endpoint: '<endpoint>',
ibmAuthEndpoint: 'https://iam.ng.bluemix.net/oidc/token',
serviceInstanceId: '<resource-instance-id>',
accessKeyId: '<HMAC access_key>',
secretAccessKey: '<HMAC secret access key>'
};

Related

How to setup google service account authorization in Node.js with JSON key file?

Trying to make use of the Server to Server OAuth flow defined here:
https://developers.google.com/identity/protocols/OAuth2ServiceAccount
Since I'm running from a local dev environment, I've created a service account in GCP and downloaded the JSON file with the private key, but cannot find any Node.js code examples on how to:
1) load the json file
2) set delegated credentials (for G Suite domain-wide authorization)
Places I've looked (besides stackoverflow) are Google's git wiki for the node.js client library, which does talk about server to server auth, but seems to assume you're running from appengine or google cloud and don't need to load a key file:
https://github.com/googleapis/google-api-nodejs-client#service-to-service-authentication
The Admin SDK Activities Reports API has a Node example, but it's using the web-based flow assuming a user is present:
https://developers.google.com/admin-sdk/reports/v1/quickstart/nodejs
Buried deep in the Node.js samples is use of the Directory API, which does seem to take a keyfile as input, but when I try running locally it says getClient is not a constructor, and still this example doesn't show how to set the G Suite admin user for context (which is generally when a refresh token and access token are loaded into the app):
https://github.com/googleapis/google-api-nodejs-client/blob/master/samples/directory_v1/group-delete.js
So... does anybody have an example of this? I really don't want to switch to a Python runtime but Google seems to have left out important examples on this topic.

Retrieve logged user information from cloud foundry web application

We developed a web application using SAP Web-IDE Full Stack; we need to retrieve the details of the user logged into application (as defined in SAP Cloud Platform Identity Authentication Administration), for example display name and assigned groups.
We tried the userapi/currentUser API, but it seems to work only on NEO environment, for this reason is working fine while debugging in Web-IDE, but we get a 404 error when deploying the app on Cloud Foundry.
Do we need to add a new destination to make userapi work also on CF? Or is there some kind of similar solution available on Cloud Foundry?
I highly suggest using the SAP S/4HANA Cloud SDK for such tasks. It is an SDK developed to make building applications for SAP Cloud Platform easy, by providing easy to use mechanisms for all the Cloud Platform mechanisms.
Regarding your task at hand, there is a UserAccessor class that you can use like this:
final Optional<User> user = UserAccessor.getCurrentUser();
This works on Neo as well as on Cloud Foundry, i.e. there is a single interface for both platforms, which allows you to develop your app in a platform agnostic way.
If this sounds like it could solve your problem, I recommend checking out this blog post series to get started.
Alternatively, you can also simply add the following dependency to your project to start testing the SDK:
<dependency>
<groupId>com.sap.cloud.s4hana.cloudplatform</groupId>
<artifactId>scp-neo</artifactId>
<version>2.7.0</version>
</dependency>
For Cloud Foundry use scp-cf instead of scp-neo.
Hope this helps!
P.S.: To answer your question also on a technical level, Cloud Foundry uses so-called JWTs for authentication and authorization. You can check whether a JWT is present by looking at the Authorization header of the request. The JWT should hold the information you're looking for.
In SAP Cloud Foundry if you develop a MTA using XSUAA service to manage User Authentication and Admistration, defined for example in the mta.yaml,
...
resources:
- name: uaa_myapp
parameters:
path: ./xs-security.json
service-plan: application
service: xsuaa
type: org.cloudfoundry.managed-service
...
you can use the UAA API published from XSUAA service self to manage user authentication and authorization (e.g.: retrieve user info, groups assigned, password management etc..). also in the case the application is federated with another IDP.
To consume this API for example to retrieve user info you need to:
Determine the XSUAA endpoint bound to your app (SCP Cockpit > XSUAA service detail > take the value url)
Create a destination (xsuaa_api_destination) of type OAuth2TokenExchange bound to your app with url url took before, and fill OAuth2 authentication parameters with the data contained in XSUAA service detail (step 1).
From your app execute the call xsuaa_api_destination/userinfo, for example using an ajax if you are using JS.
You can find other info in Account and Authentication Service of the Cloud Foundry Environment SAP doc.

Google Cloud Storage 500 Internal Server Error 'Google::Cloud::Storage::SignedUrlUnavailable'

Trying to get Google Cloud Storage working on my app. I successfully saved an image to a bucket, but when trying to retrieve the image, I receive this error:
GCS Storage (615.3ms) Generated URL for file at key: 9A95rZATRKNpGbMNDbu7RqJx ()
Completed 500 Internal Server Error in 618ms (ActiveRecord: 0.2ms)
Google::Cloud::Storage::SignedUrlUnavailable (Google::Cloud::Storage::SignedUrlUnavailable):
Any idea of what's going on? I can't find an explanation for this error in their documentation.
To provide some explanation here...
Google App Engine (as well as Google Compute Engine, Kubernetes Engine, and Cloud Run) provides "ambient" credentials associated with the VM or instance being run, but only in the form of OAuth tokens. For most API calls, this is sufficient and convenient.
However, there are a small number of exceptions, and Google Cloud Storage is one of them. Recent Storage clients (including the google-cloud-storage gem) may require a full service account key to support certain calls that involve signed URLs. This full key is not provided automatically by App Engine (or other hosting environments). You need to provide one yourself. So as a previous answer indicated, if you're using Cloud Storage, you may not be able to depend on the "ambient" credentials. Instead, you should create a service account, download a service account key, and make it available to your app (for example, via the ActiveStorage configs, or by setting the GOOGLE_APPLICATION_CREDENTIALS environment variable).
I was able to figure this out. I had been following Rail's guide on Active Storage with Google Storage Cloud, and was unclear on how to generate my credentials file.
google:
service: GCS
credentials: <%= Rails.root.join("path/to/keyfile.json") %>
project: ""
bucket: ""
Initially, I thought I didn't need a keyfile due to this sentence in Google's Cloud Storage authentication documentation:
If you're running your application on Google App Engine or Google
Compute Engine, the environment already provides a service account's
authentication information, so no further setup is required.
(I am using Google App Engine)
So I commented out the credentials line and started testing. Strangely, I was able to write to Google Cloud Storage without issue. However, when retrieving the image I would receive the 500 server error Google::Cloud::Storage::SignedUrlUnavailable.
I fixed this by generating my private key and adding it to my rails app.
Another possible solution as of google-cloud-storage gem version 1.27 in August 2020 is documented here. My Google::Auth.get_application_default as in the documentation returned an empty object, but using Google::Cloud::Storage::Credentials.default.client instead worked.
If you get Google::Apis::ClientError: badRequest: Request contains an invalid argument response when signing check that you have dash in the project name in the signing URL (i.e projects/-/serviceAccounts explicit project name in the path is deprecated and no longer valid) and that you have "issuer" string correct, as the full email address identifier of the service account not just the service account name.
If you get Google::Apis::ClientError: forbidden: The caller does not have permission verify the roles your Service Account have:
gcloud projects get-iam-policy <project-name>
--filter="bindings.members:<sa_name>"
--flatten="bindings[].members" --format='table(bindings.role)'
=> ROLE
roles/iam.serviceAccountTokenCreator
roles/storage.admin
serviceAccountTokenCreator is required to call the signBlob service, and you need storage.admin to have ownership of the thing you need to sign. I think these are project global rights, I couldn't get it to work with more fine grained permissions unfortunately (i.e one app is admin for a certain Storage bucket)

Having issue determining credentials used when connecting to SoftLayer ObjectStorage using SFTP

I'm having trouble connecting to the Bluemix Object Store using the instructions presented by this link: https://knowledgelayer.softlayer.com/procedure/connect-object-storage-using-sftp
It's unclear to me what the username and account ID are so I would appreciate it if someone can clarify
The instructions are valid
Where I can find the values for SLOS/IBMOS etc?
I do not have access to the Softlayer customer portal as this service as created in Bluemix.
I can confirm that an sftp server is listening at the appropriate region endpoint.
Brien, it is not possible to use SFTP to access the Bluemix Object Storage if you create it from the Services catalog area of the Bluemix UI:
https://console.ng.bluemix.net/catalog/services/object-storage
This one can be accessed via swift cli or REST API.
To use SFTP to access your Object Storage you need to create it from the Infrastructure are of the Bluemix UI - that is the legacy Softayer that is now integrated with Bluemix.
https://console.ng.bluemix.net/catalog/infrastructure/object_storage/
Also, to create the Object Storage from the Infrastructure catalog you need to first link your Bluemix and Softlayer accounts:
https://console.ng.bluemix.net/docs/admin/softlayerlink.html

How to get VCAP_SERVICES environment variables WITHOUT binding to an application?

Frequently, I'm create standalone services in Bluemix. For example, Analytics for Apache Hadoop, Cloudant and DashDB.
I don't need an application to work with these services, but it seems I have to bind to an application just to get access to the VCAP_SERVICES environment variables with urls, usernames, passwords, etc.
Question: How to get VCAP_SERVICES environment variables WITHOUT binding to an application?
For many services, you will have to bind them to an app in order to get the VCAP_SERVICES credentials.
There is a service key capability which some services are starting to adopt which allows you to create and access credentials without binding to an app. Using the cf command line tool, the commands below are available to use if a service supports them:
SERVICES:
create-service-key, csk Create key for a service instance
service-keys, sk List keys for a service instance
service-key Show service key info
delete-service-key, dsk Delete a service key
The CloudFoundry docs at https://docs.cloudfoundry.org/devguide/services/service-keys.html provide more detail.
In the Bluemix UI, you would see a 'Service Credentials' option in the panel when viewing a service dashboard when a service supports this capability. Selecting this option allows you to see credentials that have been created as well as an "Add Credentials" button to create new ones.