IBM - COS - SDK IAM token - ibm-cloud

I am trying to access my COS service using python.Referring IBM's Documentation was able to write the following code snippet
import ibm_boto3
from ibm_botocore.client import Config
api_key = 'key'
service_instance_id = 'resource-service-id'
auth_endpoint = 'http://iam.bluemix.net/'
service_endpoint = 'endpoint'
s3 = ibm_boto3.resource('s3',
ibm_api_key_id=api_key,
ibm_service_instance_id=service_instance_id,
ibm_auth_endpoint=auth_endpoint,
config=Config(signature_version='oauth'),
endpoint_url=service_endpoint)
s3.Bucket('bucket name').download_file('object name','location where the object must be saved')
Is this correct ? Also while trying to execute the above code the compiler is not able to retrieve the authentication token from auth_endpoint. Am i missing something?
Please to help
Thanks in advance!
I am including the output for your reference...
ibm_botocore.exceptions.CredentialRetrievalError: Error when retrieving credentials from https://iam.ng.bluemix.net/oidc/token: Retrieval of tokens from server failed
And I am using python 3.x

As instructed in README, the auth_endpoint should have /oidc/token at the end, for example, 'http://iam.bluemix.net/oidc/token'.
auth_endpoint = 'https://iam.bluemix.net/oidc/token'

The auth_endpoint should be https
See the example here
https://github.com/IBM/ibm-cos-sdk-python

To Connect with ibm cloud storage account we need api_key, service_instace_id,auth_endpoint and service_endpoint.
import ibm_boto3
from ibm_botocore.client import Config
api_key = '......' # u can find api_key in service credentials in ibm cloud account
service_instance_id = '.....' u can find service_instance_id in service credentials in ibm cloud account
auth_endpoint = 'https://iam.bluemix.net/oidc/token'
service_endpoint = 'https://s3-api.us-geo.objectstorage.softlayer.net'
cos = ibm_boto3.resource('s3',
ibm_api_key_id=api_key,
ibm_service_instance_id=service_instance_id,
ibm_auth_endpoint=auth_endpoint,
config=Config(signature_version='oauth'),
endpoint_url=service_endpoint)
to create a bucket
new_bucket = 'abcd1234'
def create_bucket():
cos.create_bucket(Bucket=new_bucket)
return "Bucket created sucessfully"
create_bucket()
to list Buckets in cloud
def get_buckets():
print("Retrieving list of buckets")
try:
buckets = cos.buckets.all()
for bucket in buckets:
print("Bucket Name: {0}".format(bucket.name))
except ClientError as be:
print("CLIENT ERROR: {0}\n".format(be))
except Exception as e:
print("Unable to retrieve list buckets: {0}".format(e))
get_buckets()

Related

How to create a bucket using the python SDK?

I'm trying to create a bucket in cloud object storage using python. I have followed the instructions in the API docs.
This is the code I'm using
COS_ENDPOINT = "https://control.cloud-object-storage.cloud.ibm.com/v2/endpoints"
# Create client
cos = ibm_boto3.client("s3",
ibm_api_key_id=COS_API_KEY_ID,
ibm_service_instance_id=COS_INSTANCE_CRN,
config=Config(signature_version="oauth"),
endpoint_url=COS_ENDPOINT
)
s3 = ibm_boto3.resource('s3')
def create_bucket(bucket_name):
print("Creating new bucket: {0}".format(bucket_name))
s3.Bucket(bucket_name).create()
return
bucket_name = 'test_bucket_442332'
create_bucket(bucket_name)
I'm getting this error - I tried setting CreateBucketConfiguration={"LocationConstraint":"us-south"}, but it doesnt seem to work
"ClientError: An error occurred (IllegalLocationConstraintException) when calling the CreateBucket operation: The unspecified location constraint is incompatible for the region specific endpoint this request was sent to."
Resolved by going to https://cloud.ibm.com/docs/cloud-object-storage?topic=cloud-object-storage-endpoints#endpoints
And choosing the endpoint specific to the region I need. The "Endpoint" provided with the credentials, is not the actual endpoint.

How to download from Google Cloud Storage by Alpakka-gcs without providing secret-key?

I'm using Alpakka-gcs connecting to GCS from google-compute-engine perfectly if I provide gcs-secret-key on application.conf like the below.
alpakka.google.cloud.storage {
project-id = "project_id"
client-email = "client_email"
private-key = "************gcs-secret-key************"
base-url = "https://www.googleapis.com/" // default
base-path = "/storage/v1" // default
token-url = "https://www.googleapis.com/oauth2/v4/token" // default
token-scope = "https://www.googleapis.com/auth/devstorage.read_write" // default
}
My question is
how to connect compute-engine already having a credential without providing secret-key for alpakka.
The below code sample is working fine but I want to know alpakka way.
def downloadObject(objectName:String, destFilePath: String): Unit = {
import com.google.cloud.storage.BlobId
import com.google.cloud.storage.StorageOptions
import java.nio.file.Paths
def credential:GoogleCredentials = ComputeEngineCredentials.create()
val storage = StorageOptions.newBuilder.setCredentials(credential).setProjectId(projectId).build.getService
val blob = storage.get(BlobId.of(bucketName, objectName))
blob.downloadTo(Paths.get(destFilePath))
}
If you look into the Alpakka sources, you can see an accessToken creation. Sadly, this version only support the internal call to GoogleTokenApi, a Alpakka made version to request token to Google Cloud. And based only on the private key, not on Metadata server or GOOGLE_APPLICATION_CREDENTIALS environment variable.
You can propose a change in the the project, or even develop it and push it to the project by using the Google Cloud oauth client library.

No error / exception raised in Google Cloud Storage call via PHP client returns null

I'm trying to create a bucket in Google Cloud Storage using PHP client library but bucket is not being created (FYI: no gcs function working at all) and even its not returning an error code or exception so I can debug the issue.
I have gRPC and Protobuf installed
Using compute engine instance with full permissions granted
Enabled the billing for project and also Storage API is Enabled
If need anything else ask me in the comment.
Any hint / clue much appreciated.
Here's the code I'm using:
require 'vendor/autoload.php';
# Imports the Google Cloud client library
use Google\Cloud\Core\ServiceBuilder;
use Google\Cloud\Storage\StorageClient;
function create_bucket($bucketName, $options = [])
{
$config = [
'projectId' => 'abc-def-agen-1553542096432'
];
$storage = new StorageClient($config);
//var_dump($storage);
$bucket = $storage->createBucket($bucketName, $options);
//var_dump($bucket); //this returns nothing
printf('Bucket created: %s' . PHP_EOL, $bucket->name()); // this prints nothing
}
create_bucket("kjdsnuiew345asd");

Google Cloud authorization keeps failing with Python 3 - Type is None, expected one of ('authorized_user', 'service_account')

I am trying to download a file for the first time from Google Cloud Storage.
I set the path to the googstruct.json service account key file that I downloaded from https://cloud.google.com/storage/docs/reference/libraries#client-libraries-usage-python
Do need to set the authorization to Google Cloud outside the code somehow? Or is there a better "How to use Google Cloud Storage" then the one on the google site?
It seems like I am passing the wrong type to the storage_client = storage.Client()
the exception string is below.
Exception has occurred: google.auth.exceptions.DefaultCredentialsError
The file C:\Users\Cary\Documents\Programming\Python\QGIS\GoogleCloud\googstruct.json does not have a valid type.
Type is None, expected one of ('authorized_user', 'service_account').
MY PYTHON 3.7 CODE
from google.cloud import storage
import os
os.environ["GOOGLE_APPLICATION_CREDENTIALS"]="C:\\GoogleCloud\\googstruct.json"
# Instantiates a client
storage_client = storage.Client()
bucket_name = 'structure_ssi'
destination_file_name = "C:\\Users\\18809_PIPEM.shp"
source_blob_name = '18809_PIPEM.shp'
download_blob(bucket_name, source_blob_name, destination_file_name)
def download_blob(bucket_name, source_blob_name, destination_file_name):
"""Downloads a blob from the bucket."""
storage_client = storage.Client()
bucket = storage_client.get_bucket(bucket_name)
blob = bucket.blob(source_blob_name)
blob.download_to_filename(destination_file_name)
print('Blob {} downloaded to {}.'.format(
source_blob_name,
destination_file_name
)
)
I did look at this but I cannot tell if this is my issue. I have tried both.
('Unexpected credentials type', None, 'Expected', 'service_account') with oauth2client (Python)
This error means that the Json Service Account Credentials that you are trying to use C:\\GoogleCloud\\googstruct.json are corrupt or the wrong type.
The first (or second) line in the file googstruct.json should be "type": "service_account".
Another few items to improve your code:
You do not need to use \\, just use / to make your code easier
and cleaner to read.
Load your credentials directly and do not modify environment
variables:
storage_client = storage.Client.from_service_account_json('C:/GoogleCloud/googstruct.json')
Wrap API calls in try / except. Stack traces do not impress customers. It is better to have clear, simple, easy to read error messages.

How to authenticate to Bluemix S3 Lite?

I'm trying to use S3 API with Bluemix object storage using the following code:
import boto3, pprint, sys;
s3 = boto3.Session().client(
service_name="s3",
region_name="us-geo",
endpoint_url="https://s3-api.us-geo.objectstorage.softlayer.net",
aws_access_key_id="auto-generated-apikey-<redacted>",
aws_secret_access_key="<redacted>");
pprint.pprint(s3.list_buckets());
but keep getting AccessDenied error:
<Error>
<Code>AccessDenied</Code>
<Message>Access Denied</Message>
<Resource></Resource>
<RequestId><redacted></RequestId>
<httpStatusCode>403</httpStatusCode>
</Error>
I took aws_access_key_id and aws_secret_access_key from the "Service Credentials" tab. I used similar code for AWS S3, and it worked. What am I missing?
The IAM-enabled cos uses a slightly different syntax for client creation that is supported by a fork of the boto3 library.
Here’s an example in the docs: https://console.bluemix.net/docs/services/cloud-object-storage/libraries/python.html