How do I manage org and space users in bluemix using cf command line? - ibm-cloud

Bluemix provides a CF command line for download to manage applications.
We want to use CF (or any other command line tools ) to manage Organization and Space users. This will allow us to programmatically sync the user list.
Specifically I am looking for
cf enroll-user
cf add-user
cf remove-user
cf unenroll-use
the cf command already lists the users for a given ORG and SPACE.

The simple answer is to read the docs. See "Creating and Managing Users with the cf CLI." It documents commands like create-user, set-org-role, and set-space-role.
For example: Use cf create-user USERNAME PASSWORD to create a new user. The problem is, when you try to do this in Bluemix, you get an error:
>cf create-user jdoe password
Creating user jdoe as bwoolf...
FAILED
Error creating user jdoe.
Server error, status code: 403, error code: access_denied, message: Invalid token does not contain resource id (scim)
You get a similar error when you try to run set-org-role or set-space-role:
FAILED
Server error, status code: 403: Access is denied. You do not have privileges to execute this command.
Why did you get this error? Like #RandalAnders explained, Bluemix currently blocks users from using these user administration commands in the CF CLI. For the time being, you'll need to perform these actions using the Bluemix Dashboard.

Currently, it is not possible within Bluemix to use the CF CLI for certain management commands, as they require administrative privileges. We are exploring expanding the scope of the commands used in the CLI and would be interested in hearing any other use cases you may have.

you can not create a user on bluemix using cf cli since it needs admin privileges. To add a user, you will need to use bluemix cli 'bluemix iam account-user-invite' to invite a user to your account with a org/space role assigned. There are other account/org/space/role management commands under 'bluemix iam'.
Download bluemix CLI here: http://clis.ng.bluemix.net

Related

Will gcloud using my SSH key to login or I will always need to login via the web?

I am trying to perform a very basic command like:
gcloud compute machine-types list
And I get this error:
ERROR: (gcloud.compute.machine-types.list) There was a problem
refreshing your current auth tokens: invalid_grant: Bad Request Please
run:
It tells me to login using 'gcloud auth login' which opens up the browser.
Is it possible to use a ssh key to skip this authentication process or I have to do this always? ssh keys are for accessing compute instances only?
Just trying to understand what SSH keys are used for and how this web based authorization fits into the picture here.
Generally, you authenticate to gcloud (and GCP services) using credentials from a Google (often Gmail) account. Such accounts use 3-legged (O)Auth and this requires the browser prompt for the human to confirm the scopes etc.
If you haven't, you should confirm the prompt, copy the token provided and paste that back into gcloud so that auth will occur transparently.
This process is different than SSH'ing to Compute Engine instances.
When you run gcloud compute machine-types list, you're authenticating (and being authorized) by Google Cloud Platform to invoke (meta)services.
When you run gcloud compute ssh ..., the command uses ssh to connect you to the (Linux) instance.
NOTE gcloud auth login --no-launch-browser is available too (link). This requires you to separately launch a browser and complete the process but it doesn't launch the browser directly from the command.
If you are trying to automate some sort of service, that runs cloud commands on-demand, without operator/browser involved - your best bet would be to create a Service Account for that task, get the key for that account and activate it, using
gcloud auth activate-service-account --key-file=my-service-account-key-file.json
If this service runs on Google Cloud platform - you don't even need to deal with the key. Just associate the service account with an instance you are running.
https://cloud.google.com/compute/docs/access/create-enable-service-accounts-for-instances

ibmcloud create action through cli

ibmcloud create action through cli is not working . Getting an exception
Unable to authenticate with Cloud Functions: Unable to obtain wsk
authentication key for Org 'xxxxx' and Space 'xxxx': Target Org
'xxxxm' and Space 'xxxx' do not have an auth key; if Space 'dev' was
recently created, try again in a couple minutes.
But I have logged in successfully using ibmcloud login command
Issue is resolved now. The account I created at IBM was lite account where the initial credentials were created with different location. But my ibmcloud login was picking another location api. Even if i could give the api enpoint manually, org and space was not matching. Since it was a Lite account, there was no permission to add another org and space. AFter sending mail to IBM, I could add my credit card details and I could create a new space and creating action worked now

Recovering access after initially provisioning wrong scopes for an instance

I recently created a VM, but mistakenly gave the default service account Storage: Read Only permissions instead of the intended Read Write under "Identity & API access", so GCS write operations from the VM are now failing.
I realized my mistake, so following the advice in this answer, I stopped the VM, changed the scope to Read Write and started the VM. However, when I SSH in, I'm still getting 403 errors when trying to create buckets.
$ gsutil mb gs://some-random-bucket
Creating gs://some-random-bucket/...
AccessDeniedException: 403 Insufficient OAuth2 scope to perform this operation.
Acceptable scopes: https://www.googleapis.com/auth/cloud-platform
How can I fix this? I'm using the default service account, and don't have the IAM permissions to be able to create new ones.
$ gcloud auth list
Credentialed Accounts
ACTIVE ACCOUNT
* (projectnum)-compute#developer.gserviceaccount.com
I will suggest you to try add the scope "cloud-platform" to the instance by running the gcloud command below
gcloud alpha compute instances set-scopes INSTANCE_NAME [--zone=ZONE]
[--scopes=[SCOPE,…] [--service-account=SERVICE_ACCOUNT
As a scopes put "https://www.googleapis.com/auth/cloud-platform" since it give Full access to all Google Cloud Platform resources.
Here is gcloud documentation
Try creating the Google Cloud Storage bucket with your user account.
Type gcloud auth login and access the link you are provided, once there, copy the code and paste it into the command line.
Then do gsutil mb gs://bucket-name.
The security model has 2 things at play, API Scopes and IAM permissions. Access is determined by the AND of them. So you need an acceptable scope and enough IAM privileges in order to do whatever action.
API Scopes are bound to the credentials. They are represented by a URL like, https://www.googleapis.com/auth/cloud-platform.
IAM permissions are bound to the identity. These are setup in the Cloud Console's IAM & admin > IAM section.
This means you can have 2 VMs with the default service account but both have different levels of access.
For simplicity you generally want to just set the IAM permissions and use the cloud-platform API auth scope.
To check if you have this setup go to the VM in cloud console and you'll see something like:
Cloud API access scopes
Allow full access to all Cloud APIs
When you SSH into the VM by default gcloud will be logged in as the service account on the VM. I'd discourage logging in as yourself otherwise you more or less break gcloud's configuration to read the default service account.
Once you have this setup you should be able to use gsutil properly.

Google Cloud Platform: Logging in to GCP from commandline

I was sure it will be simple but couldn't find any documentation or resolution.
I'm trying to write a script using gcloud to perform some operations in my GCP instances.
Is there anyway to login/authenticate using gcloud via command line only?
Thanks
You have a couple of options here (depending on what exactly you're trying to do).
The first option is to log in using the --no-launch-browser option. This still requires interaction from a human user, but doesn't require a browser on the machine you're using:
> gcloud auth login --no-launch-browser
Go to the following link in your browser:
https://accounts.google.com/o/oauth2/auth?redirect_uri=urn%3Aietf%3Awg%3Aoauth%3A2.0%3Aoob&prompt=select_account&response_type=code&client_id=32555940559.apps.googleusercontent.com&scope=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fuserinfo.email+https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform+https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fappengine.admin+https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcompute&access_type=offline
Enter verification code: *********************************************
Saved Application Default Credentials.
You are now logged in as [user#example.com].
Your current project is [None]. You can change this setting by running:
$ gcloud config set project PROJECT_ID
The non-interactive option involves service accounts. The linked documentation explains them better than I can, but the short version of what you need to do is as follows:
Create a service account in the Google Developers Console. Make sure it has the appropriate "scopes" (these are permissions that determine what this service account can do. Download the corresponding JSON key file.
Run gcloud auth activate-service-account --key-file <path to key file>.
Note that Google Compute Engine VMs come with a slightly-different service account; the difference is described here.

gsutil copy returning "AccessDeniedException: 403 Insufficient Permission" from GCE

I am logged in to a GCE instance via SSH. From there I would like to access the Storage with the help of a Service Account:
GCE> gcloud auth list
Credentialed accounts:
- 1234567890-compute#developer.gserviceaccount.com (active)
I first made sure that this Service account is flagged "Can edit" in the permissions of the project I am working in. I also made sure to give him the Write ACL on the bucket I would like him to copy a file:
local> gsutil acl ch -u 1234567890-compute#developer.gserviceaccount.com:W gs://mybucket
But then the following command fails:
GCE> gsutil cp test.txt gs://mybucket/logs
(I also made sure that "logs" is created under "mybucket").
The error message I get is:
Copying file://test.txt [Content-Type=text/plain]...
AccessDeniedException: 403 Insufficient Permission 0 B
What am I missing?
One other thing to look for is to make sure you set up the appropriate scopes when creating the GCE VM. Even if a VM has a service account attached, it must be assigned devstorage scopes in order to access GCS.
For example, if you had created your VM with devstorage.read_only scope, trying to write to a bucket would fail, even if your service account has permission to write to the bucket. You would need devstorage.full_control or devstorage.read_write.
See the section on Preparing an instance to use service accounts for details.
Note: the default compute service account has very limited scopes (including having read-only to GCS). This is done because the default service account has Project Editor IAM permissions. If you use any user service account this is not typically a problem since user created service accounts get all scope access by default.
After adding necessary scopes to the VM, gsutil may still be using cached credentials which don't have the new scopes. Delete ~/.gsutil before trying the gsutil commands again. (Thanks to #mndrix for pointing this out in the comments.)
You have to log in with an account that has the permissions you need for that project:
gcloud auth login
gsutil config -b
Then surf to the URL it provides,
[ CLICK Allow ]
Then copy the verification code and paste to terminal.
Stop VM
goto --> VM instance details.
in "Cloud API access scopes" select "Allow full access to all Cloud APIs" then
Click "save".
restart VM and Delete ~/.gsutil .
I have written an answer to this question since I can not post comments:
This error can also occur if you're running the gsutil command with a sudo prefix in some cases.
After you have created the bucket, go to the permissions tab and add your email and set Storage Admin permission.
Access VM instance via SSH >> run command: gcloud auth login and follow the steps.
Ref: https://groups.google.com/d/msg/gce-discussion/0L6sLRjX8kg/kP47FklzBgAJ
So I tried a bunch of things trying to copy from GCS bucket to my VM.
Hope this post helps someone.
Via SSHed connection:
and following this script:
sudo gsutil cp gs://[BUCKET_NAME]/[OBJECT_NAME] [OBJECT_DESTINATION_IN_LOCAL]
Got this error:
AccessDeniedException: 403 Access Not Configured. Please go to the Google Cloud Platform Console (https://cloud.google.com/console#/project) for your project, select APIs and Auth and enable the Google Cloud Storage JSON API.
What fixed this was following "Activating the API" section mentioned in this link -
https://cloud.google.com/storage/docs/json_api/
Once I activated the API then I authenticated myself in SSHed window via
gcloud auth login
Following authentication procedure I was finally able to download from Google Storage Bucket to my VM.
PS
I did make sure to:
Make sure that gsutils are installed on my VM instance.
Go to my bucket, go to the permissions tab and add desired service accounts and set Storage Admin permission / role.
3.Make sure my VM had proper Cloud API access scopes:
From the docs:
https://cloud.google.com/compute/docs/access/create-enable-service-accounts-for-instances#changeserviceaccountandscopes
You need to first stop the instance -> go to edit page -> go to "Cloud API access scopes" and choose "storage full access or read/write or whatever you need it for"
Changing the service account and access scopes for an instance If you
want to run the VM as a different identity, or you determine that the
instance needs a different set of scopes to call the required APIs,
you can change the service account and the access scopes of an
existing instance. For example, you can change access scopes to grant
access to a new API, or change an instance so that it runs as a
service account that you created, instead of the Compute Engine
Default Service Account.
To change an instance's service account and access scopes, the
instance must be temporarily stopped. To stop your instance, read the
documentation for Stopping an instance. After changing the service
account or access scopes, remember to restart the instance. Use one of
the following methods to the change service account or access scopes
of the stopped instance.
Change the permissions of bucket.
Add a user for "All User" and give "Storage Admin" access.