After deploying my Flask app, I get the following error when trying to access the MongoDB service:
OperationFailure: not authorized on [db_name] to execute command ...
I understand this is because the db user does not have read/write access to the database. But I'm not able to create a new user or change permissions. db.grantRolesToUser() returns "not a function" and addUser() gives me no permission... What can I do?
You need to do two things:
Bind the app to the service using cf bind-service (or the web portal)
In the app, parse the VCAP_SERVICES environment variable to get the credentials
This will ensure you app gets readWrite permissions on your database.
Here's a few helpful links in this regard:
https://docs.developer.swisscom.com/devguide/services/application-binding.html
https://docs.developer.swisscom.com/devguide/deploy-apps/environment-variable.html#VCAP-SERVICES
https://docs.developer.swisscom.com/service-offerings/mongodb.html
Related
I configured okta snowflake SSO. I assigned users as well. I configures scim which has permission to create users, deactivate users, sync password. After i configure scim i am having errors for existing users Automatic provisioning of user to app snowflake failed. Error while creating user. Conflict. Error reported by remote server. User exist with given user name. Same thing happening when I am assigning the app to existing user with same user name. Is there any way to fix it or is it best to remove scim.
In order for the merge to be successful, the login mapping needs to be exactly the same (the rest gets updated by okta). So make sure users can login via SSO first.
You also need to transfer ownership manually. Documentation provides this command:
use role accountadmin;
grant ownership on user <user_name> to role okta_provisioner;
Snowflake SCIM doc
I'm hitting the following error when trying to display graphs with any of my PostgreSQL data sources.
No Data Set Access
Insufficient permissions to the underlying data set.
Access denied, please check your username and password.
Request is missing required authentication credential. Expected OAuth 2 access token, login cookie or other valid authentication credential. See https://developers.google.com/identity/sign-in/web/devconsole-project.
I've whitelisted all Google Data Studio IPs on my PostgreSQL instance and I have no issue adding the corresponding data source to my Google Data Studio report (Add data > PostgreSQL > Authenticate (using a PostgreSQL user) > Add) but every time I try to add a graph I get this error message.
Does anyone know what is going wrong here?
I was able to solve the issue by granting all privileges on all tables to the user I use to authenticate on Google Data Studio. You need to run the following SQL query with a superuser (such as postgres):
GRANT ALL PRIVILEGES ON ALL TABLES IN SCHEMA public TO my_user;
Another option to solve the issue is authenticating with a superuser (such as postgres).
In case some of you happen to be blocked by logs appearing on the charts, I recommend trying to add the data with a SELECT * FROM my_table in "CUSTOM QUERY" instead of using "TABLES". The logs are more explicit.
Bluemix provides a CF command line for download to manage applications.
We want to use CF (or any other command line tools ) to manage Organization and Space users. This will allow us to programmatically sync the user list.
Specifically I am looking for
cf enroll-user
cf add-user
cf remove-user
cf unenroll-use
the cf command already lists the users for a given ORG and SPACE.
The simple answer is to read the docs. See "Creating and Managing Users with the cf CLI." It documents commands like create-user, set-org-role, and set-space-role.
For example: Use cf create-user USERNAME PASSWORD to create a new user. The problem is, when you try to do this in Bluemix, you get an error:
>cf create-user jdoe password
Creating user jdoe as bwoolf...
FAILED
Error creating user jdoe.
Server error, status code: 403, error code: access_denied, message: Invalid token does not contain resource id (scim)
You get a similar error when you try to run set-org-role or set-space-role:
FAILED
Server error, status code: 403: Access is denied. You do not have privileges to execute this command.
Why did you get this error? Like #RandalAnders explained, Bluemix currently blocks users from using these user administration commands in the CF CLI. For the time being, you'll need to perform these actions using the Bluemix Dashboard.
Currently, it is not possible within Bluemix to use the CF CLI for certain management commands, as they require administrative privileges. We are exploring expanding the scope of the commands used in the CLI and would be interested in hearing any other use cases you may have.
you can not create a user on bluemix using cf cli since it needs admin privileges. To add a user, you will need to use bluemix cli 'bluemix iam account-user-invite' to invite a user to your account with a org/space role assigned. There are other account/org/space/role management commands under 'bluemix iam'.
Download bluemix CLI here: http://clis.ng.bluemix.net
I am logged in to a GCE instance via SSH. From there I would like to access the Storage with the help of a Service Account:
GCE> gcloud auth list
Credentialed accounts:
- 1234567890-compute#developer.gserviceaccount.com (active)
I first made sure that this Service account is flagged "Can edit" in the permissions of the project I am working in. I also made sure to give him the Write ACL on the bucket I would like him to copy a file:
local> gsutil acl ch -u 1234567890-compute#developer.gserviceaccount.com:W gs://mybucket
But then the following command fails:
GCE> gsutil cp test.txt gs://mybucket/logs
(I also made sure that "logs" is created under "mybucket").
The error message I get is:
Copying file://test.txt [Content-Type=text/plain]...
AccessDeniedException: 403 Insufficient Permission 0 B
What am I missing?
One other thing to look for is to make sure you set up the appropriate scopes when creating the GCE VM. Even if a VM has a service account attached, it must be assigned devstorage scopes in order to access GCS.
For example, if you had created your VM with devstorage.read_only scope, trying to write to a bucket would fail, even if your service account has permission to write to the bucket. You would need devstorage.full_control or devstorage.read_write.
See the section on Preparing an instance to use service accounts for details.
Note: the default compute service account has very limited scopes (including having read-only to GCS). This is done because the default service account has Project Editor IAM permissions. If you use any user service account this is not typically a problem since user created service accounts get all scope access by default.
After adding necessary scopes to the VM, gsutil may still be using cached credentials which don't have the new scopes. Delete ~/.gsutil before trying the gsutil commands again. (Thanks to #mndrix for pointing this out in the comments.)
You have to log in with an account that has the permissions you need for that project:
gcloud auth login
gsutil config -b
Then surf to the URL it provides,
[ CLICK Allow ]
Then copy the verification code and paste to terminal.
Stop VM
goto --> VM instance details.
in "Cloud API access scopes" select "Allow full access to all Cloud APIs" then
Click "save".
restart VM and Delete ~/.gsutil .
I have written an answer to this question since I can not post comments:
This error can also occur if you're running the gsutil command with a sudo prefix in some cases.
After you have created the bucket, go to the permissions tab and add your email and set Storage Admin permission.
Access VM instance via SSH >> run command: gcloud auth login and follow the steps.
Ref: https://groups.google.com/d/msg/gce-discussion/0L6sLRjX8kg/kP47FklzBgAJ
So I tried a bunch of things trying to copy from GCS bucket to my VM.
Hope this post helps someone.
Via SSHed connection:
and following this script:
sudo gsutil cp gs://[BUCKET_NAME]/[OBJECT_NAME] [OBJECT_DESTINATION_IN_LOCAL]
Got this error:
AccessDeniedException: 403 Access Not Configured. Please go to the Google Cloud Platform Console (https://cloud.google.com/console#/project) for your project, select APIs and Auth and enable the Google Cloud Storage JSON API.
What fixed this was following "Activating the API" section mentioned in this link -
https://cloud.google.com/storage/docs/json_api/
Once I activated the API then I authenticated myself in SSHed window via
gcloud auth login
Following authentication procedure I was finally able to download from Google Storage Bucket to my VM.
PS
I did make sure to:
Make sure that gsutils are installed on my VM instance.
Go to my bucket, go to the permissions tab and add desired service accounts and set Storage Admin permission / role.
3.Make sure my VM had proper Cloud API access scopes:
From the docs:
https://cloud.google.com/compute/docs/access/create-enable-service-accounts-for-instances#changeserviceaccountandscopes
You need to first stop the instance -> go to edit page -> go to "Cloud API access scopes" and choose "storage full access or read/write or whatever you need it for"
Changing the service account and access scopes for an instance If you
want to run the VM as a different identity, or you determine that the
instance needs a different set of scopes to call the required APIs,
you can change the service account and the access scopes of an
existing instance. For example, you can change access scopes to grant
access to a new API, or change an instance so that it runs as a
service account that you created, instead of the Compute Engine
Default Service Account.
To change an instance's service account and access scopes, the
instance must be temporarily stopped. To stop your instance, read the
documentation for Stopping an instance. After changing the service
account or access scopes, remember to restart the instance. Use one of
the following methods to the change service account or access scopes
of the stopped instance.
Change the permissions of bucket.
Add a user for "All User" and give "Storage Admin" access.
I add a user in mongo, using the
use production
db.addUser({user:"user",pwd:"pwd",roles:["read"]})
and also authenticate it, using the
db.auth("user","pwd")
My question is how to login the production database with the user I just created info.