The question is how to configure bucket notifications to publish events from any user that made changes for a bucket? Or may be there is options to enable notification for entire ceph storage regardless of a bucket or a user?
TIA
Edit:
Will try to be a bit more specific - I have user "test1" that created bucket "bucket1", I configured bucket notification according to the official docs for bucket1 where endpoint is kafka. Now I can see events in the configured topic when I do any actions in the bucket by user1. After that I create a new user, lets say, user2. Throught bucket policy of the bucket1 I give user2 rights to put, delete objects in this bucket. However, when I do any action by user2 in the bucket "bucket1" I do not see any events in the previously configured topic.
It turned out that it is a bug - https://tracker.ceph.com/issues/47904
Related
I'm using Google Cloud Storage to serve my static website to the public and wondering is there a way to enable Apache/NGINX like access logs for the bucket via the GCP web interface?
GCS does offer access logs, although they arrive as CSV files with a bunch of information and not as Apache logs.
Enabling them is fairly simple. Say you want access logs for bucket "mybucket".
First, create another bucket to hold the access logs. Let's call it "mylogsbucket".
Second, give GCS permission to write logs to that bucket with this gsutil command:
gsutil acl ch -g cloud-storage-analytics#google.com:W gs://mylogsbucket
Third, activate logging:
gsutil logging set on -b gs://mylogsbucket gs://mybucket
Usage logs for mybucket will now show up about once per hour in mylogsbucket, and storage logs recording how much data is being stored will show up once per day.
More documentation on this feature is here: https://cloud.google.com/storage/docs/access-logs
I want to get details of Azure Subscription of my client. But I do not want to ask for special permission from client.
What I need is the bare minimum things from my client so that I can login from powershell or rest api and read status of runbook jobs.
If i login from admin account of the subscription than I can easily get those details. But you understand it is not possible to have admin account credential of my client.
Please suggest some workaround.
What you need to do is create a user in Azure Active Directory and grant that user specific rights using either the Azure Portal or PowerShell\Cli\SDK.
Say read all, or read properties of desired automation account. If you would want like a super minumim, you would need to create a custom role first.
https://azure.microsoft.com/en-us/documentation/articles/role-based-access-control-custom-roles/
If your client placed specific resources within a Resource Group, they may grant you permissions on just that Resource Group (including read-only permissions). This would allow you to have access to needed resources, without having access to other areas of their subscription.
I want to know the right/best way of having one machine copying data to Google Storage.
I need one machine to be able to write to a bucket, but not be able to create or delete other buckets.
While researching, I found out that you should create a account service so this account can log in to GC and then use the storage.
But the problem is, when the machine is from GCE, there are scopes. When setting up the scope "Default" it can Read from Google Storage, but can not write to it. Even after authenticated with a service account.
When the scope is Devstorage.read_write now the machine can create and remove buckets from that storage without login. I find that to risk.
Does anyone have any recommendations?
Thanks
The core problem here is that the "write" scope covers both write and delete, and that the GCE service account is likely a member of project-editors, which can create and delete buckets. It sounds like what you want to do is restrict a service account to only being able to affect a single bucket. You should be able to do this with these steps:
Create a service account in your project (and save the private key file).
In the permissions page for the project, make sure that service account is not a project editor for your project.
Using an account that does have full permissions to your project, create the bucket, then grant the service account write access to the bucket. Example gsutil commands to do this:
gsutil mb gs://yourbucket
gsutil acl ch -u your-service-account-name#gserviceaccount.com:W gs://yourbucket
Create a VM that does not have a GCE service account enabled.
Push the service account's private key file to that VM.
On the VM, gcloud auth activate-service-account --key-file=your-key-file.json
Now gsutil commands run on the VM should be able to write to (and delete) objects in that bucket, but not any other buckets in your project.
our team create some data on google cloud storage so other team can copy/download/read it from there, but when they tried, they always got 403 forbidden message. I tried to edit the permission on that bucket and added new permission as 'Project', 'viewers-(other team's project id)', and 'Reader', but still they got the same error when they ran this command:
gsutil cp -R gs://our-bucket gs://their-bucket
i also tried with their client id and email account, still the same.
I'm not sure one can define another group's collection of users with a give access right (readers, in this case), and apply it to an object in a different project.
An alternative to this would be to control bucket access via Google Groups: simply set up a group for readers, adding the users you wish to grant this right to. Then you can use said Group to control access to the bucket and/or contents. Further information, and use case scenario, here https://cloud.google.com/storage/docs/collaboration#group
try:
gsutil acl ch -u serviceaccount#google.com:R gs://your-bucket
This ch:changes the permission on 'your-bucket' for u:user serviceaccount#google.com to R:Reader.
We are transferring Google cloud storage bucket name (naked domain name) from one user to another. Since we no longer have active account, who own the bucket name, the bucket was deleted from Google cloud storage. And then recreate the same bucket name, but the console panel continue deny to create the bucket with the following error.
The bucket you tried to create is a domain name owned by another user.
It has been several days, the bucket was deleted.
In order to create a bucket that maps to a domain name, the account creating the bucket must be the registered owner of the domain name. Presumably the old account is registered as the owner of the domain. You're going to want to have the new account go through the domain verification process at Google Webmaster Tools: https://www.google.com/webmasters/tools/
Here's some more documentation about how to claim ownership of the domain name: https://support.google.com/webmasters/answer/35179