how to integrate local active directory folders to google cloud storage - google-cloud-storage

I have an local active directory server and a few shared folders with permissions in that the created users from the active directory have an account that has those files saved in a google cloud storage, i would like to know if there is a way to use the files from google cloud so that the admin is able to manage the permissions and users from the active directory. If not, what other solutions can i use?
Thanks!

Related

how to use local file acces rights while running powershell as administrator

So, I have an admin account to install programs, but with that admin account, I do not have access to a network location, which I can access as local user.
I am building scripts that installs multiple programs and afterwards copies a bunch of config files from a network location to a local machine. Some items require admin rights to be copied.
Ideally I want to execute a script as administrator, but use the local user rights to access the config files on the network location within the same script.
I have searched google for hours now, but I still do not have a solution. Someone an idea on how to accomplish this? asking for the missing access rights, is unfortunately not an option.

Access google cloud data without GCP account

I have created a bucket and files in it with Google Cloud Storage. I have also edited the permissions of the bucket to allow access to persons within a Googlegroup account.
Now, if they need to access the data, do they need to "sign up" at the Google Cloud Platform?
Is there anyway they can copy all the files in the bucket using gsutil again without GCP account?
No, there's not a way to allow access to a bucket in that way, but Google accounts and GCP accounts are the same, so anyone with a gmail account could access it.
The closest thing to your use case is Signed URLs, which can grant access to individual objects.

Access to AmazonWorkspaces

I am setting up an Amazon Workspace instance, and need to provide the user with a password to log in. The invite email only contains the registration code.
How do I set up a user password to enable the user to login into Amazon workspace only (no console access)?
I am creating an AmazonWorkspace from a custom bundle, and adding a user to Simple AD.
Individual workspaces created in Amazon WorkSpaces are assigned to, and used by, individuals who are defined in a Directory.
From Manage Directories for Amazon WorkSpaces:
Amazon WorkSpaces uses a directory to store and manage information for your WorkSpaces and users. You can use one of the following options:
AD Connector — Use your existing on-premises Microsoft Active Directory. Users can sign into their WorkSpaces using their on-premises credentials and access on-premises resources from their WorkSpaces.
Microsoft AD — Create a Microsoft Active Directory hosted on AWS.
Simple AD — Create a directory that is compatible with Microsoft Active Directory, powered by Samba 4, and hosted on AWS.
Cross trust — Create a trust relationship between your Microsoft AD directory and your on-premises domain.
If you have your own Active Directory, then use it. If you do not have Active Directory, the simplest option is to choose Simple AD, which is a Active Directory-compatible Samba service. (Charges apply)
The user is selected when the WorkSpace is created. Amazon WorkSpaces will send a registration code to the end user. The end user then uses an Amazon WorkSpaces client to connect with the service. They provide the Registration Code to configure the client (once only), then login with their AD credentials.
To set the user password for AmazonWorkspace, set user password on the WorkMail application for that user. Directory service manages users for both AmazonWorkspaces and WorkMail, and the password is shared across.
The user must exist, but can be disabled. Email address can be anything, in my case users do not use AWS email.
Users defined in IAM are not visible from AmazonWorkspaces and WorkMail. If a user has console, S3, ec2, etc.. access in addition to Workspace, they would need to be defined separately in IAM and Directory.

Hiding Buckets in Google Cloud Storage

We've just moved files off of a 10 year old FTP server and are now using Google Cloud Storage for our company's files. This was setup to use the web-hosting feature of GCS, and the access logging capability was also enabled.
The access logs are dumped into a bucket. I would like to deny access to this bucket, but allow them to continue using the main GCS bucket (we use Cyberduck for this).
We are presently allowing anybody with our company's email address to read/write to the buckets in this project, by giving the "Domain" the "Storage Admin" and "Storage Object Admin" permissions. This was granted through the IAM permissions.

GCS bucket permission for Project

I use GCS bucket to upload content. I am distributing a script to my users that helps them download the content in GCS bucket to their local directory. Each of the users are also GCP project owners.
How do I set permissions in GCS bucket to enable only selected GCP projects to access the contents?
Thanks
In order to give access to users in a different project to Cloud Storage bucket you can edit the bucket permissions and can add "Owners, Editors or Viewers" of that project to bucket ACLs e.g. If you want owners of project A to access the bucket with full permissions in project B you can change the permissions of the bucket and add "Project owners- . By doing that all the owners of the project A will have full control on the bucket in project B.
Note: If you change the ACLs of the bucket the changes will apply to new objects uploaded to the bucket. The object already in the bucket will still have the old ACLs.
You can read more about bucket and object ACLs on these links [1]:
[1] https://cloud.google.com/storage/docs/accesscontrol