Change AWS workspace directory (Simple AD) - amazon-workspaces

Hello I am using aws workspace. Unfortunately I assigned wrong vpc to my directory(Simple AD) which is attached to workspaces. Is it possible to change directory of workspace or VPC attached to directory ?

Related

How to allow GKE internal Pod to communicate through VPN to an internal IP in another VPC?

I have a GKE cluster (private one) with a NAT that I need to put in networking with a legacy VPC (in another GCP project).
I built a classic VPN between Project B (new) and Project A (old): all VM can talk to each other (nc -vz is my friend).
The GKE cluster inside Project B can talk with internal IP to all VMs on Project B.
I need to have some pods in this GKE able to talk to the private IP on the VPN inside the Project A.
We tried this how to but it's still not working.
If you have an idea that works in my case I will buy you a beer ;) (location : Le Havre, Lille or Paris)
Infra scheme
There is an option in GCP called “Shared VPC”, that in summary, allows the multiple projects’ interconnection within an organization, using a common Virtual Private Cloud. As it is specified in GCP’s documentation, an organization policy applies to all projects in the organization, so you need to follow these steps just once to restrict lien removal Organization policies for Shared VPC. Then you need to follow these steps to provision the Shared VPC:
-Go to the Shared VPC page in the Google Cloud Console.
-Log in as a Shared VPC Admin.
-Select the project you want to enable as a Shared VPC host project from the project picker.
-Click Set up Shared VPC.
-On the next page, click Save & continue under Enable host project.
-Under Select subnets, do one of the following:
a)Click Share all subnets (project-level permissions) if you need to share all current and future subnets in the VPC networks of the host project with service projects and Service Project Admins specified in the next steps.
b)Click Individual subnets (subnet-level permissions) if you need to selectively share subnets from the VPC networks of the host project with service projects and Service Project Admins. Then, select Subnets to share.
-Click Continue.
-The next screen is displayed.
-In Project names, specify the service projects to attach to the host project. Note that attaching service projects does not define any Service Project Admins; that is done in the next step.
-In the Select users by role section, add Service Project Admins. These users will be granted the IAM role of compute.networkUser for the shared subnets. Only Service Project Admins can create resources in the subnets of the Shared VPC host project.
-Click Save.
In the following URLs you are going to find some GCP’s official information such as a Shared VPC Overview Shared VPC overview and all the process in detail to set a new Shared VPC up Setting up Shared VPC.

GCloud SDK - Shared VPC - Associate only a specific network when attaching a project using gcloud SDK

Is there a way to share a specific subnet of a Shared VPC to a project using the gcloud SDK?
I can use the below command to associate a project but it shares all of the host project's subnets and it doesn't appear there is a flag to specify a specific subnet from the host project to share to the service project.
https://cloud.google.com/sdk/gcloud/reference/compute/shared-vpc/associated-projects/add
In this case you could achieve this as a Shared VPC Admin whom can define an IAM member from a service project as a Service Project Admin with access to only some of the subnets in the host project.
Hope this works for you.

Pointing to private github repository or AWS S3 as notebook directory for Jupyterhub notebook servers

Is it possible to point to private github repository or AWS S3 as notebook directory for Jupyterhub notebook servers?
In Jupyterhub config file, I can set C.Spawner.notebook_dir to point to local directories but how can I point to a fileshare protected by password or to a private github repository or AWS S3?
There is some information here - https://github.com/jupyterhub/jupyterhub/issues/314 on customizing the directory location for each user. Is there a way to extend the custom spawner class to have the ability to point to private github or S3?
The simplest way, if you can satisfy the requirements, would be to use the S3 FUSE filesystem, to mount an S3 bucket at a path in your local directory tree.
You could also further extend the custom spawner in that issue to re-clone/update a github repo every time you spawn a notebook (and then pass the path into the notebook), but that would be pretty slow. Also in that case the user account running the spawner needs to be able to read the credentials for the github account. The S3 solution allows you to do this outside of the Jupyter workflow, allowing you to preserve credentials with a different permissions scheme.

how to integrate local active directory folders to google cloud storage

I have an local active directory server and a few shared folders with permissions in that the created users from the active directory have an account that has those files saved in a google cloud storage, i would like to know if there is a way to use the files from google cloud so that the admin is able to manage the permissions and users from the active directory. If not, what other solutions can i use?
Thanks!

GCS bucket permission for Project

I use GCS bucket to upload content. I am distributing a script to my users that helps them download the content in GCS bucket to their local directory. Each of the users are also GCP project owners.
How do I set permissions in GCS bucket to enable only selected GCP projects to access the contents?
Thanks
In order to give access to users in a different project to Cloud Storage bucket you can edit the bucket permissions and can add "Owners, Editors or Viewers" of that project to bucket ACLs e.g. If you want owners of project A to access the bucket with full permissions in project B you can change the permissions of the bucket and add "Project owners- . By doing that all the owners of the project A will have full control on the bucket in project B.
Note: If you change the ACLs of the bucket the changes will apply to new objects uploaded to the bucket. The object already in the bucket will still have the old ACLs.
You can read more about bucket and object ACLs on these links [1]:
[1] https://cloud.google.com/storage/docs/accesscontrol