Where on Google cloud are Datalab iPython notebook files stored? I'd like to be able to access that directory so I can set up a git repository on GitHub if at all possible. Thank you!
If you used the deprecated Cloud Datalab Deployer, you can find your committed Datalab iPython notebooks at the following location in the cloud:
https://source.developers.google.com/p/[PROJECT_ID]/
If you are running Datalab locally, the notebooks will be in your $HOME directory on Linux/OSX or C:/Users/<username>/Documents/ on windows.
In terms of the actual Datalab docker container, you can also find your notebooks in the /content/ folder inside the container.
You may find the following links helpful in migrating from Cloud Source Repositories to GitHub:
Google Cloud Datalab : Migrating from Cloud Datalab Deployer
Cloud Source Repositories : Adding a Repository as a Remote
Please let me know if I haven't answered your question correctly.
Related
I've created my DataFusion instance, network, pipelines, secrets, etc.. through Terraform but still have one fundamental gap - my pipelines use plugins that are present in the Hub but not enabled by default, like Python and KinesisStreamingSource - I've found Terraform code that will allow me upload plugins but it assumes I have the jars, which to me suggests that solution is more targeted at custom plugins.
Am I missing something fundamental here? Is there a magic API/Terraform command to do a one step deploy of one of the stock plugins from Hub into my DF instance? I'm convinced I'm doing this wrong as there seems to be nobody else having this same issue.
Any help at all is appreciated :)
I Believe that this module module hub_artifact can be used to deploy an artifact (plugin) from the GCS bucket of a data fusion instance hub.
However, you should install your plugins in a GCS bucket.
I used the same link with the submodule namespace to create namespaces and preferences.
You can also find github links created by Google for data fusion plugins: for example for http plugin you have the github page.
I hope this helps!
I am trying to create a Google Cloud function through Terraform. The source code for the function is in Enterprise GitHub. https://github.xyz.com/cf
The Terraform code is as below:
resource "google_cloudfunctions_function" "cfcluster" {
name = "cfcluster1"
project = "${var.project_id}"
region = "us-central1"
runtime = "python39"
source_repository {
//url="https://github.xyz.com/cf" #is this possible?
}
Is it possible to connect to Enterprise GitHub from Google CloudFunction in Terraform? How can I achieve it?
Note
I don't want to connect to cloud-source repository from Terraform.
I don't think you can pull data from a GitHub repository directly. What you can do is to mirror it to your project's Cloud Repository, and then you can use the data within the Google Cloud Platform as you wish. Here you can find a document on how to mirror your GitHub Repository [1].
You can also take a look at this tutorial [2], here you can find the complete steps to deploy an application in Google Cloud Platform from a Github Repository.
[1] https://cloud.google.com/source-repositories/docs/mirroring-a-github-repository
[2] https://medium.com/swlh/deploying-github-repository-to-google-cloud-platform-997d296547e6
I'm working on a Terraform project that will set up all the GCP resources needed for a large project spanning multiple GitHub repos. My goal is to be able to recreate the cloud infrastructure from scratch completely with Terraform.
The issue I'm running into is in order to setup build triggers with Terraform within GCP, the GitHub repo that is setting off the trigger first needs to be connected. Currently, I've only been able to do that manually via the Google Cloud Build dashboard. I'm not sure if this is possible via Terraform or with a script but I'm looking for any solution I can automate this with. Once the projects are connected updating everything with Terraform is working fine.
TLDR; How can I programmatically connect a GitHub project with a GCP project instead of using the dashboard?
Currently there is no way to programmatically connect a GitHub repo to a Google Cloud Project. This must be done manually via Google Cloud.
My workaround is to manually connect an "admin" project, build containers and save them to that project's artifact registry, and then deploy the containers from the registry in the programmatically generated project.
Is possible to access or copy (transfer) a git Google Cloud Source repository to Google Cloud Storage.
The idea is to use the git repo as a website like GitHub Pages.
You can do this as follows:
clone the Google Cloud Source repo
use gsutil cp -r dir1/dir2 gs://my_bucket/subdir to copy the contents of the data to Google Cloud Storage, possibly after processing (e.g., if you want to use something like Jekyll or Middleman to generate your website). Note that this will also copy your .git directory as well, which you might want to exclude.
I would like to deploy a branch of a project on my github.com account to AWS Elastic Beanstalk, but I would adore being able to use a GUI.
Is there such a thing for AWS? Will I have to use the AWS command line tools? :-(
Thanks in advance for your time.
You can set up an EBS volume using the web API and then use something akin to SourceTree or RedMine to view the repository.