I was wondering if there is any way to configure review apps in Spinnaker like Heroku or Gitlab have? Looked through documentation and forums and couldn't find anything. Just verifying.
You can trigger spinnaker pipeline using merge request event from GitLaband deploy your app under review to separeate namespace.
The main task is to implement correct lifecycle and clear obsolate applications using Spinnaker pipelines. But it is possible with current Spinnaker's functionality.
Related
I am new to terraform and I want set up a CI/CD pipeline to GCP with github to replace a current system that use's jenkins, as we want to increase automation of deployments. What would be the best way or architecture to do this.
One of the primary products related to CI/CD is Google's Cloud Build.
https://cloud.google.com/build
It's one liner reads:
Build, test, and deploy on our serverless CI/CD platform.
It has built in triggers that include GitHub integration meaning that when events occur on GitHub, Cloud Build runs its prescribed recipes.
I'd suggest reading the documenation found at the above page and also correlate against the curated documentation found on GCP Weekly here:
Tag: CI
Tag: Cloud Build
Good morning!
I have been playing around with GitHub Actions to build and deploy to multiple stages. Works like a charm.
But deployments in GitHub has been hard to overview.
I have access to Environments, which is where I’ve added some secrets.
I found this article in the docs, but I can’t find it in my public repo. https://docs.github.com/en/github/administering-a-repository/viewing-deployment-activity-for-your-repository#viewing-the-deployments-dashboard
Does anyone know how GigHub deployments work and how to start using it?
"Github Deployments" is really just an API you can use to alert Github about deployments (start/finish) and the Deployments Dashboard to view activity. In order for anything to show up there you first need to actually trigger a Github deployment event and specify an environment. Put the action I've linked below at the start and then end of your existing deployment automation in Github Actions and you should start seeing something show up.
Deployment Action
https://github.com/bobheadxi/deployments
You should probably give this a read through as well, goes into details on where the Github Deployments API fits into things.
https://docs.github.com/en/rest/reference/repos#deployments
I am deploying microservices into Kubernetes,
So whenever a latest version of Docker image is pushed into Jrog Artifactory, it should notify Github or ArgoCD.
Is there any similar solution for this?
If your Artifactory server is installed on-prem, the easiest way is to implement a user plugin. You can use the one from our user plugins repository (and adapt it as needed), or write your own which will notify the services you need on events you need.
Here is a blog post detailing how to use the existing plugin.
Also, a built-in feature is on our roadmap.
I am adding end-to-end tests to our dev env which is in kubernetes. I'd like to kick off tests whenever a service is updated. My question is: what is the best way to kick off tests when a new/updated service is ready.
I have found two options that seem ok, but not perfect:
1) post-upgrade helm chart hooks
2) validating admission webhooks
Does anyone have experience in this regard or an opinion on the best way? Is there another way that I have missed?
EDIT:
After suggesting admission webhooks to my devops team, they pointed me to kubernetes controllers and I found this:
https://github.com/bitnami-labs/kubewatch
I think that kubewatch will meet my needs.
I have a private gitlab instance with multiple projects and Gitlab CI enabled. The infrastructure is provided by Google Cloud Platform and Gitlab Pipeline Runner is configured in Kubernetes cluster.
This setup works very well for basic pipelines running tests etc. Now I'd like to start with CD and to do that I need some manual acceptance on the pipeline which means the person reviewing it needs to have the access to the current state of the app.
What I'm thinking is having a kubernetes deployment for the pipeline that would be executed once you try to access it (so we don't waste cluster resources) and would be destroyed once the reviewer accepts the pipeline or after some threshold.
So the deployment would be executed in the same cluster as Gitlab Runner (or different?) and would be accessible by unique URI (we're mostly talking about web-server apps) e.g. https://pipeline-58949526.git.mydomain.com
While in theory, it all makes sense to me, I don't really know how to set this up properly.
Does anyone have a similar setup? Is my view on this topic too simple? Let me know!
Thanks
If you want to see how to automate CI/CD with multiple environments on GKE using GitOps for promotion between environments and Preview Environments on Pull Requests you might wanna check out my recent talk on Jenkins X at DevOxx UK where I do a live demo of this on GKE.