Gcloud list all projects including ones pending deletion? - gcloud

I know I can:
gcloud projects list
However, this does not show projects pending deletion. How can list those as well?

Try:
gcloud projects list --filter='lifecycleState:*'

Related

Multiple rapid updates to a single resource getting collapsed into a single latest version

I was trying to create a kubernetes CRD to manage some of the resources on our system
But I noticed that in case the CROs for this CRD are operated on rapidly (delete after create, multiple updates, update after create etc) they get collapsed into a single latest version (I am not sure at what point in the kubernetes infra is this done).
Is is possible to disable such collapsing of CRO operations?

The service already exists vstsagent.dev.mycomputername

I want to create a deployment agent for my project in Azure DevOps.
I installed an agent and then managed to delete it using
./config remove
then rebooting
then removing the folder.
When I installed it again I got a message
The service already exists: vstsagent.dev.mycomputername, it will be replaced
Could not delete the service 'vstsagent.dev.mycomputername'
The deployment group shows as offline in Azure DevOps.
My case may give you some insight. I have tried several times of agent setup, so there's multiple services installed. What I need to do is execute ./config remove inside EVERY directory to remove all services before a fresh re-install. (So I executed this command 8 times here)

Is the GCP Folder resource alpha?

Its not clear to me if the GCP Folder resource is alpha or publicly available ? The API around it gcloud alpha resource-manager folders certainly seems alpha . Can I go ahead and solution the structure using folders or not yet ?
While it may not be a definitive answer (occasionally documentation may be lagging behind), one way to check if a particular gcloud command (use the left side bar for navigation) is alpha, beta or generally available is to check for a Notes section at the bottom of the respective command's documentation page.
In particular for gcloud alpha resource-manager folders you see:
NOTES
This command is currently in ALPHA and may change without notice.
By comparison, the gcloud alpha app update command shows:
NOTES
This command is currently in ALPHA and may change without notice.
These variants are also available:
$ gcloud app update
$ gcloud beta app update
Since gclod app update is available it means the feature is generally available, thus covered by SLA, so it's safe to base solutions on it.
But gcloud alpha resource-manager folders doesn't show a gcloud beta resource-manager folders or a gcloud resource-manager folders, so it's indeed only an alpha release.
Neither alpha nor beta features are covered by SLAs and may change at any time. That's not to say you can't start working on a solution using them, as long as you're prepared to revise it or switch to some other solution if/when things change (and, of course, you don't expect a service SLA for them). It's really up to you.
As for the Cloud Folders feature itself, it reached General Availability. From the July 24, 2017 release note:
Folders General Availability
Cloud folders are nodes in the Cloud Platform Resource Hierarchy.
A folder can contain projects, other folders, or a combination of
both. You can use folders to group projects under an organization in a
hierarchy. For example, you organization might contain multiple
departments, each with its own set of Cloud Platform resources.
Folders allows you to group these resources on a per-department basis.
Folders are used to group resources that share common IAM policies.

How to set up 2 different Jenkins jobs linked with 2 different repos in one Jenkins installation?

I have Job1 that is linked to a Github repo and when I push code it builds in it's own workspace (space1)
I want to add a second job (Job2) that will be linked with a different GitHub repo and will build the code in a different workspace (space2).
Notice: 2 different jobs building different code from different repos (both master branches) in different workspaces.
Is it possible with vanilla Jenkins or will I need any extra plugin?
I have researched Pipeline (link1, link2) a little but I try to figure out if it covers my use case.
EDIT:
I have setup the communication between the second job and GitHub but in order for the build to succeed needs an SSH key. But Jenkins provides only one slot for configuring the SSH key.
Also I have added a second workspace .

Can you share Docker Images uploaded to Google Container Registry between different accounts?

We'd like to have a separate test and prod project on the Google Cloud Platform but we want to reuse the same docker images in both environments. Is it possible for the Kubernetes cluster running on the test project to use images pushed to the prod project? If so, how?
Looking at your question, I believe by account you mean project.
The command for pulling an image from the registry is:
$ gcloud docker pull gcr.io/your-project-id/example-image
This means as long as your account is a member of the project which the image belongs to, you can pull the image from that project to any other projects that your account is a member of.
Yes, it's possible since the container images are on a per-container basis.