Is there a simpler way to list GCP projects **not** in an Organization? - gcloud

I'm looking for a way to list projects that are not in an Organization (or Folder).
gcloud projects list ... returns a list of Project and the only obvious indicator that the project is not part of an Organization (or Folder) is the absence of a parent property.
I've been unable to find a way to --filter the results to exclude projects without a property.
I discovered the yesno transform that's part of --format which, combined with the csv formatter gives me results that I can then grep and cut:
gcloud projects list \
--format='csv[no-heading,separator=":"](parent.yesno(yes="Y",no="N"),projectId)' \
| grep ^N: \
| cut -d: -f2
But I'd like a way to do this solely using gcloud.

As talked in the comments, the solution to return the list of projects without an organization is to run a gcloud command that filter by the parent, using the yesno option. As clarified in this similar case here, the command that can be used to achieve that is the following.
gcloud projects list --filter="parent.id.yesno(yes='Yes', no='No')=No"
As the parent is the relation of a project being under an organization, this command filters when a project doesn't have a parent.

Related

Pull Request "reviewers" using github "history"

Is there any way (for on premise github) to :
For N number of files in the Pull Request.
Look at the history of those files.
And add any/all github users (on the history) .. to the code reviewers list of users?
I have searched around.
I found "in general" items like this:
https://www.freecodecamp.org/news/how-to-automate-code-reviews-on-github-41be46250712/
But cannot find anything in regards to the specific "workflow" I describe above.
We can get the list of changed files to the text file from PR. Then we can run the git command below to get the list of users included in last version's blame. For each file we get from file list, run the blame command. This might be also simple script.
Generate txt file from list of files of PR.
Traverse all filenames through txt file. (python, bash etc.)
Run blame command and store in a list.
Add reviewers to the PR from that list manually or some JS script for it.
For github spesific: list-pull-requests-files
The blame command is something like :
git blame filename --porcelain | grep "^author " | sort -u
As a note, if there are users who are not available in github anymore. Extra step can be added after we get usernames to check whether they exist or not. (It looks achievable through github API)

gsutil / gcloud storage file listing sorted date descending?

Is there no way to get a file listing out from a Google Cloud Storage bucket that is sorted by date descending? This is very frustrating. I need to check the status of files that are uploaded and the bucket has thousands of objects.
gsutil ls does not have the standard linux -t option.
Google cloud console also lists it but does not offer sorting options.
I use this as a workaround:
gsutil ls -l gs://[bucket-name]/ | sort -k 2
This outputs full listing including date as the second field, sort -k 2 then sorts by this field.
The only ordering supported by GCS is lexicographic.
As a workaround, if it's possible for you to name your objects with a datestamp, that would give you a way to list objects by date.

GCS CLI: using “gsutil rm” to remove files by creation date

Is there a way to remove files from GoogleCloudStorage, using the CLI by their creation date?
For example:
I would like to remove all files in a specific path, which their creation date is lower than 2016-12-01
There's no built-in way in the CLI to delete by date. There are a couple ways to accomplish something like this. One possibility is to use an object naming scheme that prefixes object names by their creation date. Then it is easy to remove them with wildcards, for example:
gsutil -m rm gs://your-bucket/2016-12-01/*
Another approach would be to write a short parser for gsutil ls -L gs://your-bucket that filters object names by their creation date, then call gsutil -m rm -I with the resulting object names.
If you just want to automatically delete objects older than a certain age, then there is a much easier way than using the CLI: you can configure an Object Lifecycle Management policy on your bucket.

Bitbake: How to list all recipe and append files used in an image?

I'm using OpenEmbedded-Core and have created a custom layer with priority 6. Months of development have gone by, and now I want to increase my layer's priority to 8 because an append file from another layer with priority 7 is interfering with an append file I'm adding in my layer.
My question is, how can I generate a list of recipes and .bbappend files used in an image?
I want to generate the list both before and after I make the priority change so that I can compare them (with a difftool hopefully) to see if any unexpected side-effects occurred, like an important append file from the other layer getting ignored potentially.
I'm using the angstrom-v2014.12-yocto1.7 branch of the Angstrom distribution.
[EDIT]
I'm now primarily just interested in determining how to list which .bbappend files are actually used by my image at this point.
A list of packages can be viewed using "bitbake -g your-image-name" as suggested by #pnxs, or from the .manifest file (which is what I like to use) which in my case is located under deploy/glibc/images/imagename/. I originally asked how a list of "recipe files" could be generated, but I think a list of packages is sufficient.
Regarding the .bbappends though, I had a case where my own .bbappend was ignored due to layer priorities. I made a change to my layer priorities and now want to see if that caused any .bbappend files anywhere else in my image to get ignored. As I understand it, using "bitbake-layers show-appends" as suggested lists all .bbappends present rather than just those which are actually used in the creation of an image, so this doesn't do what I'm looking for.
Try using "bitbake-layers show-appends" to see what bbappends are used. But that will only work on a per-recipe basis. But that might give you the information you need to understand the priorities.
You can do a "bitbake -g your-image-name" which creates some dot-files in the current directory.
The file "pn-depends.dot" contains a list of package-names (pn) and the dependencies between them.
When you take the first part of the file where all packages are listed, you see for example:
"busybox" [label="busybox :1.23.1-r0.2\n/home/user/yocto/sources/poky/meta/recipes-core/busybox/busybox_1.23.1.bb"]
"base-files" [label="base-files :3.0.14-r89\n/home/user/yocto/sources/poky/meta/recipes-core/base-files/base-files_3.0.14.bb"]
So you got a list of all packages used by your image and the corresponding recipe-file.
To see which of the recpies are extended by bbappend you have to get the list of bbappends with "bitbake-layers show-appends" and look up the appends of every recipe. You can write a little python-program that can do that for you.
Try the following:
Show all recipes
bitbake-layers show-recipes
Show .bb file of a recipe
RECIPE_NAME="linux-yocto"
bitbake -e $RECIPE_NAME | grep ^FILE=
Try the below command
bitbake -g image-name && cat pn-depends.dot | grep -v -e '-native' | grep -v digraph | grep -v -e '-image' | awk '{print $1}' | sort | uniq

gsutil - is it possible to list only folders?

Is it possible to list only the folders in a bucket using the gsutil tool?
I can't see anything listed here.
For example, I'd like to list only the folders in this bucket:
Folders don't actually exist. gsutil and the Storage Browser do some magic under the covers to give the impression that folders exist.
You could filter your gsutil results to only show results that end with a forward slash but this may not show all the "folders". It will only show "folders" that were manually created (i.e., not implicitly exist because an object name contains slashes):
gsutil ls gs://bucket/ | grep -e "/$"
Just to add here, if you directly drag a folder tree to google cloud storage web GUI, then you don't really get a file for a parent folder, in fact each file name is a fully qualified url e.g. "/blah/foo/bar.txt" , instead of a folder blah>foo>bar.txt
The trick here is to first use the GUI to create a folder called blah and then create another folder called foo inside (using the button in the GUI) and finally drag the files in it.
When you now list the file you will get a separate entry for
blah/
foo/
bar.txt
rather than only one
blah/foo/bar.txt