gsutil - is it possible to list only folders? - google-cloud-storage

Is it possible to list only the folders in a bucket using the gsutil tool?
I can't see anything listed here.
For example, I'd like to list only the folders in this bucket:

Folders don't actually exist. gsutil and the Storage Browser do some magic under the covers to give the impression that folders exist.
You could filter your gsutil results to only show results that end with a forward slash but this may not show all the "folders". It will only show "folders" that were manually created (i.e., not implicitly exist because an object name contains slashes):
gsutil ls gs://bucket/ | grep -e "/$"

Just to add here, if you directly drag a folder tree to google cloud storage web GUI, then you don't really get a file for a parent folder, in fact each file name is a fully qualified url e.g. "/blah/foo/bar.txt" , instead of a folder blah>foo>bar.txt
The trick here is to first use the GUI to create a folder called blah and then create another folder called foo inside (using the button in the GUI) and finally drag the files in it.
When you now list the file you will get a separate entry for
blah/
foo/
bar.txt
rather than only one
blah/foo/bar.txt

Related

moving files to different folders from 1

I have 200,000 files I want to send to different folders based on key words in the file name
in English if a file name has (shtf or prepper or prepping or survival) in the name send(move) it to folder shtf
if a file has (cookbook or gluten or recipe) move to food folder
*cookbook* *GLUTEN* *RECIPE*
example
(file name)
more shtf tips.epub move to folder shtf
ifshtfbeready.epub move to folder shtf
oldworldcookbook.epub move to folder food
i'm old retired ibmer small basic sas dos commands or ????
Here is a bash command, you may be able to adapt it into dos etc. I'm posting this because others may it useful as well.
find . | grep -E "(cookbook|gluten|recipe)" | while read name; do mv $name directory; done;
Where directory is the name of the directory you want to move the file. You can replace . with whatever starting directory you want, of course.
You can use wildcard in the source filename list and use a directory as the target to move multiple files with one command.
move c:\dir1\*cookbook*.* c:\food
move c:\dir1\*gluten*.* c:\food

google cloud storage with php app engine. Get all sub directories list

How would it be possible to get a list with all directories inside a specific directory of a Cloud Storage bucket using an App Engine application in PHP?
You can try two approaches to list all the directories inside a specific directory in a bucket:
Use the code snippet from the PHP docs samples at Google Cloud Platform GitHub and modify it in a way that list_objects_with_prefix function includes also delimiter, not only prefix. I have written such function in Python in this SO topic, you can use it as a reference. Here prefix needed to be the name of the parent directory, for e.g. 'my_directory/' and delimiter is simply '/' to indicate that we want to end our search on elements finishing with '/' (hence, directories).
Use gsutil ls command to list objects in a directory from within PHP. You will need to use shell_exec function:
$execCommand = "gsutil ls gs://bucket";
$output = shell_exec($execCommand);
output will be a string in this case and it will contain also file names if present in the parent directory.
This SO topic might be also informative, here the question was to list the whole directory (together with files).

Google Cloud Storage : What is the easiest way to update timestamp of all files under all subfolders

I have datewise folders in the form of root-dir/yyyy/mm/dd
under which there are so many files present.
I want to update the timestamp of all the files falling under certain date-range,
for example 2 weeks ie. 14 folders, so that these these files can be picked up by my file-Streaming Data Ingestion process.
What is the easiest way to achieve this?
Is there a way in UI console? or is it through gsutil?
please help
GCS objects are immutable, so the only way to "update" the timestamp would be to copy each object on top of itself, e.g., using:
gsutil cp gs://your-bucket/object1 gs://your-bucket/object1
(and looping over all objects you want to do this to).
This is a fast (metadata-only) operation, which will create a new generation of each object, with a current timestamp.
Note that if you have versioning enabled on the bucket doing this will create an extra version of each file you copy this way.
When you say "folders in the form of root-dir/yyyy/mm/dd", do you mean that you're copying those objects into your bucket with names like gs://my-bucket/root-dir/2016/12/25/christmas.jpg? If not, see Mike's answer; but if they are named with that pattern and you just want to rename them, you could use gsutil's mv command to rename every object with that prefix:
$ export BKT=my-bucket
$ gsutil ls gs://$BKT/**
gs://my-bucket/2015/12/31/newyears.jpg
gs://my-bucket/2016/01/15/file1.txt
gs://my-bucket/2016/01/15/some/file.txt
gs://my-bucket/2016/01/15/yet/another-file.txt
$ gsutil -m mv gs://$BKT/2016/01/15 gs://$BKT/2016/06/20
[...]
Operation completed over 3 objects/12.0 B.
# We can see that the prefixes changed from 2016/01/15 to 2016/06/20
$ gsutil ls gs://$BKT/**
gs://my-bucket/2015/12/31/newyears.jpg
gs://my-bucket/2016/06/20/file1.txt
gs://my-bucket/2016/06/20/some/file.txt
gs://my-bucket/2016/06/20/yet/another-file.txt

Zip files with encryption in a remote share, keeping orignal names and location

My team faces the need to encrypt all files in a repository with AES256. For this purpose, we decided we are going to zip all files with such encryption, using the same key for all of them.
The problem we have is that these files sit in a NAS, so from windows boxes they are accessible by \ to them.
The directory structure is something like this:
Original Structure:
Root
-1
|--folder1
|---file1.ext
|---file2.ext
|--folder2
|---filea.ext
|---fileb.ext
|--folder2.a
|---filec.ext
and so on...
Essentially, what we need is to have all the original files contained in a zip file, keeping their original names, which would be something like this:
Desired Outcome:
|-Root
|-1
|--folder1
|---file1.zip
|---file2.zip
|--folder2
|---filea.zip
|---fileb.zip
|--folder2a
|---filec.zip
and so on...
To accomplish this, we tried a batch script that calls 7zip, but it only works if it's run from the root directory, which is something we cannot use as the files are not in a server.
Here is the syntax of the batch script we came up with:
FOR /R %%i IN ("*.wmv") DO "C:\Program Files\7-Zip\7z.exe" a -mx0 -tzip -pPasswordHere "%%~dpni.zip" "%%i"
But, as wrote previously, it only works when run from the root folder, which is something we cannot do as files sit on a network location.
Mapping the drive or making a symbolic link to it doesn't do the trick either.
I've also checked on 7zip to do this, namely, making use of its "-r" operator, but I couldn't find a way to get the desired outcome (namely, recurse through all folders in the remote tree structure -there are a lot of them...- and keep the original file name).
I'm open to any suggestions as any kind of script, trick or guizmo that gets the job done will be more than welcome. =)
Thanks a million in advance!,
Sebas.
----SOLUTION----
I actually found a sollution here, mapping the drive in a different way (it's so simple it just made me feel stupid(er), but it's altogheter beautiful).
Using the batch script below, the remote share can be mapped like so:
You can map a drive using
net use X: \\server\directory
and then you can change to that directory using
pushd X:
(Post from which the answer was taken from: Batch File Iterating through files on a local network server)

7zip / winrar command to extract a folder with path intact to specific folder but excluding parent source path

example
There is a file "sample.rar".
Folder structure is: "rising\dawn\ and here there are many (folders1, folders2 and file1, file2)" in this archive.
i have used following command
7z.exe x "sample.rar" "rising\dawn\*" -oi:\delete
The result is:
all files and folders in "rising\dawn\" are extracted to "i:\delete" folder but the empty parent folders "rising\dawn\" are also created in destination folder.
e.g. destination looks:
i:\delete\rising\dawn\folder1\file1.bmp
i:\delete\rising\dawn\folder2\subfolder
i:\delete\rising\dawn\file1.txt
i:\delete\rising\dawn\file2.txt
i don't want "rising\dawn\" empty folders to be created but the folder structure there onwards must be as is in the archive.
i want the result:
i:\delete\folder1\file1.bmp
i:\delete\folder2\subfolder
i:\delete\file1.txt
i:\delete\file2.txt
at last i found a way out solution. thanks to the winrar support. i have accepted it as an answer below.
if you find the question useful don't forget to click the up-vote button.
Finally this gave me the result.
Thanks to winrar support.
rar x -ep1 sample.rar rising\dawn\* d:\e\delete\
i have tried other answers given here, this is the only correct answer.
don't forget to upvote.
You can extract the archive normally and
1) move the lower level folder/files to where you would like it, then
2) remove the extra top level archive folders.
Code to do so will depend on the exact task.
Using e command instead of x and add -r option works well.
Like this:
7z.exe e -r "sample.rar" "rising\dawn\*" -oi:\delete
My executable version is "7-Zip [64] 9.20 2010-11-18",
And the platform is Windows 8.1.
This command line eliminates unnecessary parent folders and preserves the hierarchy of folders.
You need to use the e command rather than the x command:
7z.exe e "sample.rar" "scholar\update\*" -oi:\delete
Using e instead of x means 7zip will extract all matching files into the same folder (as specified via the -so switch, or the current directory if this isn't specified) rather than preserving the folder structure from inside the archive.