zsh command line history with timestamp - command-line

I am using zsh 4.3.6. I would like to have the timestamp in the command line history. The history -i always shows the current time:
> history -i
498 2020-04-27 14:54 history -i
499 2020-04-27 14:54 cat ~/.zsh_history
500 2020-04-27 14:54 exit
It seems that the timestamp is not stored in the $HISTFILE:
> cat $HISTFILE
ls
zsh --version
history -i
How can I have the persistent command line history timestamp?
Thanks.

setopt EXTENDED_HISTORY # puts timestamps in the history
or
setopt extendedhistory
to ~/.zshrc

Related

Display the date modified of a file in another directory?

I want to display the date modified of a file that is in another directory.
e.g. I am in /some/directory and I grep -rHl "foo" which returns a list of files. I am curious about the date modified of /a/completely/different/directory/result.txt without having to go to that directory and list the files.
Can this be done?
Could use stat from GNU Coreutils:
stat -c %y /path/to/file
output:
2020-12-08 15:43:01.306251716 +0100
Or ls from GNU Coreutils:
ls --full-time /path/to/file
output:
rw------- 1 user user 759 2020-12-08 15:43:01.306251716 +0100 /path/to/file

find with xargs runs successfully but no changes to file

I run a command to batch process image files using resmushit:
find /home/user/image/data -type f -print0 | xargs -n 1 -P 10 -0 resmushit -q 85 --preserve-filename
The command runs successfully and tells me the files were optimized and saved however when I check the files in the folder there is no change.
edit: it looks like the problem might be with resmushit. When I run it on pictures within my working directory it works. i.e
resmushit -q 85 --preserve-filename test.jpg
Is there a way to make xargs or a different command to run the command within each folder recursively?
I ended finding for directories and using a bash file so:
find /home/user/image/data -type d -print0 | xargs -n 1 -P 10 -0 bashscript
and the script is:
#!/bin/sh
cd "$*"
resmushit -q 85 --preserve-filename *

gsutil command to delete old files from last day

I have a bucket in google cloud storage. I have a tmp folder in bucket. Thousands of files are being created each day in this directory. I want to delete files that are older than 1 day every night. I could not find an argument on gsutil for this job. I had to use a classic and simple shell script to do this. But the files are deleting very slowly.
I have 650K files accumulated in the folder. 540K of them must be deleted. But my own shell script worked for 1 day and only 34K files could be deleted.
The gsutil lifecycle feature is not able to do exactly what I want. He's cleaning the whole bucket. I just want to delete the files regularly at the bottom of certain folder.. At the same time I want to do deletion faster.
I'm open to your suggestions and your help. Can I do this with a single gsutil command? or a different method?
simple script I created for testing (I prepared to delete bulk files temporarily.)
## step 1 - I pull the files together with the date format and save them to the file list1.txt.
gsutil -m ls -la gs://mygooglecloudstorage/tmp/ | awk '{print $2,$3}' > /tmp/gsutil-tmp-files/list1.txt
## step 2 - I filter the information saved in the file list1.txt. Based on the current date, I save the old dated files to file list2.txt.
cat /tmp/gsutil-tmp-files/list1.txt | awk -F "T" '{print $1,$2,$3}' | awk '{print $1,$3}' | awk -F "#" '{print $1}' |grep -v `date +%F` |sort -bnr > /tmp/gsutil-tmp-files/list2.txt
## step 3 - After the above process, I add the gsutil delete command to the first line and convert it into a shell script.
cat /tmp/gsutil-tmp-files/list2.txt | awk '{$1 = "/root/google-cloud-sdk/bin/gsutil -m rm -r "; print}' > /tmp/gsutil-tmp-files/remove-old-files.sh
## step 4 - I'm set the script permissions and delete old lists.
chmod 755 /tmp/gsutil-tmp-files/remove-old-files.sh
rm -rf /tmp/gsutil-tmp-files/list1.txt /tmp/gsutil-tmp-files/list2.txt
## step 5 - I run the shell script and I destroy it after it is done.
/bin/sh /tmp/gsutil-tmp-files/remove-old-files.sh
rm -rf /tmp/gsutil-tmp-files/remove-old-files.sh
There is a very simple way to do this, for example:
gsutil -m ls -l gs://bucket-name/ | grep 2017-06-23 | grep .jpg | awk '{print $3}' | gsutil -m rm -I
There isn't a simple way to do this with gsutil or object lifecycle management as of today.
That being said, would it be feasible for you to change the naming format for the objects in your bucket? That is, instead of uploading them all under "gs://mybucket/tmp/", you could append the current date to that prefix, resulting in something like "gs://mybucket/tmp/2017-12-27/". The main advantages to this would be:
Not having to do a date comparison for every object; you could run gsutil ls "gs://mybucket/tmp/" | grep "gs://[^/]\+/tmp/[0-9]\{4\}-[0-9]\{2\}-[0-9]\{2\}/$" to find those prefixes, then do date comparisons on the last portion of of those paths.
Being able to supply a smaller number of arguments on the command line (prefixes, rather than the name of each individual file) to gsutil -m rm -r, thus being less likely to pass in more arguments than your shell can handle.

Is there a command to diff all the kept files in accurev?

I am new to accurev, used to use SVN earlier. I want to a get diff file consisting of all the changes in kept files in a given directory. I know ac diff -b <file>
gives diff in a file, but if I have many files and I want the diff of all the kept files in a given directory, is there a straight forward command to do this like svn diff?
You are going to need to create a script if you only want to diff kept files in a given directory. Basically you will run an 'accurev stat -k' -> parse output for given directory -> 'accurev diff -b'
On a *NIX machine the commands below work nicely.
The -k option to AccuRev's stat command says find the file with "(kept)" status. Using the -fal options to stat provides just the Depot relative pathway to the file. No addition filtering needed. So the command line would be:
accurev stat -k -fal | xargs accurev diff -b
Produces output like:
accurev stat -k -fal | xargs accurev diff -b
diffing element /./COPYING
341a342
> Tue Mar 18 08:38:39 EDT 2014
> Change for demo purposes.
diffing element /./INSTALL
3a4,7
> New Change
>
> Another Change
>
Dave

how to redirect a output of a command to two files

i need to redirect a output of a command to two files say file1 and file2
file1 is a new file and file2 is already existing file where i need to append the output
i have tried
This is not giving the expected results:
command > file1 > file2
You are looking for the tee command.
$ echo existing >file2
$ date | tee file1 >> file2
$ cat file2
existing
Mon Mar 9 10:40:01 CET 2009
$ cat file1
Mon Mar 9 10:40:01 CET 2009
$
For Windows (cmd.exe):
command > file1 & type file1 >> file2
It will be possible using the following command.
command | tee -a file1 file2
In PowerShell use tee-object (or its tee alias)
command | tee first-file | out-file second-file
Can also tee to a variable (e.g. for further processing).
In Linux, you can do this
command | tee file1 >> file2