flvtool2 batch process in SSH command? - flv

I want to perform hundreds of video in our server to execute command like this one
flvtool2 -U video.flv
but the problem is I can't find any command to perform batch process.
is there a way to perform batch process?
thanks, TIA
also question can this be done in PHP execute command? thanks

I found the answer, with this command in ssh
ls -1 *.flv | while read file; do cat "$file" | flvtool2 -U stdin "$file" ; done
thanks to this link
http://michaelangela.wordpress.com/2008/06/30/how-to-add-duration-info-to-a-flv-file-with-the-flvtool2-command/
if you need this kind of command try it, it works on me :)

Related

Combine sed truncate x lines into a find command

We have a large log file in the same location on multiple servers and I want to create a cron job to truncate the file to last 100k lines.
The following command works:
sed -i 1,$(($(wc -l < /root/server123.example.com.log) -100000))d /root/server123.example.com.log
But the hostname on each server is different (server1, server2, server3 etc.), and I'd like to have a single command I can paste into each cron file. During my testing I wasn't sure how to achieve a wildcard in the above command.
I think the best way might be to combine it with a find command, but I'm clueless on how to do that..
find /root/server*.example.com.log -type f -exec sed <NOT SURE..> \;
Any help would be appreciated.
During my testing I wasn't sure how to achieve a wildcard in the above command.
If there is just one log file on each server, you can simply insert the wildcard:
sed -i 1,$(($(wc -l < /root/server*.example.com.log) -100000))d /root/server*.example.com.log

Perl Program to search for a string over a set of files over SSH

I have a perl script which can be used to ssh into a remote server using Net::SSH2, i need to search for a particular string over the files in a given directory in the remote system and print the given files in which the string occurs . Any ideas /sample codes on how i can go about this ?
Thanks
Solution proposed and accepted in the comments:
ssh <user>#<host> grep -d recurse -l <string> <directories>

How to find the command line of a process only if the process is from current user

I have the following situation:
A perl script read a file where a application wrote it's pid, and tries to kill it.
But the problem is that I don't want to kill another process so I check if current process with the recorded PID has the same command line. If so, the application could be killed.
The following blues script find out the cmdline:
$PIDCMDLINE = `ps -p $PID -o cmd`;
The problem is that if another instance for another user is up, maybe on the same sid, it would be killed because it will return a valid command line, and I don't want that behaviour.
How can I restrict ps -p to search only current users processes (no, simple ps doesn't count, because -p nullify the default effect of ps)
Thank you!
You can use the following to check both command and user for the certain PID:
ps -p <PID> -o user,cmd --columns 1000 | grep `whoami`
Adding a 'grep' according to the comment.
May be a little awkward, but what about this:
$PIDCMDLINE = ps -p $PID -o user,command | grep `whoami` | awk '{ print $2 }'

Executing perl script inside bash script

I inherited a long bash script that I recently needed to modify. The bash script is run as a cronjob on a daily basis. I am decent with bash scripting, but I do not know much about Perl.
I had to substitute all "rm" commands with a call to a perl script that does something similar (for security purposes). This script was not written by me, so there is no -f flag to skip the confirmation prompt. Therefore, to automate this script I pipe "yes" to the script.
Here is an example where I am sequentially deleting two directories:
echo REMOVING FILES TO SAVE DISK SPACE
echo "yes | sudo nice -n -10 perl <path_to_delete_script.pl> -dir <del_dir1>"
yes | sudo nice -n -10 perl <path_to_delete_script.pl> -dir <del_dir1>
echo "yes | sudo nice -n -10 perl <path_to_delete_script.pl> -dir <del_dir2>"
yes | sudo nice -n -10 perl <path_to_delete_script.pl> -dir <del_dir2>
echo DONE.
In my output file, I see the following:
REMOVING FILES TO SAVE DISK SPACE
yes | sudo nice -n -10 perl <path_to_delete_script.pl> -dir <del_dir1>
yes | sudo nice -n -10 perl <path_to_delete_script.pl> -dir <del_dir2>
DONE.
It does not appear that the perl script has run. Yet when I copy and paste those two commands into the terminal, they both run fine.
Any help is appreciated. Thank you in advance.
You simply put do
yes | ./myscript.pl
Thanks for all the comments. I ended up changing the group and permissions of the tool and all output files. This allowed me to run the perl script without using "sudo," which others pointed out is bad practice.

Create symbolic link from find

I'm trying to create a symbolic link (soft link) from the results of a find command. I'm using sed to remove the ./ that precedes the file name. I'm doing this so I can paste the file name to the end of the path where the link will be saved. I'm working on this with Ubuntu Server 8.04.
I learned from this post, which is kind of the solution to my problem but not quite-
How do I selectively create symbolic links to specific files in another directory in LINUX?
The resulting file name didn't work, though, so I started trying to learn awk and then decided on sed.
I'm using a one-line loop to accomplish this. The problem is that the structure of the loop is separating the filename, creating a link for each word in the filename. There are quite a few files and I would like to automate the process with each link taking the filename of the file it's linked to.
I'm comfortable with basic bash commands but I'm far from being a command line expert. I started this with ls and awk and moved to find and sed. My sed syntax could probably be better but I've learned this in two days and I'm kind of stuck now.
for t in find -type f -name "*txt*" | sed -e 's/.//' -e 's$/$$'; do echo ln -s $t ../folder2/$t; done
Any help or tips would be greatly appreciated. Thanks.
Easier:
Go to the folder where you want to have the files in and do:
find /path/with/files -type f -name "*txt*" -exec ln -s {} . ';'
Execute your for loop like this:
(IFS=$'\n'; for t in `find -type f -name "*txt*" | sed 's|.*/||'`; do ln -s $t ../folder2/$t; done)
By setting the IFS to only a newline, you should be able to read the entire filename without getting splitted at space.
The brackets are to make sure the loop is executed in a sub-shell and the IFS of the current shell does not get changed.