I'm having a small issue, i'm running and cron task every 5 minutes which is looking text chain and replacing it with nothing..
In order to optmize something i would like to add a new function to my cronstrask : send me an email if it replaces something.. if the crontask does not find the chain no need to send a mail. I have no idea how to do that , maybe you can help me .
Here is my current cron task :
find /home -type f | xargs sed -i 's$chain if would like to era$ $g'
Thanks in advance
Here is an example that I did with help of other poeple.
It will maybe help some other poeple
#!/bin/bash
grep -r -q 'stringtoreplace' /home/ #Find the string in all files on my home
if [ $? == 0 ] #if the last result fit then
then
echo "At the following time : " $(date +%H:%M:%S) | mail -s "[Serveur Leaked] Bad iframe has been found " my#mail #we send a mail with the date
find /home -type f | xargs sed -i 's$stringtoreplace$ $g' #we replace the string with a whitespace
else
exit 1
fi
exit 0
Related
I'm working on a script to monitor a log of jobs executed and I want to receive a mail notification with the line where appears the job in the body of the mail. This is what I got so far but it keeps throwing error, I can make it work but just with an empty body. Could you please help?
Job="jobname"
tail -fn0 logfile.log | awk -v Jobs="$Job"'/jobname/
{
system("grep -i "Jobs" logfile.log | mail -s "Jobs Is Completed" mail#mail.com")
exit
}'
What's wrong with just:
Job="jobname"
tail -fn0 logfile.log |
grep --line-buffered -i "jobname.*$Job" |
mail -s "$Job Is Completed" mail#mail.com"
Your use of jobname as literal in 2 places and a shell variable named Job with an awk variable named Jobs populated from it and both containing jobname anyway was very confusing so hopefully you can tweak the above to do whatever you need to do if the variable usage above is not quite right.
watchdog.sh
#!/bin/bash
mailWorker(){
while read -r line; do
if [[ $line == *$match* ]]
then
# mailing
grep -i "Jobs" "$logfile" | mail -s "Jobs Is Completed" mail#mail.com
break
fi
done
}
logfile="/path/to/logfile.log"
match="jobname"
if [ ! -f "$logfile" ]; then
touch "$logfile"
fi
tail -f --lines=0 "$logfile" | mailWorker
Problem: I can't find any way to reliably get the current playing file in an MPlayer playlist.
Here is how far I have gotten. This working ash script monitors a text file with the path to the current playlist. When I update the file, the script closes the old instance of MPlayer and opens a new one with the new playlist:
# POLL PLAYLIST FILE FOR CHANGES
CURRENTPLAYLISTPATH=/home/tc/currentplaylist
INFIFO=/tmp/mplayer-in
CURRENTPLAYLIST="NEVERMATCHAPLAYLIST"
FIRSTRUN=1
while [ 1 ];
do
# CHECK FOR NEW PLAYLIST
NEWPLAYLIST=$(head -n 1 $CURRENTPLAYLISTPATH)
if [[ "$NEWPLAYLIST" != "$CURRENTPLAYLIST" ]]; then
if [ "$FIRSTRUN" == 0 ]; then
echo "quit" > "$INFIFO"
fi
# CREATE NAMED PIPE, IF NEEDED
trap "rm -f $INFIFO" EXIT
if [ ! -p $INFIFO ]; then
mkfifo $INFIFO
fi
# START MPLAYER
mplayer -fixed-vo -nolirc -vc ffmpeg12vdpau,ffh264vdpau, -playlist $NEWPLAYLIST -loop 0 -geometry 1696x954 -slave -idle -input file=$INFIFO -quiet -msglevel all=0 -identify | tee -a /home/tc/mplayer.log &
CURRENTPLAYLIST=$NEWPLAYLIST
FIRSTRUN=0
fi
sleep 5;
done
My original plan was just to use the "-identify" flag and parse the log file. This actually works really well up until I need to truncate the log file to keep it from getting too large. As soon as my truncating script is run, MPlayer stops writing to the log file:
FILENAME=/home/tc/mplayer.log
MAXCOUNT=100
if [ -f "$FILENAME" ]; then
LINECOUNT=`wc -l "$FILENAME" | awk '{print $1}'`
if [ "$LINECOUNT" -gt "$MAXCOUNT" ]; then
REMOVECOUNT=`expr $LINECOUNT - $MAXCOUNT`
sed -i 1,"$REMOVECOUNT"d "$FILENAME"
fi
fi
I have searched and searched but have been unable to find any other way of getting the current playing file that works.
I have tried piping the output to another named pipe and then monitoring it, but only works for a few seconds, then MPlayer completely freezes.
I have also tried using bash (instead of ash) and piping the output to a function like the following, but get the same freezing problem:
function parseOutput()
{
while read LINE
do
echo "get_file_name" > /tmp/mplayer-in
if [[ "$LINE" == *ANS_FILENAME* ]]
then
echo ${LINE##ANS_FILENAME=} > "$CURRENTFILEPATH"
fi
sleep 1
done
}
# START MPLAYER
mplayer -fixed-vo -nolirc -vc ffmpeg12vdpau,ffh264vdpau, -playlist $NEWPLAYLIST -loop 0 -geometry 1696x954 -slave -idle -input file=/tmp/mplayer-in -quiet | parseOutput &
I suspect I am missing something very obvious here, so any help, ideas, points in the right direction would be greatly appreciated.
fodder
Alright then, so I'll post mine too.
Give this one a try (assuming there is only one instance running, like on fodder's machine):
basename "$(readlink /proc/$(pidof mplayer)/fd/* | grep -v '\(/dev/\|pipe:\|socket:\)')"
This is probably the safer way, since the file descriptors might not always be in the same order on all systems.
However, this can be shortened, with a little risk:
basename "$(readlink /proc/$(pidof mplayer)/fd/*)" | head -1
You might probably like to install this, too:
http://mplayer-tools.sourceforge.net/
Well, I gave up on getting the track from MPlayer itself.
My 'solution' is probably too hackish, but works for my needs since I know my machine will only ever have one instance of MPlayer running:
lsof -p $(pidof mplayer) | grep -o "/path/to/my/assets/.*"
If anyone has a better option I'm certainly still interested in doing this the right way, I just couldn't make any of the methods work.
fodder
You can use the run command.
Put this in ~/.mplayer/input.conf:
DEL run "echo ${filename} ${stream_pos} >> /home/knarf/out"
Now if you press the delete key while playing a file it will do what you expect i.e. append the current file playing and the position in the stream to the ~/out file. You can replace echo with your program.
See slave mod docs for more info (Ctrl-F somevar).
About getting properties from MPlayer
I have used a non-elegant solution, but it is working for me.
stdbuf -oL mplayer --slave --input=file=$FIFO awesome_awesome.mp3 |
{
while IFS= read -r line
do
if [[ "${line}" == ANS_* ]]; then
echo "${line#*=}" > ${line%=*} # echo property_value > property_name
fi
done
} &
mplayer_pid=&!
read filename < ./ANS_FILENAME
read timeLength < ./ANS_LENGTH
echo ($timeLength) $filename
and so on..
It is in another proccess, that's why I've used files to bring properties
'stdbuf' is for not to miss anything
I started putting together a bash library to handle tasks like this. Basically, you can accomplish this by dumping the mplayer output to a file. Then you grep that dump for "Playing " and take the last result with tail. This should give you the name of the file that's currently playing or that last finished playing.
Take a look at my bash code. You'll want to modify the playMediaFile function to your needs, but the getMediaFileName function should do exactly what you're asking. You'll find the code on my github.
I want to replace the string "Solve the problem" with "Choose the best answer" in only the xml files which exist in the subfolders of a folder. I have compiled a script which helps me to do this, but there are 2 problems
It also replaces the content of the script
It replaces the text in all files of the subfolders( but I want only xml to change)
I want to display error messages(text output preferably) if the text mismatch happens in a particular subfolder and file.
So can you please help me modify my existing script so that I can solve the above 3 problems.
The script I have is :
find -type f | xargs sed -i "s/Solve the problem/Choose the best answer/g"
Using bash and sed:
search='Solve the problem'
replace='Choose the best answer'
for file in `find -name '*.xml'`; do
grep "$search" $file &> /dev/null
if [ $? -ne 0 ]; then
echo "Search string not found in $file!"
else
sed -i "s/$search/$replace/" $file
fi
done
find -type f -name "*.xml" | xargs sed -i "s/Solve the problem/Choose the best answer/g"
Not sure I understand issue 3.
The following command is correctly changing the contents of 2 files.
sed -i 's/abc/xyz/g' xaa1 xab1
But what I need to do is to change several such files dynamically and I do not know the file names. I want to write a command that will read all the files from current directory starting with xa* and sed should change the file contents.
I'm surprised nobody has mentioned the -exec argument to find, which is intended for this type of use-case, although it will start a process for each matching file name:
find . -type f -name 'xa*' -exec sed -i 's/asd/dsg/g' {} \;
Alternatively, one could use xargs, which will invoke fewer processes:
find . -type f -name 'xa*' | xargs sed -i 's/asd/dsg/g'
Or more simply use the + exec variant instead of ; in find to allow find to provide more than one file per subprocess call:
find . -type f -name 'xa*' -exec sed -i 's/asd/dsg/g' {} +
Better yet:
for i in xa*; do
sed -i 's/asd/dfg/g' $i
done
because nobody knows how many files are there, and it's easy to break command line limits.
Here's what happens when there are too many files:
# grep -c aaa *
-bash: /bin/grep: Argument list too long
# for i in *; do grep -c aaa $i; done
0
... (output skipped)
#
You could use grep and sed together. This allows you to search subdirectories recursively.
Linux: grep -r -l <old> * | xargs sed -i 's/<old>/<new>/g'
OS X: grep -r -l <old> * | xargs sed -i '' 's/<old>/<new>/g'
For grep:
-r recursively searches subdirectories
-l prints file names that contain matches
For sed:
-i extension (Note: An argument needs to be provided on OS X)
Those commands won't work in the default sed that comes with Mac OS X.
From man 1 sed:
-i extension
Edit files in-place, saving backups with the specified
extension. If a zero-length extension is given, no backup
will be saved. It is not recommended to give a zero-length
extension when in-place editing files, as you risk corruption
or partial content in situations where disk space is exhausted, etc.
Tried
sed -i '.bak' 's/old/new/g' logfile*
and
for i in logfile*; do sed -i '.bak' 's/old/new/g' $i; done
Both work fine.
#PaulR posted this as a comment, but people should view it as an answer (and this answer works best for my needs):
sed -i 's/abc/xyz/g' xa*
This will work for a moderate amount of files, probably on the order of tens, but probably not on the order of millions.
Another more versatile way is to use find:
sed -i 's/asd/dsg/g' $(find . -type f -name 'xa*')
I'm using find for similar task. It is quite simple: you have to pass it as an argument for sed like this:
sed -i 's/EXPRESSION/REPLACEMENT/g' `find -name "FILE.REGEX"`
This way you don't have to write complex loops, and it is simple to see, which files you are going to change, just run find before you run sed.
u can make
'xxxx' text u search and will replace it with 'yyyy'
grep -Rn '**xxxx**' /path | awk -F: '{print $1}' | xargs sed -i 's/**xxxx**/**yyyy**/'
There's some good answers above. I thought I'd throw in one more that is succinct and parallelizable, using GNU parallel, which I often prefer to xargs:
parallel sed -i 's/abc/xyz/g' {} ::: xa*
Combine this with the -j N option to run N jobs in parallel.
If you are able to run a script, here is what I did for a similar situation:
Using a dictionary/hashMap (associative array) and variables for the sed command, we can loop through the array to replace several strings. Including a wildcard in the name_pattern will allow to replace in-place in files with a pattern (this could be something like name_pattern='File*.txt' ) in a specific directory (source_dir).
All the changes are written in the logfile in the destin_dir
#!/bin/bash
source_dir=source_path
destin_dir=destin_path
logfile='sedOutput.txt'
name_pattern='File.txt'
echo "--Begin $(date)--" | tee -a $destin_dir/$logfile
echo "Source_DIR=$source_dir destin_DIR=$destin_dir "
declare -A pairs=(
['WHAT1']='FOR1'
['OTHER_string_to replace']='string replaced'
)
for i in "${!pairs[#]}"; do
j=${pairs[$i]}
echo "[$i]=$j"
replace_what=$i
replace_for=$j
echo " "
echo "Replace: $replace_what for: $replace_for"
find $source_dir -name $name_pattern | xargs sed -i "s/$replace_what/$replace_for/g"
find $source_dir -name $name_pattern | xargs -I{} grep -n "$replace_for" {} /dev/null | tee -a $destin_dir/$logfile
done
echo " "
echo "----End $(date)---" | tee -a $destin_dir/$logfile
First, the pairs array is declared, each pair is a replacement string, then WHAT1 will be replaced for FOR1 and OTHER_string_to replace will be replaced for string replaced in the file File.txt. In the loop the array is read, the first member of the pair is retrieved as replace_what=$i and the second as replace_for=$j. The find command searches in the directory the filename (that may contain a wildcard) and the sed -i command replaces in the same file(s) what was previously defined. Finally I added a grep redirected to the logfile to log the changes made in the file(s).
This worked for me in GNU Bash 4.3 sed 4.2.2 and based upon VasyaNovikov's answer for Loop over tuples in bash.
The Silver Searcher Solution
I'm adding another option for those people who don't know about the amazing tool called The Silver Searcher (command line tool is ag).
Note: You can use grep and other tools to do the same thing here, but The Silver Searcher is fantastic :)
TLDR
ag -l 'abc' | xargs sed -i 's/abc/xyz/g'
Install The Silver Searcher
sudo apt install silversearcher-ag # Debian / Ubuntu
sudo pacman -S the_silver_searcher # Arch / EndeavourOS
sudo yum install epel-release the_silver_searcher # RHEL / CentOS
Demo Files
Paste the following into your terminal to create some demonstration files:
mkdir /tmp/food
cd /tmp/food
content="Everybody loves to abc this food!"
echo "$content" > ./milk
echo "$content" > ./bread
mkdir ./fastfood
echo "$content" > ./fastfood/pizza
echo "$content" > ./fastfood/burger
mkdir ./fruit
echo "$content" > ./fruit/apple
echo "$content" > ./fruit/apricot
Using 'ag'
The following ag command will recursively find all the files that contain the string 'abc'. It ignores the .git directory, .gitignore files, and other ignore files:
$ ag 'abc'
milk
1:Everybody loves to abc this food!
bread
1:Everybody loves to abc this food!
fastfood/burger
1:Everybody loves to abc this food!
fastfood/pizza
1:Everybody loves to abc this food!
fruit/apple
1:Everybody loves to abc this food!
fruit/apricot
1:Everybody loves to abc this food!
To just list the files that contain the string 'abc', use the -l switch:
$ ag -l 'abc'
bread
fastfood/burger
fastfood/pizza
fruit/apricot
milk
fruit/apple
Changing Multiple Files
Finally, using xargs and sed, we can replace the 'abc' string with another string:
ag -l 'abc' | xargs sed -i 's/abc/eat/g'
In the above command, ag is listing all the files that contain the string 'abc'. The xargs command is splitting the file names and piping them individually into the sed command.
i want to tail log file with grep and sent it via mail
like:
tail -f /var/log/foo.log | grep error | mail -s subject name#example.com
how can i do this?
You want to send an email when emailing errors occur? That might fail ;)
You can however try something like this:
tail -f $log |
grep --line-buffered error |
while read line
do
echo "$line" | mail -s subject "$email"
done
Which for every line in the grep output sends an email.
Run above shell script with
nohup ./monitor.sh &
so it will keep running in the background.
I'll have a go at this. Perhaps I'll learn something if my icky bash code gets scrutinised. There is a chance there are already a gazillion solutions to do this, but I am not going to find out, as I am sure you have trawled the depths and widths of the cyberocean. It sounds like what you want can be separated into two bits: 1) at regular intervals obtain the 'latest tail' of the file, 2) if the latest tail actually exists, send it by e-mail. For the regular intervals in 1), use cron. For obtaining the latest tail in 2), you'll have to keep track of the file size. The bash script below does that - it's a solution to 2) that can be invoked by cron. It uses the cached file size to compute the chunk of the file it needs to mail. Note that for a file myfile another file .offset.myfile is created. Also, the script does not allow path components in the file name. Rewrite, or fix it in the invocation [e.g. (cd /foo/bar && segtail.sh zut), assuming it is called segtail.sh ].
#!/usr/local/bin/bash
file=$1
size=0
offset=0
if [[ $file =~ / ]]; then
echo "$0 does not accept path components in the file name" 2>&1
exit 1
fi
if [[ -e .offset.$file ]]; then
offset=$(<".offset.$file")
fi
if [[ -e $file ]]; then
size=$(stat -c "%s" "$file") # this assumes GNU stat, possibly present as gstat. CHECK!
# (gstat can also be Ganglias Status tool - careful).
fi
if (( $size < $offset )); then # file might have been reduced in size
echo "reset offset to zero" 2>&1
offset=0
fi
echo $size > ".offset.$file"
if [[ -e $file && $size -gt $offset ]]; then
tail -c +$(($offset+1)) "$file" | head -c $(($size - $offset)) | mail -s "tail $file" foo#bar
fi
How about:
mail -s "catalina.out errors" blah#myaddress.com < grep ERROR catalina.out