Command not running fron cron - perl

I have a perl script which runs successfully from the root cron on my redhat server.
However, I have added a command to the perl script to execute an ldapsearch and when running the perl script from the command line it works perfectly, yet running from cron, the ldapseach command does not work.
I've the the full path to the ldapsearch executable and I have tried using both system and exec before the ldapsearch command, but no good.
The line in the perl code does the ldap search for container room, greps the specific line in the results, then parses the data and cuts the 1st two characters of the results. The code is:
$userRoom = `exec /usr/local/bin/ldapsearch -h myldap.domain.com '(cn=$user)' room | /bin/grep -i room | /bin/grep -iv internal | /bin/cut -d'=' -f2 | /bin/cut -c 1-2`;
I'm assuming it's an evironment or permissions thing. I just can't find the right answer.
Any suggestions greatly appreciated.

Related

Running perl files from a text file

There're multiple perl scripts that is ran from CYGWIN terminal. An example is,
$ perl IdGeneratorTool.pl JSmith -i userInfo.adb -o JSmith.txt
The above is an example. Were based on input parameter JSmith, it reads a db file, generate an ID and output that to a text file.
Now these perl scripts running on the CYGWIN keeps growing and it's added to a text file like shown below,
$ perl IdGeneratorTool.pl JSmith -i userInfo.adb -o JSmith.txt
$ perl IdGeneratorTool.pl PTesk -i userInfo.adb -o PTesk.txt
$ perl IdGeneratorTool.pl CMorris -i userInfo.adb -o CMorris.txt
$ perl IdGeneratorTool.pl JLawrence -i userInfo.adb -o JLawrence.txt
$ perl IdGeneratorTool.pl TCruise -i userInfo.adb -o TCruise.txt
...
....
......
.......
.........
And the list keeps growing.
I would like to know whether there's a way to execute all these perl scripts which are in a text file in one go.
I'm new to perl and doesn't have much idea as to what are the options.
An ideal scenario might be, a tool where i can open this text file and click a execute button and then it executes all the scripts and output multiple *.txt files into the same directory.
Or maybe a simple perl script that can do it.
Put them into a file makeall (or whatever you want to call it.
Put as a first line #!/bin/bash into the file
In cygwin enter chmod +x makeall
in cygwin enter ./makeall
With this you've created a bash script which'll do all your calls of the perl script.
Another option would to just put all the user information into a csv file and read that one in order to call your script.
WAIT! Even easier!
Put into the makeall script this:
#!/bin/bash
for user in \
JSmith \
PTesk \
CMorris \
JLawrence \
TCruise \
; do
perl IdGeneratorTool.pl "$user" -i userInfo.adb -o "$user".txt
done
Now you just need to add any additional user the same way I did for your examples.
Without seeing the source for IdGeneratorTool.pl it's hard to give any specific advice; but it is generally not hard to turn something like
do_stuff($ARGV[0], $opt_i, $opt_o);
into
while (<>) {
chomp;
$user, $adb, $outputfile = split('\t');
do_stuff($user, $adb, $outputfile);
}
to read the input from a tab-delimited file instead of from command-line arguments.
You can create text file with list of users (one per line) for example user_list.txt
JSmith
PTesk
CMorris
JLawrence
TCruise
Then create bash script process_list.sh with following content in same directory
#!/bin/bash
for user in `cat user_list.txt`
do
perl IdGeneratorTool.pl $user -i userInfo.adb -o ${user}.txt
done
Now make bash script executable chmod +x process_list.sh and it is ready for execution.
Once you need to add new user edit user_list.txt to add one more line into the file.
Polar Bear

Piping to awk fails to output from Swift 2.0

I'm trying to run the following command to process the output via Swift 2.0 but I'm having trouble doing so.
netstat -w1 -I en | awk '{ print $3 }'
The command runs as expected when ran manually from the terminal.
I tried the method of calling /bin/sh -c with the full command as an argument but that doesn't output anything. (Similar to https://stackoverflow.com/a/29549342/2110967).
I also tried to pass the data over stdout/stdin from the netstat to the awk but didn't have much luck.
I also tried placing the code into a .sh file and running from there but again the output was never returned.

sh: variable substitution with heredoc

cat "${pos}" | /usr/bin/iconv -f CP1251 -t UTF-8 | uniq | sed -En "/^CLIENT_ID.*/!p" | while read line
do
.....
......
cat >> "$TMPFILE" << EOF
INSERT INTO ......;
EOF
done
As you can see each iteration writes a SQL statement to a tmp-file.
I launched this script from a regular interactive shell and got the expected output. Launched from a cron job - nothing.
After investigating I found a problem. When I use "$TMPFILE" without "" the script works ok. Why does this happen?
OS: FreeBSD, bourne shell.
IIRC, cron doesn't source all the files that a login shell does, so you will end up with different settings for environment variables. Could be the path $TMPFILE is pointing to contains spaces when run from cron for example.
Also, on some systems (depending on setup), cron uses a different shell. So if you start your script from command line, for example /usr/bin/sh might be used, whereas when started by cron, /bin/sh is used. (I have no experience with *BSD, but I have observed this on linux.)

How to find the command line of a process only if the process is from current user

I have the following situation:
A perl script read a file where a application wrote it's pid, and tries to kill it.
But the problem is that I don't want to kill another process so I check if current process with the recorded PID has the same command line. If so, the application could be killed.
The following blues script find out the cmdline:
$PIDCMDLINE = `ps -p $PID -o cmd`;
The problem is that if another instance for another user is up, maybe on the same sid, it would be killed because it will return a valid command line, and I don't want that behaviour.
How can I restrict ps -p to search only current users processes (no, simple ps doesn't count, because -p nullify the default effect of ps)
Thank you!
You can use the following to check both command and user for the certain PID:
ps -p <PID> -o user,cmd --columns 1000 | grep `whoami`
Adding a 'grep' according to the comment.
May be a little awkward, but what about this:
$PIDCMDLINE = ps -p $PID -o user,command | grep `whoami` | awk '{ print $2 }'

Executing perl script inside bash script

I inherited a long bash script that I recently needed to modify. The bash script is run as a cronjob on a daily basis. I am decent with bash scripting, but I do not know much about Perl.
I had to substitute all "rm" commands with a call to a perl script that does something similar (for security purposes). This script was not written by me, so there is no -f flag to skip the confirmation prompt. Therefore, to automate this script I pipe "yes" to the script.
Here is an example where I am sequentially deleting two directories:
echo REMOVING FILES TO SAVE DISK SPACE
echo "yes | sudo nice -n -10 perl <path_to_delete_script.pl> -dir <del_dir1>"
yes | sudo nice -n -10 perl <path_to_delete_script.pl> -dir <del_dir1>
echo "yes | sudo nice -n -10 perl <path_to_delete_script.pl> -dir <del_dir2>"
yes | sudo nice -n -10 perl <path_to_delete_script.pl> -dir <del_dir2>
echo DONE.
In my output file, I see the following:
REMOVING FILES TO SAVE DISK SPACE
yes | sudo nice -n -10 perl <path_to_delete_script.pl> -dir <del_dir1>
yes | sudo nice -n -10 perl <path_to_delete_script.pl> -dir <del_dir2>
DONE.
It does not appear that the perl script has run. Yet when I copy and paste those two commands into the terminal, they both run fine.
Any help is appreciated. Thank you in advance.
You simply put do
yes | ./myscript.pl
Thanks for all the comments. I ended up changing the group and permissions of the tool and all output files. This allowed me to run the perl script without using "sudo," which others pointed out is bad practice.