How to prevent the crontab job execution, when it is already running - scheduled-tasks

I want to schedule a job to run every 5 minutes. but the question is that
Is there a way (like creating a crontab) to prevent a job from running, when the previous job has not been completed?

You can write a shell script to start the job only when the job is not already running. Configure the shell script in crontab every 5 minutes. this will ensure that the execution happens only when there is no instance of the job running already. This is how i have done for my cron jobs
Note : make use of ps -ef | grep commands in your shell script to identify if there is a process already running

Related

Mac Terminal to run a bash script that starts a swift program & restarts every hour

I am looking for some support on creating some way of running a swift command in terminal to run a program and then stop it after 1 hour then restart.
Example of manual process:
Open Termain.
cd my app
swift run my program --with-parameters
ctrl+c (after 1 hours)
Restart with step 3
I am sure there must be some way using a bash script maybe to start the program by command, kill it after 60min and restart it with a continuous loop like that.
Thanks :-)
You can set up a cron job to do this. Basically, you'll have a bash script, say it's located at /Users/ben/scripts/run_my_program.sh that will, at every hour:
Terminate the current running process (kill pid)
Execute the swift run my program --with-parameters and spit out the process ID
you can get the PID of the swift process you launch with echo $!, and then use sleep 1h to sleep for 1 hour and then kill the process with kill -9 and the PID you got in the first step.

Can a PowerShell script be dependent of another script's execution?

I have a situation where I want to make the execution of my scripts smarter. I have a set of scripts that execute at a given time, but because sometimes the input files are not posted at the correct times the scripts run into errors and get unexpected results. so one of the solutions I was thinking of is to make the execution of the scripts dependent of each other. Here is what I mean:
script 1 runs at 6 pm
validates that the file is there
if it's there, set a flag
the flag is active so execute script 2 at 9 pm
if it's NOT there, the flag is not set
the flag is not set so script 2 is not executed
Right now script 1 and script 2 are set with the Task Scheduler at those times, I checked the Scheduler for those type of conditions, but didn't find anything.
You can set triggers in Task Scheduler, like when an event happens for basically everything you can see in eventviewer.
I would suggest Write-Eventlog from the script which works on the file, and depending on the result the sched task would get triggerd.
I suggest you to have single script running every N-minutes on single scheduled task via Task Scheduler.
The master script will analyze activities and have all logical conditions those determine when and which external script to run. You can also have flag files.

Jenkins - Close the Jenkins job as soon as one of the parallel scrips fail

I'm running two Perl scripts in parallel in Jenkins
some shell commands
perl script 1 &
perl script 2 &
wait
some more shell commands
If one of the perl scripts fail in the middle of the execution , the job waits until the other script runs (as it is executed in parallel in background).
I want the job to stop as soon as one of the script fails and not waste time by completing the execution of other script.
Please help.
You set up a signal handler for SIGCHLD, which is a signal that is always delivered to the parent process when a child exits. I'm not aware of a mechanism to see which child process exited, but you can save the subprocess process identifiers and just kill both of them when you receive SIGCHLD:
some shell commands
perl script 1 &
pid1=$!
perl script 2 &
pid2=$!
trap "kill $pid1 $pid2" CHLD
wait
some more shell commands
The script above has the downside that it will kill the other script regardless of the exit status of the subprocess. You could in the trap, if you want to, add a check for the exit status. The subprocess could e.g. create some temp file if it succeeds and the trap could check if the file exists.
Typically with Jenkins you would have the parallel steps running as separate jobs (or projects as they are sometimes known) rather than steps in a job. This would then allow the steps to run in parallel across different slave machines and it would keep the output for the jobs in a separate place.
You would then have a controlling job running the other parts.
I like the Multijob plugin for this sort of thing.
There are alternatives which may suit better, such as Build Flow Plugin which uses a DSL to describe the jobs you want to run

Catching the error status while running scripts in parallel on Jenkins

I'm running two perl scripts in parallel on Jenkins and one more script which should get executed if the first two succeed. If I get an error in script1, script 2 still runs and hence the exit status becomes successful.
I want to run it in such a way that if any one of the parallel script fails, the job should stop with a failure status.
Currently my setup looks like
perl_script_1 &
perl_script_2 &
wait
perl_script_3
If script 1 or 2 fails in the middle, the job should be terminated with a Failure status without executing job 3.
Note: I'm using tcsh shell in Jenkins.
I have a similar setup where I run several java processes (tests) in parallel and wait for them to finish. If any fail, I fail the rest of my script.
Each test process writes its result to a file to be tested once done.
Note - the code examples below are written in bash, but it should be similar in tcsh.
To do this, I get the process id for every execution:
test1 &
test1_pid=$!
# test1 will write pass or fail to file test1_result
test2 &
test2_pid=$!
...
Now, I wait for the processes to finish by using the kill -0 PID command
For example test1:
# Check test1
kill -0 $test1_pid
# Check if process is done or not
if [ $? -ne 0 ]
then
echo process test1 finished
# check results
grep fail test1_result
if [ $? -eq 0 ]
then
echo test1 failed
mark_whole_build_failed
fi
fi
Same for other tests (you can do a loop to test all running processes periodically).
Later condition the rest of the execution based on mark_whole_build_failed.
I hope this helps.

simple command prompt one line into exe

I have a perl script that I run from the command prompt: 3 times a day, the problem is now I need to run it every two hours. Im not going to be on an episode of lost so I need some help. Here is the command:
Perl C:/test/scripts/edi.pl
Does anyone know how this above one line command can be made into an executable (.exe) file so I can use the task scheduler to run?
If there is another way to do this with the task scheduler to run once every two hours every day then I could do that as well.
Thanks for your time.
Can you not simply create a batch file that runs the script, and set that batch file to run every two hours? I know from a quick test there that I can schedule a batch file to run from Task Scheduler on Windows XP at least.
You can actually use the task scheduler to run that exact command without a batch.
The task scheduler should allow you to pass some arguments to the script without a problem (I have done this on a few Windows servers in order to have them run PHP scripts)