On Windows XP I'm trying to add a job like this:
at 17:07 /every:s dir
I expect dir to be executed every Saturday at 17:07, however I don't see anything happens in the command line window.
Here is the log:
D:\temp>at 17:07 /every:s dir
Added a new job with job ID = 1
D:\temp>time/t
05:06 PM
D:\temp>date/t
Sat 10/02/2010
D:\temp>at
Status ID Day Time Command Line
-------------------------------------------------------------------------------
Error 1 Each S 5:07 PM dir
D:\temp>time/t
05:08 PM
D:\temp>
What am I missing ?
The Task Scheduler service runs at commands in the background. You should not expect to see anything just because you happen to have a console open. A simple way to test that it runs is to write to a log file. You will likely want to wrap this in a batch file.
I think you need to use the /interactive parameter in order for the output to be visible.
Related
I have a situation where I want to make the execution of my scripts smarter. I have a set of scripts that execute at a given time, but because sometimes the input files are not posted at the correct times the scripts run into errors and get unexpected results. so one of the solutions I was thinking of is to make the execution of the scripts dependent of each other. Here is what I mean:
script 1 runs at 6 pm
validates that the file is there
if it's there, set a flag
the flag is active so execute script 2 at 9 pm
if it's NOT there, the flag is not set
the flag is not set so script 2 is not executed
Right now script 1 and script 2 are set with the Task Scheduler at those times, I checked the Scheduler for those type of conditions, but didn't find anything.
You can set triggers in Task Scheduler, like when an event happens for basically everything you can see in eventviewer.
I would suggest Write-Eventlog from the script which works on the file, and depending on the result the sched task would get triggerd.
I suggest you to have single script running every N-minutes on single scheduled task via Task Scheduler.
The master script will analyze activities and have all logical conditions those determine when and which external script to run. You can also have flag files.
I'm looking to have a .bat file run at startup to check if the date it September 18th, which is my birthday. What I have written is
#echo off
if %date% == Sun 09/18/2016 start /d C:\Users\david\Documents\birthday.bat
end if
but that doesn't work. I'm fairly new to scripting, so any help would be appreciated.
EDIT: I also tried using Schtasks to create a scheduled task that executes birthday.bat on the given date, but that failed entirely to create a task. I am running Win10 and want to do this entirely from a .bat file.
All you need is "" around the dates in the if command.
remove the /d and I just add the "" after start and " around the batch path.
So should look like this:
#echo off
if "%date%" == "Sun 09/18/2016" start "" "C:\Users\david\Documents\birthday.bat"
That should work.
What went wrong when you used scheduled tasks, fixing that would be a better option than hitting the batch file each startup
I have the following function definition in my profile:
function design {
Set-Location $env:CC72\Designer
.\RunDesigner.bat
}
But when I execute the function design, PowerShell executes batch RunDesigner.bat twice and two instances of it are started:
PS C:\Users\s3201> design
Starting Designer...
Starting Designer...
PS C:\customer\Designer>
Why?
Update 1
The content of the batch is the following:
set LOC_WIN32=c:\Customer\xxxxxx
set LOC_UNIX=c:/Customer/xxxxxxx
echo Starting Designer...
start %LOC_WIN32%\bin\xxxxxxx.exe %LOC_UNIX%/Designer/lib/xxxxxxxx.tcl -guiFile %1
Update 2
If I start Vim from the directory where mentioned batch resides then they start both - Vim and batch file.
Therefore the reason for the message in begin of the question is clear - PowerShell tries to start that batch first, does not find it, writes the message and the starts Vim. But why it tries to start the that batch first?
Obviously, the root cause is in some PowerShell definitions. Which ones? (Aliases I checked).
I have a problem of understanding with the use of a background process.
I have to convert a logic of treatment which is on Linux towards Windows. I have a script which in its treatment, launche among others, another script in background, and pursues its treatment. The script which throws(launches) the treatment background does not have to care about the result of this one thus has to end even if the treatment in backgroud still execute.
What I experimented in Powershell is when my first script ends, it looks like the processing of the script thrown in background breaks off.
Here is an example of the tests I have made in my environment (PowerShell version 2.0)
Here's the simple code of my first script script_1.ps1:
start-job -filepath d:\script\script_2.ps1
sleep 3
get-item d:\log\log
And my second script script_2.ps1:
sleep 1
"test" > d:\log\log-1.log
sleep 1
"test" > d:\log\log-2.log
sleep 1
"test" > d:\log\log-3.log
sleep 1
"test" > d:\log\log-4.log
sleep 1
"test" > d:\log\log-5.log
Here's what's happen when I submit it:
C:\Documents and Settings\user1>Powershell -command "D:\script\script_1.ps1"
Id Name State HasMoreData Location Command
-- ---- ----- ----------- -------- -------
1 Job1 Running True localhost sleep 1...
LastWriteTime : 2013-10-10 10:19:13
Length : 14
Name : log-1.log
LastWriteTime : 2013-10-10 10:19:14
Length : 14
Name : log-2.log
The result show me that when the first script terminate, (after 3 sec.) the second script
stop also because we see that only the first 2 logs files were created.
I tought that it would be like on Linux (&) where the background job continue anyway.
Is it the way it suppose to be with PowerShell or is somethings I do wrong?
When you use the -Command switch to start PowerShell it runs the supplied command, and then exits the session. Jobs in PowerShell are specific to the session they're started in. You can see this for yourself if you want by starting two instances (sessions) of PowerShell. Create a job in one, e.g. start-Job {sleep 600}, and do get-Job, you'll see that a job is Running. If you type get-Job into the other PowerShell instance you'll see that there are no jobs.
The start-Job cmdlet is creating a job, but it doesn't wait for that job to complete, it simply starts it on its merry way. In your case what's happening is that start-Job is creating the job, you're waiting one second, then listing the contents of the log directory. Then you reach the end of the script and once that happens the session is terminated. This is why only some of your log files are created. The session is being ended because you've reached the end of the script, before the created job has had a chance to finish.
To use jobs in the manner you describe you need for the PowerShell session to remain started for the entirety of the job's lifetime. For that, you'll need the -NoExit parameter on your PowerShell command that runs the job. You may want to browse around powershell /? for other switches you might find useful. If you got extra creative with your script that starts the job you could eventually make a hidden, noninteractive session that automatically quit once the job was done. But baby steps.
On linux, the background job will keep on running only if launched with the nohup command, or the nohuponexit option is set in the shell (or the process is later disowned with the eponym command)
Hints on how to do the same with PS can be found in this thread of emails on freelist.
I have a perl script that I run from the command prompt: 3 times a day, the problem is now I need to run it every two hours. Im not going to be on an episode of lost so I need some help. Here is the command:
Perl C:/test/scripts/edi.pl
Does anyone know how this above one line command can be made into an executable (.exe) file so I can use the task scheduler to run?
If there is another way to do this with the task scheduler to run once every two hours every day then I could do that as well.
Thanks for your time.
Can you not simply create a batch file that runs the script, and set that batch file to run every two hours? I know from a quick test there that I can schedule a batch file to run from Task Scheduler on Windows XP at least.
You can actually use the task scheduler to run that exact command without a batch.
The task scheduler should allow you to pass some arguments to the script without a problem (I have done this on a few Windows servers in order to have them run PHP scripts)