DriveImage XML 2008 R2 Task Schedule - scheduled-tasks

I'm attempting to create a scheduled task on two different Windows 2008 R2 Servers that will use Driveimage XML by Runtime Software to create a disk image. The tasks are scheduling fine and running at proper times and after observing the running processes even the dixml.exe shows up in the processes section of the servers task manager under the user account that is specified.
The login used to run the task has administrator credentials as well as the GPO sets to allow admin to elevate an application without prompt.
I've tried writing a batch file for the program and running it which is as follows:
"C:\Program Files (x86)\Runtime Software\DriveImage XML\dixml.exe" /bC /s /c1
/v /t"X:\%date:~10,4%_%date:~4,2%_%date:~7,2%_%~0,3%"
I broke up the above script into two lines of code for easy reading.
This script when manually run either from the cmd or double clicking will properly start the process of the backup. However, when attached as a file to the task scheduler the event 'starts' but the program itself does not begin a back-up.
I've tried converting the batch file into this action:
Start a program: dixml.exe
Arguments: /bC /s /c1 /v /t"X:\%date:~10,4%_%date:~4,2%_%date:~7,2%_%~0,3%"
Start in: C:\Program Files (x86)\Runtime Software\DriveImage XML
According to MS in the 2008 R2 does not recoginze trailing slashes or quotations in their task scheduler so you leave them out.
The second method has the same effect of running the task, the program shows in the process tree and corresponds with the vent viewers PID code, but the back-ups are not made.
Anyone have experience with this perplexing issue?

Related

Silent bat file execute powershell command

We have an application server running as a service, when some configuration is loaded it starts a bat script which has to run the powershell command Stop-ClusterGroup DRMSERVICES and then start it again.
The bat file works flawless when I manually execute it by dobbelt clicking. But when the service is running the bat, it does not finish, or execute the powershell command.
Bat file looks as follows
#echo off
powershell -command Stop-ClusterGroup DRMSERVICES
powershell -command Start-ClusterGroup DRMSERVICES
The service runs the bat file in silent mode, as a main difference.
I have tried with various switches including the -ExecutionPolicy Unrestricted and START /wait etc
Creating a seperate ps1 file and have the bat execute this instead.
All with the same output:
Manually executing the bat works
When the service executes the bat, it does not work.
I know the bat file is executed by the service, as inserting NET STOP servicename is working correct.
In the powershell event viewer I can also see event of the powershell commands take place.
The difference between manually executing and have the service execute the command in the event viewer, is event id 800 which states info about 'execution pipe' this is not present when the service is executing the bat.
The service does not wait for the powershell, and thus it does not have time to stop the cluster before exiting.
I'm lost whether this is a permission issue, syntax error or whatever.
Hopefully somebody can help
UPDATE:
I have tried with all proposed solutions, all with same result, the bat file works when double clicked, but the service does not execute the powershell command. Pure cmd is executed, as I can pipe to a txt file. I even got to a point when trying runas that the output log text wrote "insert administrator password"
I even managed to have our software guy change our software to call a powershell directly instead of a bat, same result. Powershell won't execute the command, this tells me it probably is permission, but everything have been set to log in as admin and run as admin for the sake of success, but still nothing.
I solved the problem.
Because the service is a 32bit process, it will execute a 32bit powershell.
FailoverClusters module only exists as a 64bit module.
By using %SystemRoot%\sysnative\WindowsPowershell\v1.0\powershell.exe
The service is able to open a 64bit session, and thus use the failover cluster module.
As a side note, the sysnative folder is only visible from a 32bit session, therefore it cannot be found via browsing in a 64bit os.
I think i have dealt with this kind of issue before, after the,
powershell -command Stop-ClusterGroup DRMSERVICES
you need to have cmd wait for a certain number of seconds, and then test if the DRMSERVICES is now stopped, if it is stopped then to start the DRMSERVICES again. This way cmd will keep waiting, and then check if the service has stopped.
After a certain number of tries, maybe have a way to stop checking and exit the script, for example it is trying to stop the service, and has run into a problem.
There is a timeout command in cmd

Batch Script not Releasing after Execution

I have a batch script which performs file version controlling after a Backup event has taken place. This batch script, writing to a normal txt logfile, calls a PowerShell script to send this Logfile as an attachment with a success notification email. I have managed to release the writing lock on the log file, to allow PS to attach and send the file, but the Batch script does not stop after the entire sequence has been completed.
When I check the log file, I see that the shell instance has placed a 'Pause' in the script, instead of a self-termination (as it is instructed), and results in:
Press any key to continue... with a waiting shell
an app locked logfile, which won't allow the script to run again, unless the logfile is released.
This is the sequence of events:
The only Pause I have, is in < bak_send_exec.bat > - its sole purpose is to start a PS script:
PowerShell.exe -noprofile -executionpolicy bypass
If I remove it, the PS does not start. If I have it in there, the PS starts and executes flawlessly, but the logfile stays locked in a shell instances which is in Paused state, until someone kills the cmd.exe instances which locked the file.
This runs on a weekend at 01:00 am, so user intervention should not be required.
VC Script Summary:
This inter-connected batch files renames two identical files (in different locations) with timestamps. The timestamp is written to a variable for use in a notification email, which is sent using a PowerShell command. The entire process is logged to a txt log file (file overwritten when script runs again), and the log file is included with a Notification Email, mentioned earlier.
Script Calls:
Initial Start Command: Triggers the Version Control Procedures and Logs Progress with versioncontrol_post.bat > TSLog.txt 2>&1
versioncontrol_post.bat: Performs main procedure, then ends with CALL bak_send_exec.bat
bak_send_exec.bat: The suspected cause... Coding of entire file is three lines long, but required as mentioned earlier, for policy relaxation:
#ECHO OFF
PowerShell.exe -noprofile -executionpolicy bypass -file bak_send.ps1
PAUSE
bak_send.ps1: Performs main procedure to make a copy of the temporary log (TSLog.txt) to its final home, releases the TSLog file to work with the new duplicate of it, and continues to take that new duplicate and attach it to an email and sends email. The final line in the procedure is EXIT.
Fault finding tells me that the issue is not with the PowerShell script, but rather with the script that calls it. Taking out the PAUSE command results in the PowerShell not starting.
Does anyone have a possible solution to this "feature"?

Run a PowerShell script in Task Scheduler that is located on network location

I'm trying to administer PowerShell scripts on a central shared location. At the moment the .ps1 scripts are all stored on many different servers.
The scripts are used in Scheduled Tasks with the 'Action' specifics:
Program/script: powershell.exe
Add arguments (optional): G:..\scriptABC.ps1
Start in (optional): C:\Windows\System32\WindowsPowerShell\v1.0\
Now the goal is to have them all on a shared location:
\\domain\root..\scripts\scriptABC.ps1
Problem:
When I update the 'Add arguments' field to "\\domain\root..\scripts\scriptABC.ps1" and try to start the task, it'll say Running for a few seconds and than Ready again, but nothing happened. So it works like a charm when a local drive is specified, but not when a UNC path is specified.
Additional info:
Working on Windows Server 2012 R2
Any thoughts?
Check out the following link
https://community.spiceworks.com/how_to/17736-run-powershell-scripts-from-task-scheduler
Also the machine running the task will need to have permission to access the file on the network

Windows Scheduled task succeeds but returns result 0x1

I have a scheduled task on a Windows 2008 R2 server. The task includes a Start In directory entry. The task runs, and the batch file it runs does what it is supposed to do. When I run the batch file from a command prompt, I see no errors. The problem is that the "Last run result" is 0x1 (incorrect function call).
I did get this at one time with an incorrect DOS statement IF EXISTS file.txt DO (Copy file.txt file1.txt) that was corrected by dropping the DO statement. The current batch file does not show me any errors or warnings.
Why am I getting a 0x1 result?
Batch file that is run:
PUSHD \\JUKEBOX4\Archives\CallRecording
REM only move csv and wma together. wma should be created last.
IF NOT EXIST C:\CallRecording (MKDIR C:\CallRecording)
FOR /f %%f IN ('DIR /b *.wma') DO (
IF EXIST %%~nf.csv (MOVE /Y %%~nf.* C:\CallRecording\)
)
POPD
CD /D "C:\Program Files (x86)\Olim, LLC\Collybus DR Upload"
CollybusUpload.exe
POPD
Info on scheduled task setup:
Program to run: C:\Program Files (x86)\Olim, LLC\Collybus DR Upload\CallRecordingUploadFromH.cmd
Start in: C:\Program Files (x86)\Olim, LLC\Collybus DR Upload
Run whether user is logged on or not, highest privileges.
History screen, task completed entry
"Task Scheduler successfully completed task "\Call recording upload to portal from NH" , instance "{1449ad42-2210-427a-bd69-2c15e35340e6}" , action "C:\Windows\SYSTEM32\cmd.exe" with return code 1."
First screen of Task Scheduler shows "Run Result" of "Success"
It seems many users are having issues with this. Here are some fixes:
Right click on your task > "Properties" > "Actions" > "Edit" |
Put ONLY the file name under 'Program/Script', no quotes and ONLY the directory under 'Start in' as described, again no quotes.
Right click on your task > "Properties" > "General"
| Test with any/all of the following:
"Run with highest privileges" (test both options)
"Run wheter user is logged on or not" (test both options)
Check that "Configure for" is set to your machine's OS version
Make sure the user account running the program has the right permissions
I found that I have ticked "Run whether user is logged on or not" and it returns a silent failure.
When I changed tick "Run only when user is logged on" instead it works for me.
I've had the same problem. It is just a batch-file, working when manually started, but not working as a scheduled task.
there were drive-letters in the batch-file like this:
put z:\folder\file.ext
seems like you should not use drive-letters, they are bound to the user, who created them - for me this little change made it work again:
put \\server\folder\file.ext
For Powershell scripts
I have seen this problem multiple times while scheduling Powershell scripts with parameters on multiple Windows servers.
The solution has always been to use the -File parameter:
Under "Actions" --> "Program / Script" Type: "Powershell"
Under "Add arguments", instead of just typeing "C:/script/test.ps1" use -File "C:/script/test.ps1"
Happy scheduling!
Windows Task scheduler (Windows server 2008r2)
Same error for me (last run result: 0x1)
Tabs
Action: remove quotes/double-quotes in
program/script
and
start in
even if there is spaces in the path name...
General:
Run with highest privileges
and
configure for your OS...
Now it work!
last run result: The operation completed successfully
Probably not the cause of the OP's problem; for me the problem was caused by the fact that my program called a SQL function, and the service account the windows task was set up with did not have the required SQL permissions. That also gives a 0x1
This answer was originally edited into the question by the asker.
The problem was that the batch file WAS throwing a silent error. The final POPD was doing no work and was incorrectly called with no opening PUSHD.
Broken code:
CD /D "C:\Program Files (x86)\Olim, LLC\Collybus DR Upload" CALL CollybusUpload.exe POPD
Correct code:
PUSHD "C:\Program Files (x86)\Olim, LLC\Collybus DR Upload" CALL CollybusUpload.exe POPD
In my case it was an encoding issue. We wanted to start en existing batch file, and it resulted in "return code 1", and the desired action wasn't performed. I've accidentally found that the batch file was shown in Notepad as one with UTF-8 encoding (actually without any reason, as we have no special characters in the text). I saved it as ANSI, and it solved the problem for us. Might be, that it was a kind of encoding corruption in the file that prohibited Task Scheduler and cmd.exe to open the file, although it was displayed correctly in Notepad.
On our servers it was a problem with the system path. After upgrading PHP runtime (using installation directory whose name includes version number) and updating the path in system variable PATH we were getting status 0x1. System restart corrected the issue. Restarting Task Manager service might have done it, too.
I was running a PowerShell script into the task scheduller but i forgot to enable the execution-policy to unrestricted, in an elevated PowerShell console:
Set-ExecutionPolicy Unrestricted
After that, the error disappeared (0x1).
Just had the same problem here. In my case, the bat files had space " "
After getting rid of spaces from filename and change into underscore, bat file worked
sample before it wont start
"x:\Update & pull.bat"
after rename
"x:\Update_and_pull.bat"
For me the problem was the PowerShell script being ran had #Requires -RunAsAdministrator at the top, meaning it needs to run in an elevated command prompt as an Admin, but the user the Scheduled Task was set to run as wasn't an admin on the local computer. So even though Run with highest privileges was checked in the scheduled task, I still had to make the user an Administrator on the computer. Once I did that, the script ran as expected.
Since there is always more than one reason this could happen I thought I'd share some troubleshooting tips that helped me diagnose my issue.
Always adding a "start in" parameter first since thats an easy fix, even just adding the drive letter can help, e.g. C:\
If you're running "whether user is logged on or not" and it is failing it might be an issue with your user and/or user environment.
Switch the task to run only when user is logged in temporarily for
troubleshooting purposes.
Make sure you're actually logged in AS the user you're telling the task
to run as. (PATH and other environment variables are different by user
and if you see the task running on one user successfully that doesn't
necessarily mean it will run successfully for another user even if they're in the same security group.)
Add pauses or some other type of debugging to your script to give you
time to see any errors that may pop up.
Perform a manual run from the task scheduler window.
Fix any errors you see from your debugging statements. Rinse and repeat.
If it runs successfully switch back to run "whether user is logged on
or not" and try another manual run. If it works now you're all set.
If nothing has helped so far you might need to dig in deeper to your user and file privileges. My troubleshooting tips assume that you have been able to get a past task running using a specific user login already. They don't cover building a scheduled task from a fresh install necessarily. Luckily I haven't had to do that.
What solved it for me was that I was using a local administrator account instead of the domain account so I changed the "Run as" to the domain account.
It turns out that a FTP download call using winscp as last thing to do in the batch caused the problem. After inserting the echo command it works fine. Guess the problems source could be the winscp.exe which do not correctly report the end of the current task to the OS.
del "C:\_ftpcrawler\Account Export.csv" /S /Q
"C:\Program Files (x86)\WinSCP\WinSCP.exe" /console /script="C:\_isource\scripte\data.txt"
echo Download ausgeführt am %date%%time% >> C:\_isource\scripte\data.log

Azure startup task, wait for all other task to finish

I have a startup task for my webrole that download some executable file from a blob and then proceed to the installation.
From a .cmd file, I start a power shell script that download the files, then I start the file from the .cmd.
The script works fine if I run it manually through RDP after the publishing is done.
But, when running as startup script, it sometimes (often) fail at different points.
The taskType is set to background.
Last time, the error was that the command PowerShell does not exists...
Also, I use powershell -command set-executionpolicy unrestricted before running my PS script, but I read here that other task may reset this setting and make mine fail.
Quite a mess.
So that makes me think that if I could wait for all other task to perform before starting mine, it would eliminate these kinds of problems
I suppose I could check if some process is running and wait for it to finish, but I have no clue wich process to check.
Or maybe there's another solution.
~edit~
I read here that the error about powershell not existing may be caused by the batch file being saved as UTF-8 in visual studio. I re-writed it from scratch in notepad++ and made sure it is save as ANSI. Then, same error. The full message is :
'PowerShell' is not recognized as an internal or external command,
operable program or batch file.
Again, the script run perfectly from command line in remote desktop.
It would be possible to set an environment variable at the end of the script that is required to finish, then in the script which is awaiting the dependencies, loop until the environment variable is set, then kick off its activities.
You could also run everything from a single powershell script and use the '-asjob' switch on your installer statement, use the 'wait-job' cmdlet to block until the task is complete then carry on. Powershell also offers a '?!' operator which ensures the last statement executed properly.
This might be caused by an encoding issue. As mentioned in this answer you should save your file in ASCII to ensure correct interpretation of your script.
From the linked answer:
Open your whatever.cmd file with your VS 2012 Ultimate. Click on File->Save whatever.cmd as -> on the dialog there is little arrow next to the [save] button. It will show up a menu that will have the option Save with Encoding.
Select it. Now choose "US-ASCII Codepage 20127" from the list of available encodings.