Checking Time and executing different instrcutions based on hour - powershell

I am using a Robocopy command on a Windows Server 2003 server to copy a series of EDB files to special folders from user workstations onto a server. I want to run the robocopy commands twice, once in the mid-morning and once in the afternoon.
The way I KNOW how to do this would be to write two independent batch files that are scheduled to run at different times. Each batch would copy the EDBs to different directories.
But it occurred to me I should be able to do this in one batch file by:
Check the current time.
Note whether it's before 1200pm or after 1200pm.
If it's before 1200pm, run this set of Robocopy commands.
If it's after 1200pm, run the other set of Robocopy commands.
I am going to implement this the way I know how to do it, with the two batch files. I'd like to learn how to do it in other ways. I am open to doing this in any way -- Powershell, Python, etc. Admittedly, I am leery of installing anything on this production server that I wouldn't normally have to install. For instance, I could install Python, but it would be for this job only and that seems a little like overkill. (Feel free to disabuse me!)

There are probably several way to do what you are asking. The first part, running different codes depending on the time of day is pretty straight forward. Just use this:
if ( (Get-Date -UFormat %p) -eq "AM" ) {
<Code if doing before noon>
} #End if
else {
<code if doing after noon>
} #end else
You can run robocopy commands in Powershell without any fancy tricks. Here is a link to a question about robocopy.
As far as scheduling the task, this link will show you how to schedule the powershell script with the task scheduler.
To get anything else, your going to have to do some trial and error, then come back with failures or roadblocks to get more help.

Another possibility:
Switch ((get-date).tostring('tt'))
{
'AM' {'Morning script'}
'PM' {'Afternoon script'}
}

You could use something like this
set t=%time:0,2%
if %t% lss 12 (
REM First set of robocopy commands here
) else (
REM Second set of robocopy commands here
)

Related

Task scheduler "Run whether user is logged on or not" issue to startup application

I have a .bat file that starts up a powershell script.
Within this powershell script, i startup PowerBI with a given database.
The powershell script waits till powerBI has been done starting up, and will then be exporting data to some datadump files.
Doing this manually works fine, and also when its on the task scheduler to run when user is logged on.
The moment i change this to "Run whether user is logged on or not" it doesnt work anymore.
The reason behind this, is that it seems that powershell is unable to start PowerBI and therefore there is no open data to query in the rest of the script.
So the positive side is it runs the bat and powershell just fine, only the powershell itself seems incapable to start powerBI.
Are there any solutions to this? should i for example use a different method to call the appliation to start?
currently the powershell snippit to start the app looks like this:
$PBIDesktop = "C:\Program Files\Microsoft Power BI Desktop\bin\PBIDesktop.exe"
$template = "C:\LiveData\Data.pbix"
$waitoPBD = 60
$app = START-PROCESS $PBIDesktop $template -PassThru
log_message "Waiting $($waitoPBD) seconds for PBI to launch"
Start-Sleep -s $waitoPBD
I faced similar issue. So, sharing my experience..
First of all, please verify couple of things.
Specify user account which will be used to invoke the job. Also, ensure that, the account have sufficient permission.
Don't forget to un-check the checkbox (as shown in screenshot) under Conditions Tab
Just found this one - sorry it took so long :D
But, i had this totally nervwrecking issue to.
Solution for me is to realize that the task scheduler is very deep part of the OS.
Thats why i have to grant access to the file, for the computername$ (system name) on the file or folder containing the file to run.
Rightclick on the file or folder -> Security. Select edit and add [Name of your computer]$ and give the read and execute permissions.
That's the only way I can make it run.
But i hope you found the solution in the meantime :)

Don't succeed to execute .exe using powershell

I've looked into the forum for one hour now, and tried everything I've found here but still I don't manage to run my .exe using a powershell script.
Please forgive my ignorance, I'm very new with powershell..
Basically, my script is aimed to daily monitor files loaded.
For this, I need to list .txt files in my working directory, which I managed to do.
My issue is that when files arrive in my working folder, they have nonspeaking names that I can't figure out which business it's related to.
There is a RemaneFile1.exe executable that renames files according to some data codes inside my .txt files from something like "Inf320638.txt" to something like "lot_RUHPEG_296_320638" and that is exactly what .exe I would like to run using powershell (I didn't coded it, and I don't know how it works, just that when I manually run it it renames my files just fine).
I've tried those two command lines below, but when I look at my files, they are actually not renamed.
1. &".\INFOCENTRE\LOTS\RenameFile1.exe
=> When I check my file name it's still like "Inf320638.txt".
2. Start-Process ".\INFOCENTRE\LOTS\RenameFile1.exe" => A command prompt shows up for an instant, but when I check my file name it's like "Inf320638.txt".
Any help would be highly appreciated,
Brgs,
Thomas.
Try to run the executable with the following arguments:
Start-Process .\INFOCENTRE\LOTS\RenameFile1.exe -WorkingDirectory .\INFOCENTRE\LOTS -Wait
It might be, that the workingdirectory is what screws it up. The -wait switch lets powershell wait for the programm to finish, you could omit it.

Unable to print PDFs or office documents via Scheduled task

I have a scheduled task set to run on a machine overnight. This task will iterate through a folder printing all the files found therein. I can run the process without issue while logged in however it does not print when run via the scheduled task.
More Info:
The scheduled task executes a powershell script that performs multiple functions such as generating and emailing reports, copying files across network folders and finally printing the contents of a folder. All of these tasks are performed without error if the executing account is currently logged in. If the account is not logged in and run via a scheduled task everything except the printing of Office and PDF documents works correctly (Text documents print fine).
Here is the function I am using to print the documents.
Function Print-File($file)
{
begin
{
function internal-printfile($thefile)
{
if ($thefile -is [string])
{
$filename = $thefile
}
else
{
if ($thefile.FullName -is [string] )
{
$filename = $thefile.FullName
}
}
$start = new-object System.Diagnostics.ProcessStartInfo $filename
$start.Verb = "print"
[System.Diagnostics.Process]::Start($start)
}
if ($file -ne $null)
{
$filespecified = $true;
internal-printfile $file
}
}
process
{
if (!$filespecified)
{
$test = write-Host process ; internal-printfile $_
}
}
}
When running from a scheduled task I can see the process start (Winword or AcroRd32) as I am dumping output to a text file however I do not see anything print. One other difference I noticed is that when I use this function while logged in the Applications other than Adobe reader (Office Apps) start to print the document then close. However when run from a scheduled task the applications do not close on their own.
I would appreciate any feedback, suggestions or pointers at this time as I have hit a wall as far as knowing what else I can check. I would also take suggestions as to an alternative way to accomplish the printing of the files. (NOTE: I cannot predict the file type in the folder)
NOTE: These symptoms are present on two machines, Windows server 2008 and Windows 7, both running Office 2007 and Adobe Reader 10.1.7
I'm trying to do the same thing that you are attempting. I'm pretty sure what you're running into is session 0 isolation. You can read more about it at this MSDN site and this Windows blog post.
I haven't tried the suggestions in the following answer to another question on SO, but it might be worth a try.
Creating Desktop-Folders for session 0
Here is another guy who is trying to print without having a user logged into the machine. There is an answer from someone who claims to know how to do what we're all trying to do, but he doesn't actually post the answer.
Too late for OP, but for future readers... I had the same problem, but with a windows shell .bat file, not PowerShell. As a scheduled task, the script would launch AcroRd32.exe /t, but it wouldn't print anything. Then after a delay, Acrobat was stopped, and the file was moved to the "Printed" folder like everything was good. It printed fine standalone, just not as a scheduled task.
(Background: I'm running Windows 10 x86 on one older computer so that we can use our two bulletproof HP LaserJet 1000 printers. However, the program we used for this in Win 7, batchdocprint, is incompatible with Win 10 and the company is gone. Due to having to learn arcane syntax and workarounds, I've spent way more money in hours getting a few lines of code (below) working right than the program cost, but I couldn't find a suitable replacement. The programs that I found either printed incorrectly, or had options for only one printer.)
The problem for me did seem to be Session 0 isolation blocking GDI+. I went with the seemingly "spammy" suggestion of getting Foxit Reader. It worked, and like Acrobat, the reader is free. I just replaced the path to AcroRd32.exe with the path to FoxitReader.exe
I don't think this will ever be possible with Acrobat Reader. The CLI is not officially supported, so the likelihood of Adobe ever changing it to print without launching the GUI is minimal.
As far as other file types, it depends on what you're using to print them, and whether it can open and print without a GUI. I haven't decided whether to implement this for other common file types so that we can just drag-and-drop, or to keep forcing the users to use the Acrobat PDF printers that are set up to save PDFs in the hot folders. Right now, it's the latter.
My code, for reference, to hopefully save someone else my headache. I only changed/shortened names, and removed duplicate code for the second folder/printer. Note that taskkill requires administrative privileges. Also, you probably need to have a folder named "Printed" in your hot folder, since I don't check for its existence:
#ECHO OFF
REM Monitors folders for PDF's and prints.
REM Use PING for delay - no built-in sleep functionality.
REM Using START backgrounds the process so the script can move on.
:LOOP
cd C:\Hot Folder\
for %%a in ("*.pdf") do (
start "" "C:\Path\To\FoxitReader.exe" /t "C:\Hot Folder\%%a" "HP 1000"
ping 1.1.1.1 -n 2 -w 5000 >NUL
taskkill /IM FoxitReader.exe /F
move /Y "%%a" ".\Printed\%%a")
ping 1.1.1.1 -n 2 -w 5000 >NUL
goto LOOP
Not sure if you ever found the solution to this, but it happens that the printer to be used by task scheduler job should be registered under:
HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\Print\Printers (local printers)
vs
HKEY_Current_User\Printers\Connections (session printers)

Starting XAMPP with batch only if it isn't already running

I am writing a batch for a new deployment of my company's software.. Here is what I have so far...
wscript.exe "invisible.vbs" "apache_start.bat" /wait
wscript.exe "invisible.vbs" "mysql_start.bat" /wait
"C:\Program Files\Internet Explorer\iexplore.exe" http://localhost
So as you can see, this script should start apache, then start mysql and then open the default page with IE.
The problem is if the user runs this script twice, it runs apache and mysql twice and loads two seperate instances. The solution I need is a way to check to see if the processes are already running and, if not, run the two wscript commands. I am absolutely horrible with shell, so please try to give specific responses! I am a software engineer, not a sysadmin. Thanks for the help!
As a software engineer I think you have a leg up on scripting over some sysadmins...
Using PowerShell would make this easy. Use the following template to execute the services - you'll need to use it twice, and follow up with launching IE as above.
If ((Get-Process mysqlprocessname.exe)) {Write-Host Skipping MySQL}
Else { Start-Process ...}
This is going to take a few minutes for you to research the best way of starting a process with PowerShell. Also, you might want to pipe Start-Process to Out-Null so the script waits to start IE and Apache.
Others may want to chime in with a simpler way from a batch file.
For XAMPP, there is a pv.exe file in the apache/bin folder that XAMPP uses to see if a service is running. Look at WorldDrknss' answer in this thread for some great info: http://www.apachefriends.org/f/viewtopic.php?p=80047
The code to solve your problem is to modify your mysql_start.bat file to this:
#echo off
apache\bin\pv mysqld.exe %1 >nul
if ERRORLEVEL 1 goto Process_NotFound
echo MySQL is running
goto END
:Process_NotFound
echo Process %1 is not running
mysql\bin\mysqld.exe --defaults-file=mysql\bin\my.ini --standalone --console
goto finish
:finish
That will check if mysqld.exe is running. If it is, it just echos that out. If not, it starts the service.

How can I pause Perl processing without hard-coding the duration?

I have a Perl script that contains this code snippet, which calls the system shell to get some files by SFTP and unzip them with WinZip:
# Run script to get files from remote server
system "exec_SFTP.vbs";
# Unzip any files that were retrieved
foreach $zipFile (<*.zip>) {
system "wzunzip $zipFile";
}
Even if some files are retrieved, they are never unzipped, because by the time the files are retrieved and the SFTP connection is closed, the Perl script has already completed the unzip step, with the result that it doesn't find anything to unzip.
My short-term fix is to insert
sleep(60);
before the unzip step, but that assumes that the SFTP connection will finish within 60 seconds, which may sometimes be a gross over-estimate, and other times an under-estimate.
Is there a more sound way to cause Perl to pause until the SFTP connection is closed before proceeding with the unzip step?
Edit: Responders have questioned (and reasonably so) the use of a VB script rather than having Perl do the file transfer. It has to do with security -- the VB script is maintained by others and is authorized to do the SFTP.
Check the code in your *.vbs file. The system function waits for the child process to finish before execution continues. It appears that your *.vbs file is forking a background task to do the FTP and returning immediately.
In a perfect world your script would be rewritten to use Net::SFTP::Foreign and Archive::Extract..
An ugly quick-hackish kind of way might be to create a touch-file before your first system call, alter your sftp-fetching script to delete the file once it is done and have a while like so
while(-e 'touch.file') {
sleep 5;
}
# foreach [...]
Of course, you would need to take care if your .vbs fails and leaves the touchfile undeleted and many other bad side effects. This would be for a quick solution (if none of the other suggestions work) until you get the time to rewrite without system() calls.
You need a way for Perl to wait until the SFTP transfer is done, but as your script is currently written, Perl has no way of knowing this. (It looks like you're combining at least two scripting languages and a (GUI?) SFTP client; this can work, but it's not exactly reliable or robust. Why use VBscript to start the SFTP transfer?)
I can think of four options:
Your Perl script could do the SFTP transfer itself, using something like CPAN's Net::SFTP module, rather than spawning an external job whose status it cannot track.
Your Perl script could spawn a command-line SFTP utility (like PSFTP) that doesn't return until the transfer is done.
Or change exec_SFTP.vbs script to not return until the transfer is done.
If you're currently using a graphical SFTP client and can't switch for whatever reason, I'd recommend using a scripting language like AutoIt instead of Perl. AutoIt has features to wait for windows to change state and so on, so it could more easily monitor for an activity's completion.
Options 1 or 2 would be the most robust and reliable.
The best I can suggest is modifying exec_SFTP.vbs to exit only after the file transfer is complete. system waits for the program it called to complete, so that should solve your problem:
system LIST
system PROGRAM LIST
Does exactly the same thing as "exec LIST", except
that a fork is done first, and the parent process
waits for the child process to complete.
If you can't modify the vbs script to stay alive until it terminates, you may be able to track subprocess creation. If you get subprocess ids, you can monitor them thereby know when the vbs' various offspring terminate.
Win32::Process::Info lets you get a subprocess ids from a running process.
Maybe this is a dumb question, but why not just use the Net::SFTP and Archive::Extract Perl modules to download and unzip the files?
system will not return until the shell it's running the command in has returned; this may be wrong for launching graphical programs and file associations.
See if any of the following help?
system('cscript exec_SFTP.vbs');
use Win32::Process;
use Win32;
Win32::Process::Create(my $proc, 'wscript.exe',
'wscript exec_SFTP.vbs', 0, NORMAL_PRIORITY_CLASS, '.');
$proc->Wait(INFINITE);
Have a look at IPC::Open3
IPC::Open3 - open a process for reading, writing, and error handling using open3()