Powershell pipe into exe and wait - powershell

I am piping an array of data into a executable program but I need it to block after every call in the foreach loop. It will leave the loop before it even opens the program from the first call.
Set-Alias program "whatever.exe"
foreach ($data in $all_data)
{
$data| %{ program /command:update /path:"$_" /closeonend:2 }
}

I like PowerShell but I never really learned Invoke-Command. So whenever I need to run an EXE I always use cmd. If you type cmd /? you get its help, look at the "c" switch. I'd do something like this:
foreach ($data in $all_data){
$data |
Foreach-Object{
cmd /c "whatever.exe" /command:update /path:"$_" /closeonend:2
}
}
If you don't like the cmd /c thing you could use Jobs.
foreach ($data in $all_data){
$data |
Foreach-Object{
$job = Start-Job -InitializationScript {Set-Alias program "whatever.exe"} -ScriptBlock {program /command:update /path:"$($args[0])" /closeonend:2} -ArgumentList $_
while($job.Status -eq 'Running'){
Start-Sleep -Seconds 3
#Could make it more robust and add some error checking.
}
}
}

I can think of two ways to tackle this:
pipe your executable call to Out-Null
shell out the call to cmd.exe /c (as shown in #BobLobLaw's answer)
I made your sample code a little more specific so I could run and test my solutions; hopefully it'll translate. Here's what I started with to be equivalent to your sample code, i.e. the script executes with no waiting on the executable to finish.
# I picked a specific program
Set-Alias program "notepad.exe"
# And put some values in $all_data, specifically the paths to three text files.
$all_data = Get-Item B:\matt\Documents\*.txt
# This opens each file in notepad; three instances of notepad are running
# when the script finishes executing.
$all_data | %{ program "$_" }
Here's the same code as above, but piping to Out-Null forces the script to wait on each iteration of the loop.
# I picked a specific program
Set-Alias program "notepad.exe"
# And put some values in $all_data, specifically the paths to three text files.
$all_data = Get-Item B:\matt\Documents\*.txt
# Piping the executable call to out-null forces the script execution to wait
# for the program to complete. So in this example, the first document opens
# in notepad, but the second won't open until the first one is closed, and so on.
$all_data | %{ program "$_" | Out-Null}
And, lastly, the same code (more or less) using cmd /c to call the executable and make the script wait.
# Still using notepad, but I couldn't work out the correct call for
# cmd.exe using Set-Alias. We can do something similar by putting
# the program name in a plain old variable, though.
#Set-Alias program "notepad.exe"
$program = "notepad.exe"
# Put some values in $all_data, specifically the paths to three text files.
$all_data = Get-Item B:\matt\Documents\*.txt
# This forces script execution to wait until the call to $program
# completes. Again, the first document opens in notepad, but the second
# won't open until the first one is closed, and so on.
$all_data | %{ cmd /c $program "$_" }

Depending on your scenario, wait-job might be overkill. If you have a programmatic way to know that whatever.exe has done its thing, you could try something like
do {start-sleep -sec 2} until ($done -eq $true)
Oh and.

Related

How to capture external command progress in PowerShell?

I'm using a PowerShell script to synchronize files between network directories. Robocopy is running in the background.
To capture the output and give statistics to the user, currently I'm doing something like:
$out = (robocopy $src $dst $options)
Once that is done, a custom windows form is presented with a multi-line text box containing the output string.
However, doing this way halts the script execution until file copy is done. Since all the other input screen are presented to the user as graphical dialogues, I would like to give user progress output in a graphical way.
Is there a way to capture the stdout from robocopy, on the fly ?
Then the next question would be:
How to pipe that output into a form with a text box?
You can run the robocopy job in the background and keep checking on the progress.
Start-Job will start the process in the background
Receive-Job will give you all the data that has been printed so far.
$job = Start-Job -scriptBlock { robocopy $using:src $using:dst $using:options }
$out = ""
while( ($job | Get-Job).HasMoreData -or ($job | Get-Job).State -eq "Running") {
$out += (Receive-job $job)
Write-output $out
Start-Sleep 1
}
Remove-Job $job

Exit PowerShell script in controlled fashion using keyboard shortcut

I have a PowerShell script that performs numerous file management tasks and occasionally I have the need to terminate the script before it has finished processing. In order to terminate the script cleanly and not leave any files dotted around, I have the script read in a variable from a config file every time the foreach of the gci results is processed, as below:
Get-ChildItem $fileDir -Filter *.doc | Foreach-Object {
Get-Content $confFile | Foreach-Object {
$var = $_.Split('=')
New-Variable -Name $var[0] -Value $var[1] -Force
}
if ($process -eq "TRUE") {
<process File operations>
}
}
This allows me to change the value of process in the config file to anything but TRUE and the script will skip the process (although still loop though until complete).
Is there anyway to use a keyboard shortcut to exit the script in a controlled fashion? i.e. after an iteration of the foreach loop has completed. e.g. Press Ctrl+Q and it will exit cleanly as opposed to Ctrl+C.

Use PowerShell's type -wait with select-string to Real-Time Monitor an Application Log for a Condition and Execute an Action (like tail -f or watch)

I am trying to use PowerShell's type -wait command to monitor a log file in real time on Windows in the same way that I would use tail -f on Linux, and I want to pipe the file to another command - select-string, like grep - that will trigger an action based on a condition. On Linux, I would use watch to accomplish what I'm trying to do.
In the following example, I am trying to print the phrase "You caught a critter!" whenever a new line, which contains the string "status=Caught", is written to the log file.
while (1) {if (type -wait "$env:USERPROFILE\AppData\Roaming\CritterCatcherApp\Logs\CritterBag-170118.log" | select-string -quiet "state=Caught") {write-host "You caught a critter!"} else {continue}}
Please note that the -quiet argument for the select-string command returns True. Actually, in this example, it only returns True one time, when I first run the command, as the log file has existing lines that contain the string "status=Caught".
I do need to somehow overlook the existing strings (or, e.g., rotate the log files), but right now, the issue is that I need the PowerShell script to not only continue to tail the file, but I also need it to continue to evaluate whether each new line that is written to the file contains the given string. Then, if the string is present, I need to execute an arbitrary action, like print a line; else, continue listening for the string.
The following post indicates that the -wait argument is an option for the Get-Content cmdlet: How to monitor a windows log file in real time? . I am not sure if I should expect -wait to work with a foreach loop, as described in this post: In Powershell can i use the select-string -quiet switch within a foreach statement .
How can I modify the above PowerShell script to tail a log file and execute an action for each new line with a given string, and continue to tail the file and evaluate new lines as they are written?
You can do it on the pipeline with Where-Object (?) and ForEach-Object (%):
Get-Content file -wait | ? { $_ -match "status=Caught" } | % { Write-Host "You caught a critter!" }
Each time status=Caught is detected in the file, the Write-Host in the ForEach-Object will execute

How powershell handles returns from & calls

We have field devices that we decided to use a powershell script to help us handle 'updates' in the future. It runs every 5 minutes to execute rsync to see if it should download any new files. The script, if it sees any file types of .ps1, .exe, .bat ,etc. will then attempt to execute those files using the & operator. At the conclusion of execution, the script will write the file executed an excludes file (so that rsync will not download again) and remove the file. My problem is that the return from the executed code (called by &) behaves differently, depending on how the main script is called.
This is the main 'guts' of the script:
Get-ChildItem -Path $ScriptDir\Installs\* -Include #("*.ps1","*.exe","*.cmd","*.VBS","*.MSI") | ForEach {
Write-Verbose "Executing: $_"
& $_
$CommandName = Split-Path -Leaf $_
Write-Verbose "Adding $CommandName to rsync excludes"
Write-Output "$CommandName" | Out-File -FilePath $ScriptDir\excludes -Append -Encoding ASCII
Write-Verbose "Deleting '$_'"
Remove-Item $_
}
When invoking powershell (powershell.exe -ExecutionPolicy bypass) and then executing the script (.\Update.ps1 -Verbose), the script runs perfectly (i.e. the file is written to excludes and deleted) and you can see the verbose output (writing and deleting).
If you run the following (similar to task scheduler) powershell.exe -ExecutionPolicy bypass -NoProfile -File "C:\Update.ps1" -Verbose, you can see the new script get executed but none of the steps afterwards will execute (i.e. no adding to excludes or removing the file or the verbose outputs).

Logparser and powershell with multiple logfiles in a foreach-object loop

So I'm trying to write a powershell script that will go through a folder full of .evtx files, send out each one via syslog, then append ".done" to the filename of the .evtx file after doing so.
The thing is, I'm not quite sure how to reference the current log file I am on within the Foreach-Object loop.
Hopefully the following code will explain my dillema.
# begin foreach loop
Get-ChildItem $evtxfolder -Filter *.evtx | `
Foreach-Object {
$LPARGS = ("-i:evt", "-o:syslog", "SELECT STRCAT(`' evt-Time: `', TO_STRING(TimeGenerated, `'dd/MM/yyyy, hh:mm:ss`')),EventID,SourceName,ComputerName,Message INTO $SERVER FROM $CURRENTOBJECT") #obviously, this won't work.
$LOGPARSER = "C:\Program Files (x86)\Logparser 2.2\logparser.exe"
$LP = Start-Process -FilePath $LOGPARSER -ArgumentList $LPARGS -Wait -Passthru -NoNewWindow
$LP.WaitForExit() # wait for logs to finish
If you look in $LPARGS, you'll see that I put $SERVER and $CURRENTOBJECT. Obviously, the way I have it now will not work, but obviously, that won't work. So basically, I'm trying to put the variable $SERVER (passed in as a parameter) into the arguments for logparser, and reference whatever current event log it is working on to put in the "FROM" statement so that it knows to work on one .evtx file at a time. What would be the proper way to do this?
An example of the INTO FROM statement:
..snippet..
SourceName,ComputerName,Message INTO #192.168.56.30 FROM 'C:\Eventlogs\20131125.evtx'"
Of course, 'C:\Eventlogs\20131125.evtx' would change as it goes through the contents of the directory.
If $server is defined outside your script above it will be available inside your string for $LPARGS. As for the $CURRENTOBJECT, that would be $_. In this case, it will be a FileInfo object. It is likely you want the Name property e.g. $($_.Name).