How to tail a log file while a process is running? - powershell

I'm running an exe from a PowerShell script. This executable writes its logs to a log file. I would like to continuously read and forward the logs from this file to the console while the executable is running.
Currently, I'm starting the exe like this:
$UNITY_JOB = Start-Job
-ScriptBlock { & "C:\Program Files\Unity\Hub\Editor\2019.2.11f1\Editor\Unity.exe" $args | Out-Null }
-ArgumentList $UNITY_ARGS
If I just do Get-Content $LOG_PATH -Wait at this point, I cannot detect when the exe terminates and the script blocks indefinitely.
If I start a second job for the logs, the output is not sent to the console:
$LOG_JOB = Start-Job -ScriptBlock { Get-Content $LOG_PATH -Wait }
(I need "real time" output, so I don't think Receive-Job would work)

I'd use a loop which ends when the job's status is Completed:
# Just to mock the execution
$extProgram = Start-Job -ScriptBlock { Start-Sleep -Seconds 30}
$file = 'C:\path\to\file.txt'
do {
cls
Get-Content $file -Tail $host.ui.RawUI.WindowSize.Height
Start-Sleep -Seconds 5 # Set any interval you need
} until ((Get-Job -Id $extProgram.id).State -eq "Completed")

Related

Powershell: Call Debug Analyzer cdb.exe as Process

i need to call the cdb.exe as a Process to check to kill the process after a few seconds.
Some Dumps cannot be analyzed so i have to do an other call.
Here you can see my code. But it doesn't work. The cdb.exe is not started correctly and i am not getting the output file.
Do you have some advises for me?
The call "before" implementing the process part starts the cdb.exe
$maximumRuntimeSeconds = 3
$path = "C:\Program Files (x86)\Windows Kits\10\Debuggers\x86\cdb.exe"
$process = Start-Process -FilePath $path "-z $unzippedFile.FullName, -c `".symfix;.reload;!analyze -v; q`""
try {
$process | Wait-Process -Timeout $maximumRuntimeSeconds -ErrorAction Stop > $outputFile
Write-Warning -Message 'Process successfully completed within timeout.'
}
catch {
Write-Warning -Message 'Process exceeded timeout, will be killed now.'
$process | Stop-Process -Force
}
# call before implementing Process
& "C:\Program Files (x86)\Windows Kits\10\Debuggers\x86\cdb.exe" -z $unzippedFile.FullName -c ".symfix;.reload;!analyze -v; q" > $outputFile
-Passthru was needed to make Wait-Process work.
I think you also need to look at how the double quoted string is expanding. I think $UnzippedFIle.Fullname might be adding a literal ".FullName" at the end of the actual fullname of the zip file. I don't have your environment, but the rudementary tests I've done show that. Try packing it in a sub-expression like:
"-z $($unzippedFile.FullName), -c `".symfix;.reload;!analyze -v; q`""
Let me know how that goes. Thanks.
C:\>dir /b ok.txt
File Not Found
C:\>type dodump.ps1
$path = "C:\Program Files\Windows Kits\10\Debuggers\x86\cdb.exe"
$process = Start-Process -PassThru -FilePath $path -ArgumentList "-z `"C:\calc.DMP`"" ,
"-c `".symfix;.reload;!analyze -v;q`"" -RedirectStandardOutput c:\\ok.txt
try {
$process | Wait-Process -Timeout 100 -ErrorAction Stop
Write-Host "Process finished within timeout"
}catch {
$process | Stop-Process
Write-Host "process killed"
}
Get-Content C:\ok.txt |Measure-Object -Line
C:\>powershell -f dodump.ps1
Process finished within timeout
Lines Words Characters Property
139

Calling other PowerShell scripts within a PowerShell script

I'm trying to get one master PowerShell script to run all of the others while waiting 30-60 seconds to ensure that the tasks are completed. Everything else I tried wouldn't stop/wait for the first script and its processes to complete before going through all the others at the same time and would cause a restart automatically.
Main script, run as admin:
$LogStart = 'Log '
$LogDate = Get-Date -Format "dd-MM-yyy-hh-mm-ss"
$FileName = $LogStart + $LogDate + '.txt.'
$scriptList = #(
'C:\Scripts\1-OneDriveUninstall.ps1'
'C:\Scripts\2-ComputerRename.ps1'
);
Start-Transcript -Path "C:\Scripts\$FileName"
foreach ($script in $scriptList) {
Start-Process -FilePath "$PSHOME\powershell.exe" -ArgumentList "-Command '& $script'"
Write-Output "The $script is running."
Start-Sleep -Seconds 30
}
Write-Output "Scripts have completed. Computer will restart in 10 seconds."
Start-Sleep -Seconds 10
Stop-Transcript
C:\Scripts\3-Restart.ps1
1-OneDriveUninstall.ps1:
Set-ItemProperty -Path REGISTRY::HKEY_LOCAL_MACHINE\Software\Microsoft\windows\CurrentVersion\Policies\System -Name ConsentPromptBehaviorAdmin -Value 0
taskkill /f /im OneDrive.exe
C:\Windows\SysWOW64\OneDriveSetup.exe /uninstall
2-ComputerRename.ps1:
$computername = Get-Content env:computername
$servicetag = Get-WmiObject Win32_Bios |
Select-Object -ExpandProperty SerialNumber
if ($computername -ne $servicetag) {
Write-Host "Renaming computer to $servicetag..."
Rename-Computer -NewName $servicetag
} else {
Write-Host "Computer name is already set to service tag."
}
The log file shows:
Transcript started, output file is C:\Scripts\Log 13-09-2019-04-28-47.txt.
The C:\Scripts\1-OneDriveUninstall.ps1 is running.
The C:\Scripts\2-ComputerRename.ps1 is running.
Scripts have completed. Computer will restart in 10 seconds.
Windows PowerShell transcript end
End time: 20190913162957
They aren't running correctly at all though. They run fine individually but not when put into one master script.
PowerShell can run PowerShell scripts from other PowerShell scripts directly. The only time you need Start-Process for that is when you want to run the called script with elevated privileges (which isn't necessary here, since your parent script is already running elevated).
This should suffice:
foreach ($script in $scriptList) {
& $script
}
The above code will run the scripts sequentially (i.e. start the next script only after the previous one terminated). If you want to run the scripts in parallel, the canonical way is to use background jobs:
$jobs = foreach ($script in $scriptList) {
Start-Job -ScriptBlock { & $using:script }
}
$jobs | Wait-Job | Receive-Job

Persistent PowerShell job

Is it possible to launch a PowerShell Job that persists after the creating script terminates? So far as I can tell, the Job is tied to the initiating script.
What I have now is this
$name = 'iexplore'
$waitTimeinMinutes = 30
Start-Job -argumentList #($name, $waitTimeinMinutes) -scriptblock {
param (
[string]$name,
[int]$waitTimeinMinutes
)
$cutoff = [DateTime]::Now.AddMinutes($waitTimeinMinutes)
while ([DateTime]::Now -lt $cutoff) {
if (Get-Process -name:$name -errorAction:silentlyContinue) {
Stop-Process -name:$name -force -errorAction:silentlyContinue
}
if (Get-Service $name -errorAction:silentlyContinue | Where-Object {$_.status -eq "running"}) {
Stop-Service $name -force -errorAction:silentlyContinue
}
Start-Sleep -s:10
}
}
It works as intend. For half an hour it watches for a Process or Service called iexplore and kills it if it is found. However, I need to continue processing a calling script, and this job is killed when the calling script terminates. I presume there must be a way to initiate a job separate from the calling script?

Powershell: Run multiple jobs in parralel and view streaming results from background jobs

Overview
Looking to call a Powershell script that takes in an argument, runs each job in the background, and shows me the verbose output.
Problem I am running into
The script appears to run, but I want to verify this for sure by streaming the results of the background jobs as they are running.
Code
###StartServerUpdates.ps1 Script###
#get list of servers to update from text file and store in array
$servers=get-content c:\serverstoupdate.txt
#run all jobs, using multi-threading, in background
ForEach($server in $servers){
Start-Job -FilePath c:\cefcu_it\psscripts\PSPatch.ps1 -ArgumentList $server
}
#Wait for all jobs
Get-Job | Wait-Job
#Get all job results
Get-Job | Receive-Job
What I am currently seeing:
Id Name State HasMoreData Location Command
-- ---- ----- ----------- -------- -------
23 Job23 Running True localhost #patch server ...
25 Job25 Running True localhost #patch server ...
What I want to see:
Searching for approved updates ...
Update Found: Security Update for Windows Server 2003 (KB2807986)
Update Found: Windows Malicious Software Removal Tool - March 2013 (KB890830)
Download complete. Installing updates ...
The system must be rebooted to complete installation.
cscript exited on "myServer" with error code 3.
Reboot required...
Waiting for server to reboot (35)
Searching for approved updates ...
There are no updates to install.
cscript exited on "myServer" with error code 2.
Servername "myServer" is fully patched after 2 loops
I want to be able to see the output or store that somewhere so I can refer back to be sure the script ran and see which servers rebooted, etc.
Conclusion:
In the past, I ran the script and it went through updating the servers one at a time and gave me the output I wanted, but when I started doing more servers - this task took too long, which is why I am trying to use background jobs with "Start-Job".
Can anyone help me figure this out, please?
You may take a look at the module SplitPipeline.
It it specifically designed for such tasks. The working demo code is:
# import the module (not necessary in PS V3)
Import-Module SplitPipeline
# some servers (from 1 to 10 for the test)
$servers = 1..10
# process servers by parallel pipelines and output results immediately
$servers | Split-Pipeline {process{"processing server $_"; sleep 1}} -Load 1, 1
For your task replace "processing server $_"; sleep 1 (simulates a slow job) with a call to your script and use the variable $_ as input, the current server.
If each job is not processor intensive then increase the parameter Count (the default is processor count) in order to improve performance.
Not a new question but I feel it is missing an answer including Powershell using workflows and its parallel possibilities, from powershell version 3. Which is less code and maybe more understandable than starting and waiting for jobs, which of course works good as well.
I have two files: TheScript.ps1 which coordinates the servers and BackgroundJob.ps1 which does some kind of check. They need to be in the same directory.
The Write-Output in the background job file writes to the same stream you see when starting TheScript.ps1.
TheScript.ps1:
workflow parallelCheckServer {
param ($Servers)
foreach -parallel($Server in $Servers)
{
Invoke-Expression -Command ".\BackgroundJob.ps1 -Server $Server"
}
}
parallelCheckServer -Servers #("host1.com", "host2.com", "host3.com")
Write-Output "Done with all servers."
BackgroundJob.ps1 (for example):
param (
[Parameter(Mandatory=$true)] [string] $server
)
Write-Host "[$server]`t Processing server $server"
Start-Sleep -Seconds 5
So when starting the TheScript.ps1 it will write "Processing server" 3 times but it will not wait for 15 seconds but instead 5 because they are run in parallel.
[host3.com] Processing server host3.com
[host2.com] Processing server host2.com
[host1.com] Processing server host1.com
Done with all servers.
In your ForEach loop you'll want to grab the output generated by the Jobs already running.
Example Not Tested
$sb = {
"Starting Job on $($args[0])"
#Do something
"$($args[0]) => Do something completed successfully"
"$($args[0]) => Now for something completely different"
"Ending Job on $($args[0])"
}
Foreach($computer in $computers){
Start-Job -ScriptBlock $sb -Args $computer | Out-Null
Get-Job | Receive-Job
}
Now if you do this all your results will be mixed. You might want to put a stamp on your verbose output to tell which output came from.
Or
Foreach($computer in $computers){
Start-Job -ScriptBlock $sb -Args $computer | Out-Null
Get-Job | ? {$_.State -eq 'Complete' -and $_.HasMoreData} | % {Receive-Job $_}
}
while((Get-Job -State Running).count){
Get-Job | ? {$_.State -eq 'Complete' -and $_.HasMoreData} | % {Receive-Job $_}
start-sleep -seconds 1
}
It will show all the output as soon as a job is finished. Without being mixed up.
If you're wanting to multiple jobs in-progress, you'll probably want to massage the output to help keep what output goes with which job straight on the console.
$BGList = 'Black','Green','DarkBlue','DarkCyan','Red','DarkGreen'
$JobHash = #{};$ColorHash = #{};$i=0
ForEach($server in $servers)
{
Start-Job -FilePath c:\cefcu_it\psscripts\PSPatch.ps1 -ArgumentList $server |
foreach {
$ColorHash[$_.ID] = $BGList[$i++]
$JobHash[$_.ID] = $Server
}
}
While ((Get-Job).State -match 'Running')
{
foreach ($Job in Get-Job | where {$_.HasMoreData})
{
[System.Console]::BackgroundColor = $ColorHash[$Job.ID]
Write-Host $JobHash[$Job.ID] -ForegroundColor Black -BackgroundColor White
Receive-Job $Job
}
Start-Sleep -Seconds 5
}
[System.Console]::BackgroundColor = 'Black'
You can get the results by doing something like this after all the jobs have been received:
$array=#()
Get-Job -Name * | where{$array+=$_.ChildJobs.output}
.ChildJobs.output will have anything that was returned in each job.
function OutputJoblogs {
[CmdletBinding(DefaultParameterSetName='Name')]
Param
(
[Parameter(Mandatory=$true, Position=0)]
[System.Management.Automation.Job] $job,
[Parameter(Mandatory=$true, Position=1)]
[string] $logFolder,
[Parameter(Mandatory=$true, Position=2)]
[string] $logTimeStamp
)
#Output All logs
while ($job.sate -eq "Running" -or $job.HasMoreData){
start-sleep -Seconds 1
foreach($remotejob in $job.ChildJobs){
if($remotejob.HasMoreData){
$output=(Receive-Job $remotejob)
if($output -gt 0){
$remotejob.location +": "+ (($output) | Tee-Object -Append -file ("$logFolder\$logTimeStamp."+$remotejob.Location+".txt"))
}
}
}
}
#Output Errors
foreach($remotejob in $job.ChildJobs){
if($remotejob.Error.Count -gt0){$remotejob.location +": "}
foreach($myerr in $remotejob.Error){
$myerr 2>&1 | Tee-Object -Append -file ("$logFolder\$logTimeStamp."+$remotejob.Location+".ERROR.txt")
}
if($remotejob.JobStateInfo.Reason.ErrorRecord.Count -gt 0){$remotejob.location +": "}
foreach($myerr in $remotejob.JobStateInfo.Reason.ErrorRecord){
$myerr 2>&1 | Tee-Object -Append -file ("$logFolder\$logTimeStamp."+$remotejob.Location+".ERROR.txt")
}
}
}
#example of usage
$logfileDate="$((Get-Date).ToString('yyyy-MM-dd-HH.mm.ss'))"
$job = Invoke-Command -ComputerName "servername1","servername2" -ScriptBlock {
for ($i=1; $i -le 5; $i++) {
$i+"`n";
if($i -gt 2){
write-error "Bad thing happened"};
if($i -eq 4){
throw "Super Bad thing happened"
};
start-sleep -Seconds 1
}
} -asjob
OutputJoblogs -Job $job -logFolder "$PSScriptRoot\logs" -logTimeStamp $logfileDate

Powershell: get output from Receive-Job

I have a collection of jobs that are running. When they complete I use receive-job and it writes the output to the screen. I'd like to take that output and log it to a file. I don't want to tee the output produced while the jobs are running because with multiple jobs running at once the logging would be interspersed. Get-Job | Receive-Job prints the output in a nice organized manner.
I have tried all of the following and no output is written to the file or stored in a variable, it just goes to the screen:
#Wait for last job to complete
While (Get-Job -State "Running") {
Log "Running..." -v $info
Start-Sleep 10
}
Log ("Jobs Completed. Output:") -v $info
# Getting the information back from the jobs
foreach($job in Get-Job){
Receive-Job -Job $job | Out-File c:\Test.log -Append
$output = Receive-Job -Job $job
Log ("OUTPUT: "+$output)
Receive-Job -Job $job -OutVariable $foo
Log ("FOO: "+$foo)
}
EDIT:
I have removed the extra Receive-Job calls in the foreach to the following after seeing Keith's comment:
# Getting the information back from the jobs
foreach($job in Get-Job){
Receive-Job -Job $job -OutVariable temp
Write-Host ("Temp: "+$temp)
$temp | Out-File -FilePath c:\Test.log -Append
}
I also verified I'm not using Receive-Job anywhere else in the script. The write-host $temp and the out-file still produce no output though.
If the job uses Write-Host to produce output, Receive-Job returns $null, but the results get written to the host. However, if the job uses Write-Output to produce output in lieu of Write-Host, Receive-Job returns a string array [string[]] of the job output.
To demonstrate, enter this innocuous code at the PowerShell prompt:
$job = Start-Job -ScriptBlock {
[int] $counter = 0
while ($counter -lt 10) {
Write-Output "Counter = $counter."
Start-Sleep -Seconds 5
$counter++
}
}
Wait about 20-30 seconds for the job to produce some output, then enter this code:
$result = Receive-Job -Job $job
$result.Count
$result
$result | Get-Member
The $result object contains the strings produced by the job.
NOTE: After some experimentation I have found that using write-output or just outputting a variable directly is not a good solution either. If you have a function that uses either of those methods and also returns a value the output will be concatenated with the return value!
The best way I have found to log output from jobs is to use write-host in combination with the Start-Transcript and Stop-Transcript cmdlets. There are some disadvantages such as the ISE does not support Transcript cmdlets. Also I haven't found a way to output to the transcript without it going to the host (I wanted a robust log with a less verbose user experience), but it seems to be the best solution available currently. It is the only way I've been able to log concurrent jobs without interspersing the output without having to right multiple temporary log files for each job and then combining them.
Below was my previous answer that I left just for documentation reasons:
The issue here was not getting output from Receive-Job. The output from the jobs I was expecting to see is being written in the job by write-host. When receive-job was called I saw all my output on the screen, but nothing was available to pipe elsewhere. It seems when I call receive-job all those write-host cmdlets execute, but there's nothing for me to pick up and pipe to out-file. The following script demonstrates this. You'll notice "Sleepy Time" appears in both the log file and the host, while "Slept" only appears in the host. So, rather than trying to pipe Receive-Job to write-host, I need to modify my Log function to not use write-host or to split the output as shown below.
cls
rm c:\jobs.log
$foo = #(1,2,3)
$jobs = #()
foreach($num in $foo){
$s = { get-process -name "explorer"
Start-Sleep $args[0]
$x = ("Sleepy Time: "+$args[0])
write-host $x
$x
write-host ("Slept: "+$args[0])
}
$id = [System.Guid]::NewGuid()
$jobs += $id
Start-Job -Name $id -ScriptBlock $s -args $num
}
While (Get-Job -State "Running") {
cls
Get-Job
Start-Sleep 1
}
cls
Get-Job
write-host "Jobs completed, getting output"
foreach($job in $jobs){
Receive-Job -Name $job | out-file c:/jobs.log -append
}
Remove-Job *
notepad c:\jobs.log
Below is a sample that will run concurrent jobs and then log them sequentially as per mbourgon's request. You'll notice the timestamps indicate the work was interspersed, but the logging is displayed sequentially by job.
#This is an example of logging concurrent jobs sequentially
#It must be run from the powershell prompt as ISE does not support transcripts
cls
$logPath = "c:\jobs.log"
rm $logPath
Start-Transcript -Path $logPath -Append
#Define some stuff
$foo = #(0,1,2)
$bar = #(#(1,"AAA"),#(2,"BBB"),#(3,"CCC"))
$func = {
function DoWork1 {
Log "This is DoWork1"
return 0
}
function DoWork2 {
Log "This is DoWork2"
return 0
}
function Log {
Param([parameter(ValueFromPipeline=$true)]$InputObject)
$nl = [Environment]::NewLine.Chars(0)
$time = Get-Date -Format {HH:mm:ss}
Write-Host ($time+">"+$InputObject+$nl)
}
}
#Start here
foreach($num in $foo){
$s = { $num = $args[0]
$bar = $args[1]
$logPath = $args[2]
Log ("Num: "+$num)
for($i=0; $i -lt 5; $i++){
$out = ([string]::$i+$bar[$num][1])
Log $out
start-sleep 1
}
Start-Sleep $num
$x = DoWork1
Log ("The value of x is: "+$x)
$y = DoWork2
Log ("The value of y is: "+$y)
}
Start-Job -InitializationScript $func -ScriptBlock $s -args $num, $bar, $logPath
}
While (Get-Job -State "Running") {
cls
Get-Job
Start-Sleep 1
}
cls
Get-Job
write-host "Jobs completed, getting output"
Get-Job | Receive-Job
Remove-Job *
Stop-Transcript
notepad $logPath
I found soulution how to receive Write-Host results to variable with Receive-Job CMDlet, it's working on Powershell 5.1:
$var = Receive-Job $job 6>&1
I know this is old. I don't use jobs very often so I was obviously having the same problem when I stumbled on this thread. Anyway, I arrived at a different solution so I thought I'd throw it out there.
Also in PowerShell 5.1 I simply referenced the properties on the Job Object(s) In my case this looked like:
$Jobs.ChildJobs.Output | Out-File <LogFile> -Append
It looks like all the streams are recorded in this way. Simple echoes go to the Output, presumable the success stream. Write-Host commands go to Information Stream.
Output : {}
Error : {}
Progress : {}
Verbose : {}
Debug : {}
Warning : {}
Information : {}
When you receive the results of a job those results are deleted such that further calls to Receive-Job will have nothing to retrieve (assuming the job is completed). If you don't want Receive-Job to delete the results use the -Keep switch. Now this doesn't explain why your first call to Receive-Job isn't outputting anything to the log file ... unless there is a part of the script you're not showing that is doing a Receive-Job before this.
BTW when you use -OutVariable supply it what a variable name and not a variable e.g. -ov temp. Then you later access that info with $temp.
You could pipe your Get-Process to Format-Table, so you can see all of the output.
Get-Process -Name "explorer" | format-Table -hidetableheaders -AutoSize
And you could use something like this at the end of the script...
Get-job | Receive-Job 2>&1 >> c:\path\to\your\log\file.log
The "2>&1 >>" with grab all output from Get-Job | Receive-Job
A simple solution to the problem. Write-Verbose "blabla" -Verbose
$s = New-PSSession -ComputerName "Computer"
$j = Invoke-Command -Session $s -ScriptBlock { Write-Verbose "Message" -Verbose } -AsJob
Wait-Job $j
$Child = $j.ChildJobs[0]
$Child.Verbose | out-file d:/job.log -append
What I have done is to not call Receive-Job at all. But create a wrapper scriptblock that starts and stops a transcripts. Then in that wrapper script block I call the jobs script block like
&$ScriptBlock < Parameters >
So the transcript will capture all the output to a file.