PowerShell console timer - powershell

I am attempting to create a simple console timer display.
...
$rpt = $null
write-status "Opening report", $rpt
# long-running
$rpt = rpt-open -path "C:\Documents and Settings\foobar\Desktop\Calendar.rpt"
...
function write-status ($msg, $obj) {
write-host "$msg" -nonewline
do {
sleep -seconds 1
write-host "." -nonewline
[System.Windows.Forms.Application]::DoEvents()
} while ($obj -eq $null)
write-host
}
The example generates 'Opening report ....', but the loop never exits.
I should probably use a call-back or delegate, but I'm not sure of the pattern in this situation.
What am I missing?

You should be using Write-Progress to inform the user of the run status of your process and update it as events warrant.

If you want to do some "parallel" computing with powershell use jobs.
$job = Start-Job -ScriptBlock {Open-File $file} -ArgumentList $file
while($job.status -eq 'Running'){
Write-Host '.' -NoNewLine
}
Here is what I think about Write-Host.

The script runs in sequence. When you input a $null object, the while condition will always be true. The function will continue forever until something breaks it (which never happends in your script.
First then will it be done with the function and continue with your next lines:
# long-running
$rpt = rpt-open -path "C:\Documents and Settings\foobar\Desktop\Calendar.rpt"
Summary: Your while loop works like while($true) atm = bad design

Related

Progress bar is not showing correctly in PowerShell 7 in ForEach -Parallel

this show 0%:
this show 4%:
the realted script:
$iWrapper = [hashtable]::Synchronized(#{ i = 0 })
$srcfile = "C:\Users\wnune\OneDrive\Escritorio\imagenes\cardlist.txt"
$urls = Get-Content $srcfile
$lines = 0
switch -File $srcfile { default { ++$lines } }
Write-Host "Total Urls to process: $lines "
Write-Progress -Activity "Downloading files" -Status "In progress" -PercentComplete $i;
$urls | ForEach-Object -Parallel {
try {
$url = $_
$filename = Split-Path $url -Leaf
$destination = "C:\Users\wnune\OneDrive\Escritorio\imagenes\$filename"
$ProgressPreference = 'SilentlyContinue'
$response = Invoke-WebRequest -Uri $url -ErrorAction SilentlyContinue
if ($response.StatusCode -ne 200) {
Write-Warning "============================================="
Write-Warning "Url $url return Error. "
continue
}
if (Test-Path $destination) {
Write-Warning "============================================="
Write-Warning "File Exist in Destination: $filename "
continue
}
$job = Start-BitsTransfer -Source $url -Destination $destination -Asynchronous
while (($job | Get-BitsTransfer).JobState -eq "Transferring" -or ($job | Get-BitsTransfer).JobState -eq "Connecting")
{
Start-Sleep -m 250
}
Switch(($job | Get-BitsTransfer).JobState)
{
"Transferred" {
Complete-BitsTransfer -BitsJob $job
}
"Error" {
$job | Format-List
}
}
}
catch
{
Write-Warning "============================================="
Write-Warning "There was an error Downloading"
Write-Warning "url: $url"
Write-Warning "file: $filename"
Write-Warning "Exception Message:"
Write-Warning "$($_.Exception.Message)"
}
$j = ++($using:iWrapper).i
$k = $using:lines
$percent = [int](100 * $j / $k)
Write-Host "PercentCalculated: $percent"
Write-Host "Progress bar not Show the %"
Write-Progress -Activity "Downloading files " -Status " In progress $percent" -PercentComplete $percent
}
Write-Progress -Activity "Downloading files" -Status "Completed" -Completed
If I am passing in -PercentComplete $percent which is an integer why does the progress bar not receive it correctly?
I have verified that the script and the environment are correctly configured but I cannot validate because the progress bar is not seen correctly.
Note:
A potential future enhancement has been proposed in GitHub issue #13433, suggesting adding parameter(s) such as -ShowProgressBar to ForEach-Object -Parallel so that it would automatically show a progress bar based on how many parallel threads have completed so far.
Leaving the discussion about whether Start-BitsTransfer alone is sufficient aside:
At least as of PowerShell v7.3.1, it seemingly is possible to call Write-Progress from inside threads created by ForEach-Object -Parallel, based on a running counter of how many threads have exited so far.
However, there are two challenges:
You cannot directly update a counter variable in the caller's runspace (thread), you can only refer to an object that is an instance of a .NET reference type in the caller's runspace...
...and modifying such an object, e.g. a hashtable must be done in a thread-safe manner, such as via System.Threading.Monitor.
Note that I don't know whether calling Write-Progress from different threads is officially supported, but it seems to work in practice, at least when the call is made in a thread-safe manner, as below.
Bug alert, as of PowerShell 7.3.1: Irrespective of whether you use ForEach-Object -Parallel or not, If you call Write-Progress too quickly in succession, only the first in such a sequence of calls takes effect:
See GitHub issue #18848
As a workaround, the code below inserts a Start-Sleep -Milliseconds 200 call after each Write-Progress call.
Not only should this not be necessary, it slows down overall execution, because threads then take longer to exit, which affects not only a given thread, but overall execution time, because it delays when threads "give up their slot" in the context of thread throttling (5 threads are allowed to run concurrently by default; use -ThrottleLimit to change that.
A simple proof of concept:
# Sample pipeline input
$urls = 1..100 | ForEach-Object { "foo$_" }
# Helper hashtable to keep a running count of completed threads.
$completedCount = #{ Value = 0 }
$urls |
ForEach-Object -parallel { # Process the input objects in parallel threads.
# Simulate thread activity of varying duration.
Start-Sleep -Milliseconds (Get-Random -Min 0 -max 3000)
# Produce output.
$_
# Update the count of completed threads in a thread-safe manner
# and update the progress display.
[System.Threading.Monitor]::Enter($using:completedCount) # lock access
($using:completedCount).Value++
# Calculate the percentage completed.
[int] $percentComplete = (($using:completedCount).Value / ($using:urls).Count) * 100
# Update the progress display, *before* releasing the lock.
Write-Progress -Activity Test -Status "$percentComplete% complete" -PercentComplete $percentComplete
# !! Workaround for the bug above - should *not* be needed.
Start-Sleep -Milliseconds 200
[System.Threading.Monitor]::Exit($using:completedCount) # release lock
}
An alternative approach in which the calling thread centrally tracks the progress of all parallel threads:
Doing so requires adding the -AsJob switch to ForEach-Object -Parallel, which, instead of the synchronous execution that happens by default, starts a (thread-based) background job, and returns a [System.Management.Automation.PSTasks.PSTaskJob] instance that represents all parallel threads as PowerShell (thread) jobs in the .ChildJobs property.
A simple proof of concept:
# Sample pipeline input
$urls = 1..100 | ForEach-Object { "foo$_" }
Write-Progress -Activity "Downloading files" -Status "Initializing..."
# Launch the parallel threads *as a background (thread) job*.
$job =
$urls |
ForEach-Object -AsJob -Parallel {
# Simulate thread activity of varying duration.
Start-Sleep -Milliseconds (Get-Random -Min 0 -max 3000)
$_ # Sample output: pass the URL through
}
# Monitor and report the progress of the thread job's
# child jobs, each of which tracks a parallel thread.
do {
# Sleep a bit to allow the threads to run - adjust as desired.
Start-Sleep -Seconds 1
# Determine how many jobs have completed so far.
$completedJobsCount =
$job.ChildJobs.Where({ $_.State -notin 'NotStarted', 'Running' }).Count
# Relay any pending output from the child jobs.
$job | Receive-Job
# Update the progress display.
[int] $percent = ($completedJobsCount / $job.ChildJobs.Count) * 100
Write-Progress -Activity "Downloading files" -Status "$percent% complete" -PercentComplete $percent
} while ($completedJobsCount -lt $job.ChildJobs.Count)
# Clean up the job.
$job | Remove-Job
While this is more work and less efficient due to the polling loop, it has two advantages:
The script blocks running in the parallel threads need not be burdened with progress-reporting code.
The polling loop affords the opportunity to perform other foreground activity while the parallel threads are running in the background.
The bug discussed above needn't be worked around, assuming your Start-Sleep interval in the polling loop is at least 200 msecs.

How do you exit from an Invoke-Command -ScriptBlock in PowerShell?

How do you exit from an invoke-command script block running on a remote server? I have tried next.
Here is my code:
$Res = Invoke-Command -Session $Ses -ArgumentList ($ROOTdir, $PARAMS.SYS, $PARAMS.main, $PARAMS.zip) -ScriptBlock {
Param($ROOTdir, $SYS, $MAIN, $ZIP)
$list | %{
$completed = $false
$retrycount = 1
while (-not $completed) {
try {
$Copytime = (Measure-Command {
Copy-Item -Path $_.FullName -Destination ($SYS.KITCHENdir) -Force -ErrorVariable copyerror
}).TotalSeconds
$completed = $true
} catch {
if ($retrycount -gt $MAIN.Retry ) {
break or exit #HOW STOP EXECUTING NEXT STEPS AND EXIT
} else {
Start-Sleep $MAIN.DelayRetry
$retrycount++
}
}
}
#IF copy bad result need stop and exit
#The final one can produce flavors here, but it does not look quite kosher
#next steps
}
I suspect you want to use the Exit keyword.
"The exit keyword is used to exit from contexts; it will exit the
(currently running) context where your code is running. This means
that if you use this keyword in a script, and launch the script
directly from your console, it will exit both the script and the
console since they're both running in the same context. However, if
you use the exit keyword in a function within the script, and call
that function from the same script, it will just exit the script and
not the entire console. The exit keyword is best used in functions,
and when those functions are called in a script. It's a great way to
terminate execution of functions."
https://www.pluralsight.com/blog/it-ops/powershell-terminating-code-execution

How can I interrupt an executable invoked inside a script without interrupting the script itself?

Let's say you have the following script saved in a file outermost.ps1:
powershell.exe -Command "while ( 1 -eq 1 ) {} "
echo "Done"
When running outermost.ps1 you can only abort this by pressing Ctrl+C, and no output will be written to console. How can I modify it so that the outermost script continues and executes echo "Done" when Ctrl+C is pressed?
This is a simplified version of a real-life scenario where the inner script is actually an executable which only is stoppable by pressing Ctrl+C.
Edit: The script could also be:
everlooping.exe
echo "Done"
but I wanted to provide an example everyone could copy-paste into an editor if they wanted to "try at home".
Start your infinite command/statement as a job, make your PowerShell script process Ctrl+C as regular input (see here), and stop the job when that input is received:
[Console]::TreatControlCAsInput = $true
$job = Start-Job -ScriptBlock {
powershell.exe -Command "while ( 1 -eq 1 ) {} "
}
while ($true) {
if ([Console]::KeyAvailable) {
$key = [Console]::ReadKey($true)
if (($key.Modifiers -band [ConsoleModifiers]::Control) -and $key.Key -eq 'c') {
$job.StopJob()
break
}
}
Start-Sleep -Milliseconds 100
}
Receive-Job -Id $job.Id
Remove-Job -Id $job.Id
echo "Done"
If you need to retrieve output from the job while it's running, you can do so like this in an else branch to the outer if statement:
if ($job.HasMoreData) { Receive-Job -Id $job.Id }
The simplest solution to this is:
Start-Process -Wait "powershell.exe" -ArgumentList "while ( 1 -eq 1 ) {}"
echo "Done"
This will spawn a second window unlinke ansgar-wiechers solution, but solves my problem with the least amount of code.
Thanks to Jaqueline Vanek for the tip

Powershell: get output from Receive-Job

I have a collection of jobs that are running. When they complete I use receive-job and it writes the output to the screen. I'd like to take that output and log it to a file. I don't want to tee the output produced while the jobs are running because with multiple jobs running at once the logging would be interspersed. Get-Job | Receive-Job prints the output in a nice organized manner.
I have tried all of the following and no output is written to the file or stored in a variable, it just goes to the screen:
#Wait for last job to complete
While (Get-Job -State "Running") {
Log "Running..." -v $info
Start-Sleep 10
}
Log ("Jobs Completed. Output:") -v $info
# Getting the information back from the jobs
foreach($job in Get-Job){
Receive-Job -Job $job | Out-File c:\Test.log -Append
$output = Receive-Job -Job $job
Log ("OUTPUT: "+$output)
Receive-Job -Job $job -OutVariable $foo
Log ("FOO: "+$foo)
}
EDIT:
I have removed the extra Receive-Job calls in the foreach to the following after seeing Keith's comment:
# Getting the information back from the jobs
foreach($job in Get-Job){
Receive-Job -Job $job -OutVariable temp
Write-Host ("Temp: "+$temp)
$temp | Out-File -FilePath c:\Test.log -Append
}
I also verified I'm not using Receive-Job anywhere else in the script. The write-host $temp and the out-file still produce no output though.
If the job uses Write-Host to produce output, Receive-Job returns $null, but the results get written to the host. However, if the job uses Write-Output to produce output in lieu of Write-Host, Receive-Job returns a string array [string[]] of the job output.
To demonstrate, enter this innocuous code at the PowerShell prompt:
$job = Start-Job -ScriptBlock {
[int] $counter = 0
while ($counter -lt 10) {
Write-Output "Counter = $counter."
Start-Sleep -Seconds 5
$counter++
}
}
Wait about 20-30 seconds for the job to produce some output, then enter this code:
$result = Receive-Job -Job $job
$result.Count
$result
$result | Get-Member
The $result object contains the strings produced by the job.
NOTE: After some experimentation I have found that using write-output or just outputting a variable directly is not a good solution either. If you have a function that uses either of those methods and also returns a value the output will be concatenated with the return value!
The best way I have found to log output from jobs is to use write-host in combination with the Start-Transcript and Stop-Transcript cmdlets. There are some disadvantages such as the ISE does not support Transcript cmdlets. Also I haven't found a way to output to the transcript without it going to the host (I wanted a robust log with a less verbose user experience), but it seems to be the best solution available currently. It is the only way I've been able to log concurrent jobs without interspersing the output without having to right multiple temporary log files for each job and then combining them.
Below was my previous answer that I left just for documentation reasons:
The issue here was not getting output from Receive-Job. The output from the jobs I was expecting to see is being written in the job by write-host. When receive-job was called I saw all my output on the screen, but nothing was available to pipe elsewhere. It seems when I call receive-job all those write-host cmdlets execute, but there's nothing for me to pick up and pipe to out-file. The following script demonstrates this. You'll notice "Sleepy Time" appears in both the log file and the host, while "Slept" only appears in the host. So, rather than trying to pipe Receive-Job to write-host, I need to modify my Log function to not use write-host or to split the output as shown below.
cls
rm c:\jobs.log
$foo = #(1,2,3)
$jobs = #()
foreach($num in $foo){
$s = { get-process -name "explorer"
Start-Sleep $args[0]
$x = ("Sleepy Time: "+$args[0])
write-host $x
$x
write-host ("Slept: "+$args[0])
}
$id = [System.Guid]::NewGuid()
$jobs += $id
Start-Job -Name $id -ScriptBlock $s -args $num
}
While (Get-Job -State "Running") {
cls
Get-Job
Start-Sleep 1
}
cls
Get-Job
write-host "Jobs completed, getting output"
foreach($job in $jobs){
Receive-Job -Name $job | out-file c:/jobs.log -append
}
Remove-Job *
notepad c:\jobs.log
Below is a sample that will run concurrent jobs and then log them sequentially as per mbourgon's request. You'll notice the timestamps indicate the work was interspersed, but the logging is displayed sequentially by job.
#This is an example of logging concurrent jobs sequentially
#It must be run from the powershell prompt as ISE does not support transcripts
cls
$logPath = "c:\jobs.log"
rm $logPath
Start-Transcript -Path $logPath -Append
#Define some stuff
$foo = #(0,1,2)
$bar = #(#(1,"AAA"),#(2,"BBB"),#(3,"CCC"))
$func = {
function DoWork1 {
Log "This is DoWork1"
return 0
}
function DoWork2 {
Log "This is DoWork2"
return 0
}
function Log {
Param([parameter(ValueFromPipeline=$true)]$InputObject)
$nl = [Environment]::NewLine.Chars(0)
$time = Get-Date -Format {HH:mm:ss}
Write-Host ($time+">"+$InputObject+$nl)
}
}
#Start here
foreach($num in $foo){
$s = { $num = $args[0]
$bar = $args[1]
$logPath = $args[2]
Log ("Num: "+$num)
for($i=0; $i -lt 5; $i++){
$out = ([string]::$i+$bar[$num][1])
Log $out
start-sleep 1
}
Start-Sleep $num
$x = DoWork1
Log ("The value of x is: "+$x)
$y = DoWork2
Log ("The value of y is: "+$y)
}
Start-Job -InitializationScript $func -ScriptBlock $s -args $num, $bar, $logPath
}
While (Get-Job -State "Running") {
cls
Get-Job
Start-Sleep 1
}
cls
Get-Job
write-host "Jobs completed, getting output"
Get-Job | Receive-Job
Remove-Job *
Stop-Transcript
notepad $logPath
I found soulution how to receive Write-Host results to variable with Receive-Job CMDlet, it's working on Powershell 5.1:
$var = Receive-Job $job 6>&1
I know this is old. I don't use jobs very often so I was obviously having the same problem when I stumbled on this thread. Anyway, I arrived at a different solution so I thought I'd throw it out there.
Also in PowerShell 5.1 I simply referenced the properties on the Job Object(s) In my case this looked like:
$Jobs.ChildJobs.Output | Out-File <LogFile> -Append
It looks like all the streams are recorded in this way. Simple echoes go to the Output, presumable the success stream. Write-Host commands go to Information Stream.
Output : {}
Error : {}
Progress : {}
Verbose : {}
Debug : {}
Warning : {}
Information : {}
When you receive the results of a job those results are deleted such that further calls to Receive-Job will have nothing to retrieve (assuming the job is completed). If you don't want Receive-Job to delete the results use the -Keep switch. Now this doesn't explain why your first call to Receive-Job isn't outputting anything to the log file ... unless there is a part of the script you're not showing that is doing a Receive-Job before this.
BTW when you use -OutVariable supply it what a variable name and not a variable e.g. -ov temp. Then you later access that info with $temp.
You could pipe your Get-Process to Format-Table, so you can see all of the output.
Get-Process -Name "explorer" | format-Table -hidetableheaders -AutoSize
And you could use something like this at the end of the script...
Get-job | Receive-Job 2>&1 >> c:\path\to\your\log\file.log
The "2>&1 >>" with grab all output from Get-Job | Receive-Job
A simple solution to the problem. Write-Verbose "blabla" -Verbose
$s = New-PSSession -ComputerName "Computer"
$j = Invoke-Command -Session $s -ScriptBlock { Write-Verbose "Message" -Verbose } -AsJob
Wait-Job $j
$Child = $j.ChildJobs[0]
$Child.Verbose | out-file d:/job.log -append
What I have done is to not call Receive-Job at all. But create a wrapper scriptblock that starts and stops a transcripts. Then in that wrapper script block I call the jobs script block like
&$ScriptBlock < Parameters >
So the transcript will capture all the output to a file.

How can you set a time limit for a PowerShell script to run for?

I want to set a time limit on a PowerShell (v2) script so it forcibly exits after that time limit has expired.
I see in PHP they have commands like set_time_limit and max_execution_time where you can limit how long the script and even a function can execute for.
With my script, a do/while loop that is looking at the time isn't appropriate as I am calling an external code library that can just hang for a long time.
I want to limit a block of code and only allow it to run for x seconds, after which I will terminate that code block and return a response to the user that the script timed out.
I have looked at background jobs but they operate in a different thread so won't have kill rights over the parent thread.
Has anyone dealt with this or have a solution?
Thanks!
Something like this should work too...
$job = Start-Job -Name "Job1" -ScriptBlock {Do {"Something"} Until ($False)}
Start-Sleep -s 10
Stop-Job $job
Here's my solution, inspired by this blog post. It will finish running when all has been executed, or time runs out (whichever happens first).
I place the stuff I want to execute during a limited time in a function:
function WhatIWannaDo($param1, $param2)
{
# Do something... that maybe takes some time?
Write-Output "Look at my nice params : $param1, $param2"
}
I have another funtion that will keep tabs on a timer and if everything has finished executing:
function Limit-JobWithTime($Job, $TimeInSeconds, $RetryInterval=5)
{
try
{
$timer = [Diagnostics.Stopwatch]::StartNew()
while (($timer.Elapsed.TotalSeconds -lt $TimeInSeconds) -and ('Running' -eq $job.JobStateInfo.State)) {
$totalSecs = [math]::Round($timer.Elapsed.TotalSeconds,0)
$tsString = $("{0:hh}:{0:mm}:{0:ss}" -f [timespan]::fromseconds($totalSecs))
Write-Progress "Still waiting for action $($Job.Name) to complete after [$tsString] ..."
Start-Sleep -Seconds ([math]::Min($RetryInterval, [System.Int32]($TimeInSeconds-$totalSecs)))
}
$timer.Stop()
$totalSecs = [math]::Round($timer.Elapsed.TotalSeconds,0)
$tsString = $("{0:hh}:{0:mm}:{0:ss}" -f [timespan]::fromseconds($totalSecs))
if ($timer.Elapsed.TotalSeconds -gt $TimeInSeconds -and ('Running' -eq $job.JobStateInfo.State)) {
Stop-Job $job
Write-Verbose "Action $($Job.Name) did not complete before timeout period of $tsString."
} else {
if('Failed' -eq $job.JobStateInfo.State){
$err = $job.ChildJobs[0].Error
$reason = $job.ChildJobs[0].JobStateInfo.Reason.Message
Write-Error "Job $($Job.Name) failed after with the following Error and Reason: $err, $reason"
}
else{
Write-Verbose "Action $($Job.Name) completed before timeout period. job ran: $tsString."
}
}
}
catch
{
Write-Error $_.Exception.Message
}
}
... and then finally I start my function WhatIWannaDo as a background job and pass it on to the Limit-JobWithTime (including example of how to get output from the Job):
#... maybe some stuff before?
$job = Start-Job -Name PrettyName -Scriptblock ${function:WhatIWannaDo} -argumentlist #("1st param", "2nd param")
Limit-JobWithTime $job -TimeInSeconds 60
Write-Verbose "Output from $($Job.Name): "
$output = (Receive-Job -Keep -Job $job)
$output | %{Write-Verbose "> $_"}
#... maybe some stuff after?
I know this is an old post, but I have used this in my scripts.
I am not sure if its the correct use of it, but the System.Timers.Timer that George put up gave me an idea and it seems to be working for me.
I use it for servers that sometimes hang on a WMI query, the timeout stops it getting stuck.
Instead of write-host I then output the message to a log file so I can see which servers are broken and fix them if needed.
I also don't use a guid I use the servers hostname.
I hope this makes sense and helps you.
$MyScript = {
Get-WmiObject -ComputerName MyComputer -Class win32_operatingsystem
}
$JobGUID = [system.Guid]::NewGuid()
$elapsedEventHandler = {
param ([System.Object]$sender, [System.Timers.ElapsedEventArgs]$e)
($sender -as [System.Timers.Timer]).Stop()
Unregister-Event -SourceIdentifier $JobGUID
Write-Host "Job $JobGUID removed by force as it exceeded timeout!"
Get-Job -Name $JobGUID | Remove-Job -Force
}
$timer = New-Object System.Timers.Timer -ArgumentList 3000 #just change the timeout here
Register-ObjectEvent -InputObject $timer -EventName Elapsed -Action $elapsedEventHandler -SourceIdentifier $JobGUID
$timer.Start()
Start-Job -ScriptBlock $MyScript -Name $JobGUID
Here is an example of using a Timer. I haven't tried it personally, but I think it should work:
function Main
{
# do main logic here
}
function Stop-Script
{
Write-Host "Called Stop-Script."
[System.Management.Automation.Runspaces.Runspace]::DefaultRunspace.CloseAsync()
}
$elapsedEventHandler = {
param ([System.Object]$sender, [System.Timers.ElapsedEventArgs]$e)
Write-Host "Event handler invoked."
($sender -as [System.Timers.Timer]).Stop()
Unregister-Event -SourceIdentifier Timer.Elapsed
Stop-Script
}
$timer = New-Object System.Timers.Timer -ArgumentList 2000 # setup the timer to fire the elapsed event after 2 seconds
Register-ObjectEvent -InputObject $timer -EventName Elapsed -SourceIdentifier Timer.Elapsed -Action $elapsedEventHandler
$timer.Start()
Main
How about something like this:
## SET YOUR TIME LIMIT
## IN THIS EXAMPLE 1 MINUTE, BUT YOU CAN ALSO USE HOURS/DAYS
# $TimeSpan = New-TimeSpan -Days 1 -Hours 2 -Minutes 30
$TimeSpan = New-TimeSpan -Minutes 1
$EndTime = (Get-Date).AddMinutes($TimeSpan.TotalMinutes).ToString("HH:mm")
## START TIMED LOOP
cls
do
{
## START YOUR SCRIPT
Write-Warning "Test-Job 1...2...3..."
Start-Sleep 3
Write-Warning "End Time = $EndTime`n"
}
until ($EndTime -eq (Get-Date -Format HH:mm))
## TIME REACHED AND END SCRIPT
Write-Host "End Time reached!" -ForegroundColor Green
When using hours or days as a timer, make sure you adjust the $TimeSpan.TotalMinutes
and the HH:mm format, since this does not facilitate the use of days in the example.
I came up with this script.
Start-Transcript to log all actions and save them to a file.
Store the current process ID value in the variable $p then write it to screen.
Assign the current date to the $startTime variable.
Afterwards I assign it again and add the extra time to the current date to the var $expiration.
The updateTime function return what time there is left before the application closes. And writes it to console.
Start looping and kill process if the timer exceeds the expiration time.
That's it.
Code:
Start-Transcript C:\Transcriptlog-Cleanup.txt #write log to this location
$p = Get-Process -Id $pid | select -Expand id # -Expand selcts the string from the object id out of the current proces.
Write-Host $p
$startTime = (Get-Date) # set start time
$startTime
$expiration = (Get-Date).AddSeconds(20) #program expires at this time
# you could change the expiration time by changing (Get-Date).AddSeconds(20) to (Get-Date).AddMinutes(10)or to hours whatever you like
#-----------------
#Timer update function setup
function UpdateTime
{
$LeftMinutes = ($expiration) - (Get-Date) | Select -Expand minutes # sets minutes left to left time
$LeftSeconds = ($expiration) - (Get-Date) | Select -Expand seconds # sets seconds left to left time
#Write time to console
Write-Host "------------------------------------------------------------------"
Write-Host "Timer started at : " $startTime
Write-Host "Current time : " (Get-Date)
Write-Host "Timer ends at : " $expiration
Write-Host "Time on expire timer : "$LeftMinutes "Minutes" $LeftSeconds "Seconds"
Write-Host "------------------------------------------------------------------"
}
#-----------------
do{ #start loop
Write-Host "Working"#start doing other script stuff
Start-Sleep -Milliseconds 5000 #add delay to reduce spam and processing power
UpdateTime #call upadate function to print time
}
until ($p.HasExited -or (Get-Date) -gt $expiration) #check exit time
Write-Host "done"
Stop-Transcript
if (-not $p.HasExited) { Stop-Process -ID $p -PassThru } # kill process after time expires