Multiple PowerShell Scripts Writing to Same File - powershell

I have a condition that kicks off a PowerShell script to append a short string to a text file. This condition can fire rapidly, so the file is being written multiple times by the same script. Additionally, a separate script is importing from that text file in batches (less frequently).
Whenever the condition fires very rapidly, I get the error: "The process can not access the file 'file_name' because it is being used by another process." When I do the same append in Python (my main language), I don't get the same error, but I could use some help fixing this in PowerShell.
$action = $args[0]
$output_filename = $args[1]
$item = $args[2]
if ($action -eq 'direct'){
$file_path = $output_filename
$sw = New-Object -typename System.IO.StreamWriter($file_path, "true")
$sw.WriteLine($item)
$sw.Close() }
I have also tried the following instead of StreamWriter, but apparently the performance is weak for Add-Content and Out-File (http://sqlblog.com/blogs/linchi_shea/archive/2010/01/04/add-content-and-out-file-are-not-for-performance.aspx):
out-file -Append -FilePath $file_path -InputObject $item }

Might try something like this:
while ($true)
{
Try {
[IO.File]::OpenWrite($file_path).close()
Add-Content -FilePath $file_path -InputObject $item
Break
}
Catch {}
}

Related

Octopus Deploy - SQL - Execute Scripts Ordered step giving Exception

In Octopus deploy I have added a step in process to run the stored procedure with library script “SQL - Execute Scripts Ordered step”.
When I’m providing the script to execute the stored procedure it is throwing the below Exception:
Exception calling “ReadAllText” with “1” argument(s): “The specified path, file name, or both are too long. The fully qualified file name must be less than 260 characters, and the directory name must be less than 248 characters.”
Closing connection
I believe this is because of the large script as text I've provided to execute in field “SQL Script File”.
As shown in examples I can run script directly. So I’m providing the stored procedure execution script but in library's PowerShell scipt -
$content = [IO.File]::ReadAllText($OctopusParameters[‘SqlScriptFile’])
ReadAllText is expecting something less than 260 characters.
One solution I can think of is to provide the execution script as a file within package itself. But this will be the last resort.
How can I run the stored procedure directly from the step in process?
Apparantly [IO.File]::ReadAllText($OctopusParameters[‘SqlScriptFile’]) is expecting file path as SqlScriptFile. I updated the library's powershell script to take the full sql script from field "SQL Script File" as parameter and passed it directly to the function.
$content= $OctopusParameters['SqlScriptFile']
Execute-SqlQuery -query $content
providing below the full powershell script for reference:
$connection = New-Object System.Data.SqlClient.SqlConnection
$connection.ConnectionString = $OctopusParameters['ConnectionString']
Register-ObjectEvent -inputobject $connection -eventname InfoMessage -action {
write-host $event.SourceEventArgs
} | Out-Null
function Execute-SqlQuery($query) {
$queries = [System.Text.RegularExpressions.Regex]::Split($query, "^\s*GO\s*`$", [System.Text.RegularExpressions.RegexOptions]::IgnoreCase -bor [System.Text.RegularExpressions.RegexOptions]::Multiline)
$queries | ForEach-Object {
$q = $_
if ((-not [String]::IsNullOrWhiteSpace($q)) -and ($q.Trim().ToLowerInvariant() -ne "go")) {
$command = $connection.CreateCommand()
$command.CommandText = $q
$command.CommandTimeout = $OctopusParameters['CommandTimeout']
$command.ExecuteNonQuery() | Out-Null
}
}
}
Write-Host "Connecting"
try {
$connection.Open()
Write-Host "Executing script in" $OctopusParameters['SqlScriptFile']
# $content = [IO.File]::ReadAllText($OctopusParameters['SqlScriptFile'])
$content= $OctopusParameters['SqlScriptFile']
Execute-SqlQuery -query $content
}
catch {
if ($OctopusParameters['ContinueOnError']) {
Write-Host $_.Exception.Message
}
else {
throw
}
}
finally {
Write-Host "Closing connection"
$connection.Dispose()
}

How to Log Errors in Powershell?

[1] I am trying to create Error logs for my PowerShell script. I need to create a function so that instead of using Write-Host I can directly call that function and whatever the error I am getting in the Script can be directly logged into the Log File.
[2] I have used the following method but it doesn't seem that will work for every PowerShell script.
function WriteLog
{
Param ([string]$LogString)
$LogFile = "C:\$(gc env:computername).log"
$DateTime = "[{0:MM/dd/yy} {0:HH:mm:ss}]" -f (Get-Date)
$LogMessage = "$Datetime $LogString"
Add-content $LogFile -value $LogMessage
}
WriteLog "This is my log message"
[3] Can Anyone Suggest a more easy way to handle logging of Error into the file?
You are not far off for your use case.
You don't need Write-Host at all for this at all. Depending on what PS version you are using, Write-Host is not prudent when you are sending data down the pipe or elsewhere
Write-Host is just bad in legacy, pre-v5x versions if you were sending stuff in the pipeline because it cleared the buffer, so they'd be noting to send. In v5x
and higher. As per the founder/creator of Monad/PowerShell.
• Write-Host Harmful
https://www.jsnover.com/blog/2013/12/07/write-host-considered-harmful/
https://devblogs.microsoft.com/scripting/understanding-streams-redirection-and-write-host-in-powershell
https://powershell.org/2012/06/how-to-use-write-host-without-endangering-puppies-or-a-manifesto-for-modularizing-powershell-scripts
PowerShell Best Practice #3: Avoid Write-Host
it now writes to the Information stream, as per the founder/creator of Monad/Powershell.
• However, this thought has been changed since the v5x stuff.
You can use direct logging using either
Tee-Object
Export-Csv -Append
Out-File -Append
... and other redirection options.
You can also, create and write to your own event log...
New-EventLog -LogName LogonScripts -Source ClientScripts
Write-EventLog LogonScripts -Source ClientScripts -Message 'Test Message' -EventId 1234 -EntryType Warning
...then read from it later as you would any other event log.
There are lots of examples (scripts and modules) all of the web to use as is and start with or tweak as needed, even the ones via the Microsoft powershellgallery.com.
ScriptLogger 2.0.0
Write-Log: A Simple Logging Function for your PowerShell - Scripts: Download - Write-Log.ps1:
Write-Log PowerShell Logging Function: Download - Function-Write-Log.ps1:
Logging 2.4.9
Point of note in your post. This...
$LogFile = "C:\$(gc env:computername).log"
... just simplify to this...
$LogFile = "C:\$($Env:COMPUTERNAME).log"
#set logfile-path on global scope
$Logfile = "C:\Temp\Logfile.log" # set your Logfile-Path here
function Write-Log {
param
(
[Parameter(ValueFromPipeline)]
[string]$content
)
$FileExists = Test-Path -Path $LogFile
$DateNow = Get-Date -Format 'dd.MM.yyyy HH:mm'
$FileInp = $DateNow + ' | ' + $content      
if ($FileExists -eq $True){
Add-Content -Path $LogFile -Value $FileInp 
}
else {
New-Item -Path $Logfile -ItemType file
Add-Content -Path $LogFile -Value $FileInp
}
}
#then you only have to pipe it to Write-log like this:
Write-Output "hello world" | Write-Log

Using a try block to test if a file is locked in powershell

I want to know if its bad form to use try blocks to test if a file is locked. Here's the background.
I need to send text output of an application to two serial printers simultaneously. My solution was to use MportMon, and a Powershell script. The way it's supposed to work is the application default prints to the MportMon virtual printer port, which actually makes a uniquely named file in a "dropbox" folder. The powershell script uses a filesystemwatcher to monitor the folder and when a new file is created, it takes the textual content and pushes it out two serial printers, then deletes the file, so as not to fill up the folder. I was having a problem when trying to read the text from the file that the virtual printer created. I found that I was getting errors becasue the file was still locked. To fixed the problem, I used a FSM to impliment the logic and instead of checking for a lock everytime before attempting to get the content from the file, I used a try block that attempts to read content from the file, if it fails, the catch block just reaffirms the state that the FSM is in, and the process is repeated until successful. It seems to work fine, but I've read somewhere that its bad practice. Is there any danger in this method, or is it safe and reliable? Below is my code.
$fsw = New-Object system.io.filesystemwatcher
$q = New-Object system.collections.queue
$path = "c:\DropBox"
$fsw.path = $path
$state = "waitforQ"
[string]$tempPath = $null
Register-ObjectEvent -InputObject $fsw -EventName created -Action {
$q.enqueue( $event.sourceeventargs.fullpath )
}
while($true) {
switch($state)
{
"waitforQ" {
echo "waitforQ"
if ($q.count -gt 0 ) {$state = "retrievefromQ"}
}
"retrievefromQ" {
echo "retrievefromQ"
$tempPath = $q.dequeue()
$state = "servicefile"
}
"servicefile" {
echo " in servicefile "
try
{
$text = Get-Content -ErrorAction stop $tempPath
#echo "in try"
$text | out-printer db1
$text | out-printer db2
echo " $text "
$state = "waitforQ"
rm $tempPath
}
catch
{
#echo "in catch"
$state = "servicefile"
}
}
Default { $state = "waitforQ" }
}
}
I wouldn't say it's bad practice to test a file to see if it's locked, but it's not as clean as checking the handles used by other processes. Personally I'd test the file like you do, but I adjust a few parts to make it safer/better.
That switch-statement looks way to complicated (for me), I'd replace it with a simple if-test. "If files in queue, proceed, if not, wait".
You need to slow down.. You will try to read the file as many times as possible while it's locked. This is a waste of resources since it will take some time for the current application to let it go and save the data to a HDD. Add some pauses. You won't notice them, but your CPU will love them. The same applies when there are no files in the queue.
You might benefit from adding a timeout, like max 50 attempts to read the file, to avoid the script getting stuck if one specific file is never released.
Try:
$fsw = New-Object system.io.filesystemwatcher
$q = New-Object system.collections.queue
$path = "c:\DropBox"
$fsw.path = $path
$MaxTries = 50 #50times * 0,2s sleep = 10sec timeout
[string]$tempPath = $null
Register-ObjectEvent -InputObject $fsw -EventName created -Action {
$q.enqueue( $event.sourceeventargs.fullpath )
}
while($true) {
if($q.Count -gt 0) {
#Get next file in queue
$tempPath = $q.dequeue()
#Read file
$text = $null
$i = 0
while($text -eq $null) {
#If locked, wait and try again
try {
$text = Get-Content -Path $tempPath -ErrorAction Stop
} catch {
$i++
if($i -eq $MaxTries) {
#Max attempts reached. Stops script
Write-Error -Message "Script is stuck on locked file '$tempPath'" -ErrorAction Stop
} else {
#Wait
Start-Sleep -Milliseconds 200
}
}
}
#Print file
$text | Out-Printer db1
$text | Out-Printer db2
echo " $text "
#Remove temp-file
Remove-Item $tempPath
}
#Relax..
Start-Sleep -Milliseconds 500
}

Getting error output from a powershell 2.0 script running as a task

TL:DR actual question is at the bottom
I'm trying to troubleshoot a Powershell v1.0 script issue. The script basically downloads a file from an FTP site and puts it on a remote server via UNC and emails the success or failure of the task.
The script runs as a task with a generic ID that is a Domain Admin but is not used to log into systems so the server it runs off of does not contain a profile for it.
If I do a runas for that user and execute the script via command line it works flawlessly. However, if I try to run it as a task it runs then exits instantly. If I open a runas command prompt and run the scheduled task vi at he command line all I get back is:
SUCCESS: Attempted to run the scheduled task "Task Name".
I've tried writing variable values to a text file to see what is going on but it never writes even when I write them as the very first step of execution.
What I want to do is capture any script error messages you would normally see when trying to run the script and/or write the variable information to a text file.
Is there any way to do this? BTW I doing via calling powershell with the following arguments:
-file -ExecutionPolicy Bypass "d:\datscript\myscript.ps1"
-I've tried -command instead of -file.
-I've tried "d:\datscript\myscript.ps1 5>&1 test.txt"
-I've tried "d:\datscript\myscript.ps1 9>&1 test.txt"
-I've tried "d:\datscript\myscript.ps1 | out-file d:\datscript\test.txt"
Nothing worked. I'm sure I can fix whatever bug I have but I'm banging my head against the wall trying to get some kind of failure info.
--Update: Here is a copy of the script minus details--
#-------------------------------------------------------------------------------------------------------------------------------------------------------------
#
#Variable Declaration
#
#$path = Path on local server to downlaod DAT to
#$olddat = Old/last DAT downloaded
#$currentdat = Next DAT number
#$ftpsite = McAfee FTP site. Update if path changes
#$ftpuser = FTP user (anon login)
#$ftppass = FTP password (anon login)
#$tempstring = Manipulation variable
#$gotdat = Boolean if updated DAT exists
#$success = Status if a new DAT exists and has been downloaded (used for email notification).
#$thetime = Variable use dto hold time of day manipulation.
$path = "\\myservername\ftproot\pub\mcafee\datfiles\"
$olddat = ""
$currentdat =""
$ftpsite = "ftp://ftp.nai.com/virusdefs/4.x/"
$ftpuser = "something"
$ftppass = "anything"
$tempstring =""
$gotdat = "False"
$success = ""
$thetime = ""
#
#Normalized functions handles UNC paths
#
function Get-NormalizedFileSystemPath
{
<#
.Synopsis
Normalizes file system paths.
.DESCRIPTION
Normalizes file system paths. This is similar to what the Resolve-Path cmdlet does, except Get-NormalizedFileSystemPath also properly handles UNC paths and converts 8.3 short names to long paths.
.PARAMETER Path
The path or paths to be normalized.
.PARAMETER IncludeProviderPrefix
If this switch is passed, normalized paths will be prefixed with 'FileSystem::'. This allows them to be reliably passed to cmdlets such as Get-Content, Get-Item, etc, regardless of Powershell's current location.
.EXAMPLE
Get-NormalizedFileSystemPath -Path '\\server\share\.\SomeFolder\..\SomeOtherFolder\File.txt'
Returns '\\server\share\SomeOtherFolder\File.txt'
.EXAMPLE
'\\server\c$\.\SomeFolder\..\PROGRA~1' | Get-NormalizedFileSystemPath -IncludeProviderPrefix
Assuming you can access the c$ share on \\server, and PROGRA~1 is the short name for "Program Files" (which is common), returns:
'FileSystem::\\server\c$\Program Files'
.INPUTS
String
.OUTPUTS
String
.NOTES
Paths passed to this command cannot contain wildcards; these will be treated as invalid characters by the .NET Framework classes which do the work of validating and normalizing the path.
.LINK
Resolve-Path
#>
[CmdletBinding()]
param (
[Parameter(Mandatory = $true, ValueFromPipeline = $true, ValueFromPipelineByPropertyName = $true)]
[Alias('PSPath', 'FullName')]
[string[]]
$Path,
[switch]
$IncludeProviderPrefix
)
process
{
foreach ($_path in $Path)
{
$_resolved = $_path
if ($_resolved -match '^([^:]+)::')
{
$providerName = $matches[1]
if ($providerName -ne 'FileSystem')
{
Write-Error "Only FileSystem paths may be passed to Get-NormalizedFileSystemPath. Value '$_path' is for provider '$providerName'."
continue
}
$_resolved = $_resolved.Substring($matches[0].Length)
}
if (-not [System.IO.Path]::IsPathRooted($_resolved))
{
$_resolved = Join-Path -Path $PSCmdlet.SessionState.Path.CurrentFileSystemLocation -ChildPath $_resolved
}
try
{
$dirInfo = New-Object System.IO.DirectoryInfo($_resolved)
}
catch
{
$exception = $_.Exception
while ($null -ne $exception.InnerException)
{
$exception = $exception.InnerException
}
Write-Error "Value '$_path' could not be parsed as a FileSystem path: $($exception.Message)"
continue
}
$_resolved = $dirInfo.FullName
if ($IncludeProviderPrefix)
{
$_resolved = "FileSystem::$_resolved"
}
Write-Output $_resolved
}
} # process
} # function Get-NormalizedFileSystemPath
#
#Get the number of the exisiting DAT file and increment for next DAT if the DAT's age is older than today.
# Otherwise, exit the program if DATs age is today.
#
$tempstring = "xdat.exe"
$env:Path = $env:Path + ";d:\datscript"
$path2 ="d:\datscript\debug.txt"
add-content $path2 $path
add-content $path2 $olddat
add-content $path2 $currentdat
add-content $path2 $success
add-content $path2 " "
$path = Get-NormalizedFileSystemPath -Path $path
Set-Location -Path $path
$olddat = dir $path | %{$_.Name.substring(0, 4) }
$olddatfull = "$olddat" + "$tempstring"
if ( ((get-date) - (ls $olddatfull).LastWriteTime).day -lt 1)
{
#***** Commented out for testing!
# exit
}
$currentdat = [INT] $olddat
$currentdat++
$currentdat = "$currentdat" + "$tempstring"
add-content $path2 $olddat
add-content $path2 $currentdat
add-content $path2 $success
add-content $path2 " "
#
#Connect to FTP site and get a current directory listing.
#
[System.Net.FtpWebRequest]$ftp = [System.Net.WebRequest]::Create($ftpsite)
$ftp.Method = [System.Net.WebRequestMethods+FTP]::ListDirectoryDetails
$response = $ftp.getresponse()
$stream = $response.getresponsestream()
$buffer = new-object System.Byte[] 1024
$encoding = new-object System.Text.AsciiEncoding
$outputBuffer = ""
$foundMore = $false
#
# Read all the data available from the ftp directory stream, writing it to the
# output buffer when done. After that the buffer is searched to see if it cotains the expected
# lastest DAT.
#
do
{
## Allow data to buffer for a bit
start-sleep -m 1000
## Read what data is available
$foundmore = $false
$stream.ReadTimeout = 1000
do
{
try
{
$read = $stream.Read($buffer, 0, 1024)
if($read -gt 0)
{
$foundmore = $true
$outputBuffer += ($encoding.GetString($buffer, 0, $read))
}
} catch { $foundMore = $false; $read = 0 }
} while($read -gt 0)
} while($foundmore)
$gotdat = $outputbuffer.Contains($currentdat)
$target = $path + $currentdat
#
# Downloads DATs and cleans up old DAT file. Returns status of the operation.
# Return 1 = success
# Return 2 = Latest DAT not found and 4pm or later
# Return 3 = DAT available but did not download or is 0 bytes
# Return 4 = LatesT DAT not found and before 4pm
#
$success = 0
if ($gotdat -eq "True")
{
$ftpfile = $ftpsite + $ftppath + $currentdat
write-host $ftpfile
write-host $target
$ftpclient = New-Object system.Net.WebClient
$uri = New-Object System.Uri($ftpfile)
$ftpclient.DownloadFile($uri, $target)
Start-Sleep -s 30
if ( ((get-date) - (ls $target).LastWriteTime).days -ge 1)
{
$success = 3
}
else
{
$testlength = (get-item $target).length
if( (get-item $target).length -gt 0)
{
Remove-Item "$olddatfull"
$success = 1
}
else
{
$success = 3
}
}
}
else
{
$thetime = Get-Date
$thetime = $thetime.Hour
if ($thetime -ge 16)
{
$success = 2
}
else
{
$success = 4
exit
}
}
#
# If successful download (success = 1) run push bat
#
if ($success -eq 1)
{
Start-Process "cmd.exe" "/c c:\scripts\mcafeepush.bat"
}
#Email structure
#
#Sends result email based on previous determination
#
#SMTP server name
$smtpServer = "emailserver.domain.com"
#Creating a Mail object
$msg = new-object Net.Mail.MailMessage
#Creating SMTP server object
$smtp = new-object Net.Mail.SmtpClient($smtpServer)
$msg.From = "email1#domain.com"
$msg.ReplyTo = "email2#domain.com"
$msg.To.Add("email2#domain.com")
switch ($success)
{
1 {
$msg.subject = "McAfee Dats $currentdat successful"
$msg.body = ("DAT download completed successfully. Automaton v1.0")
}
2 {
$msg.subject = "McAfee DATs Error"
$msg.body = ("Looking for DAT $currentdat on the FTP site but I coud not find it. Human intervention may be required. Automaton v1.0")
}
3 {
$msg.subject = "McAfee DATs Error"
$msg.body = ("$currentdat is available for download but download has failed. Human intervention will be required. Automaton v1.0")
}
default {
$msg.subject = "DAT Automaton Error"
$msg.body = ("Something broke with the McAfee automation script. Human intervention will be required. Automaton v1.0")
}
}
#Sending email
$smtp.Send($msg)
#Needed to keep the program from exiting too fast.
Start-Sleep -s 30
#debugging stuff
add-content $path2 $olddat
add-content $path2 $currentdat
add-content $path2 $success
add-content $path2 " "
Apparently you have an error in starting Powershell, either because execution policy is different on the Powershell version you start, or on the account, or there is an access error on the scheduled task. To gather actual error, you can launch a task like so:
cmd /c "powershell.exe -file d:\datscript\myscript.ps1 test.txt 2>&1" >c:\windows\temp\test.log 2&>1
This way if there would be an error on starting Powershell, it will be logged in the c:\windows\temp\test.log file. If the issue is in execution policy, you can create and run (once) a task with the following:
powershell -command "Get-ExecutionPolicy -List | out-file c:/windows/temp/policy.txt; Set-ExecutionPolicy RemoteSigned -Scope LocalMachine -Force"
Running a task under the account you plan to run your main task will first get the policies in effect (so that if setting machine-level policy won't help, you'll know what scope to alter) and set machine-level policy to "RemoteSigned", the least restrictive level beyond allowing every script (highly not recommended, there are encoder scripts written on Powershell that can ruin your data).
Hope this helps.
UPDATE: If that's not policy, there might be some errors in properly writing the parameters for the task. You can do this: Create a .bat file with the string that launches your script and redirects output to say test1.txt, then change the scheduled task to cmd.exe -c launcher.bat >test2.txt, properly specifying the home folder. Run the task and review both files, at least one of them should contain an error that prevents your script from launching.

PowerShell - redirect executable's stderr to file or variable but still have stdout go to console

I'm writing a script to download several repositories from GitHub. Here is the command to download a repository:
git clone "$RepositoryUrl" "$localRepoDirectory"
When I run this command it displays some nice progress information in the console window that I want displayed.
The problem is that I also want to be able to detect if any errors have occurred while downloading. I found this post that talks about redirecting the various streams, so I tried:
(git clone "$RepositoryUrl" "$localRepoDirectory") 2> $errorLogFilePath
This pipes any errors from stderr to my file, but no longer displays the nice progress information in the console.
I can use the Tee-Object like so:
(git clone "$RepositoryUrl" "$localRepoDirectory") | Tee-Object -FilePath $errorLogFilePath
and I still get the nice progress output, but this pipes stdout to the file, not stderr; I'm only concerned with detecting errors.
Is there a way that I can store any errors that occur in a file or (preferably) a variable, while also still having the progress information piped to the console window? I have a feeling that the answer might lie in redirecting various streams into other streams as discusses in this post, but I'm not really sure.
======== Update =======
I'm not sure if the git.exe is different than your typical executable, but I've done some more testing and here is what I've found:
$output = (git clone "$RepositoryUrl" "$localRepoDirectory")
$output always contains the text "Cloning into '[localRepoDirectory]'...", whether the command completed successfully or produced an error. Also, the progress information is still written to the console when doing this. This leads me to think that the progress information is not written via stdout, but by some other stream?
If an error occurs the error is written to the console, but in the usual white foreground color, not the typical red for errors and yellow for warnings. When this is called from within a cmdlet function and the command fails with an error, the error is NOT returned via the function's -ErrorVariable (or -WarningVariable) parameter (however if I do my own Write-Error that does get returned via -ErrorVariable). This leads me to think that git.exe doesn't write to stderr, but when we do:
(git clone "$RepositoryUrl" "$localRepoDirectory") 2> $errorLogFilePath
The error message is written to the file, so that makes me think that it does write to stderr. So now I'm confused...
======== Update 2 =======
So with Byron's help I've tried a couple of more solutions using a new process, but still can't get what I want. When using a new process I never get the nice progress written to the console.
The three new methods that I've tried both use this bit of code in common:
$process = New-Object System.Diagnostics.Process
$process.StartInfo.Arguments = "clone ""$RepositoryUrl"" ""$localRepoDirectory"""
$process.StartInfo.UseShellExecute = $false
$process.StartInfo.RedirectStandardOutput = $true
$process.StartInfo.RedirectStandardError = $true
$process.StartInfo.CreateNoWindow = $true
$process.StartInfo.WorkingDirectory = $WORKING_DIRECTORY
$process.StartInfo.FileName = "git"
Method 1 - Run in new process and read output afterwards:
$process.Start()
$process.WaitForExit()
Write-Host Output - $process.StandardOutput.ReadToEnd()
Write-Host Errors - $process.StandardError.ReadToEnd()
Method 2 - Get output synchronously:
$process.Start()
while (!$process.HasExited)
{
Write-Host Output - $process.StandardOutput.ReadToEnd()
Write-Host Error Output - $process.StandardError.ReadToEnd()
Start-Sleep -Seconds 1
}
Even though this looks like it would write the output while the process is running, it doesn't write anything until after the process exits.
Method 3 - Get output asynchronously:
Register-ObjectEvent -InputObject $process -EventName "OutputDataReceived" -Action {Write-Host Output Data - $args[1].Data }
Register-ObjectEvent -InputObject $process -EventName "ErrorDataReceived" -Action { Write-Host Error Data - $args[1].Data }
$process.Start()
$process.BeginOutputReadLine()
$process.BeginErrorReadLine()
while (!$process.HasExited)
{
Start-Sleep -Seconds 1
}
This does output data while the process is working which is good, but it still doesn't display the nice progress information :(
I think I have your answer. I'm working with PowerShell for a while and created several build systems. Sorry if the script is a bit long, but it works.
$dir = <your dir>
$global:log = <your log file which must be in the global scope> # Not global = won't work
function Create-Process {
$process = New-Object -TypeName System.Diagnostics.Process
$process.StartInfo.CreateNoWindow = $false
$process.StartInfo.RedirectStandardError = $true
$process.StartInfo.UseShellExecute = $false
return $process
}
function Terminate-Process {
param([System.Diagnostics.Process]$process)
$code = $process.ExitCode
$process.Close()
$process.Dispose()
Remove-Variable process
return $code
}
function Launch-Process {
param([System.Diagnostics.Process]$process, [string]$log, [int]$timeout = 0)
$errorjob = Register-ObjectEvent -InputObject $process -EventName ErrorDataReceived -SourceIdentifier Common.LaunchProcess.Error -action {
if(-not [string]::IsNullOrEmpty($EventArgs.data)) {
"ERROR - $($EventArgs.data)" | Out-File $log -Encoding ASCII -Append
Write-Host "ERROR - $($EventArgs.data)"
}
}
$outputjob = Register-ObjectEvent -InputObject $process -EventName OutputDataReceived -SourceIdentifier Common.LaunchProcess.Output -action {
if(-not [string]::IsNullOrEmpty($EventArgs.data)) {
"Out - $($EventArgs.data)" | Out-File $log -Encoding ASCII -Append
Write-Host "Out - $($EventArgs.data)"
}
}
if($errorjob -eq $null) {
"ERROR - The error job is null" | Out-File $log -Encoding ASCII -Append
Write-Host "ERROR - The error job is null"
}
if($outputjob -eq $null) {
"ERROR - The output job is null" | Out-File $log -Encoding ASCII -Append
Write-Host "ERROR - The output job is null"
}
$process.Start()
$process.BeginErrorReadLine()
if($process.StartInfo.RedirectStandardOutput) {
$process.BeginOutputReadLine()
}
$ret = $null
if($timeout -eq 0)
{
$process.WaitForExit()
$ret = $true
}
else
{
if(-not($process.WaitForExit($timeout)))
{
Write-Host "ERROR - The process is not completed, after the specified timeout: $($timeout)"
$ret = $false
}
else
{
$ret = $true
}
}
# Cancel the event registrations
Remove-Event * -ErrorAction SilentlyContinue
Unregister-Event -SourceIdentifier Common.LaunchProcess.Error
Unregister-Event -SourceIdentifier Common.LaunchProcess.Output
Stop-Job $errorjob.Id
Remove-Job $errorjob.Id
Stop-Job $outputjob.Id
Remove-Job $outputjob.Id
$ret
}
$repo = <your repo>
$process = Create-Process
$process.StartInfo.RedirectStandardOutput = $true
$process.StartInfo.FileName = "git.exe"
$process.StartInfo.Arguments = "clone $($repo)"
$process.StartInfo.WorkingDirectory = $dir
Launch-Process $process $global:log
Terminate-Process $process
The log file must be in the global scope because the routine which runs the event processing is not in the script scope.
Sample of my log file:
Out - Cloning into ''...
ERROR - Checking out files: 22% (666/2971)
ERROR - Checking out files: 23% (684/2971)
ERROR - Checking out files: 24% (714/2971)
You can do this by putting the git clone command inside an advanced function e.g.:
function Clone-Git {
[CmdletBinding()]
param($repoUrl, $localRepoDir)
git clone $repoUrl $localRepoDir
}
Clone-Git $RepositoryUrl $localRepoDirectory -ev cloneErrors
$cloneErrors
If you use System.Diagnostics.Process to start Git, you can redirect all the error and output.
I just had to solve this problem for Inkscape:
$si = New-Object System.Diagnostics.ProcessStartInfo
$si.Arguments = YOUR PROCESS ARGS
$si.UseShellExecute = $false
$si.RedirectStandardOutput = $true
$si.RedirectStandardError = $true
$si.WorkingDirectory = $workingDir
$si.FileName = EXECUTABLE LOCATION
$process = [Diagnostics.Process]::Start($si)
while (!($process.HasExited))
{
// Do what you want with strerr and stdout
Start-Sleep -s 1 // Sleep for 1 second
}
You can, of course, wrap this in a function with proper arguments...