I have a powershell that is downloading some XML data over TCP. I need to have all the data written to a file. Each line of the response needs to be written to a new line in the file. The code below only gets the first line. Ignore the extra readline's because the first few lines are garbage data I don't need. How can I continue to write each line of the response until there are none left? Couldn't find anything else on this.
$server = "192.168.1.173"
$port = "45678"
$password = "password"
while (1) {
$tcpConnection = New-Object System.Net.Sockets.TcpClient($server, $port)
$tcpStream = $tcpConnection.GetStream()
$reader = New-Object System.IO.StreamReader($tcpStream)
$writer = New-Object System.IO.StreamWriter($tcpStream)
$writer.AutoFlush = $true
if ($tcpConnection.Connected) {
$writer.Write("<StageDisplayLogin>");
$writer.Write($password);
$writer.WriteLine("</StageDisplayLogin>");
$ProPresenterData = $reader.ReadLine()
$ProPresenterData = $reader.ReadLine()
$ProPresenterData = $reader.ReadLine()
$ProPresenterData | Out-File -Encoding "UTF8" ProPresenter.xml
Start-Sleep -m 200
while ($tcpStream.DataAvailable) {
$ProPresenterData = $reader.ReadLine()
Add-content -Path ProPresenter.xml -Value "$ProPresenterData"
}
}
$reader.Close()
$writer.Close()
$tcpConnection.Close()
"Wrote file ProPresenter.xml"
$ProPresenterData
Start-Sleep -m 500
}
I can only partially answer your problem. The reason why you only get one line is because of your while loop. There is no trigger for it to go back to the beginning and so it does your loop just one time.
With a foreach you can loop through all the lines in your file:
$file = get-content your.xml
foreach($line in $file){
$line # this would print each line , one after the other.
}
Alternativly you can use a for-loop if you can determine the length of the file in advance.
Having said that, is it an option for you to first fully download the file and then write it into a different file? Or, is it a file that is being constantly renewed with lines added on a certain interval?
Related
I created a tool (to be precise: a Powershell script) that helps me with converting pictures in folders, i.e. it looks for all files of a certain ending (say, *.TIF) and converts them to JPEGs via ImageMagick. It then transfers some EXIF, IPTC and XMP information from the source image to the JPEG via exiftool:
# searching files (done before converting the files, so just listed for reproduction):
$WorkingFiles = #(Get-ChildItem -Path D:\MyPictures\Testfiles -Filter *.tif | ForEach-Object {
[PSCustomObject]#{
SourceFullName = $_.FullName
JPEGFullName = $_.FullName -Replace 'tif$','jpg'
}
})
# Then, converting is done. PowerShell will wait until every jpeg is successfully created.
# + + + + The problem occurs somewhere after this line + + + +
# Creating the exiftool process:
$psi = New-Object System.Diagnostics.ProcessStartInfo
$psi.FileName = .\exiftool.exe
$psi.Arguments = "-stay_open True -charset utf8 -# -"
$psi.UseShellExecute = $false
$psi.RedirectStandardInput = $true
$psi.RedirectStandardOutput = $true
$psi.RedirectStandardError = $true
$exiftoolproc = [System.Diagnostics.Process]::Start($psi)
# creating the string argument for every file, then pass it over to exiftool:
for($i=0; $i -lt $WorkingFiles.length; $i++){
[string]$ArgList = "-All:all=`n-charset`nfilename=utf8`n-tagsFromFile`n$($WorkingFiles[$i].SourceFullName)`n-EXIF:All`n-charset`nfilename=utf8`n$($WorkingFiles[$i].JPEGFullName)"
# using -overwrite_original makes no difference
# Also, just as good as above code:
# [string]$ArgList = "-All:All=`n-EXIF:XResolution=300`n-EXIF:YResolution=300`n-charset`nfilename=utf8`n-overwrite_original`n$($WorkingFiles[$i].JPEGFullName)"
$exiftoolproc.StandardInput.WriteLine("$ArgList`n-execute`n")
# no difference using start-sleep:
# Start-Sleep -Milliseconds 25
}
# close exiftool:
$exiftoolproc.StandardInput.WriteLine("-stay_open`nFalse`n")
# read StandardError and StandardOutput of exiftool, then print it:
[array]$outputerror = #($exiftoolproc.StandardError.ReadToEnd().Split("`r`n",[System.StringSplitOptions]::RemoveEmptyEntries))
[string]$outputout = $exiftoolproc.StandardOutput.ReadToEnd()
$outputout = $outputout -replace '========\ ','' -replace '\[1/1]','' -replace '\ \r\n\ \ \ \ '," - " -replace '{ready}\r\n',''
[array]$outputout = #($outputout.Split("`r`n",[System.StringSplitOptions]::RemoveEmptyEntries))
Write-Output "Errors:"
foreach($i in $outputerror){
Write-Output $i
}
Write-Output "Standard output:"
foreach($i in $outputout){
Write-Output $i
}
If you want to reproduce but do not have/want that many files, there is also a simpler way: let exiftool print out its version number 600 times:
$psi = New-Object System.Diagnostics.ProcessStartInfo
$psi.FileName = .\exiftool.exe
$psi.Arguments = "-stay_open True -charset utf8 -# -"
$psi.UseShellExecute = $false
$psi.RedirectStandardInput = $true
$psi.RedirectStandardOutput = $true
$psi.RedirectStandardError = $true
$exiftoolproc = [System.Diagnostics.Process]::Start($psi)
for($i=0; $i -lt 600; $i++){
try{
$exiftoolproc.StandardInput.WriteLine("-ver`n-execute`n")
Write-Output "Success:`t$i"
}catch{
Write-Output "Failed:`t$i"
}
}
# close exiftool:
try{
$exiftoolproc.StandardInput.WriteLine("-stay_open`nFalse`n")
}catch{
Write-Output "Could not close exiftool!"
}
[array]$outputerror = #($exiftoolproc.StandardError.ReadToEnd().Split("`r`n",[System.StringSplitOptions]::RemoveEmptyEntries))
[array]$outputout = #($exiftoolproc.StandardOutput.ReadToEnd().Split("`r`n",[System.StringSplitOptions]::RemoveEmptyEntries))
Write-Output "Errors:"
foreach($i in $outputerror){
Write-Output $i
}
Write-Output "Standard output:"
foreach($i in $outputout){
Write-Output $i
}
As far as I could test, it all goes well, as long as you stay < 115 files. If you go above, the 114th JPEG gets proper metadata, but exiftool stops to work after this one - it idles, and my script does so, too. I can reproduce this with different files, paths, and exiftool commands.
Neither the StandardOutput nor the StandardError show any irregularities even with exiftool's -verbose-flag - of course, they would not, as I have to kill exiftool to get them to show up.
Running ISE's / VSCode's debugger shows nothing. Exiftool's window (only showing up when debugging) shows nothing.
Is there some hard limit on commands run with System.Diagnostics.Process, is this a problem with exiftool or is this simply due to my incompetence to use something outside the most basic Powershell cmdlets? Or maybe the better question would be: How can I properly debug this?
Powershell is 5.1, exiftool is 10.80 (production) - 10.94 (latest).
After messing around with different variants of $ArgList, I found out that there is no difference when using different file commands, but using commands that produce less StdOut (like -ver) resulted in more iterations. Therefore, I took an educated guess that the output buffer is the culprit.
As per Mark Byers' answer to "ProcessStartInfo hanging on “WaitForExit”? Why?":
The problem is that if you redirect StandardOutput and/or StandardError the internal buffer can become full. [...]
The solution is to use asynchronous reads to ensure that the buffer doesn't get full.
Then, it was just a matter of searching for the right things. I found that Alexander Obersht's answer to "How to capture process output asynchronously in powershell?" provides almost everything that I needed.
The script now looks like this:
# searching files (done before converting the files, so just listed for reproduction):
$WorkingFiles = #(Get-ChildItem -Path D:\MyPictures\Testfiles -Filter *.tif | ForEach-Object {
[PSCustomObject]#{
SourceFullName = $_.FullName
JPEGFullName = $_.FullName -Replace 'tif$','jpg'
}
})
# Then, converting is done. PowerShell will wait until every jpeg is successfully created.
# Creating the exiftool process:
$psi = New-Object System.Diagnostics.ProcessStartInfo
$psi.FileName = .\exiftool.exe
$psi.Arguments = "-stay_open True -charset utf8 -# -"
$psi.UseShellExecute = $false
$psi.RedirectStandardInput = $true
$psi.RedirectStandardOutput = $true
$psi.RedirectStandardError = $true
# + + + + NEW STUFF (1/2) HERE: + + + +
# Creating process object.
$exiftoolproc = New-Object -TypeName System.Diagnostics.Process
$exiftoolproc.StartInfo = $psi
# Creating string builders to store stdout and stderr.
$exiftoolStdOutBuilder = New-Object -TypeName System.Text.StringBuilder
$exiftoolStdErrBuilder = New-Object -TypeName System.Text.StringBuilder
# Adding event handers for stdout and stderr.
$exiftoolScripBlock = {
if (-not [String]::IsNullOrEmpty($EventArgs.Data)){
$Event.MessageData.AppendLine($EventArgs.Data)
}
}
$exiftoolStdOutEvent = Register-ObjectEvent -InputObject $exiftoolproc -Action $exiftoolScripBlock -EventName 'OutputDataReceived' -MessageData $exiftoolStdOutBuilder
$exiftoolStdErrEvent = Register-ObjectEvent -InputObject $exiftoolproc -Action $exiftoolScripBlock -EventName 'ErrorDataReceived' -MessageData $exiftoolStdErrBuilder
[Void]$exiftoolproc.Start()
$exiftoolproc.BeginOutputReadLine()
$exiftoolproc.BeginErrorReadLine()
# + + + + END OF NEW STUFF (1/2) + + + +
# creating the string argument for every file, then pass it over to exiftool:
for($i=0; $i -lt $WorkingFiles.length; $i++){
[string]$ArgList = "-All:all=`n-charset`nfilename=utf8`n-tagsFromFile`n$($WorkingFiles[$i].SourceFullName)`n-EXIF:All`n-charset`nfilename=utf8`n$($WorkingFiles[$i].JPEGFullName)"
# using -overwrite_original makes no difference
# Also, just as good as above code:
# [string]$ArgList = "-All:All=`n-EXIF:XResolution=300`n-EXIF:YResolution=300`n-charset`nfilename=utf8`n-overwrite_original`n$($WorkingFiles[$i].JPEGFullName)"
$exiftoolproc.StandardInput.WriteLine("$ArgList`n-execute`n")
}
# + + + + NEW STUFF (2/2) HERE: + + + +
# close exiftool:
$exiftoolproc.StandardInput.WriteLine("-stay_open`nFalse`n")
$exiftoolproc.WaitForExit()
# Unregistering events to retrieve process output.
Unregister-Event -SourceIdentifier $exiftoolStdOutEvent.Name
Unregister-Event -SourceIdentifier $exiftoolStdErrEvent.Name
# read StandardError and StandardOutput of exiftool, then print it:
[array]$outputerror = #($exiftoolStdErrBuilder.ToString().Trim().Split("`r`n",[System.StringSplitOptions]::RemoveEmptyEntries))
[string]$outputout = $exiftoolStdOutBuilder.ToString().Trim() -replace '========\ ','' -replace '\[1/1]','' -replace '\ \r\n\ \ \ \ '," - " -replace '{ready}\r\n',''
[array]$outputout = #($outputout.Split("`r`n",[System.StringSplitOptions]::RemoveEmptyEntries))
# + + + + END OF NEW STUFF (2/2) + + + +
Write-Output "Errors:"
foreach($i in $outputerror){
Write-Output $i
}
Write-Output "Standard output:"
foreach($i in $outputout){
Write-Output $i
}
I can confirm that it works for many, many files (at least 1600).
I am trying to rewrite an Add-Content script as a StreamWriter version, reason being that the file is ~140 MB and Add-Content is far too slow.
This is my Add-Content version, which loops through each row until it can find a header row starting FILE| and creates a new file with a filename of the second delimited (by pipe) value in that row. The Add-Content works as intended, but is really slow. It takes 35-40 mins to do it:
Param(
[string]$filepath = "\\fileserver01\Transfer",
[string]$filename = "sourcedata.txt"
)
$Path = $filepath
$InputFile = (Join-Path $Path $filename)
$Reader = New-Object System.IO.StreamReader($InputFile)
while (($Line = $Reader.ReadLine()) -ne $null) {
if ($Line -match 'FILE\|([^\|]+)') {
$OutputFile = "$($matches[1]).txt"
}
Add-Content (Join-Path $Path $OutputFile) $Line
}
I've researched that StreamWriter should be faster. Here is my attempt, but I get the error
The process cannot access the file '\fileserver01\Transfer\datafile1.txt' because it is being used by another process.
Param(
[string]$filepath = "\\fileserver01\Transfer",
[string]$filename = "sourcedata.txt"
)
$Path = $filepath
$InputFile = (Join-Path $Path $filename)
$Reader = New-Object System.IO.StreamReader($InputFile)
while (($Line = $Reader.ReadLine()) -ne $null) {
if ($Line -match 'FILE\|([^\|]+)') {
$OutputFile = "$($matches[1])"
}
$sw = New-Object System.IO.StreamWriter (Join-Path $Path $OutputFile)
$sw.WriteLine($line)
}
I assume it's something to do with using it in my loop.
Sample data:
FILE|datafile1|25/04/17
25044|0001|37339|10380|TT75
25045|0001|37339|10398|TT75
25046|0001|78711|15940|TT75
FILE|datafile2|25/04/17
25047|0001|98745|11263|TT75
25048|0001|96960|13011|TT84
FILE|datafile3|25/04/17
25074|0001|57585|13639|TT84
25075|0001|59036|10495|TT84
FILE|datafile4|25/04/17
25076|0001|75844|13956|TT84
25077|0001|17430|01111|TT84
Desired outcome is 1 file per FILE| heade row using the second delimited value as the file name.
You're creating the writer inside the while loop without ever closing it, thus your code is trying to re-open the already opened output file with every iteration. Close an existing writer and open a new one whenever your filename changes:
while (($Line = $Reader.ReadLine()) -ne $null) {
if ($Line -match 'FILE\|([^\|]+)') {
if ($sw) { $sw.Close(); $sw.Dispose() }
$sw = New-Object IO.StreamWriter (Join-Path $Path $matches[1])
}
$sw.WriteLine($line)
}
if ($sw) { $sw.Close(); $sw.Dispose() }
Note that this assumes that you won't open the same file twice. If the same output file can appear multiple times in the input file you need to open the file for appending. In that case replace
$sw = New-Object IO.StreamWriter (Join-Path $Path $matches[1])
with
$sw = [IO.File]::AppendText((Join-Path $Path $matches[1]))
Note also that the code doesn't do any error handling (e.g. input file doesn't begin with a FILE|... line, input file is empty, etc.). You may want to change that.
I want to know if its bad form to use try blocks to test if a file is locked. Here's the background.
I need to send text output of an application to two serial printers simultaneously. My solution was to use MportMon, and a Powershell script. The way it's supposed to work is the application default prints to the MportMon virtual printer port, which actually makes a uniquely named file in a "dropbox" folder. The powershell script uses a filesystemwatcher to monitor the folder and when a new file is created, it takes the textual content and pushes it out two serial printers, then deletes the file, so as not to fill up the folder. I was having a problem when trying to read the text from the file that the virtual printer created. I found that I was getting errors becasue the file was still locked. To fixed the problem, I used a FSM to impliment the logic and instead of checking for a lock everytime before attempting to get the content from the file, I used a try block that attempts to read content from the file, if it fails, the catch block just reaffirms the state that the FSM is in, and the process is repeated until successful. It seems to work fine, but I've read somewhere that its bad practice. Is there any danger in this method, or is it safe and reliable? Below is my code.
$fsw = New-Object system.io.filesystemwatcher
$q = New-Object system.collections.queue
$path = "c:\DropBox"
$fsw.path = $path
$state = "waitforQ"
[string]$tempPath = $null
Register-ObjectEvent -InputObject $fsw -EventName created -Action {
$q.enqueue( $event.sourceeventargs.fullpath )
}
while($true) {
switch($state)
{
"waitforQ" {
echo "waitforQ"
if ($q.count -gt 0 ) {$state = "retrievefromQ"}
}
"retrievefromQ" {
echo "retrievefromQ"
$tempPath = $q.dequeue()
$state = "servicefile"
}
"servicefile" {
echo " in servicefile "
try
{
$text = Get-Content -ErrorAction stop $tempPath
#echo "in try"
$text | out-printer db1
$text | out-printer db2
echo " $text "
$state = "waitforQ"
rm $tempPath
}
catch
{
#echo "in catch"
$state = "servicefile"
}
}
Default { $state = "waitforQ" }
}
}
I wouldn't say it's bad practice to test a file to see if it's locked, but it's not as clean as checking the handles used by other processes. Personally I'd test the file like you do, but I adjust a few parts to make it safer/better.
That switch-statement looks way to complicated (for me), I'd replace it with a simple if-test. "If files in queue, proceed, if not, wait".
You need to slow down.. You will try to read the file as many times as possible while it's locked. This is a waste of resources since it will take some time for the current application to let it go and save the data to a HDD. Add some pauses. You won't notice them, but your CPU will love them. The same applies when there are no files in the queue.
You might benefit from adding a timeout, like max 50 attempts to read the file, to avoid the script getting stuck if one specific file is never released.
Try:
$fsw = New-Object system.io.filesystemwatcher
$q = New-Object system.collections.queue
$path = "c:\DropBox"
$fsw.path = $path
$MaxTries = 50 #50times * 0,2s sleep = 10sec timeout
[string]$tempPath = $null
Register-ObjectEvent -InputObject $fsw -EventName created -Action {
$q.enqueue( $event.sourceeventargs.fullpath )
}
while($true) {
if($q.Count -gt 0) {
#Get next file in queue
$tempPath = $q.dequeue()
#Read file
$text = $null
$i = 0
while($text -eq $null) {
#If locked, wait and try again
try {
$text = Get-Content -Path $tempPath -ErrorAction Stop
} catch {
$i++
if($i -eq $MaxTries) {
#Max attempts reached. Stops script
Write-Error -Message "Script is stuck on locked file '$tempPath'" -ErrorAction Stop
} else {
#Wait
Start-Sleep -Milliseconds 200
}
}
}
#Print file
$text | Out-Printer db1
$text | Out-Printer db2
echo " $text "
#Remove temp-file
Remove-Item $tempPath
}
#Relax..
Start-Sleep -Milliseconds 500
}
Folks,
Googling shows me lots of folks have this problem, however the answers I'm getting do not seem to work for me. Either that, or I don't understand.
Situation: I have a script that polls and gives a file count. It works great and I pipe it to a text file
Foreach ($Directory in $Directories) {
Write-Output "You have $Results files in that folder" | Out-File "C:\Filecheck.txt" -Append
}
Filecheck looks great. It does the above loop 6 times (as I have 6 directories) and it does the carriage returns.
In email, its all jumbled up. On here, someone suggested I use the out-string, so Ive done this:
$body = GC "C:\Filecheck.txt" | Out-string
I've also seen
$body = GC "C:\Filecheck.txt" -Raw
I get the email fine, but again, its still all one line, with no carriage returns.
Anyone have any idea? I know Im so close.
You could try using the [Environment] newline. I tested with the code below and the e-mail looked good and with the correct line breaks:
$DirectoriesFiles = 2,3,4,5
$newline = [Environment]::NewLine
$body = "List of number of files" + $newline
Foreach ($numOfFiles in $DirectoriesFiles) {
$body += "You have $numOfFiles files in that folder" + $newline
}
$ol = New-Object -comObject Outlook.Application
$Mail = $ol.CreateItem(0)
$Mail.To = "someone"
$Mail.Subject = "some test e-mail"
$Mail.Body = $body
$Mail.save() #or send
For the example's sake I just assumed you have an array with the number of files in the folder, but I think you can understand how to adapt to your context from here. My resulting e-mail looked like this:
List of number of files
You have 2 files in that folder
You have 3 files in that folder
You have 4 files in that folder
You have 5 files in that folder
Thanks for your help. My company email didn't like the format, but I format in html (using ) and utilize IsBodyHTML tag, it works like a charm!
Del "D:\Filecheck.txt"
$Directories = GC "D:\Directory.txt"
Foreach ($Directory in $Directories) {
$Results = (Get-ChildItem $Directory).count
If ($Results -gt 0) {
Write-Output "...You have $Results files stuck in $Directory...<br><br> " | Out-File "D:\Filecheck.txt" -Append
} else {
Write-Output "Phew! We're good, <br><br>" | Out-File "D:\Filecheck.txt" -Append
}
$Results = $null
}
$body = GC "D:\Filecheck.txt"
Add-PSSnapin Microsoft.Exchange.Management.Powershell.Admin -erroraction silentlyContinue
$SmtpClient = new-object system.net.mail.smtpClient
$SmtpServer = "localhost"
$SmtpClient.host = "relay.me.local"
$msg = new-object Net.Mail.MailMessage
$msg.IsBodyHTML = $true
$smtp = new-object Net.Mail.SmtpClient($smtpServer)
$msg.From = "TelluRyesFileCheck#me.you"
$msg.To.Add("me#you.org")
$msg.Subject = "Checking if files exist on 9901/2"
$msg.Body = $body
$SmtpClient.Send($msg)
I have a condition that kicks off a PowerShell script to append a short string to a text file. This condition can fire rapidly, so the file is being written multiple times by the same script. Additionally, a separate script is importing from that text file in batches (less frequently).
Whenever the condition fires very rapidly, I get the error: "The process can not access the file 'file_name' because it is being used by another process." When I do the same append in Python (my main language), I don't get the same error, but I could use some help fixing this in PowerShell.
$action = $args[0]
$output_filename = $args[1]
$item = $args[2]
if ($action -eq 'direct'){
$file_path = $output_filename
$sw = New-Object -typename System.IO.StreamWriter($file_path, "true")
$sw.WriteLine($item)
$sw.Close() }
I have also tried the following instead of StreamWriter, but apparently the performance is weak for Add-Content and Out-File (http://sqlblog.com/blogs/linchi_shea/archive/2010/01/04/add-content-and-out-file-are-not-for-performance.aspx):
out-file -Append -FilePath $file_path -InputObject $item }
Might try something like this:
while ($true)
{
Try {
[IO.File]::OpenWrite($file_path).close()
Add-Content -FilePath $file_path -InputObject $item
Break
}
Catch {}
}