Use an XmlWriter to fill the Powershell output stream line by line - powershell

It is possible to "open" the Powershell pipeline output stream as a System.IO.Stream or as TextWriter/XmlWriter?
Basically I want to convert some pipeline input strings or objects to Xml text using the services of XmlWriter like namespace handling and formatting capabilities. I want so send the result of conversion, incrementally, to the pipeline output, and not build the full output in one big string and send the whole output the the end.

If I'm understanding you correctly won't something really simple as this do what you want?
$XmlWriter = New-Object System.XMl.XmlTextWriter("c:\temp\output.xml",$Null)
Get-Content "file_with_input.txt" | foreach-object {[do your stuff with the XmlWriter]}
The lines from the input file will arrive one at a time and you can handle them as you please.

Related

Can I Pipe Powershell output to an accelerator?

I've obtained a file path to an xml-resource, by interrogating task scheduler arguments.
I'd like to pipe these files paths to [xml], to return data using XPath.
Online I see accelerators and variables are used, eg
$xml = [XML](Get-Content .\Test.xml)
tried piping to convert-to-xml, but that's an XML object containing the filepath, so I need to convert to [xml] - hoping to do this in the pipeline, potentially for > 1 xmldocument
Is it possible to pipe to [typeaccelerators] ?
Should I be piping to New-Object, or Tee-Variable, as required?
I hope to eventually be able to construct a one-liner to interrogate several nodes (eg LastRan, LastResult)
currently I have this, which only works for one
([xml](Get-Content ((Get-ScheduledTask -TaskPath *mytask* | select -First 1).Actions.Arguments | % {$_.Split('"')[-2]}))).MyDocument.LastRan
returns the value of LastRan, from MyDocument node.
Thanks in advance 👍
If you want to take pipeline input you need to make a function and set the parameter attribute ValueFromPipeline
Function Convert-XML {
Param(
[Parameter(ValueFromPipeline)]$xml
)
process{
[xml]$xml
}
}
Then you could take the content of an xml file (all at once, not line by line)
Get-Content .\Test.xml -Raw | Convert-XML
Of course to get your one liner you'd probably want to add the logic for that in the function. However this is how you'd handle pipeline input.

Is it possible for PowerShell to write message to multiple target

Is there a usage of pipeline for PowerShell to Write-Output & write to file in the same time, without using a custom wrapping function?
Take a look at Tee-Object. From help:
The Tee-Object cmdlet sends the output of a command in two directions
(like the letter "T"). It stores the output in a file or variable and
also sends it down the pipeline. If Tee-Object is the last command in
the pipeline, the command output is displayed in the console.

handling a CSV with line feed characters in a column in powershell

Currently, I have a system which creates a delimited file like the one below in which I've mocked up the extra line feeds which are within the columns sporadically.
Column1,Column2,Column3,Column4
Text1,Text2[LF],text3[LF],text4[CR][LF]
Text1,Text2[LF][LF],text3,text4[CR][LF]
Text1,Text2,text3[LF][LF],text4[CR][LF]
Text1,Text2,text3[LF],text4[LF][LF][CR][LF]
I've been able to remove the line feeds causing me concern by using Notepad++ using the following REGEX to ignore the valid carriage return/Line feed combinations:
(?<![\r])[\n]
I am unable however to find a solution using powershell, because I think when I get-content for the csv file the line feeds within the text fields are ignored and the value is stored as a separate object in the variable assigned to the get-content action. My question is how can I apply the regex to the csv file using replace if the cmdlet ignores the line feeds when loading the data?
I've also tried the following method below to load the content of my csv which doesn't work either as it just results in one long string, which would be similar to using -join(get-content).
[STRING]$test = [io.file]::ReadAllLines('C:\CONV\DataOutput.csv')
$test.replace("(?<![\r])[\n]","")
$test | out-file .\DataOutput_2.csv
Nearly there, may I suggest just 3 changes:
use ReadAllText(…) instead of ReadAllLines(…)
use -replace … instead of .Replace(…), only then will the first argument be treated as a regex
do something with the replacement result (e.g. assign it back to $test)
Sample code:
[STRING]$test = [io.file]::ReadAllText('C:\CONV\DataOutput.csv')
$test = $test -replace '(?<![\r])[\n]',''
$test | out-file .\DataOutput_2.csv

Powershell script to extract logged errors

I have a lot of log files that I wish to extract the distinct error message from for a specific trace writer.
The log files are SharePoint ULS logs.
The headings are:
Timestamp
Process
TID
Area
Category
EventID
Level
Message
Correlation
So given a specific process name I want all distinct Messages.
If I was to use SQL I would write something like this:
select Distinct Message from where Process like 'myprocessname'
I'd like to do this with powershell across a whole set of log files.
I believe the ULS log is tab or space delimited.
You might be interested in Microsoft's Log Parser which essentially lets you run SQL like statements across a set of log files. You can also use this with Powershell. Here are some links:
Analyze Web Stats with Log Parser
Integrating Microsoft Log Parser in Windows Powershell
Logparser and Powershell
Assuming the log file isn't too huge, you can read the contents in using Import-Csv like so:
$data = Import-Csv .\log.csv -Delimiter "`t"
I'm assuming the delimiter is tab since it is likely any message will contain spaces. Once you have the log data you can use the standard PowerShell query operators like so:
$data | Where {$_.Process -eq 'processname.exe'} | Select Message -Unique
If the log file is huge (such that Import-Csv eats up too much memory) then I would either try using Log Parser or use a regex and parse the log, one line at a time.

Output from external exe and my custom objects in powershell

(Sorry for strange title, haven't come up with anything better..)
Background
I use nunit-console to test my assemblies. It is called like this (simplified):
function Test-ByNunit {
param($assembly, $tempFile = 'c:\temp\nunit.xml')
& <path-to-nunit-console> $assembly /nologo /xml:$tempFile #othparam
}
Test-ByNunit c:\temp\myAssembly.dll
I have no problem with this, it works fine.
Problem
nunit-console should output its messages as so far. That means - if not captured, it should send them to screen, otherwise it could be stored in file (Test-ByNunit $dll | set-content path)
I'd like to return somehow information about each test-case that was run (the info is stored in the /xml file) in form of array of PSObject objects.
Question
Do you have any tip how to return the info and still leave nunit output its messages?
If I simply write it to output, the function will return array of strings (output from nunit-console) and array of my objects. Then redirection to output file will store my objects as well, but I'd like just display them in console window.
The only possibility that could work is to use [ref], but I'd like to avoid it.
(this is not only about nunit-console, but of course it is general question)
If I got the task right then Out-Host should help:
function Get-WithOutHost {
# external output is redirected to the host
cmd /c dir | Out-Host
# normal output to be reused later
Get-Process
}
# call
$result = Get-WithOutHost
# now $result holds the data to use, external output is on the screen
EDIT: of course this is not enough if external output should be reused, too, not just shown