Log and read only last 10 minutes - powershell

I am begginer in Powershell and I have a problem with script.
I have a log and I need to send an email notification with an error. I would like to plan a task (TASK SCHEDULE) that will run my script regularly every ten minutes. This script verifies the last lines written in the last ten minutes. If the word ERROR is found, it will send an e-mail with this line where is the word ERROR.
My log:
2022-02-08 12:04:35,152 [105] ERROR RSeC.NET.RedundantHttpClient - No server found
2022-02-08 14:28:51,317 [4] DEBUG RSeC.NET.RSeC - Logging initialised
2022-02-08 14:28:53,835 [4] DEBUG RSeC.NET.JsonParser - Response binary data decoded. Size=424132
2022-02-08 14:29:20,494 [105] DEBUG RSeC.NET.RSeC - Logging initialised
2022-02-08 15:38:35,152 [105] ERROR RSeC.NET.RedundantHttpClient - No server found
2022-03-08 15:28:51,317 [4] DEBUG RSeC.NET.RSeC - Logging initialised
2022-03-08 15:28:53,835 [4] DEBUG RSeC.NET.JsonParser - Response binary data decoded. Size=424132
2022-03-08 15:39:20,494 [105] DEBUG RSeC.NET.RSeC - Logging initialised
2022-03-08 15:39:35,152 [105] ERROR RSeC.NET.RedundantHttpClient - No server found
My script :-(
$file = "C:\Soubory\esel.log"
$date = (Get-Date).AddDays(-50).Date
$cont = Get-Content -Path $file | Select-String -Pattern $date | Select-String "ERROR" | Measure-Object -line
Foreach-Object {
if ($cont -match "ERROR")
{
$kontent = Get-Content -Path $file | Select-String -Pattern $date | Select-String "ERROR" | Measure-Object -line
Write-Host $cont
}
else
{
#NOTING
}
}
Thank You for help
GILD

I think Lee_Dailey gave you the answer in his comment.
Simply figure out the worst-case scenario to read the bottom number of lines of the file, where you can be certain the last ten minutes are in there.
Then do:
$maxLines = 20 # just a guess here, but you can narrow the number of lines to read by trial and error
$lastTenMinutes = (Get-Date).AddMinutes(-10)
$errorLines = Get-Content -Path 'C:\Soubory\esel.log' -Tail $maxLines |
Where-Object { [datetime]($_ -split ',')[0] -gt $lastTenMinutes -and $_ -match 'ERROR' }
# test if there were error lines found
if (#($errorLines).Count) {
# send your email alert.
# If this email is in HTML format, use: $errorLines -join '<br>'
# if the email is plain text, join with newlines: $errorLines -join [environment]::NewLine
# for demo just output to console
Write-Host ("Errors found:`r`n{0}" -f ($errorLines -join [environment]::NewLine))
}
else {
Write-Host "No error lines found" -ForegroundColor Green
}
On my Dutch locale, [datetime]($_ -split ',')[0] parses the date correctly, but on your machine you may have to use [datetime]::ParseExact(($_ -split ',')[0], 'yyyy-MM-dd HH:mm:ss', $null)

It looks like a csv without the header, and I can compare the first column like it's a date and time, so:
import-csv log -header time,message |
where { (get-date).AddMinutes(-10) -lt $_.time -and
$_.message -match 'error' }
time message
---- -------
2022-03-09 10:11:35 152 [105] ERROR RSeC.NET.RedundantHttpClient - No server found
I would use the windows event log for easier filtering.

As I said in the comment on your question, "figure out what is the largest number of entries you could ever expect in 10 minutes". When you have that number, change the "20" in line "$MinLinesToGet = 20" in the code below to that value. Also, change the line "$MaxLengthOfALine = 100" so that it has the length of the longest line you expect to see.
If for some reason you need more than 10 minutes, change the value in the line "$MinutesOld = 10".
The code:
Uses ReadBytesFromFileEnd to read $ByteCount bytes from the end of $FilePath.
Uses ReadLinesFromFileEnd to read $MinLineCount of lines, each are expected to have less than $MaxLineLength, from the end of $FilePath. In reality, it should read several lines more than we want - which is a good thing.
Uses "[System.Text.Encoding]::UTF8.GetString" to convert the bytes to a string. And then uses Split to make an array of strings that is saved in $Lines.
The UTF8 before GetString is a type of encoding, and is the most likely file encoding the log file is in, so you shouldn't have to change anything. But, if there are problems, find the line "<## >" and remove the space when testing. The code will then give the raw text lines that are being returned by ReadLinesFromFileEnd. At that point, you can try the other encodings listed in the comment line above the GetString statement. Most encoding have a character size of 1, but in some unlikely, or absurd case where your log file has a larger character size, then change the line "$CharSize = 1" as needed.
You didn't provide your regex you were using, so I built my own and used the switch statement to loop through all the lines that are returned by ReadLinesFromFileEnd. I made the assumption that comma after the date WAS NOT part of a comma delimited line, but still part of the time stamp, and the numbers just afterwards is the milliseconds. I don't know what the numbers in the brackets are, so I just gave it the name "Code".
Those lines older than 10 minutes are used to build a PSObject that you can use in later code.
The last 3 lines are an example using the returned log entries in a Write-Host statement.
This worked well in my testing, but never know till the code is actually tried in the real world.
function ReadBytesFromFileEnd{
param (
[Parameter(Mandatory = $true, Position = 0)]
[string]$FilePath,
[Parameter(Mandatory = $true, Position = 1)]
[int]$ByteCount
)
$fs = [IO.File]::OpenRead($FilePath) # Open file
if($ByteCount -gt $fs.Length) {$ByteCount = $fs.Length} # Prevent reading more bytes than exist
$null = $fs.Seek(-$ByteCount, [System.IO.SeekOrigin]::End) # Position for reading
$Return = new-object Byte[] $ByteCount # Define the buffer $Return
$fs.Read($Return, 0, $ByteCount) | Out-Null # Fill the buffer
$fs.Close() # Close the file
return $Return
}
function ReadLinesFromFileEnd{
param (
[Parameter(Mandatory = $true, Position = 0)]
[string]$FilePath,
[Parameter(Mandatory = $true, Position = 1)]
[int]$MinLineCount,
[Parameter(Mandatory = $true, Position = 2)]
[int]$MaxLineLength
)
$CharSize = 1
[int]$ByteCount = 1.25 * ($MinLineCount + 1) * $MaxLineLength * $CharSize
# Encoding Options: ASCII, BigEndianUnicode, Default, Unicode, UTF32, UTF7,UTF8
$null, $Return = [System.Text.Encoding]::UTF8.GetString((ReadBytesFromFileEnd $FilePath $ByteCount)).Split(#("`n","`r"),[System.StringSplitOptions]::RemoveEmptyEntries)
return $Return
}
$MinLinesToGet = 20
$MaxLengthOfALine = 100
$MinutesOld = 10
$Lines = ReadLinesFromFileEnd 'C:\Soubory\esel.log' $MinLinesToGet $MaxLengthOfALine
<## >
$Lines
exit
#>
$Now = Get-Date
$RecentLogs = switch -Regex ($Lines) {
'^(?<DateTime>\d{4}-\d\d-\d\d\s+\d\d:\d\d:\d\d,\d+)\s*\[(?<Code>\d+)\]\s*(?<Description>.*)$' {
$DateTime = [DateTime]::ParseExact($Matches.DateTime, 'yyyy-MM-dd HH:mm:ss,fff', $null)
if ($Now.Subtract($DateTime).TotalMinutes -lt $MinutesOld) {
New-Object PSObject -Property #{
DateTime = $DateTime
Code = $Matches.Code
Description = $Matches.Description
}
}
continue
}
Default {
continue
}
}
$RecentLogs | ForEach-Object {
Write-Host "$($_.DateTime) [$($_.Code)] $($_.Description)"
}
EDIT:
You may want to check for lines that are not caught by the Regex. If so, place the following lines between the Default { statement and the coninue statement.
Write-Color "Regex failed to match: " -ForegroundColor Red -NoNewLine
Write-Color "$_" -ForegroundColor Yellow

Related

When running a command in powershell how can I prepend a date/time for all output on stdout/stderr?

Is it possible in powershell when running a script to add a date prefix to all log output?
I know that it would be possible to do something like:
Write-Host "$(Get-Date -format 'u') my log output"
But I dont want to have to call some function for each time we output a line. Instead I want to modify all output when running any script or command and have the time prefix for every line.
To insert a date in front of all output, that is stdout, stderr and the PowerShell-specific streams, you can use the redirection operator *>&1 to redirect (merge) all streams of a command or scriptblock, pipe to Out-String -Stream to format the stream objects into lines of text and then use ForEach-Object to process each line and prepend the date.
Let me start with a simple example, a more complete solution can be found below.
# Run a scriptblock
&{
# Test output to all possible streams, using various formatting methods.
# Added a few delays to test if the final output is still streaming.
"Write $($PSStyle.Foreground.BrightGreen)colored`ntext$($PSStyle.Reset) to stdout"
Start-Sleep -Millis 250
[PSCustomObject]#{ Answer = 42; Question = 'What?' } | Format-Table
Start-Sleep -Millis 250
Get-Content -Path not-exists -EA Continue # produce a non-terminating error
Start-Sleep -Millis 250
Write-Host 'Write to information stream'
Start-Sleep -Millis 250
Write-Warning 'Write to warning stream'
Start-Sleep -Millis 250
Write-Verbose 'Write to verbose stream' -Verbose
Start-Sleep -Millis 250
$DebugPreference = 'Continue' # To avoid prompt, needed for Windows Powershell
Write-Debug 'Write to debug stream'
} *>&1 | Out-String -Stream | ForEach-Object {
# Add date in front of each output line
$date = Get-Date -Format "yy\/MM\/dd H:mm:ss"
foreach( $line in $_ -split '\r?\n' ) {
"$($PSStyle.Reset)[$date] $line"
}
}
Output in PS 7.2 console:
Using Out-String we use the standard PowerShell formatting system to have the output look normally, as it would appear without redirection (e. g. things like tables stay intact). The -Stream parameter is crucial to keep the streaming output behaviour of PowerShell. Without this parameter, output would only be received once the whole scriptblock has completed.
While the output already looks quite nice, there are some minor issues:
The verbose, warning and debug messages are not colored as usual.
The word "text" in the 2nd line should be colored in green. This isn't working due to the use of $PSStyle.Reset. When removed, the colors of the error message leak into the date column, which looks far worse. It can be fixed, but it is not trivial.
The line wrapping isn't right (it wraps into the date column in the middle of the output).
As a more general, reusable solution I've created a function Invoke-WithDateLog that runs a scriptblock, captures all of its output, inserts a date in front of each line and outputs it again:
Function Invoke-WithDateLog {
[CmdletBinding()]
param(
[Parameter(Mandatory)]
[scriptblock] $ScriptBlock,
[Parameter()]
[string] $DateFormat = '[yy\/MM\/dd H:mm:ss] ',
[Parameter()]
[string] $DateStyle = $PSStyle.Foreground.BrightBlack,
[Parameter()]
[switch] $CatchExceptions,
[Parameter()]
[switch] $ExceptionStackTrace,
[Parameter()]
[Collections.ICollection] $ErrorCollection
)
# Variables are private so they are not visible from within the ScriptBlock.
$private:ansiEscapePattern = "`e\[[0-9;]*m"
$private:lastFmt = ''
& {
if( $CatchExceptions ) {
try { & $scriptBlock }
catch {
# The common parameter -ErrorVariable doesn't work in scripted cmdlets, so use our own error variable parameter.
if( $null -ne $ErrorCollection ) {
$null = $ErrorCollection.Add( $_ )
}
# Write as regular output, colored like an error message.
"`n" + $PSStyle.Formatting.Error + "EXCEPTION ($($_.Exception.GetType().FullName)):`n $_" + $PSStyle.Reset
# Optionally write stacktrace. Using the -replace operator we indent each line.
Write-Debug ($_.ScriptStackTrace -replace '^|\r?\n', "`n ") -Debug:$ExceptionStackTrace
}
}
else {
& $scriptBlock
}
} *>&1 | ForEach-Object -PipelineVariable record {
# Here the $_ variable is either:
# - a string in case of simple output
# - an instance of one of the System.Management.Automation.*Record classes (output of Write-Error, Write-Debug, ...)
# - an instance of one of the Microsoft.PowerShell.Commands.Internal.Format.* classes (output of a Format-* cmdlet)
if( $_ -is [System.Management.Automation.ErrorRecord] ) {
# The common parameter -ErrorVariable doesn't work in scripted cmdlets, so use our own error variable parameter.
if( $null -ne $ErrorCollection ) {
$null = $ErrorCollection.Add( $_ )
}
}
$_ # Forward current record
} | Out-String -Stream | ForEach-Object {
# Here the $_ variable is always a (possibly multiline) string of formatted output.
# Out-String doesn't add any ANSI escape codes to colorize Verbose, Warning and Debug messages,
# so we have to do it by ourselfs.
$overrideFmt = switch( $record ) {
{ $_ -is [System.Management.Automation.VerboseRecord] } { $PSStyle.Formatting.Verbose; break }
{ $_ -is [System.Management.Automation.WarningRecord] } { $PSStyle.Formatting.Warning; break }
{ $_ -is [System.Management.Automation.DebugRecord] } { $PSStyle.Formatting.Debug; break }
}
# Prefix for each line. It resets the ANSI escape formatting before the date.
$prefix = $DateStyle + (Get-Date -Format $DateFormat) + $PSStyle.Reset
foreach( $line in $_ -split '\r?\n' ) {
# Produce the final, formatted output.
$prefix + ($overrideFmt ?? $lastFmt) + $line + ($overrideFmt ? $PSStyle.Reset : '')
# Remember last ANSI escape sequence (if any) of current line, for cases where formatting spans multiple lines.
$lastFmt = [regex]::Match( $line, $ansiEscapePattern, 'RightToLeft' ).Value
}
}
}
Usage example:
# To differentiate debug and verbose output from warnings
$PSStyle.Formatting.Debug = $PSStyle.Foreground.Yellow
$PSStyle.Formatting.Verbose = $PSStyle.Foreground.BrightCyan
Invoke-WithDateLog -CatchExceptions -ExceptionStackTrace {
"Write $($PSStyle.Foreground.Green)colored`ntext$($PSStyle.Reset) to stdout"
[PSCustomObject]#{ Answer = 42; Question = 'What?' } | Format-Table
Get-Content -Path not-exists -EA Continue # produce a non-terminating error
Write-Host 'Write to information stream'
Write-Warning 'Write to warning stream'
Write-Verbose 'Write to verbose stream' -Verbose
Write-Debug 'Write to debug stream' -Debug
throw 'Critical error'
}
Output in PS 7.2 console:
Notes:
The code requires PowerShell 7+.
The date formatting can be changed through parameters -DateFormat (see formatting specifiers) and -DateStyle (ANSI escape sequence for coloring).
Script-terminating errors such as created by throwing an exception or using Write-Error -EA Stop, are not logged by default. Instead they bubble up from the scriptblock as usual. You can pass parameter -CatchExceptions to catch exceptions and log them like regular non-terminating errors. Pass -ExceptionStackTrace to also log the script stacktrace, which is very useful for debugging.
Scripted cmdlets such as this one don't set the automatic variable $? and also don't add errors to the automatic $Error variable when an error is written via Write-Error. Neither the common parameter -ErrorVariable works. To still be able to collect error information I've added parameter -ErrorCollection which can be used like this:
$scriptErrors = [Collections.ArrayList]::new()
Invoke-WithDateLog -CatchExceptions -ExceptionStackTrace -ErrorCollection $scriptErrors {
Write-Error 'Write to stderr' -EA Continue
throw 'Critical error'
}
if( $scriptErrors ) {
# Outputs "Number of errors: 2"
"`nNumber of errors: $($scriptErrors.Count)"
}
The objects generated by Write-Host already come with a timestamp, you can use Update-TypeData to override the .ToString() Method from the InformationRecord Class and then redirect the output from the Information Stream to the Success Stream.
Update-TypeData -TypeName System.Management.Automation.InformationRecord -Value {
return $this.TimeGenerated.ToString('u') + $this.MessageData.Message.PadLeft(10)
} -MemberType ScriptMethod -MemberName ToString -Force
'Hello', 'World', 123 | Write-Host 6>&1

Windows PowerShell: How to parse the log file?

I have an input file with below contents:
27/08/2020 02:47:37.365 (-0516) hostname12 ult_licesrv ULT 5 LiceSrv Main[108 00000 Session 'session1' (from 'vmpms1\app1#pmc21app20.pm.com') request for 1 additional licenses for module 'SA-XT' - 1 licenses have been allocated by concurrent usage category 'Unlimited' (session module usage now 1, session category usage now 1, total module concurrent usage now 1, total category usage now 1)
27/08/2020 02:47:37.600 (-0516) hostname13 ult_licesrv ULT 5 LiceSrv Main[108 00000 Session 'sssion2' (from 'vmpms2\app1#pmc21app20.pm.com') request for 1 additional licenses for module 'SA-XT-Read' - 1 licenses have been allocated by concurrent usage category 'Floating' (session module usage now 2, session category usage now 2, total module concurrent usage now 1, total category usage now 1)
27/08/2020 02:47:37.115 (-0516) hostname141 ult_licesrv CMN 5 Logging Housekee 00000 Deleting old log file 'C:\Program Files\PMCOM Global\License Server\diag_ult_licesrv_20200824_011130.log.gz' as it exceeds the purge threashold of 72 hours
27/08/2020 02:47:37.115 (-0516) hostname141 ult_licesrv CMN 5 Logging Housekee 00000 Deleting old log file 'C:\Program Files\PMCOM Global\License Server\diag_ult_licesrv_20200824_021310.log.gz' as it exceeds the purge threashold of 72 hours
27/08/2020 02:47:37.625 (-0516) hostname150 ult_licesrv ULT 5 LiceSrv Main[108 00000 Session 'session1' (from 'vmpms1\app1#pmc21app20.pm.com') request for 1 additional licenses for module 'SA-XT' - 1 licenses have been allocated by concurrent usage category 'Unlimited' (session module usage now 2, session category usage now 1, total module concurrent usage now 2, total category usage now 1)
I need to generate and output file like below:
Date,time,hostname,session_module_usage,session_category_usage,module_concurrent_usage,total_category_usage
27/08/2020,02:47:37.365 (-0516),hostname12,1,1,1,1
27/08/2020,02:47:37.600 (-0516),hostname13,2,2,1,1
27/08/2020,02:47:37.115 (-0516),hostname141,0,0,0,0
27/08/2020,02:47:37.115 (-0516),hostname141,0,0,0,0
27/08/2020,02:47:37.625 (-0516),hostname150,2,1,2,1
The output data order is: Date,time,hostname,session_module_usage,session_category_usage,module_concurrent_usage,total_category_usage.
Put 0,0,0,0 if no entry for session_module_usage,session_category_usage,module_concurrent_usage,total_category_usage
I need to get content from the input file and write the output to another file.
Update
I have created a file input.txt in F drive and pasted the log details into it.
Then I form an array by splitting the file content when a new line occurs like below.
$myList = (Get-Content -Path F:\input.txt) -split '\n'
Now I got 5 items in my array myList. Then I replace the multiple blank spaces with a single blank space and formed a new array by splitting each element by blank space. Then I print the 0 to 3 array elements. Now I need to add the end values (session_module_usage,session_category_usage,module_concurrent_usage,total_category_usage).
PS C:\Users\user> $myList = (Get-Content -Path F:\input.txt) -split '\n'
PS C:\Users\user> $myList.Length
5
PS C:\Users\user> $myList = (Get-Content -Path F:\input.txt) -split '\n'
PS C:\Users\user> $myList.Length
5
PS C:\Users\user> for ($i = 0; $i -le ($myList.length - 1); $i += 1) {
>> $newList = ($myList[$i] -replace '\s+', ' ') -split ' '
>> $newList[0]+','+$newList[1]+' '+$newList[2]+','+$newList[3]
>> }
27/08/2020,02:47:37.365 (-0516),hostname12
27/08/2020,02:47:37.600 (-0516),hostname13
27/08/2020,02:47:37.115 (-0516),hostname141
27/08/2020,02:47:37.115 (-0516),hostname141
27/08/2020,02:47:37.625 (-0516),hostname150
If you really need to filter on the granularity that you're looking for, then you may need to use regex to filter the lines.
This would assume that the rows have similarly labeled lines before the values you're looking for, so keep that in mind.
[System.Collections.ArrayList]$filteredRows = #()
$log = Get-Content -Path C:\logfile.log
foreach ($row in $log) {
$rowIndex = $log.IndexOf($row)
$date = ([regex]::Match($log[$rowIndex],'^\d+\/\d+\/\d+')).value
$time = ([regex]::Match($log[$rowIndex],'\d+:\d+:\d+\.\d+\s\(\S+\)')).value
$hostname = ([regex]::Match($log[$rowIndex],'(?<=\d\d\d\d\) )\w+')).value
$sessionModuleUsage = ([regex]::Match($log[$rowIndex],'(?<=session module usage now )\d')).value
if (!$sessionModuleUsage) {
$sessionModuleUsage = 0
}
$sessionCategoryUsage = ([regex]::Match($log[$rowIndex],'(?<=session category usage now )\d')).value
if (!$sessionCategoryUsage) {
$sessionCategoryUsage = 0
}
$moduleConcurrentUsage = ([regex]::Match($log[$rowIndex],'(?<=total module concurrent usage now )\d')).value
if (!$moduleConcurrentUsage) {
$moduleConcurrentUsage = 0
}
$totalCategoryUsage = ([regex]::Match($log[$rowIndex],'(?<=total category usage now )\d')).value
if (!$totalCategoryUsage) {
$totalCategoryUsage = 0
}
$hash = [ordered]#{
Date = $date
time = $time
hostname = $hostname
session_module_usage = $sessionModuleUsage
session_category_usage = $sessionCategoryUsage
module_concurrent_usage = $moduleConcurrentUsage
total_category_usage = $totalCategoryUsage
}
$rowData = New-Object -TypeName 'psobject' -Property $hash
$filteredRows.Add($rowData) > $null
}
$csv = $filteredRows | convertto-csv -NoTypeInformation -Delimiter "," | foreach {$_ -replace '"',''}
$csv | Out-File C:\results.csv
What essentially needs to happen is that we need to get-content of the log, which returns an array with each item terminated on a newline.
Once we have the rows, we need to grab the values via regex
Since you want zeroes in some of the items if those values don't exist, I have if statements that assign '0' if the regex returns nothing
Finally, we add each filtered item to a PSObject and append that object to an array of objects in each iteration.
Then export to a CSV.
You can probably pick apart the lines with a regex and substrings easily enough. Basically something like the following:
# Iterate over the lines of the input file
Get-Content F:\input.txt |
ForEach-Object {
# Extract the individual fields
$Date = $_.Substring(0, 10)
$Time = $_.Substring(12, $_.IndexOf(')') - 11)
$Hostname = $_.Substring(34, $_.IndexOf(' ', 34) - 34)
$session_module_usage = 0
$session_category_usage = 0
$module_concurrent_usage = 0
$total_category_usage = 0
if ($_ -match 'session module usage now (\d+), session category usage now (\d+), total module concurrent usage now (\d+), total category usage now (\d+)') {
$session_module_usage = $Matches[1]
$session_category_usage = $Matches[2]
$module_concurrent_usage = $Matches[3]
$total_category_usage = $Matches[4]
}
# Create custom object with those properties
New-Object PSObject -Property #{
Date = $Date
time = $Time
hostname = $Hostname
session_module_usage = $session_module_usage
session_category_usage = $session_category_usage
module_concurrent_usage = $module_concurrent_usage
total_category_usage = $total_category_usage
}
} |
# Ensure column order in output
Select-Object Date,time,hostname,session_module_usage,session_category_usage,module_concurrent_usage,total_category_usage |
# Write as CSV - without quotes
ConvertTo-Csv -NoTypeInformation |
ForEach-Object { $_ -replace '"' } |
Out-File F:\output.csv
Whether to pull the date, time, and host name from the line with substrings or regex is probably a matter of taste. Same goes for how strict the format must be matched, but that to me mostly depends on how rigid the format is. For more free-form things where different lines would match different regexes, or multiple lines makes up a single record, I also quite like switch -Regex to iterate over the lines.

is there a simple way to output to xlsx?

I am trying to output a query from a DB to a xlsx but it takes so much time to do this because there about 20,000 records to process, is there a simpler way to do this?
I know there is a way to do it for csv but im trying to avoid that, because if the records had any comma is going to take it as a another column and that would mess with the info
this is my code
$xlsObj = New-Object -ComObject Excel.Application
$xlsObj.DisplayAlerts = $false
$xlsWb = $xlsobj.Workbooks.Add(1)
$xlsObj.Visible = 0 #(visible = 1 / 0 no visible)
$xlsSh = $xlsWb.Worksheets.Add([System.Reflection.Missing]::Value, $xlsWb.Worksheets.Item($xlsWb.Worksheets.Count))
$xlsSh.Name = "QueryResults"
$DataSetTable= $ds.Tables[0]
Write-Output "DATA SET TABLE" $DataSetTable
[Array] $getColumnNames = $DataSetTable.Columns | SELECT *
Write-Output "COLUMN NAMES" $DataSetTable.Rows[0]
[Int] $RowHeader = 1
foreach ($ColH in $getColumnNames)
{
$xlsSh.Cells.item(1, $RowHeader).font.bold = $true
$xlsSh.Cells.item(1, $RowHeader) = $ColH.ColumnName
Write-Output "Nombre de Columna"$ColH.ColumnName
$RowHeader++
}
[Int] $rowData = 2
[Int] $colData = 1
foreach ($rec in $DataSetTable.Rows)
{
foreach ($Coln in $getColumnNames)
{
$xlsSh.Cells.NumberFormat = "#"
$xlsSh.Cells.Item($rowData, $colData) = $rec.$($Coln.ColumnName).ToString()
$ColData++
}
$rowData++; $ColData = 1
}
$xlsRng = $xlsSH.usedRange
[void] $xlsRng.EntireColumn.AutoFit()
#Se elimina la pestaña Sheet1/Hoja1.
$xlsWb.Sheets(1).Delete() #Versión 02
$xlsFile = "directory of the file"
[void] $xlsObj.ActiveWorkbook.SaveAs($xlsFile)
$xlsObj.Quit()
Start-Sleep -Milliseconds 700
While ([System.Runtime.Interopservices.Marshal]::ReleaseComObject($xlsRng)) {''}
While ([System.Runtime.Interopservices.Marshal]::ReleaseComObject($xlsSh)) {''}
While ([System.Runtime.Interopservices.Marshal]::ReleaseComObject($xlsWb)) {''}
While ([System.Runtime.Interopservices.Marshal]::ReleaseComObject($xlsObj)) {''}
[gc]::collect() | Out-Null
[gc]::WaitForPendingFinalizers() | Out-Null
$oraConn.Close()
I'm trying to avoid [CSV files], because if the records had any comma is going to take it as a another column and that would mess with the info
That's only the case if you try to construct the output format manually. Builtin commands like Export-Csv and ConvertTo-Json will automatically quote the values as necessary:
PS C:\> $customObject = [pscustomobject]#{ID = 1; Name = "Solis, Heber"}
PS C:\> $customObject
ID Name
-- ----
1 Solis, Heber
PS C:\> $customObject |ConvertTo-Csv -NoTypeInformation
"ID","Name"
"1","Solis, Heber"
Notice, in the example above, how:
The string value assigned to $customObject.Name does not contain any quotation marks, but
In the output from ConvertTo-Csv we see values and headers clearly enclosed in quotation marks
PowerShell automatically enumerates the row data when you pipe a [DataTable] instance, so creating a CSV might (depending on the contents) be as simple as:
$ds.Tables[0] |Export-Csv table_out.csv -NoTypeInformation
What if you want TAB-separated values (or any other non-comma separator)?
The *-Csv commands come with a -Delimiter parameter to which you can pass a user-defined separator:
# This produces semicolon-separated values
$data |Export-Csv -Path output.csv -Delimiter ';'
I usually try and refrain from recommending specific modules libraries, but if you insist on writing to XSLX I'd suggest checking out ImportExcel (don't let the name fool you, it does more than import from excel, including exporting and formatting data from PowerShell -> XSLX)

How to get output in desired encoding scheme using powershell out-fil

I have a requirement, in which I need to do read line by line, and then do string/character replacement in a datafile having data in windows latin 1.
I've written this powershell (my first one) initially using out-file -encoding option. However the output file thus created was doing some character translation. Then I searched and came across WriteAllLines, but I'm unable to use it in my code.
$encoding =[Text.Encoding]::GetEncoding('iso-8859-1')
$pdsname="ABCD.XYZ.PQRST"
$datafile="ABCD.SCHEMA.TABLE.DAT"
Get-Content ABCD.SCHEMA.TABLE.DAT | ForEach-Object {
$matches = [regex]::Match($_,'ABCD')
$string_to_be_replaced=$_.substring($matches.Index,$pdsname.Length+10)
$string_to_be_replaced="`"$string_to_be_replaced`""
$member = [regex]::match($_,"`"$pdsname\(([^\)]+)\)`"").Groups[1].Value
$_ -replace $([regex]::Escape($string_to_be_replaced)),$member
} | [System.IO.File]::WriteAllLines("C:\Users\USer01", "ABCD.SCHEMA.TABLE.NEW.DAT", $encoding)
With the help of an answer from #Gzeh Niert, I updated my above script. However, when I execute the script the output file being generated by the script has just the last record, as it was unable to append, and it did an overwrite, I tried using System.IO.File]::AppendAllText, but this strangely creates a larger file, and has only the last record. In short its likely that empty lines are being written.
param(
[String]$datafile
)
$pdsname="ABCD.XYZ.PQRST"
$encoding =[Text.Encoding]::GetEncoding('iso-8859-1')
$datafile = "ABCD.SCHEMA.TABLE.DAT"
$datafile2="ABCD.SCHEMA.TABLE.NEW.DAT"
Get-Content $datafile | ForEach-Object {
$matches = [regex]::Match($_,'ABCD')
if($matches.Success) {
$string_to_be_replaced=$_.substring($matches.Index,$pdsname.Length+10)
$string_to_be_replaced="`"$string_to_be_replaced`""
$member = [regex]::match($_,"`"$pdsname\(([^\)]+)\)`"").Groups[1].Value
$replacedContent = $_ -replace $([regex]::Escape($string_to_be_replaced)),$member
[System.IO.File]::AppendAllText($datafile2, $replacedContent, $encoding)
}
else {
[System.IO.File]::AppendAllText($datafile2, $_, $encoding)
}
#[System.IO.File]::WriteAllLines($datafile2, $replacedContent, $encoding)
}
Please help me figure out where I am going wrong.
System.IO.File.WriteAllLines is getting either an array of strings or an IEnumerable of strings as second parameter and cannot be piped to a command because it is not a CmdLet handling pipeline input but a .NET Framework method.
You should try storing your replaced content into a string[]to use it as parameter when saving the file.
param(
[String]$file
)
$encoding =[Text.Encoding]::GetEncoding('iso-8859-1')
$replacedContent = [string[]]#(Get-Content $file | ForEach-Object {
# Do stuff
})
[System.IO.File]::WriteAllLines($file, $replacedContent, $encoding)

Simulating `ls` in Powershell

I'm trying to get something that looks like UNIX ls output in PowerShell. This is getting there:
Get-ChildItem | Format-Wide -AutoSize -Property Name
but it's still outputting the items in row-major instead of column-major order:
PS C:\Users\Mark Reed> Get-ChildItem | Format-Wide -AutoSize -Property Name
Contacts Desktop Documents Downloads Favorites
Links Music Pictures Saved Games
Searches Videos
Desired output:
PS C:\Users\Mark Reed> My-List-Files
Contacts Downloads Music Searches
Desktop Favorites Pictures Videos
Documents Links Saved Games
The difference is in the sorting: 1 2 3 4 5/6 7 8 9 reading across the lines, vs 1/2/3 4/5/6 7/8/9 reading down the columns.
I already have a script that will take an array and print it out in column-major order using Write-Host, though I found a lot of PowerShellish idiomatic improvements to it by reading Keith's and Roman's takes. But my impression from reading around is that's the wrong way to go about this. Instead of calling Write-Host, a script should output objects, and let the formatters and outputters take care of getting the right stuff written to the user's console.
When a script uses Write-Host, its output is not capturable; if I assign the result to a variable, I get a null variable and the output is written to the screen anyway. It's like a command in the middle of a UNIX pipeline writing directly to /dev/tty instead of standard output or even standard error.
Admittedly, I may not be able to do much with the array of Microsoft.PowerShell.Commands.Internal.Format.* objects I get back from e.g. Format-Wide, but at least it contains the output, which doesn't show up on my screen in rogue fashion, and which I can recreate at any time by passing the array to another formatter or outputter.
This is a simple-ish function that formats column major. You can do this all in PowerShell Script:
function Format-WideColMajor {
[CmdletBinding()]
param(
[Parameter(ValueFromPipeline)]
[AllowNull()]
[AllowEmptyString()]
[PSObject]
$InputObject,
[Parameter()]
$Property
)
begin {
$list = new-object System.Collections.Generic.List[PSObject]
}
process {
$list.Add($InputObject)
}
end {
if ($Property) {
$output = $list | Foreach {"$($_.$Property)"}
}
else {
$output = $list | Foreach {"$_"}
}
$conWidth = $Host.UI.RawUI.BufferSize.Width - 1
$maxLen = ($output | Measure-Object -Property Length -Maximum).Maximum
$colWidth = $maxLen + 1
$numCols = [Math]::Floor($conWidth / $colWidth)
$numRows = [Math]::Ceiling($output.Count / $numCols)
for ($i=0; $i -lt $numRows; $i++) {
$line = ""
for ($j = 0; $j -lt $numCols; $j++) {
$item = $output[$i + ($j * $numRows)]
$line += "$item$(' ' * ($colWidth - $item.Length))"
}
$line
}
}
}