Hello and thank you for your time. Here is what I am looking to do. I have several log files that I need to search through. I do this by using Get-ChildItem -Path C:\mylogfiles\*.log | Select-String -Pattern 'MyTextHere' However, now I want to complicate my life and only select text that is between single quotes in the log files.
Here is a sample of my log file:
This is some sample text in my log file. It has a lot of garbage that I don't want to see. However, it has text that I want to find, and if found I would like it to save just the selected text to a CSV file. I want to copy everything that is between single quotes. Here comes the text 'Please copy this text that is between the single quotes'
Any idea how I would go about doing this?
The following combines Select-String with ForEach-Object to extract only the phrases of interest (the parts of the line that matched the regex), wraps them in a [pscustomobject] instance with a .Phrase property and exports the results with Export-Csv:
Select-String -Path C:\mylogfiles\*.log -AllMatches -Pattern "'.*?'" |
ForEach-Object {
foreach ($phrase in $_.Matches.Value) {
[pscustomobject] #{ Phrase = $phrase.Trim("'") }
}
} |
Export-Csv -NoTypeInformation -Encoding utf8 result.csv
Note: If there can only ever be at most one phrase of interest per line, you can omit the -AllMatches switch and replace the ForEach-Object call with the following Select-String call, which uses a calculated property:
# ... |
Select-Object -Property #{ Name='Phrase'; Expression={ $_.Matches.Value.Trim("'") } } |
# ...
Related
I'm trying to get the output of two separate files although I'm stuck on the wild card or contains select-string search from file A (Names) in file B (name-rank).
The contents of file A is:
adam
george
william
assa
kate
mark
The contents of file B is:
12-march-2020,Mark-1
12-march-2020,Mark-2
12-march-2020,Mark-3
12-march-2020,william-4
12-march-2020,william-2
12-march-2020,william-7
12-march-2020,kate-54
12-march-2020,kate-12
12-march-2020,kate-44
And I need to match on every occurrence of the names after the '-' so my ordered output should look like this which is a combination of both files as the output:
mark
Mark-1
Mark-2
Mark-3
william
william-2
william-4
william-7
Kate
kate-12
kate-44
kate-54
So far I only have the following and I'd be grateful for any pointers or assistance please.
import-csv (c:\temp\names.csv) |
select-string -simplematch (import-csv c:\temp\names-rank.csv -header "Date", "RankedName" | select RankedName) |
set-content c:\temp\names-and-ranks.csv
I imagine the select-string isn't going to be enough and I need to write a loop instead.
The data you give in the example does not give you much to work with, and the desired output is not that intuitive, most of the time with Powershell you would like to combine the data in to a much richer output at the end.
But anyway, with what is given here and what you want, the code bellow will get what you need, I have left comments in the code for you
$pathDir='C:\Users\myUser\Downloads\trash'
$names="$pathDir\names.csv"
$namesRank="$pathDir\names-rank.csv"
$nameImport = Import-Csv -Path $names -Header names
$nameRankImport= Import-Csv -Path $namesRank -Header date,rankName
#create an empty array to collect the result
$list=#()
foreach($name in $nameImport){
#get all the match names
$match=$nameRankImport.RankName -like "$($name.names)*"
#add the name from the First list
$list+=($name.names)
#if there are any matches, add them too
if($match){
$list+=$match
}
}
#Because its a one column string, Export-CSV will now show us what we want
$list | Set-Content -Path "$pathDir\names-and-ranks.csv" -Force
For this I would use a combination of Group-Object and Where-Object to first group all "RankedName" items by the name before the dash, then filter on those names to be part of the names we got from the 'names.csv' file and output the properties you need.
# read the names from the file as string array
$names = Get-Content -Path 'c:\temp\names.csv' # just a list of names, so really not a CSV
# import the CSV file and loop through
Import-Csv -Path 'c:\temp\names-rank.csv' -Header "Date", "RankedName" |
Group-Object { ($_.RankedName -split '-')[0] } | # group on the name before the dash in the 'RankedName' property
Where-Object { $_.Name -in $names } | # use only the groups that have a name that can be found in the $names array
ForEach-Object {
$_.Name # output the group name (which is one of the $names)
$_.Group.RankedName -join [environment]::NewLine # output the group's 'RankedName' property joined with a newline
} |
Set-Content -Path 'c:\temp\names-and-ranks.csv'
Output:
Mark
Mark-1
Mark-2
Mark-3
william
william-4
william-2
william-7
kate
kate-54
kate-12
kate-44
I have a directory on a server called 'servername'. In that directory, I have subdirectories whose name is a date. In those date directories, I have about 150 .csv file audit logs.
I have a partially working script that starts from inside the date directory, enumerates and loops through the .csv's and searches for a string in a column. Im trying to get it to export the row for each match then go on to the next file.
$files = Get-ChildItem '\\servername\volume\dir1\audit\serverbeingaudited\20180525'
ForEach ($file in $files) {
$Result = If (import-csv $file.FullName | Where {$_.'path/from' -like "*01May18.xlsx*"})
{
$result | Export-CSV -Path c:\temp\output.csv -Append}
}
What I am doing is searching the 'path\from' column for a string - like a file name. The column contains data that is always some form of \folder\folder\folder\filename.xls. I am searching for a specific filename and for all instances of that file name in that column in that file.
My issue is getting that row exported - export.csv is always empty. Id also like to start a directory 'up' and go through each date directory, parse, export, then go on to the next directory and files.
If I break it down to just one file and get it out of the IF it seems to give me a result so I think im getting something wrong in the IF or For-each but apparently thats above my paygrade - cant figure it out....
Thanks in advance for any assistance,
RichardX
The issue is your If block, when you say $Result = If () {$Result | ...} you are saying that the new $Result is equal what's returned from the if statement. Since $Result hasn't been defined yet, this is $Result = If () {$null | ...} which is why you are getting a blank line.
The If block isn't even needed. you filter your csv with Where-Object already, just keep passing those objects down the pipeline to the export.
Since it sounds like you are just running this against all the child folders of the parent, sounds like you could just use the -Recurse parameter of Get-ChildItem
Get-ChildItem '\\servername\volume\dir1\audit\serverbeingaudited\' -Recurse |
ForEach-Object {
Import-csv $_.FullName |
Where-Object {$_.'path/from' -like "*01May18.xlsx*"}
} | Export-CSV -Path c:\temp\output.csv
(I used a ForEach-Object loop rather than foreach just demonstrate objects being passed down the pipeline in another way)
Edit: Removed append per Bill_Stewart's suggestion. Will write out all entries for the the recursed folders in the run. Will overwrite on next run.
I don't see a need for appending the CSV file? How about:
Get-ChildItem '\\servername\volume\dir1\audit\serverbeingaudited\20180525' | ForEach-Object {
Import-Csv $_.FullName | Where-Object { $_.'path/from' -like '*01May18.xlsx*' }
} | Export-Csv 'C:\Temp\Output.csv' -NoTypeInformation
Assuming your CSVs are in the same format and that your search text is not likely to be present in any other columns you could use a Select-String instead of Import-Csv. So instead of converting string to object and back to string again, you can just process as strings. You would need to add an additional line to fake the header row, something like this:
$files = Get-ChildItem '\\servername\volume\dir1\audit\serverbeingaudited\20180525'
$result = #()
$result += Get-Content $files[0] -TotalCount 1
$result += ($files | Select-String -Pattern '01May18\.xlsx').Line
$result | Out-File 'c:\temp\output.csv'
I filtered by date this file data1.csv
2017.11.1,09:55,1.1,1.2,1.3,1.4,1
2017.11.2,09:55,1.5,1.6,1.7,1.8,2
I don't get a header with -NoTypeInformation:
$CutOff = (Get-Date).AddDays(-2)
$filePath = "data1.csv"
$Data = Import-Csv $filePath -Header Date,Time,A,B,C,D,E
$Data2 = $Data | Where-Object {$_.Date -as [datetime] -gt $Cutoff} | convertto-csv -NoTypeInformation -Delimiter "," | % {$_ -replace '"',''}
But when rewriting with Out-File
$Data2 | Out-File "data2.csv" -Encoding utf8 -Force
I get header back as data2.csv contains:
Date,Time,A,B,C,D,E
2017.11.2,09:55,1.5,1.6,1.7,1.8,2
Why do I have Date,Time,A,B,C,D,E ?
-NoTypeInformation is not about the header but the data type of the rows in the file. Remove it to see what shows up. From Microsoft
Omits the type information header from the output. By default, the string in the output contains #TYPE followed by the fully-qualified name of the object type.
Emphasis mine.
CSVs need headers. That is why it is making one. If you don't want to see the header in the output use Select-Object -Skip 1 to remove it.
$Data |
Where-Object {$_.Date -as [datetime] -gt $Cutoff} |
ConvertTo-CSV -NoTypeInformation -Delimiter "," |
Select-Object -Skip 1 |
% {$_ -replace '"'}
I would not pipe Out-File to itself. You could pipe to Set-Content here just as well.
I am guessing this whole process is to keep the source file in the same state just with some lines filtered out based on date. You could skip most of this just by parsing the date out in each line.
$threshold = (Get-Date).AddDays(-2)
$filePath = "c:\temp\bagel.txt"
(Get-Content $filePath) | Where-Object{
$date,$null=$_.Split(",",2)
[datetime]$date -gt $threshold
} | Set-Content $filePath
Now you don't have to worry about PowerShell CSV object structure or output since we act on the raw data of the file itself.
That will take each line of the input file and filter it out if the parsed date does not match the threshold. Change encoding on the input output cmdlets as you see necessary. What $date,$null=$_.Split(",",2) is doing is splitting the line
on the comma into 2 parts. First of which becomes $date and since this is just a filtering condition we dump the rest of the line into $null.
Properly-formed CSV files must have column headers. Your use of -NoTypeInformation in generating the CSV does not affect column headers; instead, it affects whether the PowerShell object type information is included. If you Export-CSV without -NoTypeInformation, the first line of your CSV file will have a line that looks like #TYPE System.PSCustomObject, which you don't want if you're going to open the CSV in a spreadsheet program.
If you subsequently Import-CSV, the headers (Date, Time, A, B, C) are used to create the fields of a PSObject, so that you can refer to them using the standard dot notation (e.g., $CSV[$line].Date).
The ability to specify -Header on Import-CSV is essentially a "hack" to allow the cmdlet to handle files that are comma-separated, but which did not include column headers.
I have a script that reads series of log files located in different places and looks for an error code with Select-String. After the error code I print out the next four lines to a file with "-Context". That file's content gets dumped into an email and sent off.
$logsToCheck = "F:\Log1.log",
"F:\log2.log",
"F:\log3.log"
$logsToCheck | % {Select-String -Path $_ -Pattern "SQLCODE:2627" -Context
0, 4} | Out-File $dupChkFile
$emailbody = Get-Content $dupChkFile | ConvertTo-Html
The actual output of the string is poorly formatted and runs together. Is there a way to add blank lines or spaces after the last line when using Select-String -Context?
Originally I was piping the $emailbody to a Out-String but changed it to HTML to try to clear up formatting.
try reading out the match and context separately.
Select-String -Path $_ "SQLCODE:2627" -Context 0,2 | %{
$_.Line
$_.Context.PostContext
"-----Separator-----"
}
the default output of Select-String with Context is human-readable modified, this will return everything as an array of unmodified strings, so you can be sure there will always be newlines, and it will behave better with other cmdlt's including Out-String or loops.
I would suggest using concatenation:
% { "$(Select-String -Path $_ -Pattern 'SQLCODE:2627' -Context 0, 4)`r`n" } |
Out-File $dupChkFile -Append
Parse .rtf file, output email addresses in .csv file?
I have an .rtf file containing a bunch of email addresses, I need this parsed so that I can compare a .csv file to active users in Active Directory.
Basically I want what is to the left of "#my.domain.com"
$finds = Select-String -Path "path\to\my.rtf" -Pattern "#my.domain.com" | ForEach-Object {$_.Matches}
$finds | Select-Object -First 1 | ft *
This of course gives me one result so that I don't have alot of output.
I only manage to get matches or the complete line.
I've tried adding something along the line of
$finds = Select-String -Path "path\to\my.rtf" -Pattern "\w.#my.domain.com"
This gives me the very two last letters in the addresses.
If I keep adding dots to the "wildcard"
-Pattern "\w.....#my.domain.com"
I also get a ton of numbers/characters (.rtf formatting) for addresses that contains fewer characters.
How do I do this?
EDIT: I will update the question as soon as I've found a solution. As of now I'm trying with regular expressions.
Example:
-Pattern "\w*?#my.domain.com"
$mPattern = "[a-zA-Z0-9._%+-]+#[a-zA-Z0-9.-]+(\.[a-zA-Z]{2,4})"
$lines = get-content "path\to\your.rtf"
foreach($line in $lines){
([regex]::MAtch($rtfInput, $mpattern, "IgnoreCase ")).value }
This code worked for me. My inital code but with a new search pattern.
$finds = Select-String -Path "path\to\my.rtf" -Pattern "[a-zA-Z0-9._%+-]+#[a-zA-Z0-9.-]+(\.[a-zA-Z]{2,4})" | ForEach-Object {$_.Matches}
$finds | Select-Object -First 10 | ft *
Thanks!