I've got a tab-delimited file that I'm running thru a foreach loop to match values.
foreach($mvalue in $mvalues) {
$vName = $mvalue.Name
$filter = "`"" + $vName + "/``t`""
$mMatch = gc d:\test.txt | select-string $filter
Write-Output $vName
Write-Output $filter
Write-Output $mMatch }
$mMatch is not outputting even though $filter is correct and I can do a test in the console with $filter's value and it will give me results. It might be important to mention that not all lines match the value that I'm searching for; but there are some that do and for those I would like it to output the value. I'm running 2.0 on Windows 2003.
You can convert it to csv format and give columns a meaningful name:
Get-Content TabDlimited.txt |
ConvertFrom-Csv -Header col1,col2,col3 -Delimiter "`t" |
Where-Object {$_.col1 -match 'whatever'}
I recently had a similar requirement where I wanted to search list of values in a field from a file, like in SQL we use the IN clause
select * from table where column in (List of values)
Achieved using the following -
$lstExpenseID = #("VAL1"
,"VAL2"
,"VAL3")
Get-ChildItem -Filter "*.txt" -Path .\ |
ForEach-Object {
Import-Csv -Delimiter `t -Path $_.FullName | Where-Object -Property "Cust id" -In -Value $lstExpenseID
}
try to write you filter like this (\t is for tab in regular expression) :
$filter = "`"$vName/\t`""
It's always the little things:
I removed the parenthesis from around my variable:
$filter = $vName + "/\t"
$mMatch = gc d:\test.txt | select-string -pattern $filter
Related
I want to check if a string exists in a csv file.
I'm trying to use if ($PCname -in $logFileLocation) { write-output "true" } else { write-output "false" }
However this always returns false.
How can I check for a value within a csv file?
You can integrate a query within the result of dir ( Get-Childitem )
$Query = "yourString"
$List = Get-Childitem *.CSV |
Select-Object Name,#{Name="MatchesQuery"; Expression={($_ | Get-Content -raw) -match $Query}}
$List
Attention: $Query will be interpreted as regular expression
That way you get a list of all files with the information that it is matching your cirteria or not. That way you can filter afterwards like that:
$List | where {$_.MatchesQuery -eq $true}
Depending in your CSV it might be better and more relyable if you do not only Get-Content but use ConvertFrom-CSV and select the column you want to search in.
This version reads CSV and searches the column "ComputerName" for your $Query
$Query = "yourString"
$List = Get-Childitem *.CSV | Select-Object Name,#{Name="MatchesQuery"; Expression={($_ | Get-Content -raw | ConvertFrom-CSV -Delimiter ";" | where {$_.ComputerName -eq "$Query").Count -gt 0}}}
$List
this could be helpful:
$find = "some_string"
Get-Content C:\temp\demo.csv | Select-String $find
I use powershell to automate extracting of selected data from a CSV file.
My $target_servers also contains two the same server name but it has different data in each rows.
Here is my code:
$target_servers = Get-Content -Path D:\Users\Tools\windows\target_prd_servers.txt
foreach($server in $target_servers) {
Import-Csv $path\Serverlist_Template.csv | Where-Object {$_.Hostname -Like $server} | Export-Csv -Path $path/windows_prd.csv -Append -NoTypeInformation
}
After executing the above code it extracts CSV data based on a TXT file, but my problem is some of the results are duplicated.
I am expecting around 28 results but it gave me around 49.
As commented, -Append is the culprit here and you should check if the newly added records are not already present in the output file:
# read the Hostname column of the target csv file as array to avoid duplicates
$existingHostsNames = #((Import-Csv -Path "$path/windows_prd.csv").Hostname)
$target_servers = Get-Content -Path D:\Users\Tools\windows\target_prd_servers.txt
foreach($server in $target_servers) {
Import-Csv "$path\Serverlist_Template.csv" |
Where-Object {($_.Hostname -eq $server) -and ($existingHostsNames -notcontains $_.HostName)} |
Export-Csv -Path "$path/windows_prd.csv" -Append -NoTypeInformation
}
You can convert your data to array of objects and then use select -Unique, like this:
$target_servers = Get-Content -Path D:\Users\Tools\windows\target_prd_servers.txt
$data = #()
foreach($server in $target_servers) {
$data += Import-Csv $path\Serverlist_Template.csv| Where-Object {$_.Hostname -Like $server}
}
$data | select -Unique | Export-Csv -Path $path/windows_prd.csv -Append -NoTypeInformation
It will work only if duplicated rows have same value in every column. If not, you can pass column names to select which are important for you. For ex.:
$data | select Hostname -Unique | Export-Csv -Path $path/windows_prd.csv -Append -NoTypeInformation
It will give you list of unique hostnames.
I'm again stuck on something that should be so simple. I have a CSV file in which I need to do a few string modifications and export it back out. The data looks like this:
FullName
--------
\\server\project\AOI
\\server\project\AOI\Folder1
\\server\project\AOI\Folder2
\\server\project\AOI\Folder3\User
I need to do the following:
Remove the "\\server\project" from each line but leave the rest of the line
Delete all rows which do not have a Folder (e.g., in the example above, the first row would be deleted but the other three would remain)
Delete any row with the word "User" in the path
Add a column called T/F with a value of "FALSE" for each record
Here is my initial attempt at this:
Get-Content C:\Folders.csv |
% {$_.replace('\\server\project\','')} |
Where-Object {$_ -match '\\'} |
#Removes User Folders rows from CSV
Where-Object {$_ -notmatch 'User'} |
Out-File C:\Folders-mod.csv
This works to a certain extent, except it deletes my header row and I have not found a way to add a column using Get-Content. For that, I have to use Import-Csv, which is fine, but it seems inefficient to be constantly reloading the same file. So I tried rewriting the above using Import-Csv instead of Get-Content:
$Folders = Import-Csv C:\Folders.csv
foreach ($Folder in $Folders) {
$Folder.FullName = $Folder.FullName.Replace('\\server\AOI\', '') |
Where-Object {$_ -match '\\'} |
Where-Object {$_ -notmatch 'User Files'}
}
$Folders | Export-Csv C:\Folders-mod.csv -NoTypeInformation
I haven't added the coding for adding the new column yet, but this keeps the header. However, I end up with a bunch of empty rows where the Where-Object deletes the line, and the only way I can find to get rid of them is to run the output file through a Get-Content command. This all seems overly complicated for something that should be simple.
So, what am I missing?
Thanks to TheMadTechnician for pointing out what I was doing wrong. Here is my final script (with additional column added):
$Folders= Import-CSV C:\Folders.csv
ForEach ($Folder in $Folders)
{
$Folder.FullName = $Folder.FullName.replace('\\server\project\','')
}
$Folders | Where-Object {$_ -match '\\' -and $_ -notmatch 'User'} |
Select-Object *,#{Name='T/F';Expression={'FALSE'}} |
Export-CSV C:\Folders.csv -NoTypeInformation
I would do this with a Table Array and pscustomobject.
#Create an empty Array
$Table = #()
#Manipulate the data
$Fullname = Get-Content C:\Folders.csv |
ForEach-Object {$_.replace('\\server\project\', '')} |
Where-Object {$_ -match '\\'} |
#Removes User Folders rows from CSV
Where-Object {$_ -notmatch 'User'}
#Define custom objects
Foreach ($name in $Fullname) {
$Table += [pscustomobject]#{'Fullname' = $name; 'T/F' = 'FALSE'}
}
#Export results to new csv
$Table | Export-CSV C:\Folders-mod.csv -NoTypeInformation
here's yet another way to do it ... [grin]
$FileList = #'
FullName
\\server\project\AOI
\\server\project\AOI\Folder1
\\server\project\AOI\Folder2
\\server\project\AOI\Folder3\User
'# | ConvertFrom-Csv
$ThingToRemove = '\\server\project'
$FileList |
Where-Object {
# toss out any blank lines
$_ -and
# toss out any lines with "user" in them
$_ -notmatch 'User'
} |
ForEach-Object {
[PSCustomObject]#{
FullName = $_.FullName -replace [regex]::Escape($ThingToRemove)
'T/F' = $False
}
}
output ...
FullName T/F
-------- ---
\AOI False
\AOI\Folder1 False
\AOI\Folder2 False
notes ...
putting a slash in the property name is ... icky [grin]
that requires wrapping the property name in quotes every time you need to access it. try another name - perhaps "Correct".
you can test for blank array items [lines] with $_ all on its own
the [regex]::Escape() stuff is really quite handy
I am using the following script that iterates through hundreds of text files looking for specific instances of the regex expression within. I need to add a second data point to the array, which tells me the object the pattern matched in.
In the below script the [Regex]::Matches($str, $Pattern) | % { $_.Value } piece returns multiple rows per file, which cannot be easily output to a file.
What I would like to know is, how would I output a 2 column CSV file, one column with the file name (which should be $_.FullName), and one column with the regex results? The code of where I am at now is below.
$FolderPath = "C:\Test"
$Pattern = "(?i)(?<=\b^test\b)\s+(\w+)\S+"
$Lines = #()
Get-ChildItem -Recurse $FolderPath -File | ForEach-Object {
$_.FullName
$str = Get-Content $_.FullName
$Lines += [Regex]::Matches($str, $Pattern) |
% { $_.Value } |
Sort-Object |
Get-Unique
}
$Lines = $Lines.Trim().ToUpper() -replace '[\r\n]+', ' ' -replace ";", '' |
Sort-Object |
Get-Unique # Cleaning up data in array
I can think of two ways but the simplest way is to use a hashtable (dict). Another way is create psobjects to fill your Lines variable. I am going to go with the simple way so you can only use one variable, the hashtable.
$FolderPath = "C:\Test"
$Pattern = "(?i)(?<=\b^test\b)\s+(\w+)\S+"
$Results =#{}
Get-ChildItem -Recurse $FolderPath -File |
ForEach-Object {
$str = Get-Content $_.FullName
$Line = [regex]::matches($str,$Pattern) | % { $_.Value } | Sort-Object | Get-Unique
$Line = $Line.Trim().ToUpper() -Replace '[\r\n]+', ' ' -Replace ";",'' | Sort-Object | Get-Unique # Cleaning up data in array
$Results[$_.FullName] = $Line
}
$Results.GetEnumerator() | Select #{L="Folder";E={$_.Key}}, #{L="Matches";E={$_.Value}} | Export-Csv -NoType -Path <Path to save CSV>
Your results will be in $Results. $Result.keys contain the folder names. $Results.Values has the results from expression. You can reference the results of a particular folder by its key $Results["Folder path"]. of course it will error if the key does not exist.
I have a problem that I am trying to solve, however, due to my non existing PowerShell knowledge it is proving to be harder than I hoped. So any help would be appreciated.
The problem can be simplified as:
Find a string in a txtfile
Extract the information on the row after that string
Store the information in a handle
Find a second string in the txtfile and repeat the procedure
Store both strings in a new file or delete everything else in the txt file.
I am then trying to do this for approx 20k files. I would love to have the information under their keyword and comma delimited so that I can import them in other systems.
My files look somewhat like the following
random words
that are unimportant
Keyword
FirstlineofNumbersthatIwanttoExtract
random words again that are unimportant
Secondkeyword
SecondLineOfNumbersThatIWantToExtract
end of the file
All files are however not similar in terms of the row that the lines I want to extract are on. I would the output to be something like
Keyword, SecondKeyword
FirstLineOfNumbersThatIWantToExtract, SecondLineOfNumbersThatIWantToExtract
And done. I got this far
[System.IO.DirectoryInfo]$folder = 'C:\users\xx\Desktop\mappcent3'
foreach ($file in ($folder.EnumerateFiles())) {
if ($file.Extension -eq '.txt') {
$content = Get-Content $file
$FirstRegex = 'KeyWordOne
(.+)$'
$First_output = "\1"
$test = Select-String -Path $file.FullName -Pattern $FirstRegex
}
}
This would do something similar to what you are asking. This requires PowerShell 3.0+
$path = 'C:\users\xx\Desktop\mappcent3'
$firstKeyword = "Keyword"
$secondKeyword = "Secondkeyword"
$resultsPath = "C:\Temp\results.csv"
Get-ChildItem $path -Filter "*.txt" | ForEach-Object{
# Read the file in
$fileContents = Get-Content $_.FullName
# Find the first keyword data
$firstKeywordData = ($fileContents | Select-String -Pattern $firstKeyword -Context 0,1 -SimpleMatch).Context.PostContext[0]
# Find the second keyword data
$secondKeywordData = ($fileContents | Select-String -Pattern $secondKeyword -Context 0,1 -SimpleMatch).Context.PostContext[0]
# Create a new object with details gathered.
[pscustomobject][ordered]#{
File = $_.FullName
FirstKeywordData = $firstKeywordData
SecondKeywordData = $secondKeywordData
}
} | Export-CSV $resultsPath -NoTypeInformation
Select-String is what does most of the magic here. We take advantage of -Context which consumes lines before and after the match. We want the one following so that is why we use 0,1. Wrap that up in a custom object and then we can export it to a CSV file.
Keyword Overlap
Beware that your keywords can overlap and create odd results in your output files. In your sample Keyword matches multiple lines so the result set would reflect that.
If you did just want to write back to the original file you could easily do that as well
"$firstKeywordData,$secondKeywordData" | Set-Content $_.FullName
Or something similar.
The Select-String cmdlet has a -Context parameter that makes it easy to extract lines before or after the line on which there's a match.
You can use Export-Csv to export to the format you require (although with 20K files you may want to write directly to the output files)
foreach($file in Get-ChildItem C:\users\xx\Desktop\mappcent3 |Where {-not $_.PsIsContainer})
{
$FirstKeyword = 'FirstKeyword'
$FirstLine = Select-String -Path $file.FullName -Pattern $FirstKeyword -Context 0,1 |Select -Expand Context -First 1 |Select -Expand PostContext
$SecondKeyword = 'SecondKeyword'
$SecondLine = Select-String -Path $file.FullName -Pattern $SecondKeyword -Context 0,1 |Select -Expand Context -First 1 |Select -Expand PostContext
New-Object psobject -Property #{$FirstKeyword=$FirstLine;$SecondKeyword=$SecondLine} |Export-Csv (Join-Path $file.DirectoryName ($file.BaseName + '_keywords.txt'))
}