create search function to support wildcard char only when it is input by user - powershell

I'm writing a function to search (case insensitive) through CSV files. To test this I'm using an array, and user input to check the array. I only want the search to include a wildstar if the user inputs a wildstar and the position of it matters too.
For example (in pseudo code), given the $array = "Hanna", "Anna", "Ann"
if userInput = Ann
output = "Ann"
If userInput = Ann*
output = "Ann", "Anna"
if userInput = *nna
output = "Anna"
if userInput = *nn*
output = "Hannah", "Anna", "Ann"
if userInput = *Hanna*
output = "Hanna"
if userInput = Hanna
output = "Hanna"
and so forth...
I'm using a CSV file test1.csv:
test1Column1,test1Column2,test1Column3
Hannah,12345,
Anna,1234,
Ann,2345,
I have the following code:
Function SearchContentName
{
Param ($userSearchContent)
If ($userSearchContent -Contains '`*')
{
Import-Csv test1.csv | % {if ($_.test1Column1 -Match $userSearchContent){$_.test1Column1}} | Export-Csv testresults.csv -NoTypeInformation
}
ElseIf ($userSearchContent -NotContains '`*')
{
Import-Csv test1.csv | % {if ($_.test1Column1 -Eq $userSearchContent){$_.test1Column1}} | Export-Csv testresults.csv -NoTypeInformation
}
}
Function Main
{
$userSearchContent = Read-Host "Enter Name"
SearchContentName $userSearchContent
}
Main
Can anyone tell me what I'm doing wrong?

You're using the wrong operators. Since you want to allow wildcard matches, simply use -like for all user input:
Function SearchContentName {
Param ($userSearchContent)
Import-Csv test1.csv | ForEach-Object {
if ($_.test1Column1 -like $userSearchContent) {
$_.test1Column1
}
} | Set-Content testresults.csv
}
The -match operator is for regular expressions, where you'd have to express "any number of characters" as .*, not just *. The -like operator behaves like -eq if $userSearchContent doesn't contain wildcard characters, so you don't need to distinguish between different cases.
If you want the output CSV to contain whole lines and headers, use Where-Object instead of filtering with an if nested in a ForEach-Object:
Function SearchContentName {
Param ($userSearchContent)
Import-Csv test1.csv | Where-Object {
$_.test1Column1 -like $userSearchContent
} | Export-Csv testresults.csv -NoType
}

Related

Powershell: Find any value in a CSV column and replace it with a single value

I have a CSV file where I have to find any non-blank value in 2 specific columns and replace them with 'Yes'
My data looks like this where it can have either both blank, value in either column, or in both.
Letter Grade
Numeric Grade
A
10
C
5
I want it to look like this when I'm done
Letter Grade
Numeric Grade
Yes
Yes
Yes
Yes
I have 2 problems, addressing columns that have a space in the name (tried wrapping with " and ' and {) and regex to match any non-empty value. It works with the code below to simply replace a and if the column is Letter instead of Letter Grade.
I tried .+ to match anything in the cell, but I get no matches.
Thanks in advance!
Import-Csv -Path ".\test.csv"| ForEach-Object {
if ($_.Letter -eq 'a') {
$_.Letter = 'Yes'
}
$_
} | Export-Csv .\poop2.csv -Encoding UTF8
You could handle this programmatically by, first, collecting all property names from the first object (done via accessing of intrinsic member PSObject in this example) and then enumerating each property of each object coming from the pipeline and checking if it matches \S (any non-whitespace character).
Import-Csv path\to\csv.csv | ForEach-Object { $isFirstObject = $true } {
if($isFirstObject) {
$properties = $_.PSObject.Properties.Name
$isFirstObject = $false
}
foreach($property in $properties) {
if($_.$property -match '\S') {
$_.$property = 'Yes'
}
}
$_
} | Export-Csv path\to\newcsv.csv -NoTypeInformation
If, instead of programmatically gathering the object's property names, you wanted to use specific / hardcoded properties, the code would be simpler:
$properties = 'Letter Grade', 'Numeric Grade'
Import-Csv path\to\csv.csv | ForEach-Object {
foreach($property in $properties) {
if($_.$property -match '\S') {
$_.$property = 'Yes'
}
}
$_
} | Export-Csv path\to\newcsv.csv -NoTypeInformation

Count number of comments over multiple files, including multi-line comments

I'm trying to write a script that counts all comments in multiple files, including both single line (//) and multi-line (/* */) comments and prints out the total. So, the following file would return 4
// Foo
var text = "hello world";
/*
Bar
*/
alert(text);
There's a requirement to include specific file types and exclude certain file types and folders, which I already have working in my code.
My current code is:
( gci -include *.cs,*.aspx,*.js,*.css,*.master,*.html -exclude *.designer.cs,jquery* -recurse `
| ? { $_.FullName -inotmatch '\\obj' } `
| ? { $_.FullName -inotmatch '\\packages' } `
| ? { $_.FullName -inotmatch '\\release' } `
| ? { $_.FullName -inotmatch '\\debug' } `
| ? { $_.FullName -inotmatch '\\plugin-.*' } `
| select-string "^\s*//" `
).Count
How do I change this to get multi-line comments as well?
UPDATE: My final solution (slightly more robust than what I was asking for) is as follows:
$CodeFiles = Get-ChildItem -include *.cs,*.aspx,*.js,*.css,*.master,*.html -exclude *.designer.cs,jquery* -recurse |
Where-Object { $_.FullName -notmatch '\\(obj|packages|release|debug|plugin-.*)\\' }
$TotalFiles = $CodeFiles.Count
$IndividualResults = #()
$CommentLines = ($CodeFiles | ForEach-Object{
#Get the comments via regex
$Comments = ([regex]::matches(
[IO.File]::ReadAllText($_.FullName),
'(?sm)^[ \t]*(//[^\n]*|/[*].*?[*]/)'
).Value -split '\r?\n') | Where-Object { $_.length -gt 0 }
#Get the total lines
$Total = ($_ | select-string .).Count
#Add to the results table
$IndividualResults += #{
File = $_.FullName | Resolve-Path -Relative;
Comments = $Comments.Count;
Code = ($Total - $Comments.Count)
Total = $Total
}
Write-Output $Comments
}).Count
$TotalLines = ($CodeFiles | select-string .).Count
$TotalResults = New-Object PSObject -Property #{
Files = $TotalFiles
Code = $TotalLines - $CommentLines
Comments = $CommentLines
Total = $TotalLines
}
Write-Output (Get-Location)
Write-Output $IndividualResults | % { new-object PSObject -Property $_} | Format-Table File,Code,Comments,Total
Write-Output $TotalResults | Format-Table Files,Code,Comments,Total
To be clear: Using string matching / regular expressions is not a fully robust way to detect comments in JavaScript / C# code, because there can be false positives (e.g., var s = "/* hi */";); for robust parsing you'd need a language parser.
If that is not a concern, and it is sufficient to detect comments (that start) on their own line, optionally preceded by whitespace, here's a concise solution (PSv3+):
(Get-ChildItem -include *.cs,*.aspx,*.js,*.css,*.master,*.html -exclude *.designer.cs,jquery* -recurse |
Where-Object { $_.FullName -notmatch '\\(obj|packages|release|debug|plugin-.*)' } |
ForEach-Object {
[regex]::matches(
[IO.File]::ReadAllText($_.FullName),
'(?sm)^[ \t]*(//[^\n]*|/[*].*?[*]/)'
).Value -split '\r?\n'
}
).Count
With the sample input, the ForEach-Object command yields 4.
Remove the ^[ \t]* part to match comments starting anywhere on a line.
The solution reads each input file as a single string with [IO.File]::ReadAllText() and then uses the [regex]::Matches() method to extract all (potentially line-spanning) comments.
Note: You could use Get-Content -Raw instead to read the file as a single string, but that is much slower, especially when processing multiple files.
The regex uses in-line options s and m ((?sm)) to respectively make . match newlines too and to make anchors ^ and $ match line-individually.
^[ \t]* matches any mix of spaces and tabs, if any, at the start of a line.
//[^\n]*$ matches a string that starts with // through the end of the line.
/[*].*?[*]/ matches a block comment across multiple lines; note the lazy quantifier, *?, which ensures that very next instance of the closing */ delimiter is matched.
The matched comments (.Value) are then split into individual lines (-split '\r?\n'), which are output.
The resulting lines across all files are then counted (.Count)
As for what you tried:
The fundamental problem with your approach is that Select-String with file-info object input (such as provided by Get-ChildItem) invariably processes the input files line by line.
While this could be remedied by calling Select-String inside a ForEach-Object script block in which you pass each file's content as a single string to Select-String, direct use of the underlying regex .NET types, as shown above, is more efficient.
An IMO better approach is to count net code lines by removing single/multi line comments.
For a start a script that handles single files and returns for your above sample.cs the result 5
((Get-Content sample.cs -raw) -replace "(?sm)^\s*\/\/.*?$" `
-replace "(?sm)\/\*.*?\*\/.*`n" | Measure-Object -Line).Lines
EDIT: without removing empty lines, build the difference from total lines
## Q:\Test\2018\10\31\SO_53092258.ps1
$Data = Get-ChildItem *.cs | ForEach-Object {
$Content = Get-Content $_.FullName -Raw
$TotalLines = (Measure-Object -Input $Content -Line).Lines
$CodeLines = ($Content -replace "(?sm)^\s*\/\/.*?$" `
-replace "(?sm)\/\*.*?\*\/.*`n" | Measure-Object -Line).Lines
$Comments = $TotalLines - $CodeLines
[PSCustomObject]#{
File = $_.FullName
Lines = $TotalLines
Comments= $Comments
}
}
$Data
"="*40
"TotalLines={0} TotalCommentLines={1}" -f (
$Data | Measure-Object -Property Lines,Comments -Sum).Sum
Sample output:
> Q:\Test\2018\10\31\SO_53092258.ps1
File Lines Comments
---- ----- --------
Q:\Test\2018\10\31\example.cs 10 5
Q:\Test\2018\10\31\sample.cs 9 4
============================================
TotalLines=19 TotalCommentLines=9

Compare 2 .csv files

I have two .csv files with many information in it. If at the end of the sentence is a "M", I have to look if this row is in the other file. When it's there I have to look if the code at the beggining of the row is the same, when not then I have to do nothing, but when it's the same I have to make a new file.
This is the information I have to look if it's in the other file:
You can see that the information is here:
I also have rows with a "B" at the end but this is unimportant:
Now, when the information is here, I have to export all rows that are same in both files.
I have to export the rows in a new file which have the same code at the beginning which is circeld in red:
I have tried different solutions that I looked up in the Internet, but nothing really works.
Perhaps something like this?
$datenbank = Import-Csv "C:\Users\information1.csv"
$zentral = Import-Csv "C:\Users\information2.csv"
$new = ""
foreach ($line in $datenbank) {
$Spalte = $line.Split(",")
foreach ($z in $Zentral) {
$found = $false
foreach ($d in $Datenbanktyp) {
if ($d.$Spalte[1] -eq $z.$Spalte[1]) {
$found = $true
}
}
if ($found -eq $true) {
$new += $z
}
}
}
Or can it work with a if..elseif..else loop?
Let's see if I got this right. You have one file where the second-last column contains a letter. If that letter is "M" you want to check if the value of the column before that (partially) matches a column from a second file. If it does, you then want to export all rows from the second file that have the same value in the first column as the matched row to a new file.
Since you didn't reveal the column names I'm going to dub the third- and second-last columns from the first file "Erin" and "Marty", the match column from the second file "Pat", and the first column from the second file "Gene".
$datenbank | Where-Object {
$_.Marty -ceq 'M'
} | Select-Object -Expand Erin -Unique | ForEach-Object {
$outfile = "export_${_}.csv" # adjust output filename as you see fit
$firstcol = $zentral |
Where { $_.Pat -like "*${_}*" } |
Select-Object -Expand Gene
$zentral | Where-Object {
$_.Gene -eq $firstcol
} | Export-Csv $outfile
}
Another approach would be to group your second file by the first column and then check if the groups contain a matching value.
$groups = $zentral | Group-Object Gene
$datenbank | Where-Object {
$_.Marty -ceq 'M'
} | Select-Object -Expand Erin -Unique | ForEach-Object {
$outfile = "export_${_}.csv" # adjust output filename as you see fit
$groups | Where-Object {
$_.Group.Pat -like "*${_}*"
} | Select-Object -Expand Group | Export-Csv $outfile
}
Replace "Erin", "Marty", "Pat" and "Gene" with the actual column titles from your CSV files. Should your files not contain column titles you need to specify them via the -Header parameter of Import-Csv, otherwise the cmdlet will interpret the first data row as the headers.

Combining like objects in an array

I am attempting to analyze a group of text files (MSFTP logs) and do counts of IP addresses that have submitted bad credentials. I think I have it worked out except I don't think that the array is passing to/from the function correctly. As a result, I get duplicate entries if the same IP appears in multiple log files. What am I doing wrong?
Function LogBadAttempt($FTPLog,$BadPassesArray)
{
$BadPassEx="PASS - 530"
Foreach($Line in $FTPLog)
{
if ($Line -match $BadPassEx)
{
$IP=($Line.Split(' '))[1]
if($BadPassesArray.IP -contains $IP)
{
$CurrentIP=$BadPassesArray | Where-Object {$_.IP -like $IP}
[int]$CurrentCount=$CurrentIP.Count
$CurrentCount++
$CurrentIP.Count=$CurrentCount
}else{
$info=#{"IP"=$IP;"Count"='1'}
$BadPass=New-Object -TypeName PSObject -Property $info
$BadPassesArray += $BadPass
}
}
}
return $BadPassesArray
}
$BadPassesArray=#()
$FTPLogs = Get-Childitem \\ftpserver\MSFTPSVC1\test
$Result = ForEach ($LogFile in $FTPLogs)
{
$FTPLog=Get-Content ($LogFile.fullname)
LogBadAttempt $FTPLog
}
$Result | Export-csv C:\Temp\test.csv -NoTypeInformation
The result looks like...
Count IP
7 209.59.17.20
20 209.240.83.135
18441 209.59.17.20
13059 200.29.3.98
and would like it to combine the entries for 209.59.17.20
You're making this way too complicated. Process the files in a pipeline and use a hashtable to count the occurrences of each IP address:
$BadPasswords = #{}
Get-ChildItem '\\ftpserver\MSFTPSVC1\test' | Get-Content | ? {
$_ -like '*PASS - 530*'
} | % {
$ip = ($_ -split ' ')[1]
$BadPasswords[$ip]++
}
$BadPasswords.GetEnumerator() |
select #{n='IP';e={$_.Name}}, #{n='Count';e={$_.Value}} |
Export-Csv 'C:\Temp\test.csv' -NoType

trim object contents in csv import

I need to run a trim method on each value extracted from the csv import object. haven't tried something like below but for me I don't want to have to define a trim command at the end of each one of my variables being passed to functions.
$csvobj = "c:\somestuff.csv"
foreach ($csvitem in $csvobj) {
$csvitem.value1.trim()
$csvitem.value2.trim()
}
Thanks in advance, SS
This will trim all values in the csv file and assign the result to $csv.
$csv = Import-Csv c:\somestuff.csv | Foreach-Object {
$_.PSObject.Properties | Foreach-Object {$_.Value = $_.Value.Trim()}
}
You can try this :
$a = import-csv "c:\somestuff.csv"
$a | % {$b=$_;$_.psobject.Properties | % {$c=$_.name ;$b."$c"=($b."$c").trim()}}
First : I import all the lines into $a.
Second : In a first loop I read each object (each CSV line), then for each object in a second loop I read each property, then I can trim each var.
Here's a slight modification of the above accepted answer. This strips all leading / tailing spaces from all .csv files in the current folder. Hope it helps!
gci -filter *.csv | foreach {
echo $_.NAME
$csv = Import-Csv $_.NAME
$csv | Foreach-Object {
$_.PSObject.Properties | Foreach-Object { $_.Value = $_.Value.Trim() }
}
$csv | Export-Csv $_.NAME -NoTypeInformation
}