Powershell refer to objects in table - select

I have a script I am working on that will output all fileNames and lineNumbers of a key word search.
$Paths = gci . *.* -rec | where { ! $_.PSIsContainer } | resolve-path
foreach($path in $Paths)
{
$ftp += Select-String -Path $Path -Pattern "FTP"
}
$ftpgroups = $ftp | select fileName,LineNumber | Format-Table -groupBy Filename
I decided to go with ft -groupby because group-object was not working correctly. But I need a way to reference this table so I can put it into a csv. When using the get-member commandlet it only gives me properties of formating. The ideal output for this is to have 1 fileName matched up to a group of fileLines. That way I can match that up to the path (which group-object worked succesfully on).
I am open to new ideas if I am going about this the wrong way. Thank you in advanced, hope it doesn't cause you as much trouble as it has me.

As you have found, the output of any of the Format-* cmdlets is formatting objects. These objects are meant for display to the console and not further manipulation. You really need Group-Object for this. In what way wasn't it working for you? I would think, this would work:
$ftpgroups = $ftp | Select Filename,LineNumber | Group Filename

Related

Powershell - GetChilditem grab first result

I have an issue with a powershell script i have made. The purpose of the script is to gather information from various ressources, CMDB og and other systems and gather them in a combined report and send it.
I have everything working just fine, except one single ting that keeps bothering me. In my script, i do a lot of parsing and trimming in the information i get, at in some functions i need to get some XML files. Example:
$filter = "D:\WEC\Script\Rapportering\BigFixData\"
$xmlfiles = Get-ChildItem -path $filter -Filter "Bigfix_trimmed_JN.xml" -Recurse -Force |where {$_.psIsContainer -eq $false }
$xmlfile = $xmlfiles | ogv -OutputMode Single
There will always be only one file to grab, and thats why i use the Filter option and give the specific name. The code above will trigger a pop-up, asking me to select the file. It works fine except for the file picker popup. I want to get rid of that.
I then changed the code to this:
$filter = "D:\WEC\Script\Rapportering\BigFixData\"
$xmlfiles = Get-ChildItem -path $filter -Filter "Bigfix_trimmed_JN.xml" | Select-Object -First 1 |where {$_.psIsContainer -eq $false }
This no longer shows the popup, but it does not seem to select the file. Resulting in a referenceObject error later in the script, because it is null.
the script is about 1000 lines and i have narrowed the error down to the command aboove.
Can anyone help me figuring out what i do wrong?
Thanks in advance
Your 2nd command is missing the -Recurse switch, which may explain why you're not getting any result.
While it is unlikely that directories match with a filter pattern as specific as "Bigfix_trimmed_JN.xml", the more concise and faster way to limit matching to files only in PSv3+ is to use the -File switch (complementarily, there's also a -Directory switch).
$xmlfile = Get-ChildItem $filter -Filter Bigfix_trimmed_JN.xml -Recurse -File |
Select-Object -First 1
You should add a check to see if no file was returned.
If you want the first file, you'll need to filter out the directories before piping to Select-Object -First 1 - otherwise you run the risk of the first input element being a directory and your pipeline therefore evaluates to $null:
$xmlfiles = Get-ChildItem -path $filter -Filter "Bigfix_trimmed_JN.xml" | Where-Object {-not $_.PsIsContainer} | Select-Object -First 1

Powershell -- Get-ChildItem Directory full path and lastaccesstime

I am attempting to output full directory path and lastaccesstime in one line.
Needed --
R:\Directory1\Directory2\Directory3, March 10, 1015
What I am getting --
R:\Directory1\Directory2\Directory3
March 10, 1015
Here is my code, It isn't that complicated, but it is beyond me.
Get-ChildItem -Path "R:\" -Directory | foreach-object -process{$_.FullName, $_.LastAccessTime} | Where{ $_.LastAccessTime -lt [datetime]::Today.AddYears(-2) } | Out-File c:\temp\test.csv
I have used foreach-object in the past in order to ensure I do not truncate the excessively long directory names and paths, but never used it when pulling two properties. I would like the information to be on all one line, but haven't been successful. Thanks in advance for the assist.
I recommend filtering (Where-Object) before selecting the properties you want. Also I think you want to replace ForEach-Object with Select-Object, and lastly I think you want Export-Csv rather than Out-File. Example:
Get-ChildItem -Path "R:\" -Directory |
Where-Object { $_.LastAccessTime -lt [DateTime]::Today.AddYears(-2) } |
Select-Object FullName,LastAccessTime |
Export-Csv C:\temp\test.csv -NoTypeInformation
We can get your output on one line pretty easily, but to make it easy to read we may have to split your script out to multiple lines. I'd recommend saving the script below as a ".ps1" which would allow you to right click and select "run with powershell" to make it easier in the future. This script could be modified to play around with more inputs and variables in order to make it more modular and work in more situations, but for now we'll work with the constants you provided.
$dirs = Get-ChildItem -Path "R:\" -Directory
We'll keep the first line you made, since that is solid and there's nothing to change.
$arr = $dirs | Select-Object {$_.FullName, $_.LastAccessTime} | Where-Object{ $_.LastAccessTime -lt [datetime]::Today.AddYears(-2) }
For the second line, we'll use "Select-Object" instead. In my opinion, it's a lot easier to create an array this way. We'll want to deal with the answers as an array since it'll be easiest to post the key,value pairs next to each other this way. I've expanded your "Where" to "Where-Object" since it's best practice to use the full cmdlet name instead of the alias.
Lastly, we'll want to convert our "$arr" object to csv before putting in the temp out-file.
ConvertTo-CSV $arr | Out-File "C:\Temp\test.csv"
Putting it all together, your final script will look like this:
$dirs = Get-ChildItem -Path "C:\git" -Directory
$arr = $dirs | Select-Object {$_.FullName, $_.LastAccessTime} | Where{ $_.LastAccessTime -lt [datetime]::Today.AddYears(-2) }
ConvertTo-CSV $arr | Out-File "C:\Temp\test.csv"
Again, you can take this further by creating a function, binding it to a cmdlet, and creating parameters for your path, output file, and all that fun stuff.
Let me know if this helps!

Searching through a text file

I have a script that searches for the lastest modified log file. It then is suppose to read that text file and pick up a key phrase then display the line after it.
So far i have this
$logfile = get-childitem 'C:\logs' | sort {$_.lastwritetime} | where {$_ -notmatch "X|Zr" }| select -last 1
$error = get-content $logfile | select-string -pattern "Failed to Modify"
an example line it reads is this
20150721 12:46:26 398fbb92 To CV Failed to Modify
CN=ROLE-x-USERS,OU=Role Groups,OU=Groups,DC=gyp,DC=gypuy,DC=net
MDS_E_BAD_MEMBERSHIP One or more members do not exist in the directory
They key bit of information im trying to get here is
Can anyone help?
Thanks
Try this:
$error = get-content $logfile |
Where-Object { $_ -like "*Failed to Modify*" } |
Select-Object -First 1
This is provided you are looking for the first match in the file. The Select-String cmdlet returns a MatchInfo object. Depending on your requirements there might be no reason to add that level of complexity if you're just looking to pull the first occurrence of this error in the file.
Failing this, my recommendation would be to debug this and step through it. Break on the Get-Content call and see what $logfile is. Run Get-Content $logfile and see what that content looks like. Then do your Select-String on that output. See what MatchInfo.ToString() looks like. Maybe you'll see some disconnect.
Again, my recommendation would be to just parse manually through the file and work with the Where-Object cmdlet at this point.
This shoul work:
get-childitem 'c:\logs' | where {$_.Name -notmatch "X|Zr" } | sort {$_.lastwritetime} | select -last 1 | select-string "Failed to Modify"
But I don't like "X|Zr" part. If your log files have .txt extension, it'll not list them because you're saying you don't want any file containing "x" or "zr" in entire name. Use $_.BaseName (name without extension), or modify regular expression.

Combine all content from several files, find matching strings and get a count of each line

I know how to get the data and search through it using some pattern. But that is not what I need.
Get-ChildItem -recurse -Filter *.xml | Get-Content | Select-String -pattern "something here"
I am searching through 100's of GPO xml files and we are trying to remove GPO's that perform the same thing over and over again. I want to find the unique values and combine them in one big happy gpo and get rid of all the redundant ones.
My goal :
1) Get all information from all *.xml files from 100's of sub folders and combine them into one file.
2) Find all lines that contain the same string and get a count of that string. I need a count for all strings in the combined file.
3) My goal is to find the lines that are unique and save them to a file, for further use.
Here's a quick-and-dirty approach using a Hashtable. Since the Hashtable setter performs an "update or create", you'll end up with a distinct list:
$ht = #{}
Get-ChildItem -recurse -Filter *.xml | Get-Content | %{$ht[$_] = $true}
$ht.Keys
Edit: Just saw you wanted counts as well. You can do this:
$ht = #{}
Get-ChildItem -recurse -Filter *.xml | Get-Content | %{$ht[$_] = $ht[$_]+1}
$ht
To export to CSV:
$ht.GetEnumerator() | select key, value | Export-Csv D:\output.csv

PowerShell Out-file manipulation

i hope someone can help.
I am trying to manipulate a file created by powershell.
I managed to get to the end result that i want, but i am sure it would be easier if it was only one command.
# Invoke the Exchange Snapping ( make sure you are Exchange Admin to do it SO)
add-pssnapin Microsoft.Exchange.Management.PowerShell.E2010
#Create a file with list of DL in the organization
Get-DistributionGroup | Select-Object Name | Out-File C:\Pre_DLGroups.txt
$content = Get-Content C:\Pre_DLGroups.txt
#Remove the 3 first lines of the file that you dont need it
$content | Select-Object -Skip 3 | Out-file C:\DLGroups.txt
#Trim the space in the end and crate the Final file
Get-Content C:\DLGroups.txt | Foreach {$_.TrimEnd()} | Set-Content c:\FinalDLGroup.txt
is that way to make the end result in a single file rather than creating 3?
cheers
Elton
You can send your content across the pipeline without writing it out to files. You can use parenthesis to group the output of certain sets of cmdlets and/or functions, and then pipe that output through to the intended cmdlets.
This can all be applied on a single line, but I've written it here on multiple lines for formatting reasons. The addition of Out-String is something of a safety measure to ensure that whatever output you're intending to trim can actually be trimmed.
Since we're not getting this content from a text file anymore, powershell could possibly return an object that doesn't understand TrimEnd(), so we need to be ready for that.
(Get-DistributionGroup | Select-Object Name) |
Out-String |
Select-Object -Skip 3 |
Foreach {$_.TrimEnd()} |
Set-Content c:\FinalDLGroup.txt
However, an even smaller solution would involve just pulling each name and manipulating it directly. I'm using % here as an alias for Foreach-Object. This example uses Get-ChildItem, where I have some files named test in my current directory:
(Get-ChildItem test*) |
% { $_.Name.TrimEnd() } |
Set-Content c:\output.txt
Get-DistributionGroup |
Select-Object -ExpandProperty Name -Skip 3 |
Set-Content c:\FinalDLGroup.txt