Powershell - GetChilditem grab first result - powershell

I have an issue with a powershell script i have made. The purpose of the script is to gather information from various ressources, CMDB og and other systems and gather them in a combined report and send it.
I have everything working just fine, except one single ting that keeps bothering me. In my script, i do a lot of parsing and trimming in the information i get, at in some functions i need to get some XML files. Example:
$filter = "D:\WEC\Script\Rapportering\BigFixData\"
$xmlfiles = Get-ChildItem -path $filter -Filter "Bigfix_trimmed_JN.xml" -Recurse -Force |where {$_.psIsContainer -eq $false }
$xmlfile = $xmlfiles | ogv -OutputMode Single
There will always be only one file to grab, and thats why i use the Filter option and give the specific name. The code above will trigger a pop-up, asking me to select the file. It works fine except for the file picker popup. I want to get rid of that.
I then changed the code to this:
$filter = "D:\WEC\Script\Rapportering\BigFixData\"
$xmlfiles = Get-ChildItem -path $filter -Filter "Bigfix_trimmed_JN.xml" | Select-Object -First 1 |where {$_.psIsContainer -eq $false }
This no longer shows the popup, but it does not seem to select the file. Resulting in a referenceObject error later in the script, because it is null.
the script is about 1000 lines and i have narrowed the error down to the command aboove.
Can anyone help me figuring out what i do wrong?
Thanks in advance

Your 2nd command is missing the -Recurse switch, which may explain why you're not getting any result.
While it is unlikely that directories match with a filter pattern as specific as "Bigfix_trimmed_JN.xml", the more concise and faster way to limit matching to files only in PSv3+ is to use the -File switch (complementarily, there's also a -Directory switch).
$xmlfile = Get-ChildItem $filter -Filter Bigfix_trimmed_JN.xml -Recurse -File |
Select-Object -First 1
You should add a check to see if no file was returned.

If you want the first file, you'll need to filter out the directories before piping to Select-Object -First 1 - otherwise you run the risk of the first input element being a directory and your pipeline therefore evaluates to $null:
$xmlfiles = Get-ChildItem -path $filter -Filter "Bigfix_trimmed_JN.xml" | Where-Object {-not $_.PsIsContainer} | Select-Object -First 1

Related

Counting .csv files based on their first n number of characters

I need to write a script that checks the first 3 to say 5 characters of a .csv file, then count those files and report the characters with the corresponding count.
I already have a few methods to do the simpler tasks. I've used a filter before to only select certain .csvs, using this:
Get-ChildItem C:\path -Recurse -Filter *.csv | Where {$_.Name -match 'NNN'}
I also use this to count the number of csvs in that corresponding location:
(Get-ChildItem C:\path -Recurse -Filter *.csv | Measure-Object).Count
How can I run a scan through a folder with say 3 random titles for the csv? Say they're RUNxxxxxxxx.csv, FUNxxxxxxx.csv, and TUNxxxxxxx.csv.
Edit: Let me explain more; basically, the example csvs I have above would be completely random, so it'd need to recognize those first 3 are different and only count those.
I'm not sure if a prompt inputting these would do any. The values are known, just different week to week (which is when this would be run;every week)
Thanks!
You can accomplish that using the where-object filter combining three or statements which finds anything starting with the three letters you want and a wild card after.
Get-ChildItem C:\path -Recurse -Filter *.csv | Where {$_.Name -like "RUN*" -or $_.Name -like "Fun*" -or $_.Name -like "TUN*"}
You could also use the get-childitem -filter to accomplish this instead of searching for all .csv files and then use where-object.
Get-ChildItem C:\path -Recurse -Include "fUN*.csv","RUN*.csv","Tun*.csv"
How about using a calculated property to add the desired part as additional property to be able to work with it?
$PatternList =
'run',
'fun',
'tun'
Get-ChildItem -Recurse -Filter *.csv |
Select-Object -Property #{Name = 'Part'; Expression={($_.Name).substring(0,3)}},* |
Where-Object -Property Part -In -Value $PatternList |
Group-Object -Property Part

How can I use PowerShell or a cmd "dir" to get the contents of multiple, but similar paths?

For example, I want the contents of the "Last" folder in the structure below. The various path structures are identical except for the first two levels.
C:\zyx-wvu\abc\Level3\Last
C:\tsr-qpo\def\Level3\Last
C:\nml-kji\ghi\Level3\Last
In PowerShell I get close with:
Get-ChildItem -Path C:\*-*\*
...but it doesn't return any results (as in it never finishes) when I try:
Get-ChildItem -Path C:\*-*\*\Level3
Get-ChildItem -Path C:\*-*\*
will only show you what's in the second layer behind anything with a hyphen in c:\
aka it will show
c:\1-2\alpha
c:\1-5\beta
etc...
What you want is
Get-ChildItem -Path C:\*-*\*\*
or more likely you want
Get-ChildItem -Path C:\*-*\* -recurse
if you want to find paths with the SAME name... you could group them together, and pull out anything with more than one finding... you didn't ask very specifically what you wanted, but here's some ideas.
get-childitem -Path c:\*-*\*\* | group-object -property basename | where count -gt 1 | select -expand group

Powershell - Match ID's in a text file against filenames in multiple folders

I need to search through 350,000 files to find any that contains certain patterns in the filename. However, the list of patterns (id numbers) that it needs to match is 1000! So I would very much like to be able to script this, because they were originally planning on doing it manually...
So to make it clearer:
Check each File in folder and all subfolders.
If the filename contains any of the IDs in the text file then move it to another file
Otherwise, ignore it.
So I have the basic code that works with a single value:
$name = Get-Content 'C:\test\list.txt'
get-childitem -Recurse -path "c:\test\source\" -filter "*$name*" |
move-item -Destination "C:\test\Destination"
If I change $name to point to a single ID, it works, if I have a single ID in the txt file, it works. Multiple items in a list:
1111111
2222222
3333333
It fails. What am I doing wrong? How can I get it to work? I'm still new to powershell so please be a little more descriptive in any answers.
Your test fails because it is effectively trying to do this (using your test data).
Get-ChildItem -Recurse -Path "c:\test\source\" -filter "*1111111 2222222 3333333*"
Which obviously does not work. It is squishing the array into one single space delimited string. You have to account for the multiple id logic in a different way.
I am not sure which of these will perform better so make sure you test both of these with your own data to get a better idea of execution time.
Cycle each "filter"
$filters = Get-Content 'C:\test\list.txt'
# Get the files once
$files = Get-ChildItem -Recurse -Path "c:\test\source" -File
# Cycle Each ID filter manually
$filters | ForEach-Object{
$singleFilter
$files | Where-Object{$_.Name -like "*$singleFilter*"}
} | Move-Item -Destination "C:\test\Destination"
Make one larger filter
$filters = Get-Content 'C:\test\list.txt'
# Build a large regex alternative match pattern. Escape each ID in case there are regex metacharacters.
$regex = ($filters | ForEach-Object{[regex]::Escape($_)}) -join "|"
# Get the files once
Get-ChildItem -Recurse -path "c:\test\source" -File |
Where-Object{$_.Name -match $regex} |
Move-Item -Destination "C:\test\Destination"
try following this tutorial on how to use get-content function. Looks like when you have a multiple line file, you get an array back. you then have to iterate through your array and use the logic you used for only one item

Powershell -- Get-ChildItem Directory full path and lastaccesstime

I am attempting to output full directory path and lastaccesstime in one line.
Needed --
R:\Directory1\Directory2\Directory3, March 10, 1015
What I am getting --
R:\Directory1\Directory2\Directory3
March 10, 1015
Here is my code, It isn't that complicated, but it is beyond me.
Get-ChildItem -Path "R:\" -Directory | foreach-object -process{$_.FullName, $_.LastAccessTime} | Where{ $_.LastAccessTime -lt [datetime]::Today.AddYears(-2) } | Out-File c:\temp\test.csv
I have used foreach-object in the past in order to ensure I do not truncate the excessively long directory names and paths, but never used it when pulling two properties. I would like the information to be on all one line, but haven't been successful. Thanks in advance for the assist.
I recommend filtering (Where-Object) before selecting the properties you want. Also I think you want to replace ForEach-Object with Select-Object, and lastly I think you want Export-Csv rather than Out-File. Example:
Get-ChildItem -Path "R:\" -Directory |
Where-Object { $_.LastAccessTime -lt [DateTime]::Today.AddYears(-2) } |
Select-Object FullName,LastAccessTime |
Export-Csv C:\temp\test.csv -NoTypeInformation
We can get your output on one line pretty easily, but to make it easy to read we may have to split your script out to multiple lines. I'd recommend saving the script below as a ".ps1" which would allow you to right click and select "run with powershell" to make it easier in the future. This script could be modified to play around with more inputs and variables in order to make it more modular and work in more situations, but for now we'll work with the constants you provided.
$dirs = Get-ChildItem -Path "R:\" -Directory
We'll keep the first line you made, since that is solid and there's nothing to change.
$arr = $dirs | Select-Object {$_.FullName, $_.LastAccessTime} | Where-Object{ $_.LastAccessTime -lt [datetime]::Today.AddYears(-2) }
For the second line, we'll use "Select-Object" instead. In my opinion, it's a lot easier to create an array this way. We'll want to deal with the answers as an array since it'll be easiest to post the key,value pairs next to each other this way. I've expanded your "Where" to "Where-Object" since it's best practice to use the full cmdlet name instead of the alias.
Lastly, we'll want to convert our "$arr" object to csv before putting in the temp out-file.
ConvertTo-CSV $arr | Out-File "C:\Temp\test.csv"
Putting it all together, your final script will look like this:
$dirs = Get-ChildItem -Path "C:\git" -Directory
$arr = $dirs | Select-Object {$_.FullName, $_.LastAccessTime} | Where{ $_.LastAccessTime -lt [datetime]::Today.AddYears(-2) }
ConvertTo-CSV $arr | Out-File "C:\Temp\test.csv"
Again, you can take this further by creating a function, binding it to a cmdlet, and creating parameters for your path, output file, and all that fun stuff.
Let me know if this helps!

PowerShell script file modify time>10h and return a value if nothing is found

I am trying to compose a script/one liner, which will find files which have been modified over 10 hours ago in a specific folder and if there are no files I need it to print some value or string.
Get-ChildItem -Path C:\blaa\*.* | where {$_.Lastwritetime -lt (date).addhours(-10)}) | Format-table Name,LastWriteTime -HideTableHeaders"
With that one liner I am getting the wanted result when there are files with
modify time over 10 hours, but I also need it to print value/string if there are
no results, so that I can monitor it properly.
The reason for this is to utilize the script/one liner for monitoring purposes.
Those cmdlet Get-ChildItem and where clause you have a would return null if nothing was found. You would have to account for that separately. I would also caution the use of Format-Table for output unless you are just using it for screen reading. If you wanted a "one-liner" you would could this. All PowerShell code can be a one liner if you want it to be.
$results = Get-ChildItem -Path C:\blaa\*.* | where {$_.Lastwritetime -lt (date).addhours(-10)} | Select Name,LastWriteTime; if($results){$results}else{"No files found matching criteria"}
You have an added bracket in your code, that might be a copy artifact, I had to remove. Coded properly would look like this
$results = Get-ChildItem -Path "C:\blaa\*.*" |
Where-Object {$_.Lastwritetime -lt (date).addhours(-10)} |
Select Name,LastWriteTime
if($results){
$results
}else{
"No files found matching criteria"
}