I have a list of pdf file names which i have stored in a text file line by line as shown below.
322223491.pdf
322223492.pdf
322223493.pdf
the name of the text file is inclusions.txt.
I am passing this to a variable:
$inclusion = get-content .\inclusions.txt
Now i want to check and display if a folder (C:\users\xyz) contains files passed in $inclusions variable how do i do that?
PS C:\users\xyz> Get-childitem
command displays the following
Mode LastWritetime Name
The Name column is the property of interest to me where in i want to compare my inclusions list and if its found in the list then i want to display them in the Powershell terminal
PS C:\users\xyz> Get-childitem | Select-Object -Property Name
is as far as i have got my head around it i dont know how to proceed further to display the filtered data by comparing it with the $inclusions variable.
Furthermore i want to delete the files if they match the name in the inclusions text file. So that is the final Objective to traverse through all the files in the folder compare if the name is same as in the inclusions list and if yes then delete those particular files.
Any help would be appreciated.
Many thanks
Use Where-Object to filter on a condition:
Get-ChildItem |Where-Object { $inclusion -contains $_.Name }
If you are only interested in the file names themselves, use Select-Object -ExpandProperty to grab just the name of each file:
Get-ChildItem |Where-Object { $inclusion -contains $_.Name } |Select-Object -ExpandProperty Name
If you want to do something based on whether a certain folder contains one or more of the files in $inclusion, use an if statement:
$folderPath = "C:\users\xyz\folder\of\interest"
if(#(Get-ChildItem $folderPath |Where-Object {$inclusion -contains $_.Name}).Count -ge 1)
{
Write-Host "Folder $folderPath contains at least one of the inclusion files"
}
Related
Thanks in advance for the help.
I have a folder with multiple CSV files. I’d like to be able to extract the first line of each of the files and store the results in a separate CSV file. The newly created CSV file will have the first column as the file name and the second column to be the first line of the file.
The output should look something like this (as an exported CSV File):
FileName,FirstLine
FileName1,Col1,Col2,Col3
FileName2,Col1,Col2,Col3
Notes:
There are other files that should be ignored. I’d like the code to loop through all CSV files which match the name pattern. I’m able to locate the files using the below code:
$targetDir ="C:\CSV_Testing\"
Get-ChildItem -Path $targetDir -Recurse -Filter "em*"
I’m also able to read the first line of one file with the below code:
Get-Content C: \CSV_Testing\testing.csv | Select -First 1
I guess I just need someone to help with looping through the files and exporting the results. Is anyone able to assist?
Thanks
You basically need a loop, to enumerate each file, for this you can use ForEach-Object, then to construct the output you need to instantiate new objects, for that [pscustomobject] is the easiest choice, then Export-Csv will convert those objects into CSV.
$targetDir = "C:\CSV_Testing"
Get-ChildItem -Path $targetDir -Recurse -Filter "em*.csv" | ForEach-Object {
[pscustomobject]#{
FileName = $_.Name
FirstLine = $_ | Get-Content -TotalCount 1
}
} | Export-Csv path\to\theResult.csv -NoTypeInformation
I have assumed the files actually have the .Csv extension hence changed your filter to -Filter "em*.csv", if that's not the case you could use the filter as you currently have it.
If I execute:
Get-ChildItem *.ext -recurse
the output consists of a series of Directory sections followed by one or more columns of info for each matching file separated by said directory sections. Is there something like the Unix find command? In which each matching file name appears on a single line with its full relative path?
Get-Childitem by default outputs a view for format-table defined in a format xml file somewhere.
get-childitem | format-table
get-childitem | format-list *
shows you the actual properties in the objects being output. See also How to list all properties of a PowerShell object . Then you can pick and choose the ones you want. This would give the full pathname:
get-childitem | select fullname
If you want an output to be just a string and not an object:
get-childitem | select -expand fullname
get-childitem | foreach fullname
Resolve-Path with the -Relative switch can be used to display the relative paths of a set of paths. You can collect the full path names (FullName property) from the Get-ChildItem command and use the member access operator . to grab the path values only.
Resolve-Path -Path (Get-ChildItem -Filter *.ext -Recurse).FullName -Relative
Note: The relative paths here only accurately reflect files found within the current directory (Get-ChildItem -Path .), i.e. Get-ChildItem -Path NotCurrentDirectory could have undesirable results.
Get-ChildItem's -Name switch does what you want:
It outputs the relative paths (possibly including subdir. components) of matching files as strings (type [string]).
# Lists file / dir. paths as *relative paths* (strings).
# (relative to the input dir, which is implicitly the current one here).
Get-ChildItem -Filter *.ext -Recurse -Name
Note that I've used -Filter, which significantly speeds up the traversal.
Caveat: As of PowerShell 7.0, -Name suffers from performance problems and behavioral quirks; see these GitHub issues:
https://github.com/PowerShell/PowerShell/issues/9014
https://github.com/PowerShell/PowerShell/issues/9119
https://github.com/PowerShell/PowerShell/issues/9126
https://github.com/PowerShell/PowerShell/issues/9122
https://github.com/PowerShell/PowerShell/issues/9120
I am having some problem passing the path plus filename to a parser. There are about 90 files of 1 GB each involved in my task. Each of the file is contained in a folder of its own. All of the folders are contained under a parent folder.
Goal: Ideally, I would like to parse 20 files simultaneously for multitasking and continue to the next 20 until all 90 files are done.
This would mean that I would like to spawn some concurrent parsing of 20 files in a batch at any one given time. In carrying out the parsing, I would like to use measure-command to time the work from beginning to finish.
Script I have used:
Get-ChildItem –Path "E:\\OoonaFTP\\input\\Videos3\\" -Filter *.mp4 -recurse | select -expand fullname
Foreach-Object {
Measure-Command { "E:\OoonaFTP\Ooona_x64_ver_2.5.13\OoonaParser.exe -encode -dat -drm $_.FullName" } | Select-Object -Property TotalSeconds
}
===============================
I have this working batch script with a for statement but doing each iteration one after another. This is not what is the ideal case though. I would really like to accomplish this in PowerShell and with simultaneous tasks.
Could someone please suggest some ways by which I could accomplish this?
Thank you very much!
Thanks for the various suggestions. I'm curious that some of them lead to empty output in my Powershell (PSVersion: 5.1.18362.145).
I tried a number of these and, inspired by some of them, found the best answer for my case at the moment:
Get-ChildItem *.ext -recurse | Select-Object -property fullname
(When I made the window wide enough I got all the info I needed; in general I suppose I might need to do more to get the formatting I want.)
Is it possible to get the permissions of a folder and its sub-folders then display the path, group, and users associated to that group? So, to look something like this. Or will it have to be one folder at a time.
-Folder1
-Line separator
-Group
-Line separator
-List of users
-Folder2
-Line separator
-Group
-Line separator
-List of users
The script I've come up with so far be warned I have very little experience with powershell. (Don't worry my boss knows.)
Param([string]$filePath)
$Version=$PSVersionTable.PSVersion
if ($Version.Major -lt 3) {Throw "Powershell version out of date. Please update powershell." }
Get-ChildItem $filePath -Recurse | Get-Acl | where { $_.Access | where { $_.IsInherited -eq $false } } | select -exp Access | select IdentityReference -Unique | Out-File .\Groups.txt
$Rspaces=(Get-Content .\Groups.txt) -replace 'JAC.*?\\|','' |
Where-Object {$_ -notmatch 'BUILTIN|NT AUTHORITY|CREATOR|-----|Identity'} | ForEach-Object {$_.TrimEnd()}
$Rspaces | Out-File .\Groups.txt
$ErrorActionPreference= 'SilentlyContinue'
$Groups=Get-Content .\Groups.txt
ForEach ($Group in $Groups)
{Write-Host"";$Group;Write-Host --------------
;Get-ADGroupMember -Identity $Group -Recursive | Get-ADUser -Property DisplayName | Select Name}
This only shows the groups and users, but not the path.
Ok, let's take it from the top! Excellent, you actually declare a parameter. What you might want to consider is setting a default value for the parameter. What I would do is use the current directory, which conveniently has an automatic variable $PWD (I believe that's short for PowerShell Working Directory).
Param([string]$filePath = $PWD)
Now if a path is provided it will use that, but if no path is provided it just uses the current folder as a default value.
Version check is fine. I'm pretty sure there's more elegant ways to do it, but I honestly don't have never done any version checking.
Now you are querying AD for each group and user that is found (after some filtering, granted). I would propose that we keep track of groups and members so that we only have to query AD once for each one. It may not save a lot of time, but it'll save some if any group is used more than once. So for that purpose we're going to make an empty hashtable to track groups and their members.
$ADGroups = #{}
Now starts a bad trend... writing to files and then reading those files back in. Outputting to a file is fine, or saving configurations, or something that you'll need again outside of the current PowerShell session, but writing to a file just to read it back into the current session is just a waste. Instead you should either save the results to a variable, or work with them directly. So, rather than getting the folder listing, piping it directly into Get-Acl, and losing the paths we're going to do a ForEach loop on the folders. Mind you, I added the -Directory switch so it will only look at folders and ignore files. This happens at the provider level, so you will get much faster results from Get-ChildItem this way.
ForEach($Folder in (Get-ChildItem $filePath -Recurse -Directory)){
Now, you wanted to output the path of the folder, and a line. That's easy enough now that we aren't ditching the folder object:
$Folder.FullName
'-'*$Folder.FullName.Length
Next we get the ACLs for the current folder:
$ACLs = Get-Acl -Path $Folder.FullName
And here's where things get complicated. I'm getting the group names from the ACLs, but I've combined a couple of your Where statements, and also added a check to see if it is an Allow rule (because including Deny rules in this would just be confusing). I've used ? which is an alias for Where, as well as % which is an alias for ForEach-Object. You can have a natural line brake after a pipe, so I've done that for ease of reading. I included comments on each line for what I'm doing, but if any of it is confusing just let me know what you need clarification on.
$Groups = $ACLs.Access | #Expand the Access property
?{ $_.IsInherited -eq $false -and $_.AccessControlType -eq 'Allow' -and $_.IdentityReference -notmatch 'BUILTIN|NT AUTHORITY|CREATOR|-----|Identity'} | #Only instances that allow access, are not inherited, and aren't a local group or special case
%{$_.IdentityReference -replace 'JAC.*?\\'} | #Expand the IdentityReference property, and replace anything that starts with JAC all the way to the first backslash (likely domain name trimming)
Select -Unique #Select only unique values
Now we'll loop through the groups, starting off by outputting the group name and a line.
ForEach ($Group in $Groups){
$Group
'-'*$Group.Length
For each group I'll see if we already know who's in it by checking the list of keys on the hashtable. If we don't find the group there we'll query AD and add the group as a key, and the members as the associated value.
If($ADGroups.Keys -notcontains $Group){
$Members = Get-ADGroupMember $Group -Recursive -ErrorAction Ignore | % Name
$ADGroups.Add($Group,$Members)
}
Now that we're sure that we have the group members we will display them.
$ADGroups[$Group]
We can close the ForEach loop pertaining to groups, and since this is the end of the loop for the current folder we'll add a blank line to the output, and close that loop as well
}
"`n"
}
So I wrote this up and then ran it against my C:\temp folder. It did tell me that I need to clean up that folder, but more importantly it showed me that most of the folders don't have any non-inherited permissions, so it would just give me the path with an underline, a blank line, and move to the next folder so I had a ton of things like:
C:\Temp\FolderA
---------------
C:\Temp\FolderB
---------------
C:\Temp\FolderC
---------------
That doesn't seem useful to me. If it is to you then use the lines above as I have them. Personally I chose to get the ACLs, check for groups, and then if there are no groups move to the next folder. The below is the product of that.
Param([string]$filePath = $PWD)
$Version=$PSVersionTable.PSVersion
if ($Version.Major -lt 3) {Throw "Powershell version out of date. Please update powershell." }
#Create an empty hashtable to track groups
$ADGroups = #{}
#Get a recursive list of folders and loop through them
ForEach($Folder in (Get-ChildItem $filePath -Recurse -Directory)){
# Get ACLs for the folder
$ACLs = Get-Acl -Path $Folder.FullName
#Do a bunch of filtering to just get AD groups
$Groups = $ACLs |
% Access | #Expand the Access property
where { $_.IsInherited -eq $false -and $_.AccessControlType -eq 'Allow' -and $_.IdentityReference -notmatch 'BUILTIN|NT AUTHORITY|CREATOR|-----|Identity'} | #Only instances that allow access, are not inherited, and aren't a local group or special case
%{$_.IdentityReference -replace 'JAC.*?\\'} | #Expand the IdentityReference property, and replace anything that starts with JAC all the way to the first backslash (likely domain name trimming)
Select -Unique #Select only unique values
#If there are no groups to display for this folder move to the next folder
If($Groups.Count -eq 0){Continue}
#Display Folder Path
$Folder.FullName
#Put a dashed line under the folder path (using the length of the folder path for the length of the line, just to look nice)
'-'*$Folder.FullName.Length
#Loop through each group and display its name and users
ForEach ($Group in $Groups){
#Display the group name
$Group
#Add a line under the group name
'-'*$Group.Length
#Check if we already have this group, and if not get the group from AD
If($ADGroups.Keys -notcontains $Group){
$Members = Get-ADGroupMember $Group -Recursive -ErrorAction Ignore | % Name
$ADGroups.Add($Group,$Members)
}
#Display the group members
$ADGroups[$Group]
}
#output a blank line, for some seperation between folders
"`n"
}
I have managed to get this working for me.
I edited the below section to show the Name and Username of the user.
$Members = Get-ADGroupMember $Group -Recursive -ErrorAction Ignore | % Name | Get-ADUser -Property DisplayName | Select-Object DisplayName,Name | Sort-Object DisplayName
This works really well, but would there be a way to get it to stop listing the same group access if it's repeated down the folder structure?
For example, "\example1\example2" was assigned a group called "group1" and we had the following folder structure:
\\example1\example2\folder1
\\example1\example2\folder2
\\example1\example2\folder1\randomfolder
\\example1\example2\folder2\anotherrandomfolder
All the folders are assigned the group "group1", and the current code will list each directory's group and users, even though it's the same. Would it be possible to get it to only list the group and users once if it's repeated down the directory structure?
The -notcontains doesn't seem to work for me
If that makes sense?
I'm new to powershell and scripting in general. Doing lots of reading and testing and this is my first post.
Here is what I am trying to do. I have a folder that contains sub-folders for each report that runs daily. A new sub-folder is created each day.
The file names in the sub-folders are the same with only the date changing.
I want to get a specific file from yesterday's folder.
Here is what I have so far:
Get-ChildItem -filter “MBVOutputQueriesReport_C12_Custom.html” -recurse -path D:\BHM\Receive\ | where(get-date).AddDays(-1)
Both parts (before and after pipe) work. But when I combine them it fails.
What am I doing wrong?
What am I doing wrong?
0,1,2,3,4,5 | Where { $_ -gt 3 }
this will compare the incoming number from the pipeline ($_) with 3 and allow things that are greater than 3 to get past it - whenever the $_ -gt 3 test evaluates to $True.
0,1,2,3,4,5 | where { $_ }
this has nothing to compare against - in this case, it casts the value to boolean - 'truthy' or 'falsey' and will allow everything 'truthy' to get through. 0 is dropped, the rest are allowed.
Get-ChildItem | where Name -eq 'test.txt'
without the {} is a syntax where it expects Name is a property of the thing coming through the pipeline (in this case file names) and compares those against 'test.txt' and only allows file objects with that name to go through.
Get-ChildItem | where Length
In this case, the property it's looking for is Length (the file size) and there is no comparison given, so it's back to doing the "casting to true/false" thing from earlier. This will only show files with some content (non-0 length), and will drop 0 size files, for example.
ok, that brings me to your code:
Get-ChildItem | where(get-date).AddDays(-1)
With no {} and only one thing given to Where, it's expecting the parameter to be a property name, and is casting the value of that property to true/false to decide what to do. This is saying "filter where *the things in the pipeline have a property named ("09/08/2016 14:12:06" (yesterday's date with current time)) and the value of that property is 'truthy'". No files have a property called (yesterday's date), so that question reads $null for every file, and Where drops everything from the pipeline.
You can do as Jimbo answers, and filter comparing the file's write time against yesterday's date. But if you know the files and folders are named in date order, you can save -recursing through the entire folder tree and looking at everything, because you know what yesterday's file will be called.
Although you didn't say, you could do approaches either like
$yesterday = (Get-Date).AddDays(-1).ToString('MM-dd-yyyy')
Get-ChildItem "d:\receive\bhm\$yesterday\MBVOutputQueriesReport_C12_Custom.html"
# (or whatever date pattern gets you directly to that file)
or
Get-ChildItem | sort -Property CreationTime -Descending | Select -Skip 1 -First 1
to get the 'last but one' thing, ordered by reverse created date.
Read output from get-date | Get-Member -MemberType Property and then apply Where-Object docs:
Get-ChildItem -filter “MBVOutputQueriesReport_C12_Custom.html” -recurse -path D:\BHM\Receive\ | `
Where-Object {$_.LastWriteTime.Date -eq (get-date).AddDays(-1).Date}
Try:
where {$_.lastwritetime.Day -eq ((get-date).AddDays(-1)).Day}
You could pipe the results to the Sort command, and pipe that to Select to just get the first result.
Get-ChildItem -filter “MBVOutputQueriesReport_C12_Custom.html” -recurse -path D:\BHM\Receive\ | Sort LastWriteTime -Descending | Select -First 1
Can do something like this.
$time = (get-date).AddDays(-1).Day
Get-ChildItem -Filter "MBVOutputQueriesReport_C12_Custom.html" -Recurse -Path D:\BHM\Receive\ | Where-Object { $_.LastWriteTime.Day -eq $time }
I have a list of file Id's
and I want to find the system feed files containing any of these numbers from a vast directory using powershell.
I was using Get-Content Cash* -totalcount 1 > cash_Check_outputfile.txt
but as each file contains numerous headers this was not working as I hoped.
Can anyone point me in the right direction?
Much appreciated.
Use the Get-ChildItem cmdlet to retrieve all files and use the Select-String cmdlet to find the files containing any number within your dictionary ($test in this example). Finally, use the Select-Object cmdlet to get the path.
$test = #(
707839
709993
)
$pathToSearch = 'C:\test'
Get-ChildItem $pathToSearch | Select-String $test | Select-Object -ExpandProperty Path