I'm working on a script to output some data from multiple files based on a string search. It outputs the string found, followed by the following six characters. I can get this to work for an exact location. However, I want to search across files inside multiple subfolders in the path. Using the below script, I get PermissionDenied errors...
[regex] $pattern = '(?<=(a piece of text))(?<chunk>.*)'
Get-Content -Path 'C:\Temp\*' |
ForEach-Object {
if ($_ -match $pattern) {
$smallchunk = $matches.chunk.substring(0, 6)
}
}
"$smallchunk" | Out-File 'C:\Temp\results.txt'
If I change -Path to one of the subfolders, it works fine, but I need it to go inside each subfolder and execute the get-content.
e.g., look inside...
C:\Temp\folder1\*
C:\Temp\folder2\*
C:\Temp\folder3\*
And so on...
Following up on boxdog's suggestion of Select-String, the only limitation would be folder recursion. Unfortunately, Select-String only allows the searching of multiple files in one directory.
So, the way around this is piping the output of Get-ChildItem with a -Recurse switch into Select-String:
$pattern = "(?<=(a piece of text))(?<chunk>.*)"
Get-ChildItem -Path "C:\Temp\" -Exclude "results.txt" -File -Recurse |
Select-String -Pattern $pattern |
ForEach-Object -Process {
$_.Matches[0].Groups['chunk'].Value.Substring(0,6)
} | Out-File -FilePath "C:\Temp\results.txt"
If there's a need for the result to be saved to $smallchunk you can still do so inside the loop if need be.
Abraham Zinala's helpful answer is the best solution to your problem, because letting Select-String search your files' content is faster and more memory-efficient than reading and processing each line with Get-Content.
As for what you tried:
Using the below script I get PermissionDenied errors...
These stem from directories being among the file-system items output by Get-ChildItem, which Get-Content cannot read.
If your files have distinct filename extensions that your directories don't, one option is to pass them to the (rarely used with Get-Content) -Include parameter; e.g.:
Get-Content -Path C:\Temp\* -Include *.txt, *.c
However, as with Select-String, this limits you to a single directory's content, and it doesn't allow you to limit processing to files fundamentally, if extension-based filtering isn't possible.
For recursive listing, you can use Get-ChildItem with -Recurse, as in Abraham's answer, and pipe the file-info objects to Get-Content:
Get-ChildItem -Recurse C:\Temp -Include *.txt, *.c | Get-Content
If you want to simply limit output to files, whatever their name is, use the -File switch (similarly, -Directory limits output to directories):
Get-ChildItem -File -Recurse C:\Temp | Get-Content
Related
I have a txt file with filenames (i.e 01234.tif) that I would like to use to filter a Get-ChildItem cmdlet. I did
$filenames = Get-Content filenames.txt
(also tried with | Out-String)
and then
Get-ChildItem . -Include $filenames | ForEach {if (!(Test-Path -Path ./jpeg/$_.Basename+".jpg")) {some imagemagick processing}}
but it does nothing. Funny part is that it works for excluding, since
Get-ChildItem . -Exclude $filenames > exclude.txt
writes the expected amount of lines. What am I missing here?
Get-Content filenames.txt | ForEach (path test) {imagemagick}
runs but copies all items, so either the path checking or Get-Content isn't working as expected.
Perhaps surprisingly, -Include (also -Exclude) is first applied to the names of the (possibly wildcard-expanded) input path(s), and - only in case of match - then to the children of directories targeted by literal path(s).
The problem does not arise if -Recurse is also used.
See GitHub issue #3304.
Therefore, use Get-Item * (Get-ChildItem * would work too, but only with -Include, not (properly) with -Exclude), so that the names of the child items are matched against the -Include patterns:
Get-Item * -Include (Get-Content filenames.txt)
Add -Force to also include hidden items.
See this answer for a detailed discussion of the pitfalls of -Include and -Exclude.
i am quite new to powershell and i am trying to make a script that copy files to certain folders that are declared in a CSV file. But till now i am getting errors from everywhere and can't find nothing to resolve this issue.
I have this folders and .txt files created in the same folder as the script.
Till now i could only do this:
$files = Import-Csv .\files.csv
$files
foreach ($file in $files) {
$name = $file.name
$final = $file.destination
Copy-Item $name -Destination $final
}
This is my CSV
name;destination
file1.txt;folderX
file2.txt;folderY
file3.txt;folderZ
As the comments indicate, if you are not using default system delimiters, you should make sure to specify them.
I also recommend typically to use quotes for your csv to ensure no problems with accidentally including an entry that includes the delimiter in the name.
#"
"taco1.txt";"C:\temp\taco2;.txt"
"# | ConvertFrom-CSV -Delimiter ';' -Header #('file','destination')
will output
file destination
---- -----------
taco1.txt C:\temp\taco2;.txt
The quotes make sure the values are correctly interpreted. And yes... you can name a file foobar;test..txt. Never underestimate what users might do. 😁
If you take the command Get-ChildItem | Select-Object BaseName,Directory | ConvertTo-CSV -NoTypeInformation and review the output, you should see it quoted like this.
Sourcing Your File List
One last tip. Most of the time I've come across a CSV for file input lists a CSV hasn't been needed. Consider looking at grabbing the files you in your script itself.
For example, if you have a folder and need to filter the list down, you can do this on the fly very easily in PowerShell by using Get-ChildItem.
For example:
$Directory = 'C:\temp'
$Destination = $ENV:TEMP
Get-ChildItem -Path $Directory -Filter *.txt -Recurse | Copy-Item -Destination $Destination
If you need to have more granular matching control, consider using the Where-Object cmdlet and doing something like this:
Get-ChildItem -Path $Directory -Filter *.txt -Recurse | Where-Object Name -match '(taco)|(burrito)' | Copy-Item -Destination $Destination
Often you'll find that you can easily use this type of filtering to keep CSV and input files out of the solution.
example
Using techniques like this, you might be able to get files from 2 directories, filter the match, and copy all in a short statement like this:
Get-ChildItem -Path 'C:\temp' -Filter '*.xlsx' -Recurse | Where-Object Name -match 'taco' | Copy-Item -Destination $ENV:TEMP -Verbose
Hope that gives you some other ideas! Welcome to Stack Overflow. 👋
I am writing a powershell script to perform the following:
Within a folder Folder > Subfolder1 > Subfolder2 there are 30+ subfolders.zipin which there is another subfolder with 200 HRML files.
I would like to search for a keyword WTSE in the HTML files and any files containing such keyword would be moved to another folder.
My script looks as follows at the moment:
Get-childitem C:\Users\XXXXX\Desktop\Folder\ -filter *.html -recurse | select-string 'WTSE'|foreach-object -process{move-item} C:\Users\XXXXX\Desktop\Folder2`
You're almost there. The problem is with the part after ForEach-Object.
Since you are not searching for a string using regex, I would suggest adding the -SimpleMatch to the Select-String cmdlet.
Try below:
$sourceFolder = 'C:\Users\XXXXX\Desktop\Folder'
$destination = 'C:\Users\XXXXX\Desktop\Folder2'
(Get-ChildItem -Path $sourceFolder -Filter '*.html' -Recurse | Select-String -Pattern 'WTSE' -SimpleMatch) |
Move-Item -Destination $destination
The Move-Item cmdlet can take an array of paths and these can also accepts pipeline input, so there is no need to use ForEach-Object here.
Note I'm using brackets around the first part (Get-ChildItem ... -SimpleMatch). This prevents the error that the process cannot open the file because it is in use
I want to copy one file logo.png to different multiple folders. As of now I am doing like this
Get-Childitem "D:\OrgIcon" -Recurse -Include "*logo.png" |
Copy-Item -Destination "D:\LoginPage"
Get-Childitem "D:\OrgIcon" -Recurse -Include "*logo.png" |
Copy-Item -Destination "D:\HomePage"
Get-Childitem "D:\OrgIcon" -Recurse -Include "*logo.png" |
Copy-Item -Destination "D:\AboutUs"
Is there any way to make it single command?
I have seen this but looks different.
I am not aware of one single command to achieve what you are trying to do.
As pointed out in the link you posted, you can achieve your goal with one line of code, but this involves piping.
"D:\LoginPage", "D:\HomePage", "D:\AboutUs" | ForEach-Object { Get-Childitem "D:\OrgIcon" -Recurse -Include "*logo.png" | Copy-Item -Destination $_}
In the first part you just list your destinations as strings and separated by comma.
The second part will execute the code within the braces { } for each destination.
Note: $_ stands for the data currently in the pipeline, so on each iteration $_ will be replaced with your destination.
I guess if this is something you need to do regularly with different destinations, sources and filenames you could always write a script file that accepts input for it's parameters, but that's a topic for another question.
Hope this is useful.
I'm trying to search text file content in a log directory for matching file names that exist in another directory.
I know I can do a Get-ChildItem $Path -file -name and get a list returned. I also know how to perform a Get-Content ... | Select-String -Pattern
However, I don't know how to feed the file list to the -Pattern.
What I've tried without success:
# Delete all Files in C:\Data\Uploads older than 90 day(s)
$Path = "C:\the_path"
$LogPath = "C:\logs"
Get-Content $LogPath + "\*.log" | Select-String -Pattern (Get-ChildItem $Path -name)
But I know this is just a blind attempt because Get-ChildItem is returning an iterative and not a usable pattern.
How can I do what I'm attempting to do and that is take a list of file names and recursively search for them in a directory of log files? #wishingitwasgrep
Select-String essentially is PowerShell's implementation of grep. Except it can't recurse by itself. That's where Get-ChildItem comes into play.
Get-ChildItem -Path "$LogPath\*.log" -Recurse |
Select-String -Pattern (Get-ChildItem $Path -Name) -SimpleMatch
You can make the statement a little less verbose by using aliases as well as positional instead of named parameters (not recommended for use in scripts, though).
ls "$LogPath\*.log" -r | sls (ls $Path -n) -s
If you want a regular expression match instead of a simple string match remove the -SimpleMatch switch.
You're close, but here's something that should work:
#(Get-Content -Path C:\logs\*.log) |
Where-Object { $_ -in #(Get-ChildItem -Path C:\the_path -Name) }
Now you have a list of files.
How can I do what I'm attempting to do and that is take a list of file names and recursively search for them in a directory of log files?
$List = Get-Content -Path 'C:\LogList.txt'
$LogList = #(Get-ChildItem -Path 'C:\Logs' -Recurse |
Where-Object { $_.Name -in $List })
This assumes your LogList.txt has a newline separated list of log file names with an extension (such as MyLog.txt). $LogList will then have an array of System.IO.FileInfo objects which you can utilize to do whatever you want with these files. For example:
$LogList | Remove-Item