I am renaming the files of a directory. The way to rename them is to add a letter to the beginning of the file name, depending on the option a user chooses ("F" or "O") A possible solution to this exercise is the following:
$Path="C:\app\SandBox\"
Get-ChildItem -Path $Path -Filter *.xlsx | ForEach-Object {
$opt = Read-Host "Do you want modify (F) this file or not (O)?"
$name=$_.name
$PathFinal=$Path+$name
if ($opt -eq "F") {
$newName="F"+$name
Rename-Item -NewName $newName -Path $PathFinal
}
if ($opt -eq "O") {
$newName="F"+$name
Rename-Item -NewName $newName -Path $PathFinal
}
}
The loop iterates as many times as there are files in the directory.
However, when I try to change the code as follows:
$Path="C:\app\SandBox\"
Get-ChildItem -Path $Path -Filter *.xlsx | ForEach-Object {
$name=$_.name
$opt = Read-Host "$name - Do you want modify (F) this file or not (O)?"
if ($opt -eq "O") {
$name=$_.Name
Rename-Item -NewName "O$name" -Path $Path$name
}
if ($opt -eq "F") {
$name=$_.Name
Rename-Item -NewName "F$name" -Path $Path$name
}
}
It turns out that, in some cases, the loop iterates one more time!
If I have two files in the folder, sometimes the loop oterates three.
Which may be due?
It should iterate twice, but iterate three. I can't think of what it could be due to, since when channeling it should pass only two files.
It should iterate twice, but iterate three. I can't think of what it could be due to, since when channeling it should pass only two files.
Get-ChildItem works against the file system by asking for the first file matching a given filter, and then it continues asking the OS "what's the next matching file name after fileX", until the OS says "no more files".
In this case, A.xlsx is renamed to FA.xlsx on the first iteration, and the OS is thus able to answer the question after the next iteration: "what's the next matching file name after B.xlsx" with "Next file is FA.xlsx".
To enumerate only the files already in the directory when you start the script, place the call to Get-ChildItem in a subexpression or nested pipeline:
$(Get-ChildItem -Path $Path -Filter *.xlsx) | ForEach-Object { ... }
This will force PowerShell to wait for Get-ChildItem to finish executing before sending the first output item to ForEach-Object - and since Get-ChildItem is already "done" by the time the renaming starts, you don't risk seeing the same file multiple times.
Related
I'm in the process of writing up a PowerShell script that can take a bunch of .TIF images, rename them, and place them in a new folder structure depending on the original file name.
For example, a folder containing the file named:
ABC-ALL-210316-0001-3001-0001-1-CheckInvoice-Front.TIF
would be renamed to "00011CIF.TIF", and placed in the following folder:
\20220316\03163001\
I've been trying to put together a code to perform this task, and I got one to work where I had two different "ForEach" methods. One would do a bunch of file renaming to remove "-" and shorten "CheckInvoiceFront" to "CIF" and such. Then the second method would again pull all .TIF images, create substrings of the image names, and create folders from those substrings, and then move the image to the new folder, shortening the file name. Like I said, it worked... but I wanted to combine the ForEach methods into one process. However, each time I try to run it, it fails for various reasons... I've tried to change things around, but I just can't seem to get it to work.
Here's the current (non-working) code:
# Prompt user for directory to search through
$sorDirectory = Read-Host -Prompt 'Input source directory to search for images: '
$desDirectory = Read-Host -Prompt 'Input target directory to output folders: '
Set-Location $sorDirectory
# Check directory for TIF images, and close if none are found
Write-Host "Scanning "$sorDirectory" for images... "
$imageCheck = Get-ChildItem -File -Recurse -Path $sorDirectory -include '*.tif'
$imageCount = $imageCheck.count
if ($imageCount -gt 0) {
Write-Host "Total number of images found: $imageCount"
""
Read-Host -Prompt "Press ENTER to continue or CTRL+C to quit"
$count1=1;
# Rename all images, removing "ABCALL" from the start and inserting "20", and then shorten long filetype names, and move files to new folders with new names
Clear-Host
Write-Host "Reformatting images for processing..."
""
Get-ChildItem -File -Recurse -Path $sorDirectory -include '*.tif' |
ForEach-Object {
Write-Progress -Activity "Total Formatted Images: $count1/$imageCount" -Status "0--------10%--------20%--------30%--------40%--------50%--------60%--------70%--------80%--------90%-------100" -CurrentOperation $_ -PercentComplete (($count1 / $imageCount) * 100)
Rename-Item $_ -NewName $_.Name.Replace("-", "").Replace("ABCALL", "20").Replace("CheckInvoiceFront", "CIF").Replace("CheckInvoiceBack", "CIB").Replace("CheckFront", "CF").Replace("CheckBack", "CB") |Out-Null
$year = $_.Name.SubString(0, 4)
$monthday = $_.Name.Substring(4,4)
$batch = $_.Name.SubString(12, 4)
$fulldate = $year+$monthday
$datebatch = $monthday+$batch
$image = $_.Name.SubString(16)
$fullPath = "$desDirectory\$fulldate\$datebatch"
if (-not (Test-Path $fullPath)) { mkdir $fullPath |Out-Null }
Move-Item $_ -Destination "$fullPath\$image" |Out-Null
$count1++
}
# Finished
Clear-Host
Write-Host "Job complete!"
Timeout /T -1
}
# Closes if no images are found (likely bad path)
else {
Write-Host "There were no images in the selected folder. Now closing..."
Timeout /T 10
Exit
}
Usually this results in an error stating that it's can't find the path of the original file name, as if it's still looking for the original non-renamed image. I tried adding some other things, but then it said I was passing null values. I'm just not sure what I'm doing wrong.
Note that if I take the everything after the "Rename-Item" (starting with "$year =") and have that in a different ForEach method, it works. I guess I just don't know how to make the Rename-Item return its results back to "$_" before everything else tries working on it. I tried messing around with "-PassThru" but I don't think I was doing it right.
Any suggestions?
As Olaf points out, situationally you may not need both a Rename-Item and a Move-Item call, because Move-Item can rename and move in single operation.
That said, Move-Item does not support implicit creation of the target directory to move a file to, so in your case you do need separate calls.
You can use Rename-Item's -PassThru switch to make it output a System.IO.FileInfo instance (or, if a directory is being renamed, a System.IO.DirectoryInfo instance) representing the already renamed file; you can directly pass such an instance to Move-Item via the pipeline:
Get-ChildItem -File -Recurse -Path $sorDirectory -include '*.tif' |
ForEach-Object {
# ...
# Use -PassThru with Rename-Item to output a file-info object describing
# the already renamed file.
$renamedFile = $_ | Rename-Item -PassThru -NewName $_.Name.Replace("-", "").Replace("ABCALL", "20").Replace("CheckInvoiceFront", "CIF").Replace("CheckInvoiceBack", "CIB").Replace("CheckFront", "CF").Replace("CheckBack", "CB")
# ...
# Pass $renamedFile to Move-Item via the pipeline.
$renamedFile | Move-Item -Destination "$fullPath\$image"
# ...
}
As for your desire to:
make the Rename-Item return its results back to "$_"
While PowerShell doesn't prevent you from modifying the automatic $_ variable, it is better to treat automatic variables as read-only.
Therefore, a custom variable is used above to store the output from Rename-Item -PassThru
You need -passthru and -destination:
rename-item file1 file2 -PassThru | move-item -Destination dir1
I'm new to PowerShell, and trying to do something pretty simple (I think). I'm trying to filter down the results of a folder, where I only look at files that start with e02. I tried creating a variable for my folder path, and a variable for the filtered down version. When I get-ChildItem for that filtered down version, it brings back all results. I'm trying to run a loop where I'd rename these files.
File names will be something like e021234, e021235, e021236, I get new files every month with a weird extension I convert to txt. They're always the same couple names, and each file has its own name I'd rename it to. Like e021234 might be Program Alpha.
set-location "C:\MYPATH\SAMPLE\"
$dir = "C:\MYPATH\SAMPLE\"
$dirFiltered= get-childItem $dir | where-Object { $_.baseName -like "e02*" }
get-childItem $dirFiltered |
Foreach-Object {
$name = if ($_.BaseName -eq "e024") {"Four"}
elseif ($_.BaseName -eq "e023") {"Three"}
get-childitem $dirFiltered | rename-item -newname { $name + ".txt"}
}
There are a few things I can see that could use some adjustment.
My first thought on this is to reduce the number of places a script has to be edited when changes are needed. I suggest assigning the working directory variable first.
Next, reduce the number of times information is pulled. The Get-ChildItem cmdlet offers an integrated -Filter parameter which is usually more efficient than gathering all the results and filtering afterward. Since we can grab the filtered list right off the bat, the results can be piped directly to the ForEach block without going through the variable assignment and secondary filtering.
Then, make sure to initialize $name inside the loop so it doesn't accidentally cause issues. This is because $name remains set to the last value it matched in the if/elseif statements after the script runs.
Next, make use of the fact that $name is null so that files that don't match your criteria won't be renamed to ".txt".
Finally, perform the rename operation using the $_ automatic variable representing the current object instead of pulling the information with Get-ChildItem again. The curly braces have also been replaced with parenthesis because of the change in the Rename-Item syntax.
Updated script:
$dir = "C:\MYPATH\SAMPLE\"
Set-Location $dir
Get-ChildItem $dir -Filter "e02*" |
Foreach-Object {
$name = $null #initialize name to prevent interference from previous runs
$name = if ($_.BaseName -eq "e024") {"Four"}
elseif ($_.BaseName -eq "e023") {"Three"}
if ($name -ne $null) {
Rename-Item $_ -NewName ($name + ".txt")
}
}
is there a way to bulk rename items such that a folder with the items arranged in order would have their name changed into numbers with zero padding regardless of extension?
for example, a folder with files named:
file1.jpg
file2.jpg
file3.jpg
file4.png
file5.png
file6.png
file7.png
file8.jpg
file9.jpg
file10.mp4
would end up like this:
01.jpg
02.jpg
03.jpg
04.png
05.png
06.png
07.png
08.jpg
09.jpg
10.mp4
i had a script i found somewhere that can rename files in alphabetical order. however, it seems to only accepts conventionally bulk renamed files (done by selecting all the files, and renaming them such that they read "file (1).jpg" etc), which messes up the ordering when dealing with differing file extensions. it also doesn't seem to rename files with variations in their file names. here is what the code looked like:
Get-ChildItem -Path C:\Directory -Filter file* | % {
$matched = $_.BaseName -match "\((?<number>\d+)\)"
if (-not $matched) {break;}
[int]$number = $Matches["number"]
Rename-Item -Path $_.FullName -NewName "$($number.ToString("000"))$($_.Extension)"
}
If your intent is to rename the files based on the ending digits of their BaseName you can use Get-ChildItem in combination with Where-Object for filtering them and then pipe this result to Rename-Item using a delay-bind script block.
Needles to say, this code does not handle file collision. If there is more than one file with the same ending digits and the same extension this will error out.
Get-ChildItem -Filter file* | Where-Object { $_.BaseName -match '\d+$' } |
Rename-Item -NewName {
$basename = '{0:00}' -f [int][regex]::Match($_.BaseName, '\d+$').Value
$basename + $_.Extension
}
To test the code you can use the following:
#'
file1.jpg
file2.jpg
file3.jpg
file4.png
file5.png
file6.png
file7.png
file8.jpg
file9.jpg
file10.mp4
'# -split '\r?\n' -as [System.IO.FileInfo[]] | ForEach-Object {
$basename = '{0:00}' -f [int][regex]::Match($_.BaseName, '\d+$').Value
$basename + $_.Extension
}
You could just use the number of files found in the folder to create the appropriate 'numbering' format for renaming them.
$files = (Get-ChildItem -Path 'D:\Test' -File) | Sort-Object Name
# depending on the number of files, create a formating template
# to get the number of leading zeros correct.
# example: 645 files would create this format: '{0:000}{1}'
$format = '{0:' + '0' * ($files.Count).ToString().Length + '}{1}'
# a counter for the index number
$index = 1
# now loop over the files and rename them
foreach ($file in $files) {
$file | Rename-Item -NewName ($format -f $index++, $file.Extension) -WhatIf
}
The -WhatIf switch is a safety measure. With this, no file gets actually renamed, you will only see in the console what WOULD happen. Once you are content with that, remove the -WhatIf switch from the code and run again to rename all your files in the folder
I want to write a PowerShell script that will recursively search a directory, but exclude specified files (for example, *.log, and myFile.txt), and also exclude specified directories, and their contents (for example, myDir and all files and folders below myDir).
I have been working with the Get-ChildItem CmdLet, and the Where-Object CmdLet, but I cannot seem to get this exact behavior.
I like Keith Hill's answer except it has a bug that prevents it from recursing past two levels. These commands manifest the bug:
New-Item level1/level2/level3/level4/foobar.txt -Force -ItemType file
cd level1
GetFiles . xyz | % { $_.fullname }
With Hill's original code you get this:
...\level1\level2
...\level1\level2\level3
Here is a corrected, and slightly refactored, version:
function GetFiles($path = $pwd, [string[]]$exclude)
{
foreach ($item in Get-ChildItem $path)
{
if ($exclude | Where {$item -like $_}) { continue }
$item
if (Test-Path $item.FullName -PathType Container)
{
GetFiles $item.FullName $exclude
}
}
}
With that bug fix in place you get this corrected output:
...\level1\level2
...\level1\level2\level3
...\level1\level2\level3\level4
...\level1\level2\level3\level4\foobar.txt
I also like ajk's answer for conciseness though, as he points out, it is less efficient. The reason it is less efficient, by the way, is because Hill's algorithm stops traversing a subtree when it finds a prune target while ajk's continues. But ajk's answer also suffers from a flaw, one I call the ancestor trap. Consider a path such as this that includes the same path component (i.e. subdir2) twice:
\usr\testdir\subdir2\child\grandchild\subdir2\doc
Set your location somewhere in between, e.g. cd \usr\testdir\subdir2\child, then run ajk's algorithm to filter out the lower subdir2 and you will get no output at all, i.e. it filters out everything because of the presence of subdir2 higher in the path. This is a corner case, though, and not likely to be hit often, so I would not rule out ajk's solution due to this one issue.
Nonetheless, I offer here a third alternative, one that does not have either of the above two bugs. Here is the basic algorithm, complete with a convenience definition for the path or paths to prune--you need only modify $excludeList to your own set of targets to use it:
$excludeList = #("stuff","bin","obj*")
Get-ChildItem -Recurse | % {
$pathParts = $_.FullName.substring($pwd.path.Length + 1).split("\");
if ( ! ($excludeList | where { $pathParts -like $_ } ) ) { $_ }
}
My algorithm is reasonably concise but, like ajk's, it is less efficient than Hill's (for the same reason: it does not stop traversing subtrees at prune targets). However, my code has an important advantage over Hill's--it can pipeline! It is therefore amenable to fit into a filter chain to make a custom version of Get-ChildItem while Hill's recursive algorithm, through no fault of its own, cannot. ajk's algorithm can be adapted to pipeline use as well, but specifying the item or items to exclude is not as clean, being embedded in a regular expression rather than a simple list of items that I have used.
I have packaged my tree pruning code into an enhanced version of Get-ChildItem. Aside from my rather unimaginative name--Get-EnhancedChildItem--I am excited about it and have included it in my open source Powershell library. It includes several other new capabilities besides tree pruning. Furthermore, the code is designed to be extensible: if you want to add a new filtering capability, it is straightforward to do. Essentially, Get-ChildItem is called first, and pipelined into each successive filter that you activate via command parameters. Thus something like this...
Get-EnhancedChildItem –Recurse –Force –Svn
–Exclude *.txt –ExcludeTree doc*,man -FullName -Verbose
... is converted internally into this:
Get-ChildItem | FilterExcludeTree | FilterSvn | FilterFullName
Each filter must conform to certain rules: accepting FileInfo and DirectoryInfo objects as inputs, generating the same as outputs, and using stdin and stdout so it may be inserted in a pipeline. Here is the same code refactored to fit these rules:
filter FilterExcludeTree()
{
$target = $_
Coalesce-Args $Path "." | % {
$canonicalPath = (Get-Item $_).FullName
if ($target.FullName.StartsWith($canonicalPath)) {
$pathParts = $target.FullName.substring($canonicalPath.Length + 1).split("\");
if ( ! ($excludeList | where { $pathParts -like $_ } ) ) { $target }
}
}
}
The only additional piece here is the Coalesce-Args function (found in this post by Keith Dahlby), which merely sends the current directory down the pipe in the event that the invocation did not specify any paths.
Because this answer is getting somewhat lengthy, rather than go into further detail about this filter, I refer the interested reader to my recently published article on Simple-Talk.com entitled Practical PowerShell: Pruning File Trees and Extending Cmdlets where I discuss Get-EnhancedChildItem at even greater length. One last thing I will mention, though, is another function in my open source library, New-FileTree, that lets you generate a dummy file tree for testing purposes so you can exercise any of the above algorithms. And when you are experimenting with any of these, I recommend piping to % { $_.fullname } as I did in the very first code fragment for more useful output to examine.
The Get-ChildItem cmdlet has an -Exclude parameter that is tempting to use but it doesn't work for filtering out entire directories from what I can tell. Try something like this:
function GetFiles($path = $pwd, [string[]]$exclude)
{
foreach ($item in Get-ChildItem $path)
{
if ($exclude | Where {$item -like $_}) { continue }
if (Test-Path $item.FullName -PathType Container)
{
$item
GetFiles $item.FullName $exclude
}
else
{
$item
}
}
}
Here's another option, which is less efficient but more concise. It's how I generally handle this sort of problem:
Get-ChildItem -Recurse .\targetdir -Exclude *.log |
Where-Object { $_.FullName -notmatch '\\excludedir($|\\)' }
The \\excludedir($|\\)' expression allows you to exclude the directory and its contents at the same time.
Update: Please check the excellent answer from msorens for an edge case flaw with this approach, and a much more fleshed out solution overall.
Recently, I explored the possibilities to parameterize the folder to scan through and the place where the result of recursive scan will be stored. At the end, I also did summarize the number of folders scanned and number of files inside as well. Sharing it with community in case it may help other developers.
##Script Starts
#read folder to scan and file location to be placed
$whichFolder = Read-Host -Prompt 'Which folder to Scan?'
$whereToPlaceReport = Read-Host -Prompt 'Where to place Report'
$totalFolders = 1
$totalFiles = 0
Write-Host "Process started..."
#IMP separator ? : used as a file in window cannot contain this special character in the file name
#Get Foldernames into Variable for ForEach Loop
$DFSFolders = get-childitem -path $whichFolder | where-object {$_.Psiscontainer -eq "True"} |select-object name ,fullName
#Below Logic for Main Folder
$mainFiles = get-childitem -path "C:\Users\User\Desktop" -file
("Folder Path" + "?" + "Folder Name" + "?" + "File Name " + "?"+ "File Length" )| out-file "$whereToPlaceReport\Report.csv" -Append
#Loop through folders in main Directory
foreach($file in $mainFiles)
{
$totalFiles = $totalFiles + 1
("C:\Users\User\Desktop" + "?" + "Main Folder" + "?"+ $file.name + "?" + $file.length ) | out-file "$whereToPlaceReport\Report.csv" -Append
}
foreach ($DFSfolder in $DFSfolders)
{
#write the folder name in begining
$totalFolders = $totalFolders + 1
write-host " Reading folder C:\Users\User\Desktop\$($DFSfolder.name)"
#$DFSfolder.fullName | out-file "C:\Users\User\Desktop\PoC powershell\ok2.csv" -Append
#For Each Folder obtain objects in a specified directory, recurse then filter for .sft file type, obtain the filename, then group, sort and eventually show the file name and total incidences of it.
$files = get-childitem -path "$whichFolder\$($DFSfolder.name)" -recurse
foreach($file in $files)
{
$totalFiles = $totalFiles + 1
($DFSfolder.fullName + "?" + $DFSfolder.name + "?"+ $file.name + "?" + $file.length ) | out-file "$whereToPlaceReport\Report.csv" -Append
}
}
# If running in the console, wait for input before closing.
if ($Host.Name -eq "ConsoleHost")
{
Write-Host ""
Write-Host ""
Write-Host ""
Write-Host " **Summary**" -ForegroundColor Red
Write-Host " ------------" -ForegroundColor Red
Write-Host " Total Folders Scanned = $totalFolders " -ForegroundColor Green
Write-Host " Total Files Scanned = $totalFiles " -ForegroundColor Green
Write-Host ""
Write-Host ""
Write-Host "I have done my Job,Press any key to exit" -ForegroundColor white
$Host.UI.RawUI.FlushInputBuffer() # Make sure buffered input doesn't "press a key" and skip the ReadKey().
$Host.UI.RawUI.ReadKey("NoEcho,IncludeKeyUp") > $null
}
##Output
##Bat Code to run above powershell command
#ECHO OFF
SET ThisScriptsDirectory=%~dp0
SET PowerShellScriptPath=%ThisScriptsDirectory%MyPowerShellScript.ps1
PowerShell -NoProfile -ExecutionPolicy Bypass -Command "& {Start-Process PowerShell -ArgumentList '-NoProfile -ExecutionPolicy Bypass -File ""%PowerShellScriptPath%""' -Verb RunAs}";
A bit late, but try this one.
function Set-Files($Path) {
if(Test-Path $Path -PathType Leaf) {
# Do any logic on file
Write-Host $Path
return
}
if(Test-Path $path -PathType Container) {
# Do any logic on folder use exclude on get-childitem
# cycle again
Get-ChildItem -Path $path | foreach { Set-Files -Path $_.FullName }
}
}
# call
Set-Files -Path 'D:\myFolder'
Commenting here as this seems to be the most popular answer on the subject for searching for files whilst excluding certain directories in powershell.
To avoid issues with post filtering of results (i.e. avoiding permission issues etc), I only needed to filter out top level directories and that is all this example is based on, so whilst this example doesn't filter child directory names, it could very easily be made recursive to support this, if you were so inclined.
Quick breakdown of how the snippet works
$folders << Uses Get-Childitem to query the file system and perform folder exclusion
$file << The pattern of the file I am looking for
foreach << Iterates the $folders variable performing a recursive search using the Get-Childitem command
$folders = Get-ChildItem -Path C:\ -Directory -Name -Exclude Folder1,"Folder 2"
$file = "*filenametosearchfor*.extension"
foreach ($folder in $folders) {
Get-Childitem -Path "C:/$folder" -Recurse -Filter $file | ForEach-Object { Write-Output $_.FullName }
}
I would like to use PowerShell to move a matching name set of files (1 job file and 1 trigger file both havening the same name just different extensions) from one directory to another. See example below.
Source directory contains job1.zip, job1.trg, job2.zip, and job2.trg. I would like to take matching job names job1.zip and job1.trg and move it to dest1folder, only if it is empty, if not move it to dest2folder. Then loop back to perform the same logic for job2.zip and job2.trg. One thing I also have to take into consideration is the Source directory may only contain job1.zip waiting for job1.trg to be transferred. I am a newbie to PowerShell and blown hours on trying to get it working with no success. Is it possible?
This is what I have so far. I get the files to move to each destination folder using IF logic, but it moves all of the files in the source directory.
$doirun = (get-childItem "d:\ftproot\pstest\").Count
$filecount = (get-childItem "d:\ftproot\ps2\").Count
if ($doirun -le 1) {exit}
$dir = get-childitem "d:\ftproot\pstest\" | Where-Object {($_.extension -eq ".zip") -or ($_.extension -eq ".trg")}
foreach ($file in $dir)
{
if ($filecount -le 2) {Move-item "d:\ftproot\pstest\$file" "d:\ftproot\ps2\"}
else {Move-item "d:\ftproot\pstest\$file" "d:\ftproot\ps3\"}
}
Not tested extensively, but I believe this should work:
$jobs = gci d:\ftproot\pstest\* -include *.zip,*.trg |
select -expand basename | sort -unique
$jobs |foreach-object {
if (test-path d:\ftproot\pstest\$_.zip -and test-path d:\ftproot\pstest\$_.trg){
if (test-path d:\ftproot\pstest\ps2\*){
move-item d:\ftproot\pstest\$_.zip d:\ftproot\pstest\ps3
move-item d:\ftproot\pstest\$_.trg d:\ftproot\pstest\ps3
}
else {
move-item d:\ftproot\pstest\$_.zip d:\ftproot\pstest\ps2
move-item d:\ftproot\pstest\$_.trg d:\ftproot\pstest\ps2
}
}