Regex for dirpaths named with letters - powershell

I have a large amount of folders that are named like this:
C:\Folders\A
C:\Folders\A\AB\ABC
C:\Folders\E\EA\EAB\EABA
C:\Folders\A\AZ\AZA
C:\Folders\B\BA\BAE
And in a PowerShell script I want to move some files (or create directory if it doesn't exist) to the right place. E.g FileABC.txt to C:\Folders\A\AB\ABC and FileBA.txt to C:\Folders\B\BA.
What I have now is this:
$naming just contains all the letter codes from a CSV.
if ($naming -match '^A$') {
$folderA = C:\Folders
New-Item -Name $naming -Path $folderA -Type Directory
}
Which creates C:\Folders\A. So I do the same for all the other top level folders (there are only 6 of them).
To match the 2 letter paths I do:
if ($naming -match '^A[A-Z]{1}$'){
$folderA2 = "C:\Folders\A\"
New-Item -Name $naming -Path $folderA2 -Type Directory
}
Which creates C:\Folders\A\AA, C:\Folders\A\AB etc.
The problem arises when I want to create the 3 or 4 letter directories:
I tried this:
if ($naming -match '^A[A-Z]{2}$') {
$folderA3 = 'C:\Folders\A\$($naming)\'
New-Item -Name $naming -Path $folderA3 -Type Directory
}
and:
if ($naming -match '^A[A-Z]{3}$') {
$folderA3 = 'C:\Folders\A\$($naming)\$($naming)\'
New-Item -Name $naming -Path $folderA3 -Type Directory
}
But files are not placed correctly. E.g FileABC.txt is moved to random places like C:\Folders\A\AK\ABC.
#edit
I also notice that folders are not created in the right place. Folders with 3 or 4 letter combos are placed at which seems random:
C:\Folders\E\EA\EBC
C:\Folders\A\AB\AZA
C:\Folders\E\EC\EFG\ECXA
I could do:
if ($naming -match '^AB[A-Z]{2}$')
if ($naming -match '^AC[A-Z]{2}$')
But then i would have to make one for each letter A-Z, which i feel should not be necessary.

Cannot totally discern the algorithm because your examples do not map to the same things, but perhaps with an approach like this, you can further refine it do to what you want. It does not use regex, but does do the mapping I believe to what you want, or close to it.
# E.g FileABC.txt to C:\Folders\A\AB\ABC and FileBA.txt to C:\Folders\B\BA.
function FileToPath($file)
{
$filePrefix = 'File'
$destPrefix = 'c:\Folders'
$fileParts = [System.IO.Path]::GetFileNameWithoutExtension($file) -split $filePrefix
$destPath = $destPrefix
$len = $fileParts[1].Length
for ($i = 0; $i -lt $len; $i++)
{
$destPath = Join-Path $destPath $fileParts[1].Substring(0,$i+1)
# or maybe the below?
#$destPath = Join-Path $destPath $fileParts[1][$i]
}
return (Join-Path $destPath $file)
}
'=== 1 ==='
$f = FileToPath 'FileABC.txt'
$f
Split-Path -Path $f -Parent
Split-Path -Path $f -Leaf
'=== 2 ==='
FileToPath 'FileBA.txt'
See also Test-Path, New-Item, Move-Item, etc.

Related

PowerShell create a duplicate folder with zero-size files

I want to create a 0-filesize mirror image of a set of folder, but while robocopy is really good, it doesn't save all of the information that I would like:
robocopy D:\documents E:\backups\documents_$(Get-Date -format "yyyyMMdd_HHmm")\ /mir /create
The /create switch makes each file in the duplicate folder have zero-size, and that is good, but I would like each file in the duplicate folder to have [size] appended to the end of the name with the size in KB or MB or GB, and the create / last modified time on every file to exactly match the original file. This way, I will have a zero-size duplicate of the folder that I can archive, but which contains all of the relevant information for the files in that directory, showing the size of each and the exact create / last modified times.
Are there good / simple ways to iterate through a tree in PowerShell, and for each item create a zero size file with all relevant information like this?
This would be one way to implement the copy command using the approach I mentioned in the comments. This should give you something to pull ideas from. I didn't intend to spend as much time on it as I did, but I ran it on several directories and found some problems and debugged each problem I encountered. This is a pretty solid example at this point.
function Copy-FolderZeroSizeFiles {
[CmdletBinding()]
param( [Parameter(Mandatory)] [string] $FolderPath,
[Parameter(Mandatory)] [string] $DestinationPath )
$dest = New-Item $DestinationPath -Type Directory -Force
Push-Location -LiteralPath $FolderPath
try {
foreach ($item in Get-ChildItem '.' -Recurse) {
$relPath = Resolve-Path -LiteralPath $item -Relative
$type = if ($item.Attributes -match 'Directory')
{ 'Directory' }
else { 'File' }
$destItem = New-Item "$dest\$relPath" -Type $type -Force
$destItem.Attributes = $item.Attributes
$destItem.LastWriteTime = $item.LastWriteTime
}
} finally {
Pop-Location
}
}
Note: the above implementation is simplistic and represents anything that isn't a directory as a file. That means symbolic links, et al. will be files with no information what they would be linked to.
Here's a function to get the conversion from number of bytes to N.N B/K/M/G format. To get more decimal places, just add 0's to the end of the format strings.
function ConvertTo-FriendlySize($NumBytes) {
switch ($NumBytes) {
{$_ -lt 1024} { "{0,7:0.0}B" -f ($NumBytes) ; break }
{$_ -lt 1048576} { "{0,7:0.0}K" -f ($NumBytes / 1024) ; break }
{$_ -lt 1073741824} { "{0,7:0.0}M" -f ($NumBytes / 1048576) ; break }
default { "{0,7:0.0}G" -f ($NumBytes / 1073741824); break }
}
}
Often, people get these conversions wrong. For instance, it's a common error to use 1024 * 1000 to get Megabytes (which is mixing the base10 value for 1K with the base2 value for 1K) and follow that same logic to get GB and TB.
Here is what I came up with with the additional parts in the question, change $src / $dst as required (D:\VMs is where I keep a lot of Virtual Machines). I have included setting all of CreationTime, LastWriteTime, LastAccessTime so that the backup location with zero-size files is a perfect representation of the source. As I want to use this for archival purposes, I have finally zipped things up and included a date-time stamp in the zipfile name.
# Copy-FolderZeroSizeFiles
$src = "D:\VMs"
$dst = "D:\VMs-Backup"
function ConvertTo-FriendlySize($NumBytes) {
switch ($NumBytes) {
{$_ -lt 1024} { "{0:0.0}B" -f ($NumBytes) ; break } # Change {0: to {0,7: to align to 7 characters
{$_ -lt 1048576} { "{0:0.0}K" -f ($NumBytes / 1024) ; break }
{$_ -lt 1073741824} { "{0:0.0}M" -f ($NumBytes / 1048576) ; break }
default { "{0:0.0}G" -f ($NumBytes / 1073741824); break }
}
}
function Copy-FolderZeroSizeFiles($FolderPath, $DestinationPath) {
Push-Location $FolderPath
if (!(Test-Path $DestinationPath)) { New-Item $DestinationPath -Type Directory }
foreach ($item in Get-ChildItem $FolderPath -Recurse -Force) {
$relPath = Resolve-Path $item.FullName -Relative
if ($item.Attributes -match 'Directory') {
$new = New-Item "$DestinationPath\$relPath" -ItemType Directory -Force -EA Silent
}
else {
$fileBaseName = [System.IO.Path]::GetFileNameWithoutExtension($item.Name)
$fileExt = [System.IO.Path]::GetExtension($item.Name)
$fileSize = ConvertTo-FriendlySize($item.Length)
$new = New-Item "$DestinationPath\$(Split-Path $relPath)\$fileBaseName ($fileSize)$fileExt" -ItemType File
}
"$($new.Name) : creation $($item.CreationTime), lastwrite $($item.CreationTime), lastaccess $($item.LastAccessTime)"
$new.CreationTime = $item.CreationTime
$new.LastWriteTime = $item.LastWriteTime
$new.LastAccessTime = $item.LastAccessTime
$new.Attributes = $item.Attributes # Must this after setting creation/write/access times or get error on Read-Only files
}
Pop-Location
}
Copy-FolderZeroSizeFiles $src $dst
$dateTime = Get-Date -Format "yyyyMMdd_HHmm"
$zipName = "$([System.IO.Path]::GetPathRoot($dst))\$([System.IO.Path]::GetFileName($dst))_$dateTime.zip"
Add-Type -AssemblyName System.IO.Compression.FileSystem
[IO.Compression.ZipFile]::CreateFromDirectory($dst, $zipName)

Bulk renaming photos and adding letters to duplicate file names in Powershell

I have a question about a powershell script. I want to rename a bunch of photos within a folder. I have a .csv file of the old names and the new names. This is a section of that file:
OldFile NewFile
{5858AA5A-DB1B-475A-808E-0BFF0B885E5B}.jpeg 975NNNN-AGUIRRESUGARASSOCSTACK-Notes-20200828.jpeg
{FA1E4CEE-0AD8-4B40-A5AD-4BB22C0EE4F0}.jpeg 975NNNN-AGUIRRESUGARASSOCSTACK-Other-20200828.jpeg
{FD20FA44-B3D2-4A6A-B73D-F3BADC2DDE71}.jpeg 975NNNN-AGUIRRESUGARASSOCSTACK-Vicinity-20200831.jpeg
{E0DDA4CD-7783-417C-9BE0-705FFA08CD17}.jpeg 975NNNN-AGUIRRESUGARASSOCSTACK-Vicinity-20200831.jpeg
{76DC6315-942D-444C-BA04-92FC9B9FF1A5}.jpeg 975NNNN-AGUIRRESUGARASSOCSTACK-Vicinity-20200831.jpeg
{3C853453-0A0D-40B5-B3B7-B0F84F92D512}.jpeg 975NNNN-AGUIRRESUGARASSOCSTACK-Vicinity-20200831.jpeg
Many of the new file names will be duplicates. For those files, I want to add a letter (A,B,C, so on) in the middle of the name at an exact location.
For example, if the file, 975NNNN-AGUIRRESUGARASSOCSTACK-Vicinity-20200831.jpeg, is a duplicate, I want to add "A" right after "Vicinity", so that the file is called 975NNNN-AGUIRRESUGARASSOCSTACK-VicinityA-20200831.jpeg. The letter will always be at that exact same location (right before the third -).
This is the script I have so far. I know it's not right and I haven't been able to even attempt at adding the letter within the script. (I'm a complete Powershell newbie.)
$filesToRename = Import-CSV C:\Users\clair\OneDrive\Documents\JOA\batch_photos\Rename_Central_Aguirre.csv
foreach ($file In $filesToRename) {
if (Test-Path $file.NewFile) {
$letter = -begin { $count= 1 } -Process { Rename-Item $file.OldFile
"file-$([char](96 + $count)).jpeg"; $count++}
} else {
Rename-Item $file.OldFile $file.NewFile
}
}
Could I get some guidance on how to achieve this file naming system?
Thanks!!!
When renaming files using a character from the alphabet will mean you will only have 26 options. If that is enough for you, you can do the following:
$alphabet = 'ABCDEFGHIJKLMNOPQRSTUVWXYZ'
$folderPath = 'D:\Test'
$filesToRename = Import-CSV C:\Users\clair\OneDrive\Documents\JOA\batch_photos\Rename_Central_Aguirre.csv
foreach ($file In $filesToRename) {
$oldFile = Join-Path -Path $folderPath -ChildPath $file.OldFile
if (Test-Path $oldFile -PathType Leaf) {
# split the new filename into workable parts
$newName = $file.NewFile
$extension = [System.IO.Path]::GetExtension($newName)
$parts = [System.IO.Path]::GetFileNameWithoutExtension($newName) -split '-'
$suffix = $parts[-1]
$prefix = $parts[0..($parts.Count -2)] -join '-'
$charToAppend = 0 # counter to go through the characters in the alphabet. 0..25
while (Test-Path (Join-Path -Path $folderPath -ChildPath $newName) -PathType Leaf) {
if ($charToAppend -gt 25) {
# bail out if al characters have been used up
throw "Cannot rename file '$($file.OldFile)', because all characters A-Z are already used"
}
$newName = '{0}{1}-{2}{3}' -f $prefix, $alphabet[$charToAppend++], $suffix, $extension
}
Rename-Item -Path $oldFile -NewName $newName
}
else {
Write-Warning "File '$($file.OldFile)' not found"
}
}
Before:
D:\TEST
{3C853453-0A0D-40B5-B3B7-B0F84F92D512}.jpeg
{5858AA5A-DB1B-475A-808E-0BFF0B885E5B}.jpeg
{76DC6315-942D-444C-BA04-92FC9B9FF1A5}.jpeg
{E0DDA4CD-7783-417C-9BE0-705FFA08CD17}.jpeg
{FA1E4CEE-0AD8-4B40-A5AD-4BB22C0EE4F0}.jpeg
{FD20FA44-B3D2-4A6A-B73D-F3BADC2DDE71}.jpeg
After:
D:\TEST
975NNNN-AGUIRRESUGARASSOCSTACK-Notes-20200828.jpeg
975NNNN-AGUIRRESUGARASSOCSTACK-Other-20200828.jpeg
975NNNN-AGUIRRESUGARASSOCSTACK-Vicinity-20200831.jpeg
975NNNN-AGUIRRESUGARASSOCSTACK-VicinityA-20200831.jpeg
975NNNN-AGUIRRESUGARASSOCSTACK-VicinityB-20200831.jpeg
975NNNN-AGUIRRESUGARASSOCSTACK-VicinityC-20200831.jpeg
I think you need to use the method.Insert(). This is a small example how it works:
I ve created a txt named 975NNNN-AGUIRRERM1-Vicinity-20200829.txt in C:\Test just for testing purpose, in your example the first code line is to identify the duplicate(s)
#Code to identify duplicates (insert your code instead of mine)
$files=Get-ChildItem -Path C:\Test -File -Name
#The following line indentifies the location of the last "-" (I understand you always have 3 "-" right?)
$DashPos=($files).LastIndexOf("-")
#This inserts on the position $DashPos, the letter "A")
$files.Insert($DashPos,"A")

How can I find out the exact location where the recursive operation is working?

My problem is, that the string for replacement needs to change according to the folder depth the designated file is located and I don't have a clue how to get that info. I need to work with relative addresses.
I want the script to be run from 2 folder levels above the folder where all the files are that need correcting. So I've set the $path in line 1. That folder suppose to be 'depth 0'. In here, the replacement string needs to be in it's native form -> stylesheet.css.
For files in the folders one level below 'depth 0' the string for replacement needs to be prefixed with ../ once -> ../stylesheet.css.
For files in the folders two level below 'depth 0' the string for replacement needs to be prefixed with ../ twice -> ../../stylesheet.css.
...and so on...
I'm stuck here:
$depth = $file.getDepth($path) #> totally clueless here
I need $depth to contain the number of folders under the root $path.
How can I get this? Here's the rest of my code:
$thisLocation = Get-Location
$path = Join-Path -path $thisLocation -childpath "\Files\depth0"
$match = "findThisInFiles"
$fragment = "stylesheet.css" #> string to be prefixed n times
$prefix = "../" #> prefix n times according to folder depth starting at $path (depth 0 -> don't prefix)
$replace = "" #> this will replace $match in files
$depth = 0
$htmlFiles = Get-ChildItem $path -Filter index*.html -recurse
foreach ($file in $htmlFiles)
{
$depth = $file.getDepth($path) #> totally clueless here
$replace = ""
for ($i=0; $i -lt $depth; $i++){
$replace = $replace + $prefix
}
$replace = $replace + $fragment
(Get-Content $file.PSPath) |
Foreach-Object { $_ -replace $match, $replace } |
Set-Content $file.PSPath
}
Here's a function I've written that uses Split-Path recursively to determine the depth of a path:
Function Get-PathDepth ($Path) {
$Depth = 0
While ($Path) {
Try {
$Parent = $Path | Split-Path -Parent
}
Catch {}
if ($Parent) {
$Depth++
$Path = $Parent
}
else {
Break
}
}
Return $Depth
}
Example usage:
$MyPath = 'C:\Some\Example\Path'
Get-PathDepth -Path $MyPath
Returns 3.
Unfortunately, I had to wrap Split-Path in a Try..Catch because if you pass it the root path then it throws an error. This is unfortunate because it means genuine errors won't cause an exception to occur but can't see a way around this at the moment.
The advantage of working using Split-Path is that you should get a consistent count regardless of whether a trailing \ is used or not.
Here is a way to get the depth in the folder structure for all files in a location. Hope this helps get you in the right direction
New-Item -Path "C:\Logs\Once\Test.txt" -Force
New-Item -Path "C:\Logs\Twice\Folder_In_Twice\Test.txt" -Force
$Files = Get-ChildItem -Path "C:\Logs\" -Recurse -Include *.* | Select-Object FullName
foreach ($File in $Files) {
[System.Collections.ArrayList]$Split_File = $File.FullName -split "\\"
Write-Output ($File.FullName + " -- Depth is " + $Split_File.Count)
}
Output is this just for illustration
C:\Logs\Once\Test.txt -- Depth is 4
C:\Logs\Twice\Folder_In_Twice\Test.txt -- Depth is 5

Searching for files in powershell

I've scoured all of the internet for this answer. Maybe it's right here, but alas, I'm out of time and we're on a time schedule from the wonderful boys over in legal.
We have some files which need to be retrieved based on particular names which appear in the directory path.
The person who stored and saved all of these files kept the same naming convention throughout. She's pretty awesome and a++ to her.
The file structure is as below:
Animals
-Dogs
-Folders With Breeds of Dogs
-<Breed of Dog>_MA_etc.pdf
-Cats
-Folders with Breeds of Cats
-<Breed of Cat>_MA_etc.pdf
-ETC
-etc
-etc
The person who saved the files was meticulous about file structure and naming convention, so you can expect c:\animals\dogs\GSD\GSD_MA.PDF or something like that.
While the original author was rather consistent, human error has occured so what I'm trying to do is look for "close enough", basically.
We might have:
Client Agreements\Netflix\files
Master Agreements\Netflix,Inc\files
Rental Agreements\Netflix\files
What I want to do is grab the file structure of all of those and move them to my "E:\sorted" directory maintaining the file structure it has.
So stepping way from animals, we've got a client list from legal with names they're interested. If I look for name:name, i get 27 results. So far not good.
I've tried partial and I get zero results. So here's my terrible code below. Maybe you can make fun of me and show me where I went wrong.
$a = Import-CSV C:\scripts\Clients.csv
$a = #($a.Client)
#$a = $a | %{ $_.SubString(0,6) }
$c = Get-ChildItem E:\Legal\ -include ($a) -recurse # | Where-Object {($_ -match $a)}
ForEach($file in $c){
$dest = Split-Path -path $file.FullName -Parent | Split-path -NoQualifier
#Copy-Item -path $file -recurse -Destination "e:\sorted\11\$dest" -force -Verbose
}
I expect that there is a more PowerShell-ish way to do it, but I used a more procedural-type approach.
Using a HashSet, I create a set of directories which need to be copied. A HashSet has only one of an entry, so if it contains "C:\A\B", then adding "C:\A\B" again will not add another entry.
The .contains method is the .NET one, not the PS one, and similarly for .replace.
$src = "C:\temp\a"
$dest = "F:\temp\b"
$CsvFile = Join-Path -Path $src -ChildPath "findthese.csv"
$sought = (Import-Csv $CsvFile).Client
$dirs = Get-ChildItem -Path $src -Directory -Recurse
$set = New-Object System.Collections.Generic.HashSet[string]
# get the directories with a client name in the path anywhere
foreach($dir in $dirs) {
foreach($client in $sought) {
if ($dir.FullName.contains($client)) {
$temp = $set.Add($dir.FullName)
}
}
}
# copy the selected directory structures to the destination
foreach($dir in $set) {
Copy-Item -path $dir -Destination $dir.replace($src, $dest) -Recurse -WhatIf
}
I left the -WhatIf in there so you can quickly check it's going to do the right thing.
If the names in $a don't exactly match the names of files, using that as input to the include parameter won't help you find just those files you want.
I've got a file named clients.csv with the follwong
client,gender,fun
fred,m,y
barney,m,y
wilma,f,y
navneet,n,y
kumar,f,y
konda,m,y
In my current directory, I've got a directory named clients with the following contents:
C:
├───clients
├───losers
│ barney_loser.txt
│ kumar_loser.txt
│
└───winners
fred_winner.txt
konda_winner.txt
wilma_winner.txt
Case-1:
ls .\clients\ -Filter *.txt -Recurse
Returns all the text files.
Case-2:
$people = import-csv -path .\people.csv
$clients = $people.client
ls .\clients\ -Filter *.txt -Recurse -Include $clients
Returns me nothing.
Case-3:
$people = import-csv -path .\people.csv
$clients = $people.client
$clients += 'kumar_loser.txt'
ls .\clients\ -Filter *.txt -Recurse -Include $clients
Returns me one record for "kumar_loser.txt".
I'm asserting the pattern in your list ($a) don't match the file names.
If I wanted to fix that in my example, I could do something like this...
$people = import-csv -path .\people.csv
$clients = $people.client
for($i = 0; $i -lt $clients.length; $i++) {
$clients[$i] = '*{0}*' -f $clients[$i]
}
ls .\clients\ -Filter *.txt -Recurse -Include $clients
Hope this helps.
Thanks for the help guys.
I took a less scripty approach and procedural approach, as suggested above. Here's the code I used that mostly worked, a colleague and I went through and verified the results and some outlier files. I had to double check the errors that popped up and found a few more files that I wanted. Wasn't perfect, but definitely cut down looking through 700 folders and 3000 files. Include is great but filter is what I really wanted. Furthermore, Include doesn't like index values and Filter Especially doesn't, so I had to save it to a variable and filter by that with a * wildcard which did work.
Here's what I did:
$people = Import-CSV C:\scripts\HelenClients.csv
$clients = $people.Client| %{$_.SubString(0,5)}
for($i=0; $i -lt $clients.Length; $i++){
$name = $clients[$i]
Write-Host "Searching for $name"
$file = Get-ChildItem 'E:\Legal\' -Include "$name*" -recurse
if($file -ne $null){
$dest = Split-Path -path $file -Parent
$dest1 = $dest | Split-Path -NoQualifier
$from = $dest[0]
$to = $dest1[0]
$too = $file.BaseName[0]
Copy-Item $file -Destination e:\sorted\16\$to\$too\ -force -Verbose
}
else{
Write-Output "No results found"
}
}
I found when you store the results into a variable, if there's more than one, it'll list all of the locations and names, etc. Not pretty. See below:
PS C:\Users\me> $ff
Directory: E:\ParentDir\subfolder\redacted
Mode LastWriteTime Length Name
---- ------------- ------ ----
-a---- 6/4/2018 1:47 PM 50485 redacted.docx
-a---- 6/4/2018 1:47 PM 155579 redacted.pdf
PS C:\Users\me> $ff.Basename
Redacted Basename 0
Redacted Basename 1
PS C:\Users\me> $ff.BaseName[0]
Redacted Basename 0
So I just wanted the first indexed value. I also wanted to maintain the file structure without copying everything over, so I used split-path to kind of take it apart. It's very hodgepodge and not pretty to look at, but it works.

Powershell to Split huge folder in multiple folders

I have a folder that contains many huge files. I want to split these files in 3 folders. The requirement is to get the count of files in main folder and then equally split those files in 3 child folders.
Example - Main folder has 100 files. When I run the powershell, 3 child folders should be created with 33, 33 and 34 files respectively.
How can we do this using Powershell?
I've tried the following:
$FileCount = (Get-ChildItem C:\MainFolder).count
Get-ChildItem C:\MainFolder -r | Foreach -Begin {$i = $j = 0} -Process {
if ($i++ % $FileCount -eq 0) {
$dest = "C:\Child$j"
md $dest
$j++
}
Move-Item $_ $dest
}
Here is another solution. This one accounts for the sub folders not existing.
# Number of groups to support
$groupCount = 3
$path = "D:\temp\testing"
$files = Get-ChildItem $path -File
For($fileIndex = 0; $fileIndex -lt $files.Count; $fileIndex++){
$targetIndex = $fileIndex % $groupCount
$targetPath = Join-Path $path $targetIndex
If(!(Test-Path $targetPath -PathType Container)){[void](new-item -Path $path -name $targetIndex -Type Directory)}
$files[$fileIndex] | Move-Item -Destination $targetPath -Force
}
If you need to split up the files into a different number of groups the use $groupCount of higher that 3. Can also work logic with a switch that would change $groupCount to something else if the count was greater that 500 for example.
Loop though the files one by one. Using $fileIndex as a tracker we determine the folder 0,1 or 2 in my case that the file will be put into. Then using that value we check to be sure the target folder exists. Yes, this logic could easily be placed outside the loop but if you have file and folder changes while the script is running you could argue it is more resilient.
Ensure the folder exists, if not make it. Then move that one item. Using the modulus operator, like in the other answers, we don't have to worry about how many files are there. Let PowerShell do the math.
This is super quick and dirty, but it does the job.
#Get the collection of files
$files = get-childitem "c:\MainFolder"
#initialize a counter to 0 or 1 depending on if there is a
#remainder after dividing the number of files by 3.
if($files.count % 3 -eq 0){
$counter = 0
} else {
$counter = 1
}
#Iterate through the files
Foreach($file in $files){
#Determine which subdirectory to put the file in
If($counter -lt $files.count / 3){
$d = "Dir1"
} elseif($counter -ge $files.count / 3 * 2){
$d = "Dir3"
} else {
$d = "Dir2"
}
#Create the subdirectory if it doesn't exist
#You could just create the three subdirectories
#before the loop starts and skip this
if(-Not (test-path c:\Child\$d)){
md c:\Child\$d
}
#Move the file and increment the counter
move-item $file.FullName -Destination c:\Child\$d
$counter ++
}
I think it's possible to do without doing the counting and allocating yourself. This solution:
Lists all the files
Adds a counter property which cycles 0,1,2,0,1,2,0,1,2 to each file
groups them into buckets based on the counter
moves each bucket in one command
There's scope for rewriting it in a lot of ways to make it nicer, but this saves doing the math, handling uneven allocations, iterating over the files and moving them one at a time, would easily adjust to different numbers of groups.
$files = (gci -recurse).FullName
$buckets = $files |% {$_ | Add-Member NoteProperty "B" ($i++ % 3) -PassThru} |group B
$buckets.Name |% {
md "c:\temp$_"
Move-Item $buckets[$_].Group "c:\temp$_"
}