PowerShell: How to use remove-item to delete files in batches? - powershell

I have a small script which deletes files from a directory where date1 is less than date2. My script is working, however, as this is run on a directory with many files, I'd like the remove item to only remove 100 files at a time, so that I can monitor the progress with each run. Is this possible?
if ($date1 -lt $date2)
{
$_ | Remove-Item;
}

Use a for loop and make the counter step 100 on each iteration:
if($date1 -ge $date2){
# nothing to be done
return
}
$files = Get-ChildItem $pathToFolder
for($i = 0; $i -lt $files.Count; $i += 100){
$null = Read-Host "Press Enter to delete the next 100 files... "
$filesToDelete = $files[$i..($i+99)]
$filesToDelete |Remove-Item
Write-Host "Deleted $($filesToDelete.Count) files..."
}

Related

Looping through File Groups such as FileGroup159, FileGroup160, etc. in Powershell

So I got the code to work how I like it for individual files. Based on some of the suggestions below, I was able to come up with this:
$Path = "C:\Users\User\Documents\PowerShell\"
$Num = 160
$ZipFile = "FileGroup0000000$Num.zip"
$File = "*$Num*.txt"
$n = dir -Path $Path$File | Measure
if($n.count -gt 0){
Remove-Item $Path$ZipFile
Compress-Archive -Path $Path$File -DestinationPath $Path
Rename-Item $Path'.zip' $Path'FileGroup0000000'$Num'.zip'
Remove-Item $Path$File
}
else {
Write-Output "No Files to Move for FileGroup$File"
}
The only thing I need to do now is have $Num increment after the program finishes each time. Therefore the program will run, and then move $Num to 160, 161, etc. and I will not have to re initiate the code manually. Thanks for the help so far.
Your filename formatting should go inside the loop and you should use the Format operator -f to get the preceding zeros, like:
159..1250 | ForEach-Object {
$UnzippedFile = 'FileGroup{0:0000000000}' -f $_
$ZipFile = "$UnzippedFile.zip"
Write-Host "Unzipping: $ZipFile"
# Do your thing here
}

Powershell adding file sizes, unexpected result

I am currently working on a powershell script to copy a random selection of songs from my NAS onto an SD card. As an added complication, I can have no more than 512 songs per folder and also obviously, need to stop the process before I run out of free space on the card.
I have written a nearly complete script (with reduced amounts of songs for testing purposes), but am struggling with keeping track of the total size of the files I have copied. As an example, a test run with a total of 112MB of files gives a recorded value (in $copied_size) of 1245. I don't know what that value means, it doesn't seem to be a realistic value of GB, Gb, MB or Mb. I am obviously missing something here. Any ideas?
Here is the script, I haven't put in the size restriction yet:
$j = 1
$i = 0
$files_per_folder = 5
$sd_card_size = 15920000000
$copied_size = 0
$my_path = '\\WDMYCLOUD\Public\Shared Music'
$random = Get-Random -Count 100 -InputObject (1..200)
For ($j=1; $j -le 5; $j++)
{
md ("F:\" + $j)
$list = Get-ChildItem -Path $my_path | ?{$_.PSIsContainer -eq $false -and $_.Extension -eq '.mp3'}
For ($i=0; $i -le $files_per_folder - 1; $i++)
{
Copy-Item -Path ($my_path + "\" + $list[$random[(($J - 1) * $files_per_folder) +$i]]) -Destination ('F:\' + $j)
$copied_size = $copied_size + ($my_path + "\" + $list[$random[(($J - 1) * $files_per_folder) +$i]]).length
}
}
Write-Host "Copied Size = " $copied_size
Here's a way to solve your problem using some more powershell-like patterns. It compares the current file to be copied with space remaining and will exit out of the top-level loop if that condition is true.
#requires -Version 3
$path = '\\share\Public\Shared Music'
$filesPerFolder = 5
$copySize = 0
$random = 1..200 | Get-Random -Count 100
$files = Get-ChildItem -Path $path -File -Filter *.mp3
:main
for ($i = 1; $i -le 5; $i++) {
$dest = New-Item -Path F:\$i -ItemType Directory -Force
for ($j = 0; $j -le $filesPerFolder; $j++) {
$file = $files[$random[(($j - 1) * $filesPerFolder) + $i]]
if ((Get-PSDrive -Name F).Free -lt $file.Length) {
break main
}
$file | Copy-Item -Destination $dest\
$copySize += $file.Length
}
}

Cannot figure out why part of my copy commands are working and some are not

Code is at the bottom. Simply put, when I run the code, my .ps1 files get moved no problem, but for some reason any file with "Lec" as a part of its name will pop up this error message:
cp : Cannot find path 'C:\Users\Administrator\Documents\Win213x_Lec_filename.docx' because it does not
exist.
I do not understand why this is happening when it recognizes the file name, and I double checked the exact file is in the directory with the exact name, but my ps1 files have no issue.
$sub1 = "Lectures"
$sub2 = "Labs"
$sub3 = "Assignment"
$sub4 = "Scripts"
$DirectoryName = "test"
$Win213CopyFiles = ls C:\Users\Administrator\Documents\Win213Copy
$count = 0
foreach ($i in $Win213CopyFiles)
{
if ($i -match ".*Lec.*")
{
cp $i C:\Users\Administrator\Documents\test\Lectures
$count = $count + 1
}
elseif ($i -match ".*Lab.*")
{
cp $i C:\Users\Administrator\Documents\$DirectoryName\$sub2
$count = $count + 1
}
elseif ($i -match ".*Assign.*")
{
cp $i C:\Users\Administrator\Documents\$DirectoryName\$sub3
$count = $count + 1
}
elseif ($i -match ".*.ps1")
{
cp $i C:\Users\Administrator\Documents\$DirectoryName\$sub4
$count = $count + 1
}
Write-host "$i"
}
## Step 9: Display a message "<$count> files moved"
###################################
Write-host "$count files moved"
Copy-Item expects String as input. So it calls the ToString() method of the FileInfo object $i. That returns only the file name, not full path. As source directory is not specified the current working directory is used. Solution is to use full path found in fullname property:
cp $i.fullname C:\Users\Administrator\Documents\test\Lectures
From pipeline Copy-Item can handle FileInfo objects correctly, using the fullname property. Which you should remember not to drop if using Select-Object.

Monitor a command and wait for it to complete before proceeding to next command?

I have written a PowerShell script that will:
grab all txt files from a directory
perform a line-by-line assessment of the first file (grabbing headers and appending, appending data to each line in file, saving to an output file)
for subsequent files, grab body (excluding header), append data, then add to output file
The problem is in the use of Add-Content where the process hangs so certain files don't get written because the output file is in use. I added a function (based on recommendations found in various places on StackExchange) that test the output file to determine if it is available for read-write. This seems like a 'brute-force' approach.
Is there a way to monitor the actual Add-Content process launched by PowerShell to identify when it is complete? Or is there some other way to disaggregate the code as written to use the process control commands in PowerShell?
Sample:
function IsFileAccessible([String]$FullFileName) {
[Boolean]$IsAccessible = $false
try {
[IO.File]::OpenWrite($FullFileName).Close();
$IsAccessible = $true
} catch {
$IsAccessible = $false
}
return $IsAccessible
}
cd '[filepath]'
del old_output.type
$filearray = #()
$files = Get-ChildItem '[filepath]' -Filter "*.txt"
$outfile = 'new_output.type'
for ($i=0; $i -lt $files.Count; $i++) {
# Define variables
$lastWriteTime = $files[$i].LastWriteTime
# Define process steps for appending data
filter Add-Time {"$_$lastWriteTime"}
if ($i -eq 0) {
$lines = Get-Content $files[$i]
for ($j=0;$j -lt $lines.Count; $j++) {
if ($j -eq 0) {
$appended_txt = 'New_Header'
filter Add-Header{"$_$appended_txt"}
$lines[$j] | Add-Header | Add-Content $outfile
} else {
$lines[$j] | Add-Time | Add-Content $outfile
}
}
} else {
do {
$ErrorActionPreference = 'SilentlyContinue'
$test = IsFileAccessible('[filepath-new_output.type]')
echo 'file open'
} until ($test -eq 'True')
$ErrorActionPreference = 'Continue'
echo 'okay'
(Get-Content $files[$i].FullName | Select-Object -Skip 1) |
Add-Time | Add-Content $outfile
}
}

Powershell to Split huge folder in multiple folders

I have a folder that contains many huge files. I want to split these files in 3 folders. The requirement is to get the count of files in main folder and then equally split those files in 3 child folders.
Example - Main folder has 100 files. When I run the powershell, 3 child folders should be created with 33, 33 and 34 files respectively.
How can we do this using Powershell?
I've tried the following:
$FileCount = (Get-ChildItem C:\MainFolder).count
Get-ChildItem C:\MainFolder -r | Foreach -Begin {$i = $j = 0} -Process {
if ($i++ % $FileCount -eq 0) {
$dest = "C:\Child$j"
md $dest
$j++
}
Move-Item $_ $dest
}
Here is another solution. This one accounts for the sub folders not existing.
# Number of groups to support
$groupCount = 3
$path = "D:\temp\testing"
$files = Get-ChildItem $path -File
For($fileIndex = 0; $fileIndex -lt $files.Count; $fileIndex++){
$targetIndex = $fileIndex % $groupCount
$targetPath = Join-Path $path $targetIndex
If(!(Test-Path $targetPath -PathType Container)){[void](new-item -Path $path -name $targetIndex -Type Directory)}
$files[$fileIndex] | Move-Item -Destination $targetPath -Force
}
If you need to split up the files into a different number of groups the use $groupCount of higher that 3. Can also work logic with a switch that would change $groupCount to something else if the count was greater that 500 for example.
Loop though the files one by one. Using $fileIndex as a tracker we determine the folder 0,1 or 2 in my case that the file will be put into. Then using that value we check to be sure the target folder exists. Yes, this logic could easily be placed outside the loop but if you have file and folder changes while the script is running you could argue it is more resilient.
Ensure the folder exists, if not make it. Then move that one item. Using the modulus operator, like in the other answers, we don't have to worry about how many files are there. Let PowerShell do the math.
This is super quick and dirty, but it does the job.
#Get the collection of files
$files = get-childitem "c:\MainFolder"
#initialize a counter to 0 or 1 depending on if there is a
#remainder after dividing the number of files by 3.
if($files.count % 3 -eq 0){
$counter = 0
} else {
$counter = 1
}
#Iterate through the files
Foreach($file in $files){
#Determine which subdirectory to put the file in
If($counter -lt $files.count / 3){
$d = "Dir1"
} elseif($counter -ge $files.count / 3 * 2){
$d = "Dir3"
} else {
$d = "Dir2"
}
#Create the subdirectory if it doesn't exist
#You could just create the three subdirectories
#before the loop starts and skip this
if(-Not (test-path c:\Child\$d)){
md c:\Child\$d
}
#Move the file and increment the counter
move-item $file.FullName -Destination c:\Child\$d
$counter ++
}
I think it's possible to do without doing the counting and allocating yourself. This solution:
Lists all the files
Adds a counter property which cycles 0,1,2,0,1,2,0,1,2 to each file
groups them into buckets based on the counter
moves each bucket in one command
There's scope for rewriting it in a lot of ways to make it nicer, but this saves doing the math, handling uneven allocations, iterating over the files and moving them one at a time, would easily adjust to different numbers of groups.
$files = (gci -recurse).FullName
$buckets = $files |% {$_ | Add-Member NoteProperty "B" ($i++ % 3) -PassThru} |group B
$buckets.Name |% {
md "c:\temp$_"
Move-Item $buckets[$_].Group "c:\temp$_"
}