Copy files until they exceed a specified size - powershell

I need to copy files to a folder until they exceed a specified size. I've written the following script but it fails with the following error:
Cannot compare "Microsoft.PowerShell.Commands.GenericMeasureInfo" because it is not IComparable.
At C:\33.ps1:8 char:1
+ "{0:N2}" -f ($colItems.sum / 1024MB)
$files = Get-ChildItem C:\source -Recurse | % { $_.FullName }
foreach($file in $files) {
do {
Copy-Item $file -Recurse D:\target
$colItems = (Get-ChildItem d:\target -recurse | `
Measure-Object -property length -sum)
"{0:N2}" -f ($colItems.sum / 1024MB)
}
while ($colItems -le 10)
}
What am I doing wrong?

The while condition will be verified after the first do loop. Since you already enumerate through the files, your script will copy all files.
You can omit the do-while loop and break the foreach if the limit is reached:
$files=Get-ChildItem C:\source -Recurse | % { $_.FullName }
$sum = 0
$sizeLimitInGB = 10
foreach($file in $files)
{
$colItems = (Get-ChildItem d:\target -recurse | Measure-Object -property length -sum)
if (($colItems.sum / 1GB) -gt $sizeLimitInGB)
{
break; # Limit reached.
}
Copy-Item $file -Recurse D:\target
}

Related

Move folders if bigger than 2GB

I have tried to modify unsuccessfully another script that list correctly folders bigger than 2GB so that it moves them
(I only changed the row $fso = New-Object -COM 'Scripting.FileSystemObject') :
$threshold = 2GB
$fso = Move-Item -path -Destination "C:\Dest\"
Get-ChildItem 'C:\Source\' -Recurse -Directory | Where-Object {
$fso.GetFolder($_.FullName).Size -gt $threshold
}
I've finally found a solution (adapted from https://community.spiceworks.com/topic/2468783-ps-script-to-find-folder-1gb-and-move-to-another-folder) ! :
Get-ChildItem -Path 'C:\Source' -Directory | ForEach-Object {
$Folder = Get-ChildItem -Path $_.FullName -File -Recurse
$FolderSize = $Folder | Measure-Object -Sum Length
if ($FolderSize.Sum -gt 2GB) {
Write-Information -MessageData "$_ is larger than 2GB. Size is $FolderSize" -InformationAction Continue
Move-Item -Path $_.FullName -Destination 'C:\dest'
}
}

Find Relativepaths of folders and subfolders(not files) recursively with size?

I was tried to find the relative paths of folders and subfolders (not files ) with length but i did'n get the length.
$srcpth = "C:\Program Files\Microsoft SQL Server"
$files = Get-ChildItem -Path $srcpth -Directory -Recurse
$result = foreach ($f in $files) {
[pscustomobject][ordered]#{
RelativePath = $f.fullname.remove(0,($srcpth.length))
FileSize = '{0:N1}' -f ($f_.Length/1KB)
}
}
$result | Export-Csv "c:\files\o2.csv"
I tried this snippet it is showing folders and subfolders in relativepaths but its not showing their length .anyone help me on this?
As commented, you need for each of the directories calculate the sum of the file sizes in there.
Try
$srcpth = "C:\Program Files\Microsoft SQL Server"
$files = Get-ChildItem -Path $srcpth -Directory -Recurse
$result = foreach ($f in $files) {
[pscustomobject][ordered]#{
RelativePath = $f.FullName.Remove(0, $srcpth.Length )
FileSize = '{0:N1}' -f ((Get-ChildItem -Path $f.FullName -File | Measure-Object -Property Length -Sum).Sum / 1KB)
}
}
$result | Export-Csv "c:\files\o2.csv" -NoTypeInformation

Test-Path cmdlet fails only for one file out of 20

I want to get the duplicates from a folder structure and copy all of them to a single folder, while renaming (so they don't overwrite). I would like the first file from a duplicates group to be copied with it's original name, and for the rest to add "_X" at the end of the name.
I wrote a code that almost works, but at some point it just overwrites the first file copied. Only one file is being overwritten, the rest are renamed and copied like intended.
Get-ChildItem $SourcePath -Recurse -File -Force | Group-Object -Property Name | Where-Object {$_.Count -gt 1} | Select-Object -ExpandProperty Group |
ForEach-Object {
$SourceFile = $_.FullName
$FileName = $($_.BaseName + $_.Extension)
$DestFileName = Join-Path -Path $DestinationPath -ChildPath $FileName
if (Test-Path -Path $DestFileName) {
$DestinationFile = "$DestinationPath\" + $_.BaseName + "_" + $i + $_.Extension
$i+=1
} else {
$DestinationFile = $DestFileName
}
Copy-Item -Path $SourceFile -Destination $DestinationFile
}
I don't see the actual problem but you could rewrite the code without using Test-Path. Remove Select-Object -ExpandProperty Group too, then iterate over each group's elements. Increment a counter and append it to all file names except the first one.
Get-ChildItem $SourcePath -Recurse -File -Force | Group-Object -Property Name | Where-Object Count -gt 1 |
ForEach-Object {
$i = 0
foreach( $dupe in $_.Group ) {
$SourceFile = $dupe.FullName
$DestinationFile = Join-Path -Path $DestinationPath -ChildPath $dupe.BaseName
if( $i -gt 0 ) { $DestinationFile += "_$i" }
$DestinationFile += $dupe.Extension
Copy-Item -Path $SourceFile -Destination $DestinationFile
$i++
}
}

Folders with more than 40.000 files

I have this script I received to check folders and subfolders on a network drive. I wonder how it could be modified into checking only folders and subfolder and write in the CSV if there is any folder with more then 40.000 files in it and the number of files. The image show a sample output from the script as it is now and I do not need it to show any files as it currently do.
$dir = "D:\test"
$results = #()
gci $dir -Recurse -Depth 1 | % {
$temp = [ordered]#{
NAME = $_
SIZE = "{0:N2} MB" -f ((gci $_.Fullname -Recurse | measure -Property Length -Sum -ErrorAction SilentlyContinue).Sum / 1MB)
FILE_COUNT = (gci -File $_.FullName -Recurse | measure | select -ExpandProperty Count)
FOLDER_COUNT = (gci -Directory $_.FullName -Recurse | measure | select -ExpandProperty Count)
DIRECTORY_PATH = $_.Fullname
}
$results += New-Object PSObject -Property $temp
}
$results | export-csv -Path "C:\temp\output.csv" -NoTypeInformation
Instead of executing so many Get-ChildItem cmdlets, here's an approach that uses robocopy to do the heavy lifting of counting the number of files, folders and total sizes:
# set the rootfolder to search
$dir = 'D:\test'
# switches for robocopy
$roboSwitches = '/L','/E','/NJH','/BYTES','/NC','/NP','/NFL','/XJ','/R:0','/W:0','/MT:16'
# regex to parse the output from robocopy
$regEx = '\s*Total\s*Copied\s*Skipped\s*Mismatch\s*FAILED\s*Extras' +
'\s*Dirs\s*:\s*(?<DirCount>\d+)(?:\s+\d+){3}\s+(?<DirFailed>\d+)\s+\d+' +
'\s*Files\s*:\s*(?<FileCount>\d+)(?:\s+\d+){3}\s+(?<FileFailed>\d+)\s+\d+' +
'\s*Bytes\s*:\s*(?<ByteCount>\d+)(?:\s+\d+){3}\s+(?<BytesFailed>\d+)\s+\d+'
# loop through the directories directly under $dir
$result = Get-ChildItem -Path $dir -Directory | ForEach-Object {
$path = $_.FullName # or if you like $_.Name
$summary = (robocopy.exe $_.FullName NULL $roboSwitches | Select-Object -Last 8) -join [Environment]::NewLine
if ($summary -match $regEx) {
$numFiles = [int64] $Matches['FileCount']
if ($numFiles -gt 40000) {
[PsCustomObject]#{
PATH = $path
SIZE = [int64] $Matches['ByteCount']
FILE_COUNT = [int64] $Matches['FileCount']
FOLDER_COUNT = [int64] $Matches['DirCount']
}
}
}
else {
Write-Warning -Message "Path '$path' output from robocopy was in an unexpected format."
}
}
# output on screen
$result | Format-Table -AutoSize
# output to CSV file
$result | Export-Csv -Path "C:\temp\output.csv" -NoTypeInformation

Powershell - Delete all the oldest files when folder size reaches 5GB

Need you help in the below query. My requirement is to delete all the oldest files who together attain a total size of 5GB in the folder. But currently it is only deleting one file due to -first 1. Please help me. Thanks in advance.
$Dir = "L:\TraceFiles"
$MaxSize = 5120 #Specify in MB
$Filter = "*.trc"
$OldestFile = Get-ChildItem $dir -Filter $Filter | Sort LastWriteTime | Select -First 1
$FolderCurrentSize = (Get-ChildItem $dir -recurse | Measure-Object -property length -sum).sum / 1MB
IF ($FolderCurrentSize -GT $MaxSize)
{
Write-output "Deleting File $OldestFile, becuase the Current folder size $FolderCurrentSize MB, has exceeded the maximum size of $MaxSize MB"
#Remove-Item $OldestFile.FullName
}
ELSE
{
Write-output "No deletes needed! Current folder size is $FolderCurrentSize MB, which is less than maximum size of $MaxSize MB"
}
Add a LastWriteTime filter to The $OldestFile and remove the Select -Last 1
$OldestFile = Get-ChildItem $dir -Filter $Filter | ? {$_.LastWriteTime -lt "01/01/2014"}
$Sum = 0; $OldestFile | % {$sum += $_.length}
$TotalSize = $Sum /1mb
Use Foreach to remove each file found: in case them all are greater then 5gb
IF ($TotalSize -GT $MaxSize)
{
foreach ($file in $oldestfile)
{
remove-item $file -WhatIf
}
}
## Remove the -WhatIf if it's do the work...
You can do that like wile Directory size > 5gb, get oldest file, delete it.
in PS:
while ( ((Get-ChildItem $dir -recurse | Measure-Object -property length -sum).sum / 1MB) -gt $MaxSize)
{
$OldestFile = Get-ChildItem $dir -Filter $Filter | Sort LastWriteTime | Select -First 1
#Remove-Item $OldestFile.FullName
}
Thanks Martin and Avshalom for inputs. Got the solution finally as below using the foreach loop. Really thanks for your ideas.
$Dir = "L:\TraceFiles"
$MaxSize = 5120 #Specify in MB
$Filter = "*.trc"
$OldestFilesAll = Get-ChildItem $dir -Filter $Filter | Sort LastWriteTime
IF ($FolderCurrentSize -GT $MaxSize)
{
foreach($File in $OldestFilesAll)
{
$OldestFileSingle = Get-ChildItem $dir -Filter $Filter | Sort LastWriteTime | Select -First 1
$FolderCurrentSize = (Get-ChildItem $dir -recurse | Measure-Object -property length -sum).sum / 1MB
IF ($FolderCurrentSize -GT $MaxSize)
{
Write-output "Deleting File $OldestFileSingle, becuase the Current folder size $FolderCurrentSize MB, has exceeded the maximum size of $MaxSize MB"
Remove-Item $OldestFileSingle.FullName
}
}
}
ELSE
{
Write-output "No deletes needed! Current folder size is $FolderCurrentSize MB, which is less than maximum size of $MaxSize MB"
}