How to copy Oldest file to new directoy? - powershell

I need a script to move 1 old file to another directory.
I have a script but is not good for me:
$path = "C:*.*"
$Destination = "C:*.*"
Foreach($file in (Get-ChildItem $path)) {
If($file.LastWriteTime -gt (Get-Date).adddays(-1).date) {
Move-Item -Path $file.fullname -Destination $Destination
}
}
I need only 1 file, and Oldest every day.
Please help, Thanks

If you want only one file then you don't need a foreach loop there. You can try something like this using the Sort-Object and the Select-Object with the -Descending parameter.
Presently I have sorted that with the CreationTime and you can always choose the first element of the resultant using the -First.
$path = "C:*.*" ;
$Destination = "C:*.*" ;
$file= Get-ChildItem $path | select name,lastwritetime,CreationTime | sort-object -property CreationTime -Descending | Select-Object -First 1 ;
If($file.LastWriteTime -gt (Get-Date).adddays(-1).date) {
Move-Item -Path $file.fullname -Destination $Destination
}
Hope it helps.

try Something like this
$path = "C:\temp" ;
$Destination = "C:\temp\olddir\" ;
Get-ChildItem $path -file -rec |
where {$_.LastWriteTime -le (Get-Date).adddays(-10).date} |
Select-Object -First 1 |
Move-Item -Destination "$Destination"

Related

Powershell: Move all files except most recently modified?

Trying to simplify one of my archiving tasks, but I'm stumped on how to go about it. Basically, I just want Powershell to search a folder for files, and move all but the most recently modified (by LastWriteTime) to a backup folder.
I've searched around for solutions to this but every answer I've come across looks for the oldest file or depends on a specific file-naming convention to work.
Basically I want it to look at this this:
E:\ProjectFolder1\EDLs\File1.prproj (modified six days ago)
E:\ProjectFolder1\EDLs\File2.prproj (modified six hours ago)
E:\ProjectFolder1\EDLs\File3.prproj (modified six seconds ago)
Identify File3.prproj as the one that's most up-to-date, and move all the other files in the directory to another folder:
E:\Deep Storage\ProjectFolder1\EDLs\File1.prproj
E:\Deep Storage\ProjectFolder1\EDLs\File2.prproj
I know how to do everything except get it to compare the LastWriteTimes. Is there a way do get PS to do this?
EDIT with code sample
Get-ChildItem $sourceDir -Include $search -Recurse | Sort-Object LastWriteTime -Descending | Select-Object -Skip 1 | `
foreach{
$targetFile = $targetDir + $_.FullName.SubString($sourceDir.Length);
New-Item -ItemType File -Path $targetFile -Force;
Move-Item $_.FullName -destination $targetFile -Force
}
EDIT with functional code:
$sourceDir = "E:\Test1\EDLs\"
$targetDir = "E:\Deep Storage\Test1\EDLs\"
$search = "*.prproj"
Get-ChildItem $sourceDir -Recurse -Directory | ForEach-Object {
$files = $_ | Get-ChildItem -File -Filter $search
if($files.Count -lt 2) {
return
}
$newPath = Join-Path $targetDir -ChildPath $_.FullName.Substring($sourceDir.Length)
$null = New-Item $newPath -ItemType Directory -Force
$files | Sort-Object LastWriteTime -Descending | Select-Object -Skip 1 |
Move-Item -Destination $newPath -Verbose -WhatIf
}
EDIT to show actual syntax for operating environment:
$sourceDir = "E:\Projects\Current\EDLs"
$targetDir = "E:\Deep Storage\Projects\Current\EDLs"
$search = "*.prproj"
Get-ChildItem $sourceDir -Directory | ForEach-Object {
# search only for files only 1 level under this folder
$files = Get-ChildItem $sourceDir -Filter *.prproj
# if there are at least 2 files here
if($files.Count -ge 2) {
# we dont need to create new folder here since these will go directly under
# destination folder so, we can just sort and skip first as in previous logic
$files | Sort-Object LastWriteTime -Descending | Select-Object -Skip 1 |
# then move them
Move-Item -Destination $targetDir
}
Ultimately the answer was a lot simpler than I thought it would be:
$sourceDir="E:\Test1\Test2"
# Where your files are
$targetDir="E:\Deep Storage\Test1\Test2"
# Where you want to send them
$search="*.ext"
# If applicable, what type of file you want to look for
Get-ChildItem -Path $sourceDir -Filter $search | Sort-Object |
Select-Object -SkipLast 1 | Move-Item -Destination $targetDir -Verbose -WhatIf
I use environment variables for my workflow so mine looks a little different, but this should be useful for anyone in the same situation.

Test-Path cmdlet fails only for one file out of 20

I want to get the duplicates from a folder structure and copy all of them to a single folder, while renaming (so they don't overwrite). I would like the first file from a duplicates group to be copied with it's original name, and for the rest to add "_X" at the end of the name.
I wrote a code that almost works, but at some point it just overwrites the first file copied. Only one file is being overwritten, the rest are renamed and copied like intended.
Get-ChildItem $SourcePath -Recurse -File -Force | Group-Object -Property Name | Where-Object {$_.Count -gt 1} | Select-Object -ExpandProperty Group |
ForEach-Object {
$SourceFile = $_.FullName
$FileName = $($_.BaseName + $_.Extension)
$DestFileName = Join-Path -Path $DestinationPath -ChildPath $FileName
if (Test-Path -Path $DestFileName) {
$DestinationFile = "$DestinationPath\" + $_.BaseName + "_" + $i + $_.Extension
$i+=1
} else {
$DestinationFile = $DestFileName
}
Copy-Item -Path $SourceFile -Destination $DestinationFile
}
I don't see the actual problem but you could rewrite the code without using Test-Path. Remove Select-Object -ExpandProperty Group too, then iterate over each group's elements. Increment a counter and append it to all file names except the first one.
Get-ChildItem $SourcePath -Recurse -File -Force | Group-Object -Property Name | Where-Object Count -gt 1 |
ForEach-Object {
$i = 0
foreach( $dupe in $_.Group ) {
$SourceFile = $dupe.FullName
$DestinationFile = Join-Path -Path $DestinationPath -ChildPath $dupe.BaseName
if( $i -gt 0 ) { $DestinationFile += "_$i" }
$DestinationFile += $dupe.Extension
Copy-Item -Path $SourceFile -Destination $DestinationFile
$i++
}
}

Powershell Find all empty folders and subfolders in a given Folder name

I´m trying to get a
a) list of all empty folders and subfolders if the folder is named "Archiv"
b) I´d like to delete all those empty folders. My current approch doesn´t check the subfolders.
It would be also great if the results would be exportet in a .csv =)
$TopDir = 'C:\Users\User\Test'
$DirToFind = 'Archiv'>$EmptyDirList = #(
Get-ChildItem -LiteralPath $TopDir -Directory -Recurse |
Where-Object {
#[System.IO.Directory]::GetFileSystemEntries($_.FullName).Count -eq 0
$_.GetFileSystemInfos().Count -eq 0 -and
$_.Name -match $DirToFind
}
).FullName
$EmptyDirList
Any ideas how to adjust the code? Thanks in advance
You need to reverse the order in which Get-ChildItem lists the items so you can remove using the deepest nested empty folder first.
$LogFile = 'C:\Users\User\RemovedEmptyFolders.log'
$TopDir = 'C:\Users\User\Test'
# first get a list of all folders below the $TopDir directory that are named 'Archiv' (FullNames only)
$archiveDirs = (Get-ChildItem -LiteralPath $TopDir -Filter 'Archiv' -Recurse -Directory -Force).FullName |
# sort on the FullName.Length property in Descending order to get 'deepest-nesting-first'
Sort-Object -Property Length -Descending
# next, remove all empty subfolders in each of the $archiveDirs
$removed = foreach ($dir in $archiveDirs) {
(Get-ChildItem -LiteralPath $dir -Directory -Force) |
# sort on the FullName.Length property in Descending order to get 'deepest-nesting-first'
Sort-Object #{Expression = {$_.FullName.Length}} -Descending |
ForEach-Object {
# if this folder is empty, remove it and output its FullName for the log
if (#($_.GetFileSystemInfos()).Count -eq 0) {
$_.FullName
Remove-Item -LiteralPath $_.FullName -Force
}
}
# next remove the 'Archiv' folder that is now possibly empty too
if (#(Get-ChildItem -LiteralPath $dir -Force).Count -eq 0) {
# output this folders fullname and delete
$dir
Remove-Item -LiteralPath $dir -Force
}
}
$removed | Set-Content -Path $LogFile -PassThru # write your log file. -PassThru also writes the output on screen
Not sure a CSV is needed, I think a simple text file will suffice as it's just a list.
Anyway, here's (although not the most elegant) a solution which will also delete "nested empty directories". Meaning if a directory only contains empty directorIS, it will also get deleted
$TopDir = "C:\Test" #Top level directory to scan
$EmptyDirListReport = "C:\EmptyDirList.txt" #Text file location to store a file with the list of deleted directorues
if (Test-Path -Path $EmptyDirListReport -PathType Leaf)
{
Remove-Item -Path $EmptyDirListReport -Force
}
$EmptyDirList = ""
Do
{
$EmptyDirList = Get-ChildItem -Path $TopDir -Recurse | Where-Object -FilterScript { $_.PSIsContainer } | Where-Object -FilterScript { ((Get-ChildItem -Path $_.FullName).Count -eq 0) } | Select-Object -ExpandProperty FullName
if ($EmptyDirList)
{
$EmptyDirList | Out-File -FilePath $EmptyDirListReport -Append
$EmptyDirList | Remove-Item -Force
}
} while ($EmptyDirList)
This should do the trick, should works with nested too.
$result=(Get-ChildItem -Filter "Archiv" -Recurse -Directory $topdir | Sort-Object #{Expression = {$_.FullName.Length}} -Descending | ForEach-Object {
if ((Get-ChildItem -Attributes d,h,a $_.fullname).count -eq 0){
$_
rmdir $_.FullName
}
})
$result | select Fullname |ConvertTo-Csv |Out-File $Logfile
You can do this with a one-liner:
> Get-ChildItem -Recurse dir -filter Archiv |
Where-Object {($_ | Get-ChildItem).count -eq 0} |
Remove-Item
Although, for some reason, if you have nested Archiv files like Archiv/Archiv, you need to run the line several times.

Copy same file to multiple destinations

I want to copy a file to multiple destinations using a script that filters through a directory and selects the newest file in the $File_path then change its name and copies it to the $destination, the script i'm using is this:
$File_path = "C:\TEMP\export\liste\Text_Utf8\"
$destination = "C:\TEMP\export\C7E001"
get-childitem -path $File_path -Filter "Ges?*.txt" |
where-object { -not $_.PSIsContainer } |
sort-object -Property $_.CreationTime |
select-object -last 1 | copy-item -Destination (join-path $destination "FRER3000CCFETES01_IN.DEV")
this only copies it to one location, is there a way to improve it to copy the same file to multiple locations? i have seen this thread but it seems different.
the other locations are as follow:
C:\TEMP\export\C7P001
C:\TEMP\export\C7F001
C:\TEMP\export\C7S001
and so on.
thank you.
Although my answer isn't very different to Peter's answer, This uses the LastWriteTime property to get the latest file and uses the FullName property of the file to copy in the Copy-Item cmdlet.
$File_path = "C:\TEMP\export\liste\Text_Utf8"
$destinations = "C:\TEMP\export\C7E001", "C:\TEMP\export\C7F001", "C:\TEMP\export\C7S001"
$fileToCopy = Get-ChildItem -Path $File_path -Filter "Ges*.txt" -File |
Sort-Object -Property $_.LastWriteTime |
Select-Object -Last 1
foreach ($dest in $destinations) {
Copy-Item -Path $fileToCopy.FullName -Destination (Join-Path -Path $dest -ChildPath "FRER3000CCFETES01_IN.DEV")
}
You can use an foreach object loop
$File_path = "C:\TEMP\export\liste\Text_Utf8\"
$destination = "C:\TEMP\export\C7E001", "C:\TEMP\export\C7P001", "C:\TEMP\export\C7F001", "C:\TEMP\export\C7S001"
$Files = get-childitem -path $File_path -Filter "Ges?*.txt" |
where-object { -not $_.PSIsContainer } |
sort-object -Property $_.CreationTime |
select-object -last 1
$Destination | Foreach-Object {copy-item $Files -Destination (join-path $_ "FRER3000CCFETES01_IN.DEV")}

Delete files older than 30 and save 1

I need to have a clean-up script that remove all files older than 30 days but if file is older than 30 days it should save the last one. Possible? :)
I have tried a couple of parameters but cannot really get it to work.. guess I need a if/else clause?
Would appreciate any guide and help with this, thanks
$Daysback = "-30"
$CurrentDate = Get-Date
$DatetoDelete = $CurrentDate.AddDays($Daysback)
$path = "C:\Data\*"
$save1 = Get-ChildItem -Path $path | Where-Object {($_.Name -like "Test*.zip")} | sort LastWriteTime -Descending | select -First
Get-ChildItem $path -Recurse
{($_.CreationTime -le $(Get-Date).AddDays($Daysback))}
{
Remove-Item -Recurse -Force
}
elseif ($save1)
{
Remove-Item -Recurse -Force
}
}
Something like this should work.
$Daysback = "-30"
$CurrentDate = Get-Date
$DatetoDelete = $CurrentDate.AddDays($Daysback)
$path = "C:\Data\*"
$Items=Get-ChildItem -Path $path -Recurse | Where-Object {($_.Name -like "Test*.zip") -and ($_.LastWriteTime -le ($DatetoDelete))}| Sort-Object LastWriteTime -Descending
$Items|Select-Object -Skip 1 |Remove-Item -Recurse -Force -Path $_.fullname
Get-ChildItem -> Filter, only get the items that name starts with Test and ends with .Zip that were written over 30 days ago. Sort them.
In the delete line, we use -Skip 1 to skip over the first item in the sorted list and remove the items by using their path.
This can be simplified. The below block will grab all files in C:\Data that meet the filter (faster than Where-Object significantly), then further reduces those based on their CreationTime, skips 1, and deletes the rest.
Get-ChildItem -Path 'C:\Data' -Filter 'Test*.zip' -Recurse |
Where-Object { -not $_.PSIsContainer -and
$_.CreationTime -le (Get-Date).AddDays(-30) } |
Sort-Object -Property 'LastWriteTime' -Descending |
Select-Object -Skip 1 |
Remove-Item -Force -WhatIf