Trying to simplify one of my archiving tasks, but I'm stumped on how to go about it. Basically, I just want Powershell to search a folder for files, and move all but the most recently modified (by LastWriteTime) to a backup folder.
I've searched around for solutions to this but every answer I've come across looks for the oldest file or depends on a specific file-naming convention to work.
Basically I want it to look at this this:
E:\ProjectFolder1\EDLs\File1.prproj (modified six days ago)
E:\ProjectFolder1\EDLs\File2.prproj (modified six hours ago)
E:\ProjectFolder1\EDLs\File3.prproj (modified six seconds ago)
Identify File3.prproj as the one that's most up-to-date, and move all the other files in the directory to another folder:
E:\Deep Storage\ProjectFolder1\EDLs\File1.prproj
E:\Deep Storage\ProjectFolder1\EDLs\File2.prproj
I know how to do everything except get it to compare the LastWriteTimes. Is there a way do get PS to do this?
EDIT with code sample
Get-ChildItem $sourceDir -Include $search -Recurse | Sort-Object LastWriteTime -Descending | Select-Object -Skip 1 | `
foreach{
$targetFile = $targetDir + $_.FullName.SubString($sourceDir.Length);
New-Item -ItemType File -Path $targetFile -Force;
Move-Item $_.FullName -destination $targetFile -Force
}
EDIT with functional code:
$sourceDir = "E:\Test1\EDLs\"
$targetDir = "E:\Deep Storage\Test1\EDLs\"
$search = "*.prproj"
Get-ChildItem $sourceDir -Recurse -Directory | ForEach-Object {
$files = $_ | Get-ChildItem -File -Filter $search
if($files.Count -lt 2) {
return
}
$newPath = Join-Path $targetDir -ChildPath $_.FullName.Substring($sourceDir.Length)
$null = New-Item $newPath -ItemType Directory -Force
$files | Sort-Object LastWriteTime -Descending | Select-Object -Skip 1 |
Move-Item -Destination $newPath -Verbose -WhatIf
}
EDIT to show actual syntax for operating environment:
$sourceDir = "E:\Projects\Current\EDLs"
$targetDir = "E:\Deep Storage\Projects\Current\EDLs"
$search = "*.prproj"
Get-ChildItem $sourceDir -Directory | ForEach-Object {
# search only for files only 1 level under this folder
$files = Get-ChildItem $sourceDir -Filter *.prproj
# if there are at least 2 files here
if($files.Count -ge 2) {
# we dont need to create new folder here since these will go directly under
# destination folder so, we can just sort and skip first as in previous logic
$files | Sort-Object LastWriteTime -Descending | Select-Object -Skip 1 |
# then move them
Move-Item -Destination $targetDir
}
Ultimately the answer was a lot simpler than I thought it would be:
$sourceDir="E:\Test1\Test2"
# Where your files are
$targetDir="E:\Deep Storage\Test1\Test2"
# Where you want to send them
$search="*.ext"
# If applicable, what type of file you want to look for
Get-ChildItem -Path $sourceDir -Filter $search | Sort-Object |
Select-Object -SkipLast 1 | Move-Item -Destination $targetDir -Verbose -WhatIf
I use environment variables for my workflow so mine looks a little different, but this should be useful for anyone in the same situation.
Related
Here is my current script and it works fine. Not efficient running same code twice but I don't know how to combine the wildcards... anyway on to the bigger issue.
The below code searches through my $sourceDir, excludes the files listed in $ExclusionFiles, copies all folders and structure as well as any .jpg or any .csv files, then puts them into the $targetDir.
$sourceDir = 'c:\sectionOne\Graphics\Data'
$targetDir = 'C:\Test\'
$ExclusionFiles = #("InProgress.jpg", "input.csv", "PCMCSV2.csv")
# Get .jpg files
Get-ChildItem $sourceDir -filter "*.jpg" -recurse -Exclude $ExclusionFiles | `
foreach{
$targetFile = $targetDir + $_.FullName.SubString($sourceDir.Length);
New-Item -ItemType File -Path $targetFile -Force;
Copy-Item $_.FullName -destination $targetFile
}
# Get .csv files
Get-ChildItem $sourceDir -filter "*.csv" -recurse -Exclude $ExclusionFiles | `
foreach{
$targetFile = $targetDir + $_.FullName.SubString($sourceDir.Length);
New-Item -ItemType File -Path $targetFile -Force;
Copy-Item $_.FullName -destination $targetFile
}
My list of files in the main $sourceDir that I need to exclude is getting longer and there are folders I want to exclude as well. Can someone tell me how to,
Copy only a list of specific files in the $sourceDir
Exclude certain folders in $sourceDir from copying
Combine the wildcard search for .jpg and .csv into one statement
I'm still learning so any help would be greatly appreciated!
This is a case where a little bit of Regex will go a long way:
You can filter multiple extensions by using a pretty basic match:
$extensions = 'jpg', 'csv'
$endsWithExtension = "\.(?>$($extensions -join '|'))$"
Get-ChildItem -Recurse |
Where-Object Name -Match $endsWithExtension
You can exclude a list of specific files with one more Where-Object and the -In parameter:
$extensions = 'jpg', 'csv'
$endsWithExtension = "\.(?>$($extensions -join '|'))$"
$ExcludeFileNames = #("InProgress.jpg", "input.csv", "PCMCSV2.csv")
Get-ChildItem -Recurse |
Where-Object Name -Match $endsWithExtension |
Where-Object Name -NotIn $ExcludeFileNames
From there on in, your Foreach-Object is basically correct (nice touch making sure the file exists by using New-Item, though I'd personally assign it's output to null and -PassThru the Copy-Item).
Get-ChildItem $sourceDir -Recurse |
Where-Object Name -Match $endsWithExtension |
Where-Object Name -NotIn $ExcludeFileNames |
Foreach-Object {
$targetFile = $targetDir + $_.FullName.SubString($sourceDir.Length);
New-Item -ItemType File -Path $targetFile -Force;
Copy-Item $_.FullName -destination $targetFile
}
I have a folder that has a bunch of backups in in separated by folder. I want a script to use the directory (C:\Users\user\Desktop\TEST) and in that directory I have any number of folders with any number of files in them, I only want to keep the latest in the folder for every folder in the directory and delete the rest.
I have this but it only does 1 folder at a time and it has to be hardcoded.
$path = "C:\Users\user\Desktop\TEST\Folder1"
$FileNumber = (get-childitem $path).count - 1
get-childitem -path $path | sort CreationTime -Descending | select -last $FileNumber | Remove-Item -Force -WhatIf
Is there any way to automate this?
Thanks,
You can try this:
$Path = "C:\Users\user\Desktop\TEST"
$Folders = Get-ChildItem $Path
foreach ($Folder in $Folders)
{
$FolderName = $Folder.FullName
$Files = Get-ChildItem -Path $FolderName
$FileNumber = $Files.Count - 1
$Files | sort CreationTime -Descending | select -last $FileNumber | Remove-Item -Force -WhatIf
}
You would need a loop of your choice to accomplish this, this example uses ForEach-Object. Instead of using Select-Object -Last N you could just use Select-Object -Skip 1 to skip the newest file. Also note the use of -Directory and -File with Get-ChildItem to filter only directories or only files.
$path = "C:\Users\user\Desktop\TEST\Folder1"
# Get all Directories in `$path`
Get-ChildItem $path -Directory | ForEach-Object {
# For each Directory, get the Files
Get-ChildItem $_.FullName -File |
# Sort them from Newest to Oldest
Sort-Object CreationTime -Descending |
# Skip the first 1 (the newest)
Select-Object -Skip 1 |
# Remove the rest
Remove-Item -Force -WhatIf
}
I am trying to copy the latest file from every folder/sub-folder into a same file structure on a different drive.
Latest file from source copied to the same name corresponding destination.
The destination folder hierarchy already exists & cannot be copied over or recreated. This & other versions are not behaving. Can anyone help?
$sourceDir = 'test F Drive\Shares\SSRSFileExtract\'
$destDir = 'test X Drive\SSRSFileExtract\'
$date = Get-Date
$list = Get-ChildItem -Path $sourceDir | Sort-Object -Property LastWriteTime -Descending | Select-Object -First 1
foreach ($item in $list)
{
Copy-Item -Verbose -LiteralPath $item.FullName -Destination $destDir -Force |
Get-Acl -Path $item.FullName | Set-Acl -Path $destDir\$(Split-Path -Path $item.FullName -Leaf)
}
Get-ChildItem –Path $destDir -Recurse | Where-Object {($_.LastWriteTime -lt (Get-Date).AddDays(-5))} | Remove-Item -Verbose -Recurse -Force
I found a solution for this which copies/moves all the files from all sub folders in to all corresponding sub folders:
Powershell: Move all files from folders and subfolders into single folder
The way your code retrieves the list of files will only return one single object because of Select-Object -First 1. Also, because you don't specify the -File switch, Get-ChildItem will also return DirectoryInfo objects, not just FileInfo objects..
What you could do is get an array of FileInfo objects recursively from the source folder and group them by the DirectoryName property
Then loop over these groups of files and from each of these groups, select the most recent file and copy that over to the destination folder.
Try:
$sourceDir = 'F:\Shares\SSRSFileExtract'
$destDir = 'X:\SSRSFileExtract'
Get-ChildItem -Path $sourceDir -File -Recurse | Group-Object DirectoryName | ForEach-Object {
# the $_ automatic variable represents one group at a time inside the loop
$newestFile = $_.Group | Sort-Object -Property LastWriteTime -Descending | Select-Object -First 1
# construct the target sub directory
# you could also use $_.Name (the name of this group) instead of $newestFile.DirectoryName here, because
# we grouped on that DirectoryName file property.
$targetDir = Join-Path -Path $destDir -ChildPath $newestFile.DirectoryName.Substring($sourceDir.Length)
# if you're not sure the targetpath exists, uncomment the next line to have it created first
# if (!(Test-Path -Path $targetDir -PathType Container)) { $null = New-Item -Path $target -ItemType Directory }
# copy the file
$newestFile | Copy-Item -Destination $targetDir -Force -Verbose
# copy the file's ACL
$targetFile = Join-Path -Path $targetDir -ChildPath $newestFile.Name
$newestFile | Get-Acl | Set-Acl -Path $targetFile
}
Apparently you would also like to clean up older files in the destination folder
Get-ChildItem –Path $destDir -File -Recurse |
Where-Object {$_.LastWriteTime -lt (Get-Date).AddDays(-5).Date} |
Remove-Item -Verbose -Recurse -Force
Be aware that the final code to remove older files could potentially remove all files from a subfolder if all happen to be older than 5 days..
I want to copy a file to multiple destinations using a script that filters through a directory and selects the newest file in the $File_path then change its name and copies it to the $destination, the script i'm using is this:
$File_path = "C:\TEMP\export\liste\Text_Utf8\"
$destination = "C:\TEMP\export\C7E001"
get-childitem -path $File_path -Filter "Ges?*.txt" |
where-object { -not $_.PSIsContainer } |
sort-object -Property $_.CreationTime |
select-object -last 1 | copy-item -Destination (join-path $destination "FRER3000CCFETES01_IN.DEV")
this only copies it to one location, is there a way to improve it to copy the same file to multiple locations? i have seen this thread but it seems different.
the other locations are as follow:
C:\TEMP\export\C7P001
C:\TEMP\export\C7F001
C:\TEMP\export\C7S001
and so on.
thank you.
Although my answer isn't very different to Peter's answer, This uses the LastWriteTime property to get the latest file and uses the FullName property of the file to copy in the Copy-Item cmdlet.
$File_path = "C:\TEMP\export\liste\Text_Utf8"
$destinations = "C:\TEMP\export\C7E001", "C:\TEMP\export\C7F001", "C:\TEMP\export\C7S001"
$fileToCopy = Get-ChildItem -Path $File_path -Filter "Ges*.txt" -File |
Sort-Object -Property $_.LastWriteTime |
Select-Object -Last 1
foreach ($dest in $destinations) {
Copy-Item -Path $fileToCopy.FullName -Destination (Join-Path -Path $dest -ChildPath "FRER3000CCFETES01_IN.DEV")
}
You can use an foreach object loop
$File_path = "C:\TEMP\export\liste\Text_Utf8\"
$destination = "C:\TEMP\export\C7E001", "C:\TEMP\export\C7P001", "C:\TEMP\export\C7F001", "C:\TEMP\export\C7S001"
$Files = get-childitem -path $File_path -Filter "Ges?*.txt" |
where-object { -not $_.PSIsContainer } |
sort-object -Property $_.CreationTime |
select-object -last 1
$Destination | Foreach-Object {copy-item $Files -Destination (join-path $_ "FRER3000CCFETES01_IN.DEV")}
Hello All,
I wish to replace only the old file with new file
I tried
Set-Location C:\contains_newfolder_contents\Old Folder
Get-ChildItem | ForEach-Object {
if ((Test-Path 'C:\contains_newfolder_contents\Sample Folder\$_' ) -and
(.$_.LastWriteTime -gt C:\contains_newfolder_contents\Sample Folder\$_.LastWriteTime' )) {
Copy-Item .\$_ -destination 'C:\contains_newfolder_contents\Sample Folder'
}
}
Kindly correct me!
Here's a one-line solution. I used different folder names to make the example easier to read.
Get-ChildItem C:\temp\destination|foreach-object {$sourceItem = (get-item "c:\temp\source\$($_.name)" -erroraction ignore); if ($sourceItem -and $sourceItem.LastWriteTime -gt $_.lastwritetime) {Copy-Item -path $sourceItem -dest $_.fullname -verbose}}
For each existing file, it finds the matching file in the source folder. $sourcItem will be null if there is no matching source item. It proceeds to compare the dates and copy if the source date is newer.
you can do it too :
Get-ChildItem "C:\contains_newfolder_contents\Old Folder" -file | sort LastWriteTime -Descending | select -First 1 | Copy-Item -Destination 'C:\contains_newfolder_contents\Sample Folder'
Instead of making several reads to the source, I propose you make a lookup table and then these simple commands will achieve the desired results.
$source = 'C:\temp\Source'
$destintation = 'C:\temp\Destination'
$lookup = Get-ChildItem $destintation | Group-Object -Property name -AsHashTable
Get-ChildItem -Path $source |
Where-Object {$_.lastwritetime -gt $lookup[$_.name].lastwritetime} |
Copy-Item -Destination $destintation