Copy files from one directory to another - powershell

I need to copy files from one directory to another location, based on the age of the file. I also need to keep the directory structure.
This code is working as far as only copying the files that meet the criteria, but it is NOT keeping the directory structure:
$ListDate = Get-Date "12/6/2013 11:08 AM"
$ActiveDate = $ListDate.AddYears(-7)
Get-ChildItem -path "T:\ProductionServices" -recurse | where-object {$_.lastwritetime -le $ActiveDate -and -not $_.psiscontainer} | Copy-item -destination "T:\TECH\CopyOfDeleteFile"
I've been struggling with this for over a week, and I've tried all the suggestions I have seen here and on the internet. I just need a little push to figure out what I am doing wrong.

This is one of those situations that is better for robocopy than PowerShell (or, use robocopy with PowerShell). Although with robocopy, you can't get resolution down to the minute (I think)
robocopy t:\Productionservices t:\tech\copyofdeletefile /E /MINAGE:20061206
Or, if you want to use it with PowerShell to do the date math:
$ListDate = Get-Date "12/6/2013 11:08 AM";
$ActiveDate = get-date $($ListDate.AddYears(-7)) -f "yyyyMMdd";
robocopy t:\Productionservices t:\tech\copyofdeletefile /E /MINAGE:$ActiveDate;
You can use the /COPY:DAT /DCOPY:T switches to preserve all attributes & timestamps as well.

Related

using 7zip to zip in powershell 5.0

I have customized one powershell code to zip files older than 7 days from a source folder to a subfolder and then delete the original files from source after zipping is complete. The code is working fine with inbuilt Compress-Archive and Remove-Item cmdlets with less volume of files, but takes more time and system memory for a large volume of files. So, I'm working on a solution using 7zip instead as it's faster.
Below script does zipping correctly but not following the condition of only files older than 7 days and deletes all the files from source folder. It should zip and delete only files older than 7 days.
I have tried all possible ways to troubleshoot but no luck. Can anybody suggest possible solution?
if (-not (test-path "$env:ProgramFiles\7-Zip\7z.exe")) {throw "$env:ProgramFiles\7-Zip\7z.exe needed"}
set-alias sz "$env:ProgramFiles\7-Zip\7z.exe"
$Date = Get-Date -format yyyy-MM-dd_HH-mm
$Source = "C:\Users\529817\New folder1\New folder_2\"
$Target = "C:\Users\529817\New folder1\New folder_2\ARCHIVE\"
Get-ChildItem -path $Source | sz a -mx=9 -sdel $Target\$Date.7z $Source
There are several problems here. The first is that 7-Zip doesn't accept a list of files as a pipe, furthermore even if it did your GCI is selecting every file and not selecting by date. The reason that it works at all is that you are passing the source folder as a parameter to 7-Zip.
7-Zip accepts the list of files to zip as a command line argument:
Usage: 7z <command> [<switches>...] <archive_name> [<file_names>...] [#listfile]
And you can select the files you want by filter the output from GCI by LastWriteTime.
Try changing your last line to this
sz a -mx=9 -sdel $Target\$Date.7z (gci -Path $Source |? LastWriteTime -lt (Get-Date).AddDays(-7) | select -expandproperty FullName)
If you have hundreds of files and long paths then you may run into problems with the length of the command line in which case you might do this instead:
gci -Path $Source |? LastWriteTime -lt (Get-Date).AddDays(-7) |% { sz a -mx=9 -sdel $Target\$Date.7z $_.FullName }
Consider a temporary file with a list of those files which need to be compressed:-
$tmp = "$($(New-Guid).guid).tmp"
set-content $tmp (gci -Path $Source |? LastWriteTime -lt (Get-Date).AddDays(-7)).FullName
sz a -mmt=8 out.7z #$tmp
Remove-Item $tmp
Also looking at the parameters to 7-Zip: -mx=9 will be slowest for potentially a small size gain. Perhaps leave that parameter out and take the default and consider adding -mmt=8 to use multiple threads.

Zipping using powershell

I've written code to zip files older than 7 days from a source folder to a subfolder and then delete the original files. My code works best with Compress-Archive and Remove-Item cmdlets with fewer files, but takes more time and system memory for a large volume of files.
So, I'm working on a solution using 7zip instead as it's faster.
Below code does zipping correctly but not limit itself to files older than 7 days and deletes all the files from source folder. It should zip and delete only files older than 7 days.
Is there anything wrong with the code.
if (-not (test-path "$env:ProgramFiles\7-Zip\7z.exe")) {throw "$env:ProgramFiles\7-Zip\7z.exe needed"}
set-alias 7z "$env:ProgramFiles\7-Zip\7z.exe"
$Days = "7"
$Date = Get-Date -format yyyy-MM-dd_HH-mm
$limit = (Get-Date).AddDays(-$Days)
$filePath = "C:\Users\529817\New folder1\New folder_2"
Where LastWriteTime -lt $limit | 7z a -t7z -sdel "C:\Users\529817\New folder1\New folder_2\ARCHIVE\$Date.7z" "$filePath"
I don't think you are running the 7zip command correctly. You are simply telling it to add all the files from the directory $filepath to the archive then delete all the files. That and I have serious doubts that 7zip can take pipeline input as your sample suggests.
Look at the examples from 7Zip cmdline help:
7z a archive1.zip subdir\
Adds all files and subfolders from folder subdir to archive archive1.zip. The filenames in archive will contain subdir\ prefix.
7z a archive2.zip .\subdir\*
Adds all files and subfolders from folder subdir to archive archive2.zip. The filenames in archive will not contain subdir\ prefix.
I'd have to download 7Zip to test but I think you need a loop to process the files you isolated with the Where clause. It might look something like:
if (-not (test-path "$env:ProgramFiles\7-Zip\7z.exe")) {throw "$env:ProgramFiles\7-Zip\7z.exe needed"}
set-alias 7z "$env:ProgramFiles\7-Zip\7z.exe"
$Days = "7"
$Date = Get-Date -format yyyy-MM-dd_HH-mm
$limit = (Get-Date).AddDays(-$Days)
$filePath = "C:\Users\529817\New folder1\New folder_2"
Get-ChildItem $filePath |
Where-Object{ $_.LastWriteTime -lt $limit } |
ForEach-Object{
7z a -t7z -sdel "C:\Users\529817\New folder1\New folder_2\ARCHIVE\$Date.7z" $_.FullName
}
Note: At least in your sample you are missing the Get-ChildItem command. I don't think you need to reference the .Date property from the [DateTime] object returned by the .AddDays() method unless you want the boundary to be midnight of that date. Otherwise .AddDays() will return a [DateTime] naturally.

Powershell: Get-ChildItem performance to deal with bulk files

The scenario is in a remote server, a folder is shared for ppl to access the log files.
The log files will be kept for around 30 days before they got aged, and each day, around 1000 log files will be generated.
For problem analysis, I need to copy log files to my own machine, according to the file timestamp.
My previous strategy is:
Use dir /OD command to get the list of the files, to my local PC, into a file
Open the file, find the timestamp, get the list of the files I need to copy
Use copy command to copy the actual log files
It works but needs some manual work, ie step2 I use notepad++ and regular expression to filter the timestamp
I tried to use powershell as:
Get-ChildItem -Path $remotedir | where-object {$_.lastwritetime -gt $starttime -and $_lastwritetime -lt $endtime } |foreach {copy-item $_.fullname -destination .\}
However using this approach it took hours and hours and no file has been copied, while compared with the dir solution it took around 7-8 minutes to generate the list of the file than copy itself took sometime but not hours
I guess most of the time spent on the filter file. I'm not quite sure why the get-childitem's performance is so poor.
Can you please advise if there's anything i can change?
Thanks
For directories with a lot of files, Get-ChildItem is too slow. It looks like most of the time is spent enumerating the directory, then filtering through 'where', then copying each file.
Use .net directly, particularly [io.directoryinfo] with the GetFileSystemInfos() method.
e.g.
$remotedir = [io.directoryinfo]'\\server\share'
$destination = '.\'
$filemask = '*.*'
$starttime = [datetime]'jan-21-2021 1:23pm'
$endtime = [datetime]'jan-21-2021 4:56pm'
$remotedir.GetFileSystemInfos($filemask, [System.IO.SearchOption]::TopDirectoryOnly) | % {
if ($_.lastwritetime -gt $starttime -and $_.lastwritetime -lt $endtime){
Copy-Item -Path $_.fullname -Destination $destination
}
}

Batch - Find and Move Folders Based on Folder Created Date

I'm trying to create a batch (or PowerShell) script that does the following:
Gets the current date (i.e. 06/21/2018)
Looks at all sub-folders in a specific folder (not recursively, just the immediate sub-folders) and finds all folders with a created date in the previous year, up to the current date in the previous year (i.e. 01/01/2017 - 06/21/2017).
Moves all of those folders to a '2017 Jobs' folder.
So I've been searching around for an answer to this question but everything seems to be focused around file dates, not folder dates, so here we are. I know how to use Robocopy to move the folders once found, but all of it's switches are based around moving files older than X date, not folders. Any ideas on how I can achieve a folder-based created date looping lookup?
Without building the entire script for you, here are your pieces:
$date = Get-Date
$dateAYearAgo = $date.AddYears(-1)
$items = Get-ChildItem "C:\base\folder" | Where-Object {$_.CreationTime -gt $start -and $_.CreationTime -lt $end}
$items | Move-Item "C:\base\folder\2017 Jobs"
As for filtering out just folders, you can see if the version of powershell you are on allows Get-ChildItem C:\ -Directory to pull only directories, or you can use Get-ChildItem C:\ | Where-Object { $_.PSIsContainer }

Copy files in PowerShell too slow

Good day, all. New member here and relatively new to PowerShell so I'm having trouble figuring this one out. I have searched for 2 days now but haven't found anything that quite suits my needs.
I need to copy folders created on the current date to another location using mapped drives. These folders live under 5 other folders, based on language.
Folder1\Folder2\Folder3\Folder4\chs, enu, jpn, kor, tha
The folders to be copied all start with the same letters followed by numbers - abc123456789_111. With the following script, I don't need to worry about folder names because only the folder I need will have the current date.
The folders that the abc* folders live in have about 35k files and over 1500 folders each.
I have gotten all of this to work using Get-ChildItem but it is so slow that the developer could manually copy the files by the time the script completes. Here is my script:
GCI -Path $SrcPath -Recurse |
Where {$_.LastWriteTime -ge (Get-Date).Date} |
Copy -Destination {
if ($_.PSIsContainer) {
Join-Path $DestPath $_.Parent.FullName.Substring($SrcPath.length)
} else {
Join-Path $DestPath $_.FullName.Substring($SrcPath.length)
}
} -Force -Recurse
(This only copies to one destination folder at the moment.)
I have also been looking into using cmd /c dir and cmd /c forfiles but haven't been able to work it out. Dir will list the folders but not by date. Forfiles has turned out to be pretty slow, too.
I'm not a developer but I'm trying to learn as much as possible. Any help/suggestions are greatly appreciated.
#BaconBits is right, you have a recurse on your copy-item as well as your getchild-item. This will cause a lot of extra pointless copies which are just overwrites due to your force parameter. Change your script to do a foreach loop and drop the recurse parameter from copy-item
GCI -Path $SrcPath -Recurse |
Where {$_.LastWriteTime -ge (Get-Date).Date} | % {
Copy -Destination {
if ($_.PSIsContainer) {
Join-Path $DestPath $_.Parent.FullName.Substring($SrcPath.length)
} else {
Join-Path $DestPath $_.FullName.Substring($SrcPath.length)
}
} -Force
}