I need a script that only copy files after 5 minutes based on the modification date. Does anyone have a solution for this ?
I couldn't find any script online.
The answer from jdweng is a good solution to identify the files in scope.
You could make your script something like this to easily re-use it with other paths or file age.
# Customizable variables
$Source = 'C:\Temp\Input'
$Destination = 'C:\Temp\Output'
[int32]$FileAgeInMinutes = 5
# Script Execution
Get-ChildItem -Path $Source | Where-Object { $_.LastWriteTime -lt (Get-Date).AddMinutes(-$FileAgeInMinutes) } | Copy-Item -Destination $Destination
You could then run a scheduled task using this script and schedule it to run in periodically, depending on your need.
Related
The scenario is in a remote server, a folder is shared for ppl to access the log files.
The log files will be kept for around 30 days before they got aged, and each day, around 1000 log files will be generated.
For problem analysis, I need to copy log files to my own machine, according to the file timestamp.
My previous strategy is:
Use dir /OD command to get the list of the files, to my local PC, into a file
Open the file, find the timestamp, get the list of the files I need to copy
Use copy command to copy the actual log files
It works but needs some manual work, ie step2 I use notepad++ and regular expression to filter the timestamp
I tried to use powershell as:
Get-ChildItem -Path $remotedir | where-object {$_.lastwritetime -gt $starttime -and $_lastwritetime -lt $endtime } |foreach {copy-item $_.fullname -destination .\}
However using this approach it took hours and hours and no file has been copied, while compared with the dir solution it took around 7-8 minutes to generate the list of the file than copy itself took sometime but not hours
I guess most of the time spent on the filter file. I'm not quite sure why the get-childitem's performance is so poor.
Can you please advise if there's anything i can change?
Thanks
For directories with a lot of files, Get-ChildItem is too slow. It looks like most of the time is spent enumerating the directory, then filtering through 'where', then copying each file.
Use .net directly, particularly [io.directoryinfo] with the GetFileSystemInfos() method.
e.g.
$remotedir = [io.directoryinfo]'\\server\share'
$destination = '.\'
$filemask = '*.*'
$starttime = [datetime]'jan-21-2021 1:23pm'
$endtime = [datetime]'jan-21-2021 4:56pm'
$remotedir.GetFileSystemInfos($filemask, [System.IO.SearchOption]::TopDirectoryOnly) | % {
if ($_.lastwritetime -gt $starttime -and $_.lastwritetime -lt $endtime){
Copy-Item -Path $_.fullname -Destination $destination
}
}
I am working out a scenario using PowerShell.
As of now, I am trying to workout a scenario where I have a batch job which upon successful execution first creates a date folder for that particular day and then creates a .CSV file under. The folder structure than look like below.
\\Server\SourceA\Processed\20200120\TestA.CSV
when the job runs next day, it will create another folder and file like below.
\\Server\SourceA\Processed\20200121\TestB.CSV
This is how there have been so many folders already created in past.
My PS script is good to run daily after the batch job is completed. I read a date, append to path and it copies file from source to destination folder. But I want to enable my PS script to read all previous date folders created under
\\Server\SourceA\Processed\
Another tricky part is - under the date folder there is few other sub folders. I.e.
\\Server\SourceA\Processed\20191010\Log
\\Server\SourceA\Processed\20191010\Charlie
\\Server\SourceA\Processed\20191010\Alpha
\\Server\SourceA\Processed\20191010\Delta
among them, I only need to read files from Log folder only.
Hence, my actual source path becomes like
\\Server\SourceA\Processed\20191010\Log\TestA.CSV
Here is my script (which is static right now and unable to read past existing date folders).
$fullSourceFileName = "\\Server\SourceA\Processed\"
$date = Get-Date -format "yyyyMMdd"
$fullSourceFileName = "$($fullSourceFileName)$($date)\Log
$destination = "\\Server\DestA\Processed\"
$destination = "$($destination)$($date)\"
get-childitem -path $fullSourceFileName -recurse | copy-item -destination "$($destinationFolder)$($date)\"
Your help is highly appreciated.
I did not know I can use foreach loop in Powershell.
So, here is the answer to read all dynamic date folder under my given path.
I hope this helps the community.
$fullSourceFileName = "\\Server\SourceA\Processed\"
$DirToRead = "\Log\"
$dates = Get-ChildItem -Path $fullSourceFileName -Directory
$destination = "\\Server\DestA\Processed\"
foreach ($date in $dates){
$ActualPath = "$($fullSourceFileName)$($date)$($DirToRead)"
if(!(Test-Path $ActualPath))
{
Write-Output $($ActualPath)$(" source path does not exist")
}
else
{
get-childitem -path $ActualPath -recurse | copy-item -destination "$($destinationFolder)$($date)\"
}
$ActualPath = ""
}
AD Manager Plus generates reports hourly to a time stamped file path and I would like to copy these files to another location - overwriting the existing file. I will then schedule the script to run hourly after the files have been generated. Unfortunately the location the reports are extracted to cannot be modified and it creates date & time stamped folders.
Example:
C:\ADManager Plus\audit-data\reports\16042019\DailyTrue-Up01-55-07\Real Last Logon.xls
C:\ADManager Plus\audit-data\reports\ddmmyyyy\DailyTrue-Uphh-mm-ss\Real Last Logon.xls
I thought the easiest approach would be to:
Get the last modified folder in the Reports Folder - eg Apr162019
Get the last modified folder in the Apr162019 Folder - eg DailyTrue-Up01-55-07
Filter for the Real Last Logon.xls spreadsheet in folder DailyTrue-Up01-55-07
$Path = "C:\ADManager Plus\audit-data\reports"
$DestinationPath = "\\domain\networkshare\Reports\"
Get-ChildItem -Path $Path -Directory | ForEach-Object {
Get-ChildItem -Path "$Path\$_" -File -Filter "Real Last Logon.xlsx" |
Sort-Object LastWriteTime -Descending |
Select-Object -First 1 |
Copy-Item -Force -Destination (New-Item -Force -Type Directory -Path (Join-Path $DestinationPath ($_.FullName.Replace("$Path\", ''))))
}
The code we have seems to copy all folders to the location and can't look in multiple directories.
I got a feeling we are approaching this wrong, Can anyone suggest the best way to achieve this? There are few posts online that explain how to retrieve files from time stamped folders.
I have a script that references a .csv document of filenames and then runs a Get-ChildItem over a large directory to find the file and pull the 'owner'. Finally the info outputs into another .csv document. We use this to find who created files. Additionally I have it create .txt files with filename and timestamp to see how fast the script is finding the data. The code is as follows:
Get-ChildItem -Path $serverPath -Filter $properFilename -Recurse -ErrorAction 'SilentlyContinue' |
Where-Object {$_.LastWriteTime -lt (get-date).AddDays(30) -and
$_.Extension -eq ".jpg"} |
Select-Object -Property #{
Name='Owner'
Expression={(Get-Acl -Path $_.FullName).Owner}
},'*' |
Export-Csv -Path "$desktopPath\Owner_Reports\Owners.csv" -NoTypeInformation -Append
$time = (get-date -f 'hhmm')
out-file "$desktopPath\Owner_Reports\${fileName}_$time.txt"
}
This script serves it's purpose but is extremely slow based on the large size of the parent directory. Currently it takes 12 minutes per filename. We query approx 150 files at a time and this long wait time is hindering production.
Does anyone have better logic that could increase the speed? I assume that each time the script runs Get-ChildItem it recreates the index of the parent directory, but I am not sure. Is there a way we can create the index one time instead of for each filename?
I am open to any and all suggestions! If more data is required (such as the variable naming etc) I will provide upon request.
Thanks!
Good day, all. New member here and relatively new to PowerShell so I'm having trouble figuring this one out. I have searched for 2 days now but haven't found anything that quite suits my needs.
I need to copy folders created on the current date to another location using mapped drives. These folders live under 5 other folders, based on language.
Folder1\Folder2\Folder3\Folder4\chs, enu, jpn, kor, tha
The folders to be copied all start with the same letters followed by numbers - abc123456789_111. With the following script, I don't need to worry about folder names because only the folder I need will have the current date.
The folders that the abc* folders live in have about 35k files and over 1500 folders each.
I have gotten all of this to work using Get-ChildItem but it is so slow that the developer could manually copy the files by the time the script completes. Here is my script:
GCI -Path $SrcPath -Recurse |
Where {$_.LastWriteTime -ge (Get-Date).Date} |
Copy -Destination {
if ($_.PSIsContainer) {
Join-Path $DestPath $_.Parent.FullName.Substring($SrcPath.length)
} else {
Join-Path $DestPath $_.FullName.Substring($SrcPath.length)
}
} -Force -Recurse
(This only copies to one destination folder at the moment.)
I have also been looking into using cmd /c dir and cmd /c forfiles but haven't been able to work it out. Dir will list the folders but not by date. Forfiles has turned out to be pretty slow, too.
I'm not a developer but I'm trying to learn as much as possible. Any help/suggestions are greatly appreciated.
#BaconBits is right, you have a recurse on your copy-item as well as your getchild-item. This will cause a lot of extra pointless copies which are just overwrites due to your force parameter. Change your script to do a foreach loop and drop the recurse parameter from copy-item
GCI -Path $SrcPath -Recurse |
Where {$_.LastWriteTime -ge (Get-Date).Date} | % {
Copy -Destination {
if ($_.PSIsContainer) {
Join-Path $DestPath $_.Parent.FullName.Substring($SrcPath.length)
} else {
Join-Path $DestPath $_.FullName.Substring($SrcPath.length)
}
} -Force
}