Copy and rename a file every minute - powershell

So I am trying to copy files from a folder to another one. The files in this folder are overwritten every minute by another program. I want to get a copy of each file every minute before it gets overwritten and save it somewhere else. See example structure below:
Folder 1 # gets overwritten every minute
a.txt
a_backup.txt
Folder 2
a1.txt
a1_backup.txt
a2.txt
a2_backup.txt
a3.txt
a3_backup.txt
a4.txt
a4_backup.txt
It would be even better if the files in Folder 2 would contain the date and time of when they were copied in their names.
I came up with the following:
$Source = 'C:\Users\Source'
$Destination = 'C:\Users\Target'
Do{
Copy-Item $Source\* -Destination $Destination
sleep -s 59
} while($true)
However, this does not do the job completely as I am only copying the file once and then copy the same file again when it's overwritten...
Any help is warmly welcome!

New on giving answers but here's my proposal.
Get Content , and out to another file with current time as of writing maybe? Of course include your loop around it
Get-Content C:\log.txt | Out-File "Log.$([System.Math]::Round((date -UFormat %s),0)).txt""

Get-ChildItem -Path $source | % { copy-item $_.FullName -Destination "$Destination\$((Get-Date).ToString("MMddyyyy-hhmmss"))$($_.Name)" }
This statement will take care of it but what it will not do is hide the exceptions and failures you will get while the file is being written to.
$_.FullName includes the full path.. can be used as source
$_.Name gives you the filename only (without the path)
(Get-Date).ToString("MMddyyyy-hhmmss") gives you the date in format specified in ToString(). Since the file is updated every minute, you will need to include the minutes and seconds in your filename as well.

Related

Powershell - Copy directory and files from source folder to destination folder

I am working out a scenario using PowerShell.
As of now, I am trying to workout a scenario where I have a batch job which upon successful execution first creates a date folder for that particular day and then creates a .CSV file under. The folder structure than look like below.
\\Server\SourceA\Processed\20200120\TestA.CSV
when the job runs next day, it will create another folder and file like below.
\\Server\SourceA\Processed\20200121\TestB.CSV
This is how there have been so many folders already created in past.
My PS script is good to run daily after the batch job is completed. I read a date, append to path and it copies file from source to destination folder. But I want to enable my PS script to read all previous date folders created under
\\Server\SourceA\Processed\
Another tricky part is - under the date folder there is few other sub folders. I.e.
\\Server\SourceA\Processed\20191010\Log
\\Server\SourceA\Processed\20191010\Charlie
\\Server\SourceA\Processed\20191010\Alpha
\\Server\SourceA\Processed\20191010\Delta
among them, I only need to read files from Log folder only.
Hence, my actual source path becomes like
\\Server\SourceA\Processed\20191010\Log\TestA.CSV
Here is my script (which is static right now and unable to read past existing date folders).
$fullSourceFileName = "\\Server\SourceA\Processed\"
$date = Get-Date -format "yyyyMMdd"
$fullSourceFileName = "$($fullSourceFileName)$($date)\Log
$destination = "\\Server\DestA\Processed\"
$destination = "$($destination)$($date)\"
get-childitem -path $fullSourceFileName -recurse | copy-item -destination "$($destinationFolder)$($date)\"
Your help is highly appreciated.
I did not know I can use foreach loop in Powershell.
So, here is the answer to read all dynamic date folder under my given path.
I hope this helps the community.
$fullSourceFileName = "\\Server\SourceA\Processed\"
$DirToRead = "\Log\"
$dates = Get-ChildItem -Path $fullSourceFileName -Directory
$destination = "\\Server\DestA\Processed\"
foreach ($date in $dates){
$ActualPath = "$($fullSourceFileName)$($date)$($DirToRead)"
if(!(Test-Path $ActualPath))
{
Write-Output $($ActualPath)$(" source path does not exist")
}
else
{
get-childitem -path $ActualPath -recurse | copy-item -destination "$($destinationFolder)$($date)\"
}
$ActualPath = ""
}

Checking if at least one file in folder has a newer modified date

I am trying to write a batch file/PowerShell script that compares the last modified date between a file in folder A and a bunch of files in folder B. If the script finds a more recent file in folder B, then stop comparing and run XYZ. Otherwise, do nothing.
So far, after going over a bunch of related posts, I came up with the following PowerShell script.
$pdf = "T:\Sample_Folder\PDF\*.pdf"
$pdfTime = (Get-Item $pdf).LastWriteTime
$source = "T:\Sample_Folder\Sources"
Get-ChildItem $source | ForEach-Object {
if ($_.LastWriteTime -gt $pdfTime -and $_.Mode -ne "d----") {
echo "Hello!"
}
}
The code seems to correctly check the last modified date of $pdf and of $source folder children. But what I need is to stop the loop once a more recent file is found in $source and go to the action (echo in this example).
How do I stop the loop once the first more recent file is found in $source?
Thank you in advance!

Copy-Items: Copying the same folder more than once

I try to copy the files and folder that is last access 1 year ago and this is my script:
$Source = "\\UNC\Path\Folder\"
$Dest = "\\UNC\Path2\Folder\1"
$Get = Get-ChildItem $Source -Recurse | where {
$_.LastAccessTime -ge (Get-Date).AddMonths(-12).ToString("yyyy-MM-dd")
}
$Get | Copy-Item -Destination $Dest -Recurse
The script works except it copies the files and folder more than once.
For example, it will copy \\UNC\Path\Folder\a\b\File1.txt to both:
\\UNC\Path2\Folder\1\a\b\File1.txt
\\UNC\Path2\Folder\a\b\File1.txt
Note it skips the folder called 1 and puts it directly under Folder.
Now the File1.txt have been copied twice and it's the same file, just different dest. location.
I have Googled and search this forum but I haven't found anything. Any idea what it might be?
I used RoboCopy as #TessellatingHeckler suggested and it work great!
Robocopy.exe \\Source\folder\Folder1 \\dest\folder\folder\1 /MAXLAD:365 /e /copyall /log:C:\Logs\log.txt
Robocopy source: https://social.technet.microsoft.com/wiki/contents/articles/1073.robocopy-and-a-few-examples.aspx#Copy_all_content_including_empty_directory
/maxlad: Specifies the maximum last access date (excludes files
unused since N).
/minlad: Specifies the minimum last access date (excludes files
used since N) If N is less than 1900, N specifies the number of days.
Otherwise, N specifies a date in the format YYYYMMDD

Copy Script Trouble

I've created a script that I found in other forums to copy a file from one server to another. A bit of background here. We have large PDF files being generated nightly and saved on a share at one site that need to be copied over to our corporate share. These files are pretty sizable (anywhere between 25MB to as high as 65MB), so I can only copy these files off hours. The source repository holds all the original files from the year, so I only want to copy the most recent files. I created a script (or tried to at least) that copies only the most recent files from the SourceHost location to CorpHost share and set up a Task Schedule to run at 7:30pm.
The script kicks off and runs as scheduled, but nothing gets copied over. I don't see any errors being generated from the task schedule and it appears to run as normal as the script returns a "not copying .pdf". Originally, I though that maybe it was bypassing all the files because the generation date is outside the $Max_days range (-1), so I increased it to -2. No luck. Increased it again to -5 - no luck. -10... nothing.
Here's the code sample:
$RemotePath = "\\<CorpHost>\Shared\Packing_Slips\<Site>"
$SourcePath = "\\<SourceHost>\<Site>_packingslips"
$Max_days = "-1"
$Curr_date = Get-Date
#Checking date and then copying file from RemotePath to LocalPath
foreach ($file in (Get-ChildItem $SourcePath))
{
if ($file.LastWriteTime -gt ($Curr_date).AddDays($Max_days))
{
Copy-Item -Path $file.FullName -Destination $RemotePath
}
else
{
"not copying $file"
}
}

Powershell Scripting - Zipping, Moving BAK files to a new folder named for the date of BAK file creation

Firstly, Hello and apologies for the stupidly long title...
Secondly, I hope someone out there can help with what should be a simple task that has annoyed me for the past 4days, I will elaborate;
I have 4 x BAK file that are created between 22:30 and 23:00hrs each night.
Each BAK file is named differently and we append the date in format "yyyy_MM_dd".
I need to 7z each BAK file into separate archives and move them to a new directory either named for the date they where created or the appended date of the file but keep the format "yyyy_MM_dd". (Both will be the same obviously but the code will be different, so whichever is easiest)
I believe I have the separate lines for some of the script I need...
For Creating the 7z
dir *.bak | ForEach-Object { & "C:\Program Files\7-Zip\7z.exe"
a -t7z -mx3 ($.Name+".7z") $.Name }
For Creating the folder
$Folder = New-Item -ItemType Directory -Path
"DRIVE2:\Folder1\Folder2\$((Get-Date).ToString('yyyy-MM-dd'))"
For Moving the files
Get-ChildItem 'DRIVE1:\Folder1\Folder2\*.7z' | Copy-Item
-Destination $Folder
Can someone point out where I am being a complete nugget with this?
Thanks in advance
RobD
If the files to copy are in DRIVE1:\Folder1\Folder2, then you are missing a slash between the folder and the wildcard.
Get-ChildItem 'DRIVE1:\Folder1\Folder2\*.7z' | Copy-Item -Destination $Folder