I've created a script that I found in other forums to copy a file from one server to another. A bit of background here. We have large PDF files being generated nightly and saved on a share at one site that need to be copied over to our corporate share. These files are pretty sizable (anywhere between 25MB to as high as 65MB), so I can only copy these files off hours. The source repository holds all the original files from the year, so I only want to copy the most recent files. I created a script (or tried to at least) that copies only the most recent files from the SourceHost location to CorpHost share and set up a Task Schedule to run at 7:30pm.
The script kicks off and runs as scheduled, but nothing gets copied over. I don't see any errors being generated from the task schedule and it appears to run as normal as the script returns a "not copying .pdf". Originally, I though that maybe it was bypassing all the files because the generation date is outside the $Max_days range (-1), so I increased it to -2. No luck. Increased it again to -5 - no luck. -10... nothing.
Here's the code sample:
$RemotePath = "\\<CorpHost>\Shared\Packing_Slips\<Site>"
$SourcePath = "\\<SourceHost>\<Site>_packingslips"
$Max_days = "-1"
$Curr_date = Get-Date
#Checking date and then copying file from RemotePath to LocalPath
foreach ($file in (Get-ChildItem $SourcePath))
{
if ($file.LastWriteTime -gt ($Curr_date).AddDays($Max_days))
{
Copy-Item -Path $file.FullName -Destination $RemotePath
}
else
{
"not copying $file"
}
}
Related
So I am trying to copy files from a folder to another one. The files in this folder are overwritten every minute by another program. I want to get a copy of each file every minute before it gets overwritten and save it somewhere else. See example structure below:
Folder 1 # gets overwritten every minute
a.txt
a_backup.txt
Folder 2
a1.txt
a1_backup.txt
a2.txt
a2_backup.txt
a3.txt
a3_backup.txt
a4.txt
a4_backup.txt
It would be even better if the files in Folder 2 would contain the date and time of when they were copied in their names.
I came up with the following:
$Source = 'C:\Users\Source'
$Destination = 'C:\Users\Target'
Do{
Copy-Item $Source\* -Destination $Destination
sleep -s 59
} while($true)
However, this does not do the job completely as I am only copying the file once and then copy the same file again when it's overwritten...
Any help is warmly welcome!
New on giving answers but here's my proposal.
Get Content , and out to another file with current time as of writing maybe? Of course include your loop around it
Get-Content C:\log.txt | Out-File "Log.$([System.Math]::Round((date -UFormat %s),0)).txt""
Get-ChildItem -Path $source | % { copy-item $_.FullName -Destination "$Destination\$((Get-Date).ToString("MMddyyyy-hhmmss"))$($_.Name)" }
This statement will take care of it but what it will not do is hide the exceptions and failures you will get while the file is being written to.
$_.FullName includes the full path.. can be used as source
$_.Name gives you the filename only (without the path)
(Get-Date).ToString("MMddyyyy-hhmmss") gives you the date in format specified in ToString(). Since the file is updated every minute, you will need to include the minutes and seconds in your filename as well.
I am working out a scenario using PowerShell.
As of now, I am trying to workout a scenario where I have a batch job which upon successful execution first creates a date folder for that particular day and then creates a .CSV file under. The folder structure than look like below.
\\Server\SourceA\Processed\20200120\TestA.CSV
when the job runs next day, it will create another folder and file like below.
\\Server\SourceA\Processed\20200121\TestB.CSV
This is how there have been so many folders already created in past.
My PS script is good to run daily after the batch job is completed. I read a date, append to path and it copies file from source to destination folder. But I want to enable my PS script to read all previous date folders created under
\\Server\SourceA\Processed\
Another tricky part is - under the date folder there is few other sub folders. I.e.
\\Server\SourceA\Processed\20191010\Log
\\Server\SourceA\Processed\20191010\Charlie
\\Server\SourceA\Processed\20191010\Alpha
\\Server\SourceA\Processed\20191010\Delta
among them, I only need to read files from Log folder only.
Hence, my actual source path becomes like
\\Server\SourceA\Processed\20191010\Log\TestA.CSV
Here is my script (which is static right now and unable to read past existing date folders).
$fullSourceFileName = "\\Server\SourceA\Processed\"
$date = Get-Date -format "yyyyMMdd"
$fullSourceFileName = "$($fullSourceFileName)$($date)\Log
$destination = "\\Server\DestA\Processed\"
$destination = "$($destination)$($date)\"
get-childitem -path $fullSourceFileName -recurse | copy-item -destination "$($destinationFolder)$($date)\"
Your help is highly appreciated.
I did not know I can use foreach loop in Powershell.
So, here is the answer to read all dynamic date folder under my given path.
I hope this helps the community.
$fullSourceFileName = "\\Server\SourceA\Processed\"
$DirToRead = "\Log\"
$dates = Get-ChildItem -Path $fullSourceFileName -Directory
$destination = "\\Server\DestA\Processed\"
foreach ($date in $dates){
$ActualPath = "$($fullSourceFileName)$($date)$($DirToRead)"
if(!(Test-Path $ActualPath))
{
Write-Output $($ActualPath)$(" source path does not exist")
}
else
{
get-childitem -path $ActualPath -recurse | copy-item -destination "$($destinationFolder)$($date)\"
}
$ActualPath = ""
}
I have a Powershell script issuing the Move-Item command to move some 3D CAD directories and subdirectories from one network share to another. About 300 parent folders a day. I am having the most strange problem. All parent folders are copying to the destination just fine, but about 10 of the 300 objects are not being removed from the source. It seems that any subdirectory that contains a type of png scan file, will not remove from the source; it copies fine, just doesn't remove from source. No error is reported in the -verbose log output. The file sizes are not large, < 50MB each. Doing some testing, I also noticed that if I run the script once, the problem occurs, but if I run the script again with no changes, it moves the remaining objects that it did not move on the initial run. Another observation is that if I run the script where the source and destination are on the same share, the problem does not occur. The problem only occurs when source and destination are on different shares. Same problem using both versions 5.1 and 4.0.
I also tried using some simple png files I created myself just by saving a jpg screen print as a png, and those png files copy and remove fine. So something is special about these 3D CAD png files. I even tried copying off the parent folder to share to make sure no one or no program is locking the original files, and the problem still happens.
I have been successful at replacing the move-item command with a robocopy command, and it works perfectly, so I could use robocopy to solve my problem, but I want to figure out why the move-item command is having this problem. My only theory is that if on the same share, no data is actually being moved, only the reference to the file, ie pointer, ie FAT (file allocation table) is being changed. When crossing shares the actual file data has to be moved, so something is funny about these files that prevents that from happening. But as stated above, if I run the script a 2nd time, the remaining files copy and remove fine. So that leaves a little unexplained.
Just wondering if anyone has any ideas.
See the code samples below, and also some screen prints of the source folder structure before and after.
$sLogPath = 'C:\PowerShell_scripts\test\robocopy_test.log'
$StagingFolder = '\\3ipbg-fs-p01\PSP_Inbound\Test\3Shape_Auto_Create\Suspense\test_case_01'
$FinalFolder_M = '\\3ipbg-fs-p01\patient specific mfg\3Shape_AutoCreate\Auto_Job_Create_Test\Inbox_Manual_Jobs'
$FinalFolder_M_robo = '\\3ipbg-fs-p01\patient specific mfg\3Shape_AutoCreate\Auto_Job_Create_Test\Inbox_Manual_Jobs\test_case_01'
Move-Item -LiteralPath $StagingFolder $FinalFolder_M -Verbose -Force
robocopy $StagingFolder $FinalFolder_M_robo /R:3 /W:3 /E /B /J /MIR /MOVE /LOG+:$sLogPath
enter image description here
You are using /R:3 parameter for robocopy, meaning it retries to move the folder up to 3 times if an error is present.
I am not aware of such functionality in PowerShell, but you could easily write a small utility function for this.
Assumptions:
1) In the loop code, it checks if the source folder exists, if Yes, it continues up to $retries times with Sleep of 2 seconds.
Function Custom-Move{
param( [string]$source, [string]$destination, [int]$retries)
$i=0
while($i -ne $retries){
Move-Item -LiteralPath $source -Destination $destination -Verbose -Force -ErrorAction SilentlyContinue
#test if the folder exist, if not meaning that the folder was moved.
if(-not (Test-Path $source)) {break}
$i++
Start-Sleep 2
}
}
Usage:
Custom-Move $StagingFolder $FinalFolder_M 5
I want to copy folders with their contents to a remote computer using a PSSession and Copy-item. When running the script for the first time it has to create the destination folder, it does so correctly and then is supposed to dump the folders with their contents inside into the destination folder. What it is instead doing is dumping two of the folders correctly and then dumping the contents of the third folder, not the folder itself. When I run it a second time without deleting the destination folder, everything runs fine.
I have tried using various different parameters, including -container but it doesn't seem to help at all. Here is where I use the function in my code, I use a lot of environment variables and variables in general because this needs to be a script that can be put anywhere and work.
if (Test-Path -path "$env:TEMP\VMlogs") {
Write-Host "I'M GONNA SEND IT!"; Pause
Copy-Item -path "$env:TMP\VMLogs\*" -tosession $Targetsession -destination $Destination`_$Source -force -recurse
Write-Host Logs copied sucessfully!
Remove-Item "$env:TEMP\VMlogs" -recurse
} else {
Write-Host "There was an issue copying logs!"
Pause
Exit
What I expect is that the folders are put into the destination folder with their intact structure but instead this only happens on the second running of the script, after the destination folder has already been created.
Quick need: Within a SQL Agent job step, I am looking for a way to copy files newer than 60 minutes "ago" to another server. I don't want to re-copy any files older than that. So, copy, xcopy, robocopy are all possibilities as this is a Windows 2008 or higher server.
Background: I'm wiring up a process where serverA has a folder where an ERP application is dumping flat text files to that folder. I need to copy the "newest files" once per hour to serverB so that another application (an SSIS package that kicks off every 60 minutes) can process the file and save the data into SQL Server. In order to only copy "new files" that appear and not copy anything I've already copied over (if exists won't work because I will remove the copied file after SSIS processes it) I need to basically copy files that are only 60 minutes old, or newer and exclude all other files.
For what its worth, the method used will be a SQL Agent Job step so CmdExec and Powershell are both allowed (I am new to PowerShell so I am leaning toward Robocopy).
My solution (and my first PowerShell script) was to use PowerShell's 'Where-Object' cmdlet to pipe only the files that were 60 minutes old or newer to 'Get-ChildItem' and then use 'Copy-Item' within foreach as below:
$srcPath = '\\serverA\ERP\Outbox\'
$destPath = 'C:\ERP\FromERP\Inbox\'
# target files where LastWriteTime >= 60 minutes "ago"
$age = (Get-Date).AddMinutes(-60)
#Write-Output "$age"
$newFiles = Get-ChildItem $srcPath | Where-Object { $_.LastWriteTime -ge $age }
#Write-Output "$newFiles"
foreach ($newFile in $newFiles) {
#Write-Output "Copying $newFile to $destPath"
Copy-Item $newFile.FullName -Destination "$($destPath)$($newFile)"
}
Some of the lines that are commented were simply for debugging purposes but I left them in to help in understanding what is going on.
Note: I experimented with copy, xcopy and robocopy and I think that with some crafty batch language syntax I saw elsewhere on SO I could have gotten them to work (robocopy has a MAXAGE argument but the lowest value is 1 day, which is not granular enough for minutes) but PowerShell really felt simpler and more elegant.
Hopefully someone else can make use of the technique.