I am trying to Sync 2 Folders with Powershell.
Comparing and copying any new Files works just fine. But I want to additionally copy all files that got modified in the reference Foler.
The following Code works and copys all new Files which got created in the reference Folder.
$folderReference = 'C:\Users\Administrator\Desktop\TestA'
$folderToSync = 'C:\Users\Administrator\Desktop\TestB'
$referenceFiles = Get-ChildItem -Recurse -Path $folderReference
$FolderSyncFiles = Get-ChildItem -recurse -Path $folderToSync
$fileDiffs = Compare-Object -ReferenceObject $referenceFiles -DifferenceObject $FolderSyncFiles
foreach ($File in $fileDiffs){
try {
if ($File.SideIndicator -eq "<="){
$FullSourceObject = $File.InputObject.Fullname
$FullTargetObject = $File.InputObject.Fullname.Replace($folderreference, $folderToSync)
Write-Host "copy File: " $FullSourceObject
copy-Item -Path $FullSourceObject -Destination $FullTargetObject
}
}
catch {
Write-Error -Message "Something went wrong!" -ErrorAction Stop
}
}
Now I also want to copy the modified Files.
I tried -property LastWriteTime after the Compare-Objectbut I get a WriteErrorException when running the code.
Do you guys have some tips on how to get this Code to run properly?
Thanks in advance
I'd just use robocopy, it's built specifically for this type of task and included in most modern versions of windows by default:
robocopy C:\Source C:\Destination /Z /XA:H /W:5
/Z - resumes copy if interrupted
/XA:H - ignores hidden files
/W:5 - shortens wait for failures to 5 sec (default 30)
Worth taking a look through the documentation as there's many different options for practically every situation you can think of...
For example, add /MIR and it will remove any files from the destination when they are deleted from source.
Related
Right in the beginning I should note that I am a bloody beginner because I can't attend it classes in my grade.
I want to create a PowerShell script which will copy everything from
C:\Users\Robert\Desktop\test(lots of folders)
to
C:\Users\Robert\Desktop\neu(lots of folders with the exact same names as above)\price
As an absolute beginner I thought that it will be ok to replace the variable folder name with $_Name because it is the same name in both but I am obviously wrong and don't know why.
Here is my attempt
Copy-Item "C:\Users\Robert\Desktop\test\$_name\*" -Destination "C:\Users\Robert\Desktop\neu\$_Name\price" -Recurse
It is copying something but in one package in a new folder in "neu".
I can't avoid creating this script because it would take me at least two or three days to do it by hand.
I am also sorry for my poor English skills
Thank you
the $_ represents the current pipeline item. i don't see a pipeline in there ... [grin]
the following works by grabbing every file in the source dir & its subdirs, and copying that structure to the destination dir. it uses Splatting to structure the parameters neatly.
$SourceDir = "$env:TEMP\Apps - Copy"
$DestDir = "$env:TEMP\Apps - Copy - Two"
$CI_Params = #{
LiteralPath = $SourceDir
Destination = $DestDir
Force = $True
Recurse = $True
}
Copy-Item #CI_Params
If my understanding is correct:
$src = 'C:\Users\Robert\Desktop\test'
$dst = 'C:\Users\Robert\Desktop\neu\{0}\price'
Get-ChildItem $src -Directory | ForEach-Object {
Copy-Item -Path "$($_.FullName)\*" -Destination ($dst -f $_.BaseName) -Recurse -Force -WhatIf
}
Remove -WhatIf to actually do it.
Good day, all. New member here and relatively new to PowerShell so I'm having trouble figuring this one out. I have searched for 2 days now but haven't found anything that quite suits my needs.
I need to copy folders created on the current date to another location using mapped drives. These folders live under 5 other folders, based on language.
Folder1\Folder2\Folder3\Folder4\chs, enu, jpn, kor, tha
The folders to be copied all start with the same letters followed by numbers - abc123456789_111. With the following script, I don't need to worry about folder names because only the folder I need will have the current date.
The folders that the abc* folders live in have about 35k files and over 1500 folders each.
I have gotten all of this to work using Get-ChildItem but it is so slow that the developer could manually copy the files by the time the script completes. Here is my script:
GCI -Path $SrcPath -Recurse |
Where {$_.LastWriteTime -ge (Get-Date).Date} |
Copy -Destination {
if ($_.PSIsContainer) {
Join-Path $DestPath $_.Parent.FullName.Substring($SrcPath.length)
} else {
Join-Path $DestPath $_.FullName.Substring($SrcPath.length)
}
} -Force -Recurse
(This only copies to one destination folder at the moment.)
I have also been looking into using cmd /c dir and cmd /c forfiles but haven't been able to work it out. Dir will list the folders but not by date. Forfiles has turned out to be pretty slow, too.
I'm not a developer but I'm trying to learn as much as possible. Any help/suggestions are greatly appreciated.
#BaconBits is right, you have a recurse on your copy-item as well as your getchild-item. This will cause a lot of extra pointless copies which are just overwrites due to your force parameter. Change your script to do a foreach loop and drop the recurse parameter from copy-item
GCI -Path $SrcPath -Recurse |
Where {$_.LastWriteTime -ge (Get-Date).Date} | % {
Copy -Destination {
if ($_.PSIsContainer) {
Join-Path $DestPath $_.Parent.FullName.Substring($SrcPath.length)
} else {
Join-Path $DestPath $_.FullName.Substring($SrcPath.length)
}
} -Force
}
We got hit by virus, it changed all common file extension to .kmybamf (.txt >> .txt.kmybamf ) and if I delete .kmybamf , the file got damaged.....
So I made a list of all files that got damaged. now I'm trying to overwrite them by previous version. Anyone knows how to do it in Powershell?
I can do it in cmd similar to this
subst X: \localhost\D$\#GMT-2011.09.20-06.00.04_Data
robocopy Z: D:\Folder\ /E /COPYALL
But I want to do it in one shot in Powershell, It has to be a "if .kmybamf found, then restore previous version." and powershell seems like has no such cmdlet for restoring previous version of files or folders.
$fileList = Get-Content -Path "\\localhost\D$\#GMT-2011.09.20-06.00.04_Data"
$destinationFolder = "D:\Folder\"
foreach ($file in $fileList)
{
Copy-Item -Path $file -Destination $destinationFolder -Force
}
This will also work but I find it less readable
Get-Content -Path "\\localhost\D$\#GMT-2011.09.20-06.00.04_Data" | ForEach-Object { Copy-Item -Path $_ -Destination "D:\Folder" -Force }
Get-Content is for reading the text from the files, to read the files from a folder you would have to use Get-Childitem
I have a program which generate some reports by reading .XML file and I have to generate reports for multiple files.
But the problem which I am facing is for doing this I need to run it multiple times for each files as program reads only 1 file in 1 click.
Is there any way by which I can generate reports for multiple files in one click ?
So far i have tried below codes
$a = Get-ChildItem "D:\Directory1\Files\*.xml"
foreach ($i in $a)
{
Move-Item $i "D:\Directory1\"
if ($a) {
D:\Directory1\Program1.exe /run /exit /SilentMode
}
}
As per the above code I am trying to Read files from "D:\Directory1\Files\" Then move any 1 file (Not all Files) to the directory "D:\Directory1\" and then start the Program "Program1.exe" and generate the reports and repeat it till the .xml files exist in "D:\Directory1\Files\"
Is your goal to copy all files from D:\Directory1\Files\ to D:\Directory1\ in one step and then run D:\Directory1\Program1.exe /run /exit /SilentMode?
EDIT:
This work for you?
0. Set location that your program work
1. Get all files
2. For each file
3. Move file to new location
4. Start you program
5. Remove the moved file
Set-Location -Path "D:\Directory1\"
$arrFiles = Get-ChildItem -Path "D:\Directory1\Files\*.xml"
Foreach ($objFile in $arrFiles) {
Move-Item -Path $objFile.FullName -Destination "D:\Directory1\$($objFile.Name)"
Start-Process -FilePath "D:\Directory1\Program1.exe" -ArgumentList "/run /exit /SilentMode" -Wait
Remove-Item -Path "D:\Directory1\$($objFile.Name)"
}
Your logic here was sound, however, one issue you would have is the script will continue processing even while Program1.exe is running. Thereby making it possible for it to seeminly skip files. Also your If statement is just check if $a contains data which it always will in you example. Makes the condition check mute.
What you can do is something like this.
$moveLocation = "D:\Directory1\"
Get-ChildItem "D:\Directory1\Files\*.xml" | ForEach-Object{
# Move the file to its new location
Move-Item -Path $_.FullName -Destination $moveLocation
Start-Process -FilePath "D:\Directory1\Program1.exe" -ArgumentList "/run /exit /SilentMode" -Wait
}
I wrote a Powershell script which does following Steps.
It RoboCopies Files from a Server to an External Hard drive (incremental backup)
Next time it is supposed to check if any files were deleted on the Server, if so move those Files from the Backup Folder to a Folder Called _DeletedFiles in the Backup Hard drive.
RoboCopy with /MIR (which will delete the files which are deleted on the server also on the backup, and that's okay because I saved them already on the _DeletedFiles Folder)
Point of this _DeletedFiles Folder is that even if someone deleted a file we want to keep it somewhere for at least 60 Days.
Now this script is a little bit more complex including writing to log file, testing paths, and first run if statement etc.
All seems to work except the step where I want to copy the Files which are deleted on the Server from the BackUp to a new Folder.
This step looks similar to this:
$testfolder = Test-Path "$UsbDisk\$backupFolder"
# If a Backup folder already exist then Compare files and see if any changes have been made
if ( $testfolder -eq $true ) { #IF_2
# Copy deleted files to BackUp Folder
MyLog "Check for Deleted Files on E:\" 0
$source = "E:\"
$sourcelist = Get-ChildItem $source -Recurse
$destination = "$UsbDisk\$backupFolder\Data_01\_DeletedFiles"
foreach ($file in $sourcelist){
$result = test-path -path "E:\*" -include $file
if ($result -like "False"){
Copy-Item $file -Destination "$destination"
}
}
# Start Synchronizing E:\
MyLog "Start Synchronizing E:\" 0
Robocopy "E:\" "$UsbDisk\$backupFolder\Data_01" /mir /r:2 /w:3 /M /XD VM_*
MyLog "E:\ is up to Date" 0
# Copy deleted files to BackUp Folder
MyLog "Check for Deleted Files on F:\" 0
$source = "F:\"
$sourcelist = Get-ChildItem $source -Recurse
$destination = "$UsbDisk\$backupFolder\Data_02\_DeletedFiles"
foreach ($file in $sourcelist){
$result = test-path -path "F:\*" -include $file
if ($result -like "False"){
Copy-Item $file -Destination "$destination"
# Then Delete it from the BackUp
}
}
# Start Synchronizing F:\
MyLog "Start Synchronizing F:\" 0
Robocopy "F:\" "$UsbDisk\$backupFolder\Data_02" /mir /r:2 /w:3 /M /XD VM_*
MyLog "F:\ is up to Date" 0
}
The error I get that files can't be copied because they do not exist at the destination; however, it tries to copy files which shouldn't be copied in the first place.
I wonder if anyone has an idea to solve this more elegant, or how to fix my code snip.
I think the problem may be in the test-path commands. I would replace the include $file with include $file.Name. The include parameter expects a string, not an object.
And in the interests of code maintainability I would also replace ($result -like "False") with (-not $result). This is because $result will contain a boolean value ($true or $false), not a string.