Piping output from get-childitem to robocopy - powershell

I have an array of files that need to be moved to a backup locations. I am collecting the array of desired items using a get-childitem command. I am looking to use robocopy to move stuff once the list of collected items is ready.
$paths=#()
$srcitems = get-childitem $paths
robocopy $srcitems $dest /move
Does this work?
If not what is the best way to pipe to each individual item to robocopy?
Thanks
Steeluser

Usage :: ROBOCOPY source destination [file [file]...] [options]
source :: Source Directory (drive:\path or \\server\share\path).
destination :: Destination Dir (drive:\path or \\server\share\path).
file :: File(s) to copy (names/wildcards: default is "*.*").
Robocopy is expecting a source directory, a destination directory, and a file spec as arguments. It's difficult to give a definitive answer without knowing what your "list of collected items" looks like. If it's source directories, then you can foreach that list through a an ivocation of robocopy, and hardcode a wildcard spec for the file names. If you've got a list of files, you'll need to split those off into directory/file (I'd use split-path), and do an invocation of robocopy for each source directory, specifying the list of files in that directory.

Similar scenario, posting in case it helps:
I needed to copy (move) some folders all beginning with "Friday" and items within from a source to a destination, and came up with this that seems to be working:
Get-ChildItem T:\ParentFolder -Filter "Friday*" -Name | ForEach-Object { robocopy "T:\ParentFolder\$_" "E:\$_" /z /s /MOVE }
The "Get-ChildItem" portion lists the folder names (-Name) starting
with "Friday" (-Filter "Friday*").
This gets piped to the ForEach-Object where robocopy will execute for every instance found.
The robocopy /MOVE argument obviously moves the folders/files.
I'm fairly new to Powershell; not sure if there is a better way. The script is still running, but so far so good.
#walid2mi uses Move-Item which I'm sure works; I just like robocopy b/c it has a restartable mode (/Z).

Syntax that worked for me:
$srcPath = "C:\somefolder"
$dstPath = "C:\someOtherFolder"
$srcitems = get-childitem $srcPath #whatever condition
$srcitems | Select Name | ForEach-Object {robocopy $srcPath $dstPath $_.name}
(which is obvious according to the robocopy documentation)

robocopy
Get-ChildItem $paths | Move-Item -Destination $dest -WhatIf

Related

Copying all Files in Subdirectories to a Single Folder with Robocopy Updating Capabilities

I am trying to copy all the files in a directory that contains many subfolders into a single separate folder. When the code is run again, rather than replacing each file in the destination folder, it should skip files that have the same timestamp and only replace those that are older.
I have used robocopy to skip the copying of files that are of the current version/older in the destination folder. However, robocopy only copies the entire directory along with its folder structure so I am unable to obtain the desired folder with a list of all the files from the source.
I have also used get child-item and then copy-item. However, although this is able to get rid of the folder structure, it overwrites each file for each iteration and is thus time-consuming.
So what I want is to combine the capabilities of robocopy and copy-item. Note that there are no specific pattern to the files that I am to copy. It is simply to COPY each file in the subdirectories that are EITHER of a NEWER version or NON-existing into a single folder.
#For copying and ease of updating destination folder
robocopy /purge /np /S /xo 'source' 'destination'
#To copy items into the destination folder without keeping folder structure
Get-ChildItem -Path 'source' -Recurse -File | Copy-Item -Destination 'destination'
Was unable to combine both, So I am stuck with using the 'copy-item' code, which is quite time consuming when copying/updating large amounts of files.
The purpose of robocopy is to preserve the folder structure. If you want to mangle subfolders robocopy is not the right tool. Use the Get-ChildItem approach, group the results by file name, sort each group by date, pick the most recent file from each group, and copy it if the corresponding destination file either doesn't exist or is older.
Something like this should do what you want:
Get-ChildItem -Path 'C:\source' -Recurse -File |
Group-Object Name |
ForEach-Object {
$src = $_.Group | Sort-Object | Select-Object -Last 1
$dst = Join-Path 'C:\destination' $src.Name
if (-not (Test-Path $dst) -or ($src.LastWriteTime -gt (Get-Item $dst).LastWriteTime)) {
$src | Copy-Item -Destination $dst
}
}

Using Robocopy in Powershell

I am having a challenge while trying to move a large number of files from one location to another. I have a list of folders that I want to move in a .csv and I want to move those items to a new network location. When I use robocopy, only the content of the folders is moved, not the top level folder. This leads to unorganized files. I have tried using move-item but there is not a good logging feature that I can get to work. So here I am.
I am trying to use robocopy but I need to create a destination folder based on the last part of the source path for each item in a list. I have been working on this for hours, please help.
GC -Path C:\Test_1\Test_List.csv |
ForEach-Object
{
$Destination = new-item NAME BASED ON PART OF FILE -itemtype directory |
Robocopy $_ $Destination /e /move /LOG+:C:\Test_2\Test_Copy_Log.txt /NP
}

Move-Item from sub folder to second subfolder using variable destination

I'd like to move all the files in multiple folders into a subfolder within those folders.
So from C:\Desktop\Folder to C:\Desktop\Folder\cv.
I'm having difficulty targeting the correct destination, so far I've tried:
Get-ChildItem *cv -Recurse -Exclude "cv" | Move-Item -Destination {"C:\Desktop\$($_.Name)\cv"}
I'm using Windows PowerShell.
EDIT: For those interested I've found a way to do this using CMD:
FOR /D %G IN ("*cv") DO robocopy /move "%~G" "%~G\cv" "*"
I'd still like to know how to do this using PowerShell, though.

Powershell Copy files and folders

I have a PS script which Zips up the previous months logs and names the zip file FILENAME-YYYY-MM.zip
This works
What I now want to do is copy these zip files off to a network share but keeping some of the folder structure. I currently a folder structure similar to the following;
C:\Folder1\
C:\Folder1\Folder2\
C:\Folder1\Folder3\
C:\Folder1\Folder4\Folder5\
There are .zip files in every folder below c:\Folder1
What I want is for the script to copy files from c:\folder1 to \\networkshare but keeping the folder structure, so I should have 3 folders and another subfolder in folder4.
Currently I can only get it to copy the whole structure so I get c:\folder1\... in my \\networkshare
I keep running into issues such as the new folder structure doesn't exist, I can't use the -recurse switch within the Get-ChildItem command etc...
The script I have so far is;
#This returns the date and formats it for you set value after AddMonths to set archive date -1 = last month
$LastWriteMonth = (Get-Date).AddMonths(-3).ToString('MM')
#Set destination for Zip Files
$DestinationLoc = "\\networkshare\LogArchive\$env:computername"
#Source files
$SourceFiles = Get-ChildItem C:\Sourcefiles\*.zip -Recurse | where-object {$_.lastwritetime.month -le $LastWriteMonth}
Copy-Item $SourceFiles -Destination $DestinationLoc\ZipFiles\
Remove-Item $SourceFiles
Sometimes, you just can't (easily) use a "pure PowerShell" solution. This is one of those times, and that's OK.
Robocopy will mirror directory structures, including any empty directories, and select your files (likely faster than a filter with get-childitem will). You can copy anything older than 90 days (about 3 months) like this:
robocopy C:\SourceFiles "\\networkshare\LogArchive\$($env:computername)\ZipFiles" /E /IS /MINAGE:90 *.zip
You can specify an actual date with /MINAGE too, if you have to be that precise.
How about Copy-Item "C:\SourceFiles\" -dest $DestinationLoc\ZipFiles -container -recurse? I have tested this and have found that it copies the folder structure intact. If you only need *.zip files, you first get them, then for each you call Resolve-Path with -Relative flag set and then add the resultant path into Destination parameter.
$oldloc=get-location
Set-Location "C:\SourceFiles\" # required for relative
$SourceFiles = Get-ChildItem C:\Sourcefiles\*.zip -Recurse | where-object {$_.lastwritetime.month -le $LastWriteMonth}
$SourceFiles | % {
$p=Resolve-Path $_.fullname -relative
copy-item $_ -destination "$DestinationLoc\ZipFiles\$p"
}
set-location $oldloc # return back

Powershell moving files but backup the files before replacing

I have three directories:
1. RFC
2. Source
3. Backup
RFC contains files and folders(that contain files) that I need to replace in the source folder but before I replace/move files I need to backup the files I'm replacing from source to the backup folder.
I have wrote the following code, which compares RFC and Source and copies the files to backup, but it doesn't copy sub directories. I want it to move files within the sub directories as well with the same folder structure as Source. And once the copy of the files is done. I want to move files from RFC to Source.
Please any help would be highly appreciated.
$source = "C:\Scripts\Source\"
$backup = "C:\Scripts\Destination\"
$rfc_dir = "C:\Scripts\RFC000001234\"
$folder1Files= dir $source
$folder2Files= dir $rfc_dir
compare-object $folder1Files $folder2Files -property name -includeEqual -excludeDifferent | ForEach-object {
copy-item "$source\$($_.name)" -Destination "$backup" -Force -recurse
}
UPDATE
TL;DR -- the script
$RFC_Folder = 'c:\scripts\rfc'
$SOURCE_Folder = 'c:\scripts\source'
$BACKUP_Folder = 'c:\scripts\backup'
$rfc = get-ChildItem -File -Recurse -Path $RFC_Folder
$source = get-ChildItem -File -Recurse -Path $SOURCE_Folder
compare-Object -DifferenceObject $rfc -ReferenceObject $source -ExcludeDifferent -IncludeEqual -Property Name -PassThru | foreach-Object {
# copy SOURCE to BACKUP
$backup_dest = $_.DirectoryName -replace [regex]::Escape($SOURCE_Folder),$BACKUP_Folder
# create directory, including intermediate paths, if necessary
if ((test-Path -Path $backup_dest) -eq $false) { new-Item -ItemType Directory -Path $backup_dest | out-Null}
copy-Item -Force -Path $_.FullName -Destination $backup_dest
#copy RFC to SOURCE
$rfc_path = $_.fullname -replace [regex]::Escape($SOURCE_Folder),$RFC_Folder
copy-Item -Force -Path $rfc_path -Destination $_.FullName
}
The explanation:
Given the OPs comment below on ROBOCOPY not being preferable I've updated the answer.
Your posted script is basically on the right track, however instead of using just $backup you have to get a little fancy with the -Destination parameter. You don't want the -Destination to be the static path c:\scripts\backup you want it to update based on where the source file actually was. For example if the file was in c:\scripts\source\subdir1\subdir2, you'd want -Destination to be c:\scripts\backup\subdir1\subdir2.
$_.FullName will be the path to the $SOURCE_Folder as it was used as the -ReferenceObject. It's getting manipulated stringwise to create the desired RFC and Backup paths. The [regex]::Escape static method is being used because the -replace operator does a regular expression operation on the strings, and several characters in paths need to be escaped (the slash, mainly). All it's doing is turning c:\scripts\source into a regular expression escaped version: c:\\scripts\\source
The if construct is used because copy-Item doesn't create intermediate directories, but new-Item does.
Depending on your specifics this might work as is, but you may have to alter it. For example if somehow a directory can end up in RFC that wasn't in SOURCE, this wouldn't catch that. It also won't catch any empty directories that are in SOURCE and RFC, if that's important.
It may be that PowerShell isn't the best tool for this job, depending on some other factors. As I understand it you have Source, RFC and Backup folders. The RFC folder will contain changes that need to be committed to Source, but before you do that you want to backup Source to Backup. If the folder structures are all similar between them, then perhaps the command line tool ROBOCOPY could do what you need?
If some of my assumptions are correct you'd first mirror the Source folder with the Backup folder. This would contain the pre-changed files. Then you would mirror the RFC folder to Source, this would commit any changed files/folders from RFC to the Source folder. An example (this is a batch file):
REM Mirror the Source folder to Backup
ROBOCOPY C:\Scripts\Source C:\Scripts\Backup /MIR
REM Mirror the RFC folder to Source
ROBOCOPY C:\Scripts\RFC C:\Scripts\Source /MIR
At the end of all this your Source folder would be an exact replica of whatever the RFC folder looked like. If RFC isn't a full copy of Source, but rather a partial copy, then you wouldn't want to use the mirror switch, /MIR, as it would destroy anything in Source that wasn't in RFC.
Browse around ROBOCOPY /? for some of its other switches, it's got a few interesting ones for logging if you want to build in some auditing capability. Also, make extra sure to test this in a test environment. Misuse of ROBOCOPY with a /MIR switch might make you a very sad camper.