Powershell move files and folders based on older than x days - powershell

I am new to powershell and trying to learn a basic file move from one directory to another. My goal is to move files and folders that are over 18months old to cold storage folder run as a scheduled Task. I need to be able to easily modify it's directories to fit our needs. It needs to preserve the folder structure and only move files that fit the above parameters. I also need it to log everything it did so if something is off i know where.
If I run this it just copies everything. If I comment out the %{Copy-Item... then it runs and lists only based on my parameters and logs it. Where am I going wrong or am I way off base?
Yes it would be easy to use robocopy to do this but I want to use powershell and learn from it.
#Remove-Variable * -ErrorAction SilentlyContinue; Remove-Module *; $error.Clear();
#Clear-Host
#Days older than
$Days = "-485"
#Path Variables
$Sourcepath = "C:\Temp1"
$DestinationPath = "C:\Temp2"
#Logging
$Logfile = "c:\temp3\file_$((Get-Date).ToString('MM-dd-yyyy_hh-mm-ss')).log"
#transcript logs all outputs to txt file
Start-Transcript -Path $Logfile -Append
Get-ChildItem $Sourcepath -Force -Recurse |
Where-Object {$_.LastwriteTime -le (Get-Date).AddDays($Days)} |
% {Copy-Item -Path $Sourcepath -Destination $DestinationPath -Recurse -Force}
Stop-Transcript

Problem
Copy-Item -Path $Sourcepath -Destination $DestinationPath -Recurse -Force
You always specify the same path for source and destination. With parameter -recurse you will copy the whole directory $SourcePath for each matching file.
Solution
You need to feed the output of the previous pipeline steps to Copy-Item by using the $_ (aka $PSItem) variable, basically using Copy-Item in single-item mode.
Try this (requires .NET >= 5.0 for GetRelativePath method):
Get-ChildItem $Sourcepath -File -Force -Recurse |
Where-Object {$_.LastwriteTime -le (Get-Date).AddDays($Days)} |
ForEach-Object {
$relativeSourceFilePath = [IO.Path]::GetRelativePath( $sourcePath, $_.Fullname )
$destinationFilePath = Join-Path $destinationPath $relativeSourceFilePath
$destinationSubDirPath = Split-Path $destinationFilePath -Parent
# Need to create sub directory when using Copy-Item in single-item mode
$null = New-Item $destinationSubDirPath -ItemType Directory -Force
# Copy one file
Copy-Item -Path $_ -Destination $destinationFilePath -Force
}
Alternative implementation without GetRelativePath (for .NET < 5.0):
Push-Location $Sourcepath # Base path to use for Get-ChildItem and Resolve-Path
try {
Get-ChildItem . -File -Force -Recurse |
Where-Object {$_.LastwriteTime -le (Get-Date).AddDays($Days)} |
ForEach-Object {
$relativeSourceFilePath = Resolve-Path $_.Fullname -Relative
$destinationFilePath = Join-Path $destinationPath $relativeSourceFilePath
$destinationSubDirPath = Split-Path $destinationFilePath -Parent
# Need to create sub directory when using Copy-Item in single-item mode
$null = New-Item $destinationSubDirPath -ItemType Directory -Force
# Copy one file
Copy-Item -Path $_ -Destination $destinationFilePath -Force
}
}
finally {
Pop-Location # restore previous location
}
On a side note, $Days = "-485" should be replaced by $Days = -485.
You currently create a string instead of a number and rely on Powershell's ability to automagically convert string to number when "necessary". This doesn't always work though, so better create a variable with the appropriate datatype in the first place.

Related

CSV - Piping to Copy-Item

When I try to import a CSV, and take a source filename/path and destination folder ref, copy-item seems to not copy the file in question.
I have a folder full of files in C:\Dir1\Test\Files\ and I need to copy them to individual folders in C:\Dir1\Test, based on what is in the csv.
$SourceDir = 'C:\Dir1\Test\Files\'
$DestDir = 'C:\Dir1\Test\'
Import-Csv C:\Dir1\Test\FileList.csv | ForEach-Object {
$Source = $SourceDir + $($_.'FilePath')
$Dest = $DestDir + "$($_.'Folder Ref')\"
Copy-Item $Source -Destination $Dest
}
If I switch out the Copy-Item to Write-Host, it reads to me correctly, am I doing something wrong?
Nothing happens, it returns me to the prompt with no output
Constructing file paths using string concatenation as you are doing is never a good idea..
Better use PowerShells cmdlet Join-Path for that or .Net [System.IO.Path]::Combine() method.
As mklement0 already commented, Copy-Item by default does not procude any visual output unless you add -Verbose.
You can also append switch -PassThru and in that case, the cmdlet returns an object that represents the copied item.
In your case, why not add an informative message yourself, something like:
$SourceDir = 'C:\Dir1\Test\Files'
$DestDir = 'C:\Dir1\Test'
Import-Csv -Path 'C:\Dir1\Test\FileList.csv' | ForEach-Object {
# construct the source path
$Source = Join-Path -Path $SourceDir -ChildPath $_.FilePath
if (Test-Path -Path $source -PathType Leaf) {
# construct the destination path
$Dest = Join-Path -Path $DestDir -ChildPath $_.'Folder Ref'
# make sure the target path exists before trying to copy to it
$null = New-Item -Path $Dest -ItemType Directory -Force
# now copy the file
Write-Host "Copying file '$Source' to '$Dest'"
Copy-Item -Path $Source -Destination $Dest
}
else {
Write-Warning "File '$Source' could not be found"
}
}

How can I pipe multiple files into Copy-Item and keep the directory structure?

I have a very large directory located at D:\Stuff and I want to create a copy of it at D:\CopyStuff, but I only want to take files with a certain extension as well as keep the folder structure.
Getting the files I want seems simple enough:
$from = "D:\stuff"
$to = "D:\CopyStuff"
$files = Get-ChildItem -Recurse -Path $from -Include *.config, *.txt, *.ini
However, copying the files and keeping the structure is a bit more challenging. I could use a for-loop, but that seems against the very nature of Powershell. Here https://stackoverflow.com/a/25780745/782880, it suggests to do it this way:
Get-ChildItem -Path $sourceDir | Copy-Item -Destination $targetDir -Recurse -Container
But that copies files to D:\CopyStuff with no folders, much less my original structure. What am I doing wrong? I'm using Powershell 5.
try this :
$Source="C:\temp\test1"
$Dest="C:\temp\test2"
$EnableExt=".config", ".txt" , ".ini"
Get-ChildItem $Source -Recurse | % {
$NewPath=$_.FullName.Replace($Source, $Dest)
if ($_.psiscontainer)
{
New-Item -Path $NewPath -ItemType Directory -Force
}
elseif ($_.Extension -in $EnableExt)
{
Copy-Item $_.FullName $NewPath -Force
}
}
First of all, Copy-Item can do it on its own like:
$fromFolder = "C:\Temp\Source"
$toFolder = "C:\Temp\Dest"
Copy-Item -Path $fromFolder -Destination $toFolder -Recurse -Filter *.txt
But, you may not like the result: it will make folder "Source" inside the "Dest" folder, and then copy the structure. I reckon, you need the same files/folders from inside "Source" folder to be copy to the "Dest" folder. Well, it's a bit more complex, but here it is:
$fromFolder = "C:\Temp\Source"
$toFolder = "C:\Temp\Dest"
Get-ChildItem -Path $fromFolder -Directory -Recurse | Select-Object FullName, #{N="NewPath";E={$_.FullName.Replace($fromFolder, $toFolder)}} | ForEach-Object { New-Item -Path $_.NewPath -ItemType "Directory" }
Get-ChildItem -Path $fromFolder -Include "*.txt" -Recurse | Select-Object FullName, #{N="NewPath";E={$_.FullName.Replace($fromFolder, $toFolder)}} | ForEach-Object { Copy-Item -Path $_.FullName -Destination $_.NewPath }
It copies folder structure first, then files.
NB! I do strongly recommend to use absolute paths only. Otherwise, the Replace method may give unexpected results.
Note: The solution below creates analogous target folders only for those source folders that contain files matching the -Include filter, not for all source folders.
You can get away with a single-pipeline solution by combining Get-ChildItem -Name with delay-bind script blocks:
$from = 'D:\stuff'
$to = 'D:\CopyStuff'
Get-ChildItem -Name -Recurse -LiteralPath $from -Include *.config, *.txt, *.ini |
Copy-Item `
-LiteralPath { Join-Path $from $_ } `
-Destination { New-Item -Type Directory -Force (Split-Path (Join-Path $to $_)) }
-Name emits paths relative to the input directory as strings.
Delay-bind script block { Join-Path $from $_ } builds the full input file name from each relative input path.
Delay-bind script block { New-Item -Type Directory -Force (Split-Path (Join-Path $to $_)) } builds the full path of the target directory from the target root path and the relative input path and creates that directory on demand, using a preexisting one if present (-Force).

Copy-item Files in Folders and subfolders in the same directory structure of source server using PowerShell

I am struggling really hard to get this below script worked to copy the files in folders and sub folders in the proper structure (As the source server).
Lets say, there are folders mentioned below:
Main Folder: File aaa, File bbb
Sub Folder a: File 1, File 2, File 3
Sub Folder b: File 4, File 5, File 6
Script used:
Get-ChildItem -Path \\Server1\Test -recurse | ForEach-Object {
Copy-Item -LiteralPath $_.FullName -Destination \\server2\test |
Get-Acl -Path $_.FullName | Set-Acl -Path "\\server2\test\$(Split-Path -Path $_.FullName -Leaf)"
}
Output:
File aaa, File bbb
Sub Folder a (Empty Folder)
Sub Folder b (Empty Folder)
File 1, File 2, File 3, File 4, File 5, File 6.
I want the files to get copied to their respective folders (Like the source folders). Any further help is highly appreciated.
This can be done just using Copy-Item. No need to use Get-Childitem. I think you are just overthinking it.
Copy-Item -Path C:\MyFolder -Destination \\Server\MyFolder -recurse -Force
I just tested it and it worked for me.
edit: included suggestion from the comments
# Add wildcard to source folder to ensure consistent behavior
Copy-Item -Path $sourceFolder\* -Destination $targetFolder -Recurse
If you want to mirror same content from source to destination, try following one.
function CopyFilesToFolder ($fromFolder, $toFolder) {
$childItems = Get-ChildItem $fromFolder
$childItems | ForEach-Object {
Copy-Item -Path $_.FullName -Destination $toFolder -Recurse -Force
}
}
Test:
CopyFilesToFolder "C:\temp\q" "c:\temp\w"
one time i found this script, this copy folder and files and keep the same structure of the source in the destination, you can make some tries with this.
# Find the source files
$sourceDir="X:\sourceFolder"
# Set the target file
$targetDir="Y:\Destfolder\"
Get-ChildItem $sourceDir -Include *.* -Recurse | foreach {
# Remove the original root folder
$split = $_.Fullname -split '\\'
$DestFile = $split[1..($split.Length - 1)] -join '\'
# Build the new destination file path
$DestFile = $targetDir+$DestFile
# Move-Item won't create the folder structure so we have to
# create a blank file and then overwrite it
$null = New-Item -Path $DestFile -Type File -Force
Move-Item -Path $_.FullName -Destination $DestFile -Force
}
I had trouble with the most popular answer (overthinking). It put AFolder in the \Server\MyFolder\AFolder and I wanted the contents of AFolder and below in MyFolder. This didn't work.
Copy-Item -Verbose -Path C:\MyFolder\AFolder -Destination \\Server\MyFolder -recurse -Force
Plus I needed to Filter and only copy *.config files.
This didn't work, with "\*" because it did not recurse
Copy-Item -Verbose -Path C:\MyFolder\AFolder\* -Filter *.config -Destination \\Server\MyFolder -recurse -Force
I ended up lopping off the beginning of the path string, to get the childPath relative to where I was recursing from. This works for the use-case in question and went down many subdirectories, which some other solutions do not.
Get-Childitem -Path "$($sourcePath)/**/*.config" -Recurse |
ForEach-Object {
$childPath = "$_".substring($sourcePath.length+1)
$dest = "$($destPath)\$($childPath)" #this puts a \ between dest and child path
Copy-Item -Verbose -Path $_ -Destination $dest -Force
}
Here you go.
Function Backup-Files {
[CmdletBinding()]
Param (
[Parameter(Mandatory)]
[System.IO.FileInfo[]]$Source,
[Parameter(Mandatory)]
[String]$Destination
)
if (!(Test-Path $Destination)) {[void][System.IO.Directory]::CreateDirectory($Destination)}
ForEach ($File in $Source) {
$SourceRoot = $(Convert-Path $File.PSParentPath).split('\')[0]
$NewFile = $($File.FullName).Replace($SourceRoot,$Destination)
$NewDir = $($File.DirectoryName).Replace($SourceRoot,$Destination)
[void][System.IO.Directory]::CreateDirectory($NewDir)
Copy-Item -Path $File.FullName -Destination $NewFile -Force
}
}
Examples
<#
.SYNOPSIS
Copy FileInfo object or array to a new destination while retaining the original directory structure.
.PARAMETER Source
FileInfo object or array. (Get-Item/Get-ChildItem)
.PARAMETER Destination
Path to backup source data to.
.NOTES
Version (Date): 1.0 (2023-02-04)
Author: Joshua Biddle (thebiddler#gmail.com)
Purpose/Change: Initial script development.
Known Bugs:
.EXAMPLE
Backup-Files -Source $(Get-ChildItem -Path 'C:\Users\*\Documents' -Recurse -Force -Exclude 'My Music','My Pictures','My Videos','desktop.ini' -ErrorAction SilentlyContinue) -Destination "C:\Temp\UserBackup"
.EXAMPLE
Backup-Files -Source $(Get-ChildItem -Path 'C:\Users\*\Desktop' -Exclude "*.lnk","desktop.ini" -Recurse -Force -ErrorAction SilentlyContinue) -Destination "C:\Temp\UserBackup"
#>
I wanted a solution to copy files modified after a certain date and time which mean't I need to use Get-ChildItem piped through a filter. Below is what I came up with:
$SourceFolder = "C:\Users\RCoode\Documents\Visual Studio 2010\Projects\MyProject"
$ArchiveFolder = "J:\Temp\Robin\Deploy\MyProject"
$ChangesStarted = New-Object System.DateTime(2013,10,16,11,0,0)
$IncludeFiles = ("*.vb","*.cs","*.aspx","*.js","*.css")
Get-ChildItem $SourceFolder -Recurse -Include $IncludeFiles | Where-Object {$_.LastWriteTime -gt $ChangesStarted} | ForEach-Object {
$PathArray = $_.FullName.Replace($SourceFolder,"").ToString().Split('\')
$Folder = $ArchiveFolder
for ($i=1; $i -lt $PathArray.length-1; $i++) {
$Folder += "\" + $PathArray[$i]
if (!(Test-Path $Folder)) {
New-Item -ItemType directory -Path $Folder
}
}
$NewPath = Join-Path $ArchiveFolder $_.FullName.Replace($SourceFolder,"")
Copy-Item $_.FullName -Destination $NewPath
}

Copy a file including it's relative path

I need to copy a large number of files to a backup folder but I want to maintain their relative paths. I only need specific files; i.e.
C:\scripts\folder\File.ext1
C:\scripts\folder2\file2.ext2
C:\scripts\file3.ext1
But I only need to copy the ext1 files like so:
C:\backup\folder\File.ext1.bak
C:\backup\file3.ext1.bak
The source paths are of multiple depths.
This is what I have to copy the files:
$files = gci -path C:\scripts\ -recurse -include *.ext1
$files | % { Copy-Item $_ "$($_).bak"; move-item $_ -destination C:\backup\ }
This just dumps all the files into C:\backup\ and does not appear to get any of the paths. Not sure how that part would be done.
Something like this could work:
gci -path C:\scripts\ -recurse -include *.ext1 |
% { Copy-Item $_.FullName "$($_.FullName).bak"
move-item $_.FullName -destination ($_.FullName -replace 'C:\\scripts\\','C:\backup\') }
It is not clever, but it's quick & dirty and works without a lot of effort.
get-childitem returns absolute paths, but you can make them relative to the current working directory as follows:
resolve-path -relative
So to copy a filtered set of files from the current directory recursively to a destination directory:
$dest = "c:\dest"
$filter = "*.txt"
get-childitem -recurse -include $filter | `
where-object { !$_.PSIsContainer } | `
resolve-path -relative | `
% { $destFile = join-path $dest $_; new-item -type f $destFile -force | out-null; copy-item $_ $destFile; get-item $destfile; }
new-item is needed to create the parent directories
get-item provides a display of all the new files it created
Of course robocopy does all this, but there will be times when you want to do more special filtering or filename mangling...
Use robocopy.
robocopy c:\scripts c:\backup *.ext1 /s
Oops. I failed to notice you wanted to add the .bak extension too. I still think it is a good idea to use robocopy to copy the files then:
dir c:\backup -recurse -include *.ext1 | % { ren $_ "$_.bak" }
You can try this
Clear-Host
$from = "'C:\scripts\"
$to = "'C:\backup\"
$inc = #('*.ext1', '*.extx')
$files = get-childItem -path $from -include $inc -Recurse
$files | % {$dest = (Join-Path $to $($_.FullName+".bak").SubString($from.length)); $dum = New-Item -ItemType file $dest -Force; Copy-Item -Path $_ -Destination $dest -Recurse -Force }
the new-item is there in order to force path creation.
Jean Paul

Exclude list in PowerShell Copy-Item does not appear to be working

I have the following snippet of PowerShell script:
$source = 'd:\t1\*'
$dest = 'd:\t2'
$exclude = #('*.pdb','*.config')
Copy-Item $source $dest -Recurse -Force -Exclude $exclude
Which works to copy all files and folders from t1 to t2, but it only excludes the exclude list in the "root"/"first-level" folder and not in sub-folders.
How do I make it exclude the exclude list in all folders?
I think the best way is to use Get-ChildItem and pipe in the Copy-Item command.
I found that this worked:
$source = 'd:\t1'
$dest = 'd:\t2'
$exclude = #('*.pdb','*.config')
Get-ChildItem $source -Recurse -Exclude $exclude | Copy-Item -Destination {Join-Path $dest $_.FullName.Substring($source.length)}
Basically, what is happening here is that you're going through the valid files one by one, then copying them to the new path. The 'Join-Path' statement at the end is so that the directories are also kept when copying over the files. That part takes the destination directory and joins it with the directory after the source path.
I got the idea from here, and then modified it a bit to make it work for this example.
I hope it works!
I had this problem, too, and spent 20 minutes with applying the solutions here, but kept having problems.So I chose to use robocopy - OK, it's not powershell, but should be available everywhere where powershell runs.
And it worked right out of the box:
robocopy $source $dest /S /XF <file patterns to exclude> /XD <directory patterns to exclude>
e.g.
robocopy $source $dest /S /XF *.csproj /XD obj Properties Controllers Models
Plus, it has tons of features, like resumable copy.
Docs here.
As comments format code badly I'll post as answer but it's just an addition to #landyman's answer.
The proposed script has a drawback - it will create double-nested folders. For example for 'd:\t1\sub1' it will create empty directory 'd:\t2\sub1\sub1'. That's due to the fact that Copy-Item for directories expects parent directory name in -Destination property not directory name itself.
Here's a workaround I found:
Get-ChildItem -Path $from -Recurse -Exclude $exclude | Copy-Item -Force -Destination {
if ($_.GetType() -eq [System.IO.FileInfo]) {
Join-Path $to $_.FullName.Substring($from.length)
} else {
Join-Path $to $_.Parent.FullName.Substring($from.length)
}
}
Note that the syntax spec calls for a STRING ARRAY; ala String[]
SYNTAX
Copy-Item [[-Destination] <String>] [-Confirm] [-Container] [-Credential <PSCredential>] [-Exclude <String[]>] [-Filter <String>] [-Force] [-FromSession <PSSession>] [-Include
<String[]>] -LiteralPath <String[]> [-PassThru] [-Recurse] [-ToSession <PSSession>] [-UseTransaction] [-WhatIf] [<CommonParameters>]
If you're not explicit in your array generation, you end up with an Object[] - and that is ignored in many cases, leaving the appearance of "buggy behavior" because of type-safety. Since PowerShell can process script-blocks, evaluation of other than a type-specific variable (so that a valid string could be determined) would leave an opening for the potential of an injection mode attack on any system whose execution policy were lax.
So this is unreliable:
PS > $omissions = #("*.iso","*.pdf","*.zip","*.msi")
PS > $omissions.GetType()
Note the result....
IsPublic IsSerial Name BaseType
-------- -------- ---- --------
True True Object[] System.Array
And this works.... for example:
PS > $omissions = [string[]]#("*.iso","*.pdf","*.zip","*.msi")
**or**
PS > [string[]]$omissions = ("*.iso,*.pdf,*.zip,*.msi").split(',')
PS > $omissions.GetType()
IsPublic IsSerial Name BaseType
-------- -------- ---- --------
True True String[] System.Array
Note that even a "single" element would still require the same cast, so as to create a 1-element array.
If you're trying this at home, be sure to use the Replace-Variable "omissions" to clean out the existence of $omissions before recasting it in the examples shown above.
And as far as a pipeline that works reliably that I've tested....
--------------------------------------------------------------------------------------- cd $sourcelocation
ls | ?{$_ -ne $null} | ?{$_.BaseName -notmatch "^\.$"} | %{$_.Name} | cp -Destination $targetDir -Exclude $omissions -recurse -ErrorAction silentlycontinue
---------------------------------------------------------------------------------------
The above does a directory listing of the source files in the base (selected "current") directory, filters out potential problem items, converts the file to the basename and forces cp (copy-item alias) to re-access the file "by name" in the "current directory" - thus reacquiring the file object, and copies it. This will create empty directories, including those that may even contain excluded files (less the exclusions of course). Note also that "ls" (get-childitem) does NOT -recurse - that is left to cp. Finally - if you're having problems and need to debug, remove the -ErrorAction silentlycontinue switch and argument, which hides a lot of nuisances that might interrupt the script otherwise.
For those whose comments were related to "\" inclusions, keep in mind that you're working over the .NET sub-layer via an interpreter (i.e. PowerShell), and in c# for example, the inclusion of a single "\" (or multiple singles in a string), results in the compiler demanding you correct the condition by using either "\\" to escape the backslash, or precede the string with an # as in #"\"; with the other remaining option being the enclosure of the string in single quotes, as '\'. All of this is because of ASCII interpolation of character combinations like "\n" etc.
The latter is a much bigger subject, so I'll leave you with that consideration.
The exclude parameter won't work with dirs. A variant of Bo's script does the trick:
$source = 'c:\tmp\foo'
$dest = 'c:\temp\foo'
$exclude = '\.bak'
Get-ChildItem $source -Recurse | where {$_.FullName -notmatch $exclude} |
Copy-Item -Destination {Join-Path $dest $_.FullName.Substring($source.length)}
I was looking for a way to copy files modified after a certain date/timestamp so as to archive them. This way I could save off exactly what files I worked on (assuming I know when I started). (Yes, I know this is what SCM is for, but there are times when I just want to snapshot my work without checking it in.)
Using landyman's tip, and stuff I found elsewhere, I found that this worked:
$source = 'c:\tmp\foo'
$dest = 'c:\temp\foo'
$exclude = #('*.pdb', '*.config')
Get-ChildItem $source -Recurse -Exclude $exclude |
where-object {$_.lastwritetime -gt "8/24/2011 10:26 pm"} |
Copy-Item -Destination {Join-Path $dest $_.FullName.Substring($source.length)}
Get-ChildItem with Join-Path was working mostly for me, but I realized it was copying root directories inside the other root directories, which was bad.
For example
c:\SomeFolder
c:\SomeFolder\CopyInHere
c:\SomeFolder\CopyInHere\Thing.txt
c:\SomeFolder\CopyInHere\SubFolder
c:\SomeFolder\CopyInHere\SubFolder\Thin2.txt
Source Directory: c:\SomeFolder\CopyInHere
Destination Directory: d:\PutItInHere
Goal:
Copy every childitem Inside c:\SomeFolder\CopyInHere to the root of d:\PutItInHere, but not including c:\SomeFolder\CopyInHere itself.
- E.g. Take all the children of CopyInHere and make them Children of PutItInHere
The above examples do this most of the way, but what happens is It Creates a folder Called SubFolder, and Creates a Folder in Folder called SubFolder.
That's because Join-Path Calculates a destination path of d:\PutItInHere\SubFolder for the SubFolder child item, so SubFolder get's created in a Folder called SubFolder.
I got around this by Using Get-ChildItems to bring back a collection of the items, then using a loop to go through it.
Param(
[Parameter(Mandatory=$True,Position=1)][string]$sourceDirectory,
[Parameter(Mandatory=$True,Position=2)][string]$destinationDirectory
)
$sourceDI = [System.IO.DirectoryInfo]$sourceDirectory
$destinationDI = [System.IO.DirectoryInfo]$destinationDirectory
$itemsToCopy = Get-ChildItem $sourceDirectory -Recurse -Exclude #('*.cs', 'Views\Mimicry\*')
foreach ($item in $itemsToCopy){
$subPath = $item.FullName.Substring($sourceDI.FullName.Length)
$destination = Join-Path $destinationDirectory $subPath
if ($item -is [System.IO.DirectoryInfo]){
$itemDI = [System.IO.DirectoryInfo]$item
if ($itemDI.Parent.FullName.TrimEnd("\") -eq $sourceDI.FullName.TrimEnd("\")){
$destination = $destinationDI.FullName
}
}
$itemOutput = New-Object PSObject
$itemOutput | Add-Member -Type NoteProperty -Name Source -Value $item.FullName
$itemOutput | Add-Member -Type NoteProperty -Name Destination -Value $destination
$itemOutput | Format-List
Copy-Item -Path $item.FullName -Destination $destination -Force
}
What this does in short, is it uses the current item's full name for the destination calculation. However it then checks to see if it is a DirectoryInfo object. If it is it checks if it's Parent Folder is the Source Directory, that means the current folder being iterated is a direct child of the source directory, as such we should not append it's name to the destination directory, because we want that folder to be created in the destination directory, not in a folder of it's in the destination directory.
Following that, every other folder will work fine.
$sourcePath="I:\MSSQL\Backup\Full"
$excludedFiles=#("MASTER", "DBA", "MODEL", "MSDB")
$sourceFiles=(ls $sourcePath -recurse -file) | where-object { $_.directory.name -notin $excludedFiles }
this is what i did, i needed to copy out a bunch of backup files to a separate location on the network for client pickup. we didn't want them to have the above system DB backups.
I had a similar problem extending this a bit. I want a solution working for sources like
$source = "D:\scripts\*.sql"
too. I found this solution:
function Copy-ToCreateFolder
{
param(
[string]$src,
[string]$dest,
$exclude,
[switch]$Recurse
)
# The problem with Copy-Item -Rec -Exclude is that -exclude effects only top-level files
# Copy-Item $src $dest -Exclude $exclude -EA silentlycontinue -Recurse:$recurse
# http://stackoverflow.com/questions/731752/exclude-list-in-powershell-copy-item-does-not-appear-to-be-working
if (Test-Path($src))
{
# Nonstandard: I create destination directories on the fly
[void](New-Item $dest -itemtype directory -EA silentlycontinue )
Get-ChildItem -Path $src -Force -exclude $exclude | % {
if ($_.psIsContainer)
{
if ($Recurse) # Non-standard: I don't want to copy empty directories
{
$sub = $_
$p = Split-path $sub
$currentfolder = Split-Path $sub -leaf
#Get-ChildItem $_ -rec -name -exclude $exclude -Force | % { "{0} {1}" -f $p, "$currentfolder\$_" }
[void](New-item $dest\$currentfolder -type directory -ea silentlycontinue)
Get-ChildItem $_ -Recurse:$Recurse -name -exclude $exclude -Force | % { Copy-item $sub\$_ $dest\$currentfolder\$_ }
}
}
else
{
#"{0} {1}" -f (split-path $_.fullname), (split-path $_.fullname -leaf)
Copy-Item $_ $dest
}
}
}
}
The below snippet will copy all files and folders from $source to $dest, excluding .pdb and .config files from the root folder and sub-folders:
Get-ChildItem -Path $source | Copy-Item -Destination $dest -Recurse -Container -Exclude #('*.pdb','*.config')
One way of copying items from one folder to another using regular expressions for exclusion:
$source = '.\source'
$destination = '.\destination'
$exclude = '.*\.pdf$|.*\.mp4$|\\folder1(\\|$)|\\folder2(\\|$)'
$itemsToCopy = Get-ChildItem $source -Recurse |
Where-Object FullName -notmatch $exclude | Select-Object -Expand FullName
$sourceFullNameLength = (Get-Item $source).FullName.Length
foreach ($item in $itemsToCopy) {
$relativeName = $item.Substring($sourceFullNameLength + 1)
Copy-Item -Path $item -Destination "$destination\$relativeName"
}
I wrote this for daily use and packaged it in the script module, it maintains all the directory structure and supports wildcards:
function Copy-Folder {
[CmdletBinding()]
param(
[Parameter(Mandatory)]
[String]$FromPath,
[Parameter(Mandatory)]
[String]$ToPath,
[string[]] $Exclude
)
if (Test-Path $FromPath -PathType Container) {
New-Item $ToPath -ItemType Directory -ErrorAction SilentlyContinue | Out-Null
Get-ChildItem $FromPath -Force | ForEach-Object {
# avoid the nested pipeline variable
$item = $_
$target_path = Join-Path $ToPath $item.Name
if (($Exclude | ForEach-Object { $item.Name -like $_ }) -notcontains $true) {
if (Test-Path $target_path) { Remove-Item $target_path -Recurse -Force }
Copy-Item $item.FullName $target_path
Copy-Folder -FromPath $item.FullName $target_path $Exclude
}
}
}
}
Just call the Copy-Folder -FromPath 'fromDir' -ToPath 'destDir' -Exclude *.pdb,*.config
The -FromPath and -ToPath can be omitted,
Copy-Folder -FromPath 'fromDir destDir -Exclude *.pdb,*.config