I need to add a safety net in my script. I'm trying to do a copy job based on a list of users provided through a txt file. Copy the files from that users home directory to a new location. Once the files are copied, check if the file exists in the new location. If yes, then Remove-Item.
Can someone help me? I just don't know how to implement the "if file exists" logic.
$username = Get-Content '.\users.txt'
foreach ($un in $username)
{
$dest = "\\server\homedirs\$un\redirectedfolders"
$source = "\\server\homedirs\$un"
New-Item -ItemType Directory -Path $dest\documents, $dest\desktop
Get-ChildItem $source\documents -Recurse -Exclude '*.msg' | Copy-Item -Destination $dest\documents
Get-ChildItem $source\desktop -Recurse -Exclude '*.msg' | Copy-Item -Destination $dest\desktop
Get-ChildItem $source\mydocuments, $source\desktop -Recurse -Exclude '*.msg' | Remove-Item -Recurse
}
The shortest way to delete file if it doesn't exist is NOT to use Test-Path but:
rm my_file.zip -ea ig
This is short version of
rm my_file.zip -ErrorAction Ignore
which is much more readable and more DRY then
if (Test-Path my_file.zip) { rm my_file.zip }
To answer your question per se, you can do it like this:
Get-ChildItem $source\mydocuments, $source\desktop -Recurse -Exclude '*.msg' | %{
if (Test-Path ($_. -replace "^$([regex]::escape($source))","$dest")) {
Remove-Item $_ -Recurse
}
}
Test-Path returns $true if the file at the given path exists, otherwise $false.
$_ -replace "^$([regex]::escape($source))","$dest" converts the path of each source item you're enumerating with the corresponding destination path, by replacing $source at the beginning of the path with $dest.
The basic regex for the first argument to the -replace operator is ^$source (which means "match the value of $source at the beginning of the string"). However, you need to use [regex]::escape in case $source contains any regex special characters, which is in fact extremely likely with Windows paths, since they contain backslashes. For example, the value you've given here for $source contains \s, which in a regex means "any whitespace character". $([regex]::escape($source)) will interpolate the value of $source with any regex special characters properly escaped, so that you're matching the explicit value.
That said, if your purpose is to copy each item to a new location, and remove the original only if the copy to the new location is successful, it seems like you're reinventing the wheel. Why not just use Move-Item instead of Copy-Item?
Not directly related to the question, but rather than repeating the same command for each subdirectory, you can use a foreach loop:
foreach ($subdir in (echo documents desktop)) {
# Whatever command you end up using to copy or move the items,
# using "$source\$subdir" and "$dest\$subdir" as the paths
}
Test-Path commandlet will help you check if the file exists
http://technet.microsoft.com/en-us/library/ee177015.aspx
#Adi Inbar, I need to use a function like this because I need to move files to a remote session, and the Move-Item does not work when I tried -ToSession... only Copy-Item.
The Key is that if the power or internet goes down, the script will delete the file even if it wasn't copied.
$username = "name"
$password = ConvertTo-SecureString "password" -AsPlainText -Force
$credential = New-Object System.Management.Automation.PSCredential -ArgumentList ($username, $password)
$Session = New-PSSession -ComputerName "IPAdress" -Credential $credential
Copy-Item -Path C:\userPC_1\csv-output\*.csv -Destination C:\userPC_2\Documents\Test_Scripts -ToSession $Session -Verbose
Get-PSSession | Remove-PSSession
Get-ChildItem -Path C:\userPC_1\csv-output\*.csv | Remove-Item -Force
Related
I realise this question seems to have been asked a thousand times before, and I've read most if not all the suggestions (most common being new ps-drive or using alphaFS). I just never see that suggestion actually put into code.
My program works fine the way I've coded it for now, with 1 exception being the too long path exception i run into once in a while.
While i start from a path that is not too long, i will recursively pass down the path length to find all files with a specific extension and move it around.
I'm just wondering how i can test get-childitem and start making a new temp ps drive without having the exception already.
the code: (part of it)
Get-ChildItem -Path $loc -File -Filter $extension -Recurse |
Where-Object { $_.FullName -inotmatch $folder_name } |
ForEach-Object {
if((Test-Path (Join-Path $Destination $_)) -eq $true) {
$tmp_name = "$($_.BaseName)$(Get-Date -Format "dd_MM_yyyy_hh_mm_ss")$($_.Extension)"
Rename-Item $_.FullName $tmp_name
Move-Item ((Split-Path -Path $_.FullName)+"\"+$tmp_name) -Destination $Destination -Verbose 4>>$Destination\$filename -Force;
}
else{$_ | Move-Item -Destination $Destination -Verbose >>$Destination\$filename -Force
}
}
This basically test if the file exists at the location I want to move it to, and if so will rename with current time in the name.
Now I want to implement the new ps-drive if the get-childitem path gets at 200 length. But according to me you can't filter before querying the path length and already run into the exception.
Or am I wrong?
I'm thinning out my backup files with a powershell script, and I know I have the correct filenames, but for some reason when I use remove-item, the item doesn't get removed and no exception is thrown. This is what it looks like:
try{
$Drive = "E:\temp\"
$deleteTime = -42
$limit = (Get-Date).AddDays($deleteTime) #files older than 6 weeks
#get files in folder older than deleteTime and with signature of *junk.vhd* (to be changed later)
$temp1 = Get-ChildItem -Path $Drive -filter "*junk.vhd*" | Where-Object {$_.LastWriteTime -lt $limit} | Select -Expand Name #this has 5 files in list
#will delete every other file
for($i=$temp1.GetLowerBound(0);$i -le $temp1.GetUpperBound(0);$i+=2) {
$name = $temp1[$i]
Write-Host "removing $name" #prints correct file names to screen
Get-ChildItem -Path $Drive -include $name | Remove-Item -recurse -force #this is handling correct files but they aren't deleted for some reason
}
}#try
Catch [Exception] {
#nothing is caught
Write-Host "here"
}
Does anyone have any ideas why it's finding and Write-Host the correct filenames to remove, but the Remove-Item isn't removing them?
I was looking at removal a little different example, but everything looks good.
try to replace this line:
Get-ChildItem -Path $Drive -include $name | Remove-Item -recurse -force #this is handling correct files but they aren't deleted for some reason
with this:
Remove-Item $temp1[i].FullName -force -recurse
Since you already got the full patrh to the file, I don't feel it's necessary to call Get-ChilItem again and pass it through the pipeline, instead you just feed Remove-Item with the FUllName property which is the full path to thew file.
Using PowerShell, is it possible to remove some directory that contains files without prompting to confirm action?
Remove-Item -LiteralPath "foldertodelete" -Force -Recurse
or, with shorter version
rm /path -r -force
From PowerShell remove force answer:
help Remove-Item says:
The Recurse parameter in this cmdlet does not work properly
The command to workaround is
Get-ChildItem -Path $Destination -Recurse | Remove-Item -force -recurse
And then delete the folder itself
Remove-Item $Destination -Force
This worked for me:
Remove-Item $folderPath -Force -Recurse -ErrorAction SilentlyContinue
Thus the folder is removed with all files in there and it is not producing error if folder path doesn't exists.
2018 Update
In the current version of PowerShell (tested with v5.1 on Windows 10 and Windows 11 in 2023) one can use the simpler Unix syntax rm -R .\DirName to silently delete the directory .\DirName with all subdirectories and files it may contain. In fact many common Unix commands work in the same way in PowerShell as in a Linux command line.
One can also clean up a folder, but not the folder itself, using rm -R .\DirName\* (noted by Jeff in the comments).
in short, We can use rm -r -fo {folderName} to remove the folder recursively (remove all the files and folders inside) and force
To delete content without a folder you can use the following:
Remove-Item "foldertodelete\*" -Force -Recurse
rm -Force -Recurse -Confirm:$false $directory2Delete didn't work in the PowerShell ISE, but it worked through the regular PowerShell CLI.
I hope this helps. It was driving me bannanas.
This worked for me:
Remove-Item C:\folder_name -Force -Recurse
Powershell works with relative folders. The Remove-Item has couple of useful aliases which aligns with unix. Some examples:
rm -R -Force ./directory
del -R -Force ./directory/*
Below is a copy-pasteable implementation of Michael Freidgeim's answer
function Delete-FolderAndContents {
# http://stackoverflow.com/a/9012108
param(
[Parameter(Mandatory=$true, Position=1)] [string] $folder_path
)
process {
$child_items = ([array] (Get-ChildItem -Path $folder_path -Recurse -Force))
if ($child_items) {
$null = $child_items | Remove-Item -Force -Recurse
}
$null = Remove-Item $folder_path -Force
}
}
$LogPath = "E:\" # Your local of directories
$Folders = Get-Childitem $LogPath -dir -r | Where-Object {$_.name -like "*temp*"}
foreach ($Folder in $Folders)
{
$Item = $Folder.FullName
Write-Output $Item
Remove-Item $Item -Force -Recurse
}
Since my directory was in C:\users I had to run my powershell as administrator,
del ./[your Folder name] -Force -Recurse
this command worked for me.
If you have your folder as an object, let's say that you created it in the same script using next command:
$folder = New-Item -ItemType Directory -Force -Path "c:\tmp" -Name "myFolder"
Then you can just remove it like this in the same script
$folder.Delete($true)
$true - states for recursive removal
$LogPath = "E:\" # Your local of directories
$Folders = Get-Childitem $LogPath -dir -r | Where-Object {$_.name -like "*grav*"} # Your keyword name directories
foreach ($Folder in $Folders)
{
$Item = $Folder.FullName
Write-Output $Item
Remove-Item $Item -Force -Recurse -ErrorAction SilentlyContinue
}
Some multi-level directory folders need to be deleted twice, which has troubled me for a long time. Here is my final code, it works for me, and cleans up nicely, hope it helps.
function ForceDelete {
[CmdletBinding()]
param(
[string] $path
)
rm -r -fo $path
if (Test-Path -Path $path){
Start-Sleep -Seconds 1
Write-Host "Force delete retrying..." -ForegroundColor white -BackgroundColor red
rm -r -fo $path
}
}
ForceDelete('.\your-folder-name')
ForceDelete('.\your-file-name.php')
If you want to concatenate a variable with a fixed path and a string as the dynamic path into a whole path to remove the folder, you may need the following command:
$fixPath = "C:\Users\myUserName\Desktop"
Remove-Item ("$fixPath" + "\Folder\SubFolder") -Recurse
In the variable $newPath the concatenate path is now: "C:\Users\myUserName\Desktop\Folder\SubFolder"
So you can remove several directories from the starting point ("C:\Users\myUserName\Desktop"), which is already defined and fixed in the variable $fixPath.
$fixPath = "C:\Users\myUserName\Desktop"
Remove-Item ("$fixPath" + "\Folder\SubFolder") -Recurse
Remove-Item ("$fixPath" + "\Folder\SubFolder1") -Recurse
Remove-Item ("$fixPath" + "\Folder\SubFolder2") -Recurse
I've written a script that will be used for archiving log files from a server. I'm in pretty good shape with everything but the recursiveness or not of Get-ChildItem...
The issue I seem to be having is that when Get-ChildItem is not recursive and -Include is present with only one filter, it is ignored! Or, I'm doing something wrong (likely).
I've cleaned up the output a little...
PS C:\foo> Get-childitem -path "c:\foo"
Name
----
bar1.doc
bar2.doc
bar3.doc
foo1.txt
foo2.txt
foo3.txt
PS C:\foo> Get-childitem -path "c:\foo" -Include *.txt
PS C:\foo> Get-childitem -path "c:\foo" -Include *.txt -recurse
Name
----
foo1.txt
foo2.txt
foo3.txt
Sooo??? I had a fantasy where all I had to do was branch to a path of the script that did not have the recurse switch. (By the way, is it possible to variably apply parameters so as to avoid duplicated code paths where the only variability is the parameters to a cmdlet?)
Anyway, here is my script for completeness, in addition to my issue with Get-ChildItem.
function MoveFiles()
{
Get-ChildItem -Path $source -Recurse -Include $ext | where { $_.LastWriteTime -lt (Get-Date).AddDays(-$days) } | foreach {
$SourceDirectory = $_.DirectoryName;
$SourceFile = $_.FullName;
$DestinationDirectory = $SourceDirectory -replace [regex]::Escape($source), $dest;
$DestionationFile = $SourceFile -replace [regex]::Escape($source), $dest;
if ($WhatIf){
#Write-Host $SourceDirectory;
#Write-Host $DestinationDirectory;
Write-Host $SourceFile -NoNewline
Write-Host " moved to " -NoNewline
Write-Host $DestionationFile;
}
else{
if ($DestinationDirectory)
{
if ( -not [System.IO.Directory]::Exists($DestinationDirectory)) {
[void](New-Item $DestinationDirectory -ItemType directory -Force);
}
Move-Item -Path $SourceFile -Destination $DestionationFile -Force;
}
}
}
}
The answer is in the full description of the command (get-help get-childitem -full):
The Include parameter is effective
only when the command includes the
Recurse parameter or the path leads to
the contents of a directory, such as
C:\Windows\*, where the wildcard
character specifies the contents of
the C:\Windows directory.
So the following would work without recurse.
PS C:\foo> Get-childitem -path "c:\foo\*" -Include *.txt
This is expected behaviour, but admittedly confusing. From the Get-ChildItem help file:
-Include <string[]>
Retrieves only the specified items. The value of this parameter qualifies the Path parameter. Enter a path element or pattern, such as "*.txt". Wildcards are permitted.
The Include parameter is effective only when the command includes the Recurse parameter or the path leads to the contents of a directory, such as C:\Windows*, where the wildcard character specifies the contents of the C:\ Windows directory.
ps> help dir -full | more
Hope this helps,
-Oisin
I can't tell you the exact why of it (but I will keep looking), but the behavior is documented in the Get-Help for Get-ChildItem:
-Include <string[]>
Retrieves only the specified items. The value of this parameter qualifies the Path parameter. Enter a path elem
ent or pattern, such as "*.txt". Wildcards are permitted.
The Include parameter is effective only when the command includes the Recurse parameter or the path leads to th
e contents of a directory, such as C:\Windows\*, where the wildcard character specifies the contents of the C:\
Windows directory.
I have the following snippet of PowerShell script:
$source = 'd:\t1\*'
$dest = 'd:\t2'
$exclude = #('*.pdb','*.config')
Copy-Item $source $dest -Recurse -Force -Exclude $exclude
Which works to copy all files and folders from t1 to t2, but it only excludes the exclude list in the "root"/"first-level" folder and not in sub-folders.
How do I make it exclude the exclude list in all folders?
I think the best way is to use Get-ChildItem and pipe in the Copy-Item command.
I found that this worked:
$source = 'd:\t1'
$dest = 'd:\t2'
$exclude = #('*.pdb','*.config')
Get-ChildItem $source -Recurse -Exclude $exclude | Copy-Item -Destination {Join-Path $dest $_.FullName.Substring($source.length)}
Basically, what is happening here is that you're going through the valid files one by one, then copying them to the new path. The 'Join-Path' statement at the end is so that the directories are also kept when copying over the files. That part takes the destination directory and joins it with the directory after the source path.
I got the idea from here, and then modified it a bit to make it work for this example.
I hope it works!
I had this problem, too, and spent 20 minutes with applying the solutions here, but kept having problems.So I chose to use robocopy - OK, it's not powershell, but should be available everywhere where powershell runs.
And it worked right out of the box:
robocopy $source $dest /S /XF <file patterns to exclude> /XD <directory patterns to exclude>
e.g.
robocopy $source $dest /S /XF *.csproj /XD obj Properties Controllers Models
Plus, it has tons of features, like resumable copy.
Docs here.
As comments format code badly I'll post as answer but it's just an addition to #landyman's answer.
The proposed script has a drawback - it will create double-nested folders. For example for 'd:\t1\sub1' it will create empty directory 'd:\t2\sub1\sub1'. That's due to the fact that Copy-Item for directories expects parent directory name in -Destination property not directory name itself.
Here's a workaround I found:
Get-ChildItem -Path $from -Recurse -Exclude $exclude | Copy-Item -Force -Destination {
if ($_.GetType() -eq [System.IO.FileInfo]) {
Join-Path $to $_.FullName.Substring($from.length)
} else {
Join-Path $to $_.Parent.FullName.Substring($from.length)
}
}
Note that the syntax spec calls for a STRING ARRAY; ala String[]
SYNTAX
Copy-Item [[-Destination] <String>] [-Confirm] [-Container] [-Credential <PSCredential>] [-Exclude <String[]>] [-Filter <String>] [-Force] [-FromSession <PSSession>] [-Include
<String[]>] -LiteralPath <String[]> [-PassThru] [-Recurse] [-ToSession <PSSession>] [-UseTransaction] [-WhatIf] [<CommonParameters>]
If you're not explicit in your array generation, you end up with an Object[] - and that is ignored in many cases, leaving the appearance of "buggy behavior" because of type-safety. Since PowerShell can process script-blocks, evaluation of other than a type-specific variable (so that a valid string could be determined) would leave an opening for the potential of an injection mode attack on any system whose execution policy were lax.
So this is unreliable:
PS > $omissions = #("*.iso","*.pdf","*.zip","*.msi")
PS > $omissions.GetType()
Note the result....
IsPublic IsSerial Name BaseType
-------- -------- ---- --------
True True Object[] System.Array
And this works.... for example:
PS > $omissions = [string[]]#("*.iso","*.pdf","*.zip","*.msi")
**or**
PS > [string[]]$omissions = ("*.iso,*.pdf,*.zip,*.msi").split(',')
PS > $omissions.GetType()
IsPublic IsSerial Name BaseType
-------- -------- ---- --------
True True String[] System.Array
Note that even a "single" element would still require the same cast, so as to create a 1-element array.
If you're trying this at home, be sure to use the Replace-Variable "omissions" to clean out the existence of $omissions before recasting it in the examples shown above.
And as far as a pipeline that works reliably that I've tested....
--------------------------------------------------------------------------------------- cd $sourcelocation
ls | ?{$_ -ne $null} | ?{$_.BaseName -notmatch "^\.$"} | %{$_.Name} | cp -Destination $targetDir -Exclude $omissions -recurse -ErrorAction silentlycontinue
---------------------------------------------------------------------------------------
The above does a directory listing of the source files in the base (selected "current") directory, filters out potential problem items, converts the file to the basename and forces cp (copy-item alias) to re-access the file "by name" in the "current directory" - thus reacquiring the file object, and copies it. This will create empty directories, including those that may even contain excluded files (less the exclusions of course). Note also that "ls" (get-childitem) does NOT -recurse - that is left to cp. Finally - if you're having problems and need to debug, remove the -ErrorAction silentlycontinue switch and argument, which hides a lot of nuisances that might interrupt the script otherwise.
For those whose comments were related to "\" inclusions, keep in mind that you're working over the .NET sub-layer via an interpreter (i.e. PowerShell), and in c# for example, the inclusion of a single "\" (or multiple singles in a string), results in the compiler demanding you correct the condition by using either "\\" to escape the backslash, or precede the string with an # as in #"\"; with the other remaining option being the enclosure of the string in single quotes, as '\'. All of this is because of ASCII interpolation of character combinations like "\n" etc.
The latter is a much bigger subject, so I'll leave you with that consideration.
The exclude parameter won't work with dirs. A variant of Bo's script does the trick:
$source = 'c:\tmp\foo'
$dest = 'c:\temp\foo'
$exclude = '\.bak'
Get-ChildItem $source -Recurse | where {$_.FullName -notmatch $exclude} |
Copy-Item -Destination {Join-Path $dest $_.FullName.Substring($source.length)}
I was looking for a way to copy files modified after a certain date/timestamp so as to archive them. This way I could save off exactly what files I worked on (assuming I know when I started). (Yes, I know this is what SCM is for, but there are times when I just want to snapshot my work without checking it in.)
Using landyman's tip, and stuff I found elsewhere, I found that this worked:
$source = 'c:\tmp\foo'
$dest = 'c:\temp\foo'
$exclude = #('*.pdb', '*.config')
Get-ChildItem $source -Recurse -Exclude $exclude |
where-object {$_.lastwritetime -gt "8/24/2011 10:26 pm"} |
Copy-Item -Destination {Join-Path $dest $_.FullName.Substring($source.length)}
Get-ChildItem with Join-Path was working mostly for me, but I realized it was copying root directories inside the other root directories, which was bad.
For example
c:\SomeFolder
c:\SomeFolder\CopyInHere
c:\SomeFolder\CopyInHere\Thing.txt
c:\SomeFolder\CopyInHere\SubFolder
c:\SomeFolder\CopyInHere\SubFolder\Thin2.txt
Source Directory: c:\SomeFolder\CopyInHere
Destination Directory: d:\PutItInHere
Goal:
Copy every childitem Inside c:\SomeFolder\CopyInHere to the root of d:\PutItInHere, but not including c:\SomeFolder\CopyInHere itself.
- E.g. Take all the children of CopyInHere and make them Children of PutItInHere
The above examples do this most of the way, but what happens is It Creates a folder Called SubFolder, and Creates a Folder in Folder called SubFolder.
That's because Join-Path Calculates a destination path of d:\PutItInHere\SubFolder for the SubFolder child item, so SubFolder get's created in a Folder called SubFolder.
I got around this by Using Get-ChildItems to bring back a collection of the items, then using a loop to go through it.
Param(
[Parameter(Mandatory=$True,Position=1)][string]$sourceDirectory,
[Parameter(Mandatory=$True,Position=2)][string]$destinationDirectory
)
$sourceDI = [System.IO.DirectoryInfo]$sourceDirectory
$destinationDI = [System.IO.DirectoryInfo]$destinationDirectory
$itemsToCopy = Get-ChildItem $sourceDirectory -Recurse -Exclude #('*.cs', 'Views\Mimicry\*')
foreach ($item in $itemsToCopy){
$subPath = $item.FullName.Substring($sourceDI.FullName.Length)
$destination = Join-Path $destinationDirectory $subPath
if ($item -is [System.IO.DirectoryInfo]){
$itemDI = [System.IO.DirectoryInfo]$item
if ($itemDI.Parent.FullName.TrimEnd("\") -eq $sourceDI.FullName.TrimEnd("\")){
$destination = $destinationDI.FullName
}
}
$itemOutput = New-Object PSObject
$itemOutput | Add-Member -Type NoteProperty -Name Source -Value $item.FullName
$itemOutput | Add-Member -Type NoteProperty -Name Destination -Value $destination
$itemOutput | Format-List
Copy-Item -Path $item.FullName -Destination $destination -Force
}
What this does in short, is it uses the current item's full name for the destination calculation. However it then checks to see if it is a DirectoryInfo object. If it is it checks if it's Parent Folder is the Source Directory, that means the current folder being iterated is a direct child of the source directory, as such we should not append it's name to the destination directory, because we want that folder to be created in the destination directory, not in a folder of it's in the destination directory.
Following that, every other folder will work fine.
$sourcePath="I:\MSSQL\Backup\Full"
$excludedFiles=#("MASTER", "DBA", "MODEL", "MSDB")
$sourceFiles=(ls $sourcePath -recurse -file) | where-object { $_.directory.name -notin $excludedFiles }
this is what i did, i needed to copy out a bunch of backup files to a separate location on the network for client pickup. we didn't want them to have the above system DB backups.
I had a similar problem extending this a bit. I want a solution working for sources like
$source = "D:\scripts\*.sql"
too. I found this solution:
function Copy-ToCreateFolder
{
param(
[string]$src,
[string]$dest,
$exclude,
[switch]$Recurse
)
# The problem with Copy-Item -Rec -Exclude is that -exclude effects only top-level files
# Copy-Item $src $dest -Exclude $exclude -EA silentlycontinue -Recurse:$recurse
# http://stackoverflow.com/questions/731752/exclude-list-in-powershell-copy-item-does-not-appear-to-be-working
if (Test-Path($src))
{
# Nonstandard: I create destination directories on the fly
[void](New-Item $dest -itemtype directory -EA silentlycontinue )
Get-ChildItem -Path $src -Force -exclude $exclude | % {
if ($_.psIsContainer)
{
if ($Recurse) # Non-standard: I don't want to copy empty directories
{
$sub = $_
$p = Split-path $sub
$currentfolder = Split-Path $sub -leaf
#Get-ChildItem $_ -rec -name -exclude $exclude -Force | % { "{0} {1}" -f $p, "$currentfolder\$_" }
[void](New-item $dest\$currentfolder -type directory -ea silentlycontinue)
Get-ChildItem $_ -Recurse:$Recurse -name -exclude $exclude -Force | % { Copy-item $sub\$_ $dest\$currentfolder\$_ }
}
}
else
{
#"{0} {1}" -f (split-path $_.fullname), (split-path $_.fullname -leaf)
Copy-Item $_ $dest
}
}
}
}
The below snippet will copy all files and folders from $source to $dest, excluding .pdb and .config files from the root folder and sub-folders:
Get-ChildItem -Path $source | Copy-Item -Destination $dest -Recurse -Container -Exclude #('*.pdb','*.config')
One way of copying items from one folder to another using regular expressions for exclusion:
$source = '.\source'
$destination = '.\destination'
$exclude = '.*\.pdf$|.*\.mp4$|\\folder1(\\|$)|\\folder2(\\|$)'
$itemsToCopy = Get-ChildItem $source -Recurse |
Where-Object FullName -notmatch $exclude | Select-Object -Expand FullName
$sourceFullNameLength = (Get-Item $source).FullName.Length
foreach ($item in $itemsToCopy) {
$relativeName = $item.Substring($sourceFullNameLength + 1)
Copy-Item -Path $item -Destination "$destination\$relativeName"
}
I wrote this for daily use and packaged it in the script module, it maintains all the directory structure and supports wildcards:
function Copy-Folder {
[CmdletBinding()]
param(
[Parameter(Mandatory)]
[String]$FromPath,
[Parameter(Mandatory)]
[String]$ToPath,
[string[]] $Exclude
)
if (Test-Path $FromPath -PathType Container) {
New-Item $ToPath -ItemType Directory -ErrorAction SilentlyContinue | Out-Null
Get-ChildItem $FromPath -Force | ForEach-Object {
# avoid the nested pipeline variable
$item = $_
$target_path = Join-Path $ToPath $item.Name
if (($Exclude | ForEach-Object { $item.Name -like $_ }) -notcontains $true) {
if (Test-Path $target_path) { Remove-Item $target_path -Recurse -Force }
Copy-Item $item.FullName $target_path
Copy-Folder -FromPath $item.FullName $target_path $Exclude
}
}
}
}
Just call the Copy-Folder -FromPath 'fromDir' -ToPath 'destDir' -Exclude *.pdb,*.config
The -FromPath and -ToPath can be omitted,
Copy-Folder -FromPath 'fromDir destDir -Exclude *.pdb,*.config