I try to get the directory name as part of a filename. The problem is that I want to do this for each file in a different separate folder, which does not seem to work.
What I have come to thus far is that I am able to get a list of subfolders ($GetAllActionTargetSubFolders) that each contain one or more files. In each of the subfolders, the files are combined into one file with the name 'temp'. Now, what does not work is, I want to rename this combined 'temp' file that is in each of the subfolders and want to include the subfolder name as such: FoldernameA_Consolidated202209161304.rpt (instead of temp.rpt)
I thought that the '$_.Directory.Name' would give me the directory name, but then I get this error message:
Rename-Item : Could not find a part of the path.
At line:5 char:125
+ ... de *temp* | Rename-Item -NewName {$_.Directory.Name + "_Consolidated ...
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : WriteError: (\\networkdrive\R...ctions\temp.rpt:String) [Rename-Item], DirectoryNotFoundException
+ FullyQualifiedErrorId : RenameItemIOError,Microsoft.PowerShell.Commands.RenameItemCommand
This is the script that I have thus far:
#get a list of all sub directories:
$ActionTargetFolder = "\\networkdrive\2_ActionData_Prep\"
$GetAllActionTargetSubFolders = Get-ChildItem -Path $ActionTargetFolder -Recurse | Where-Object { $_.PSIsContainer -eq $true}
#for each sub directory, make 1 file that contains all lines from the files that are in that specific sub directory:
ForEach ($foldername in $GetAllActionTargetSubFolders.FullName) {Get-ChildItem $foldername -Recurse -File -Include *.rpt | get-content | sort | get-unique | Out-File -FilePath "$($foldername)\temp.rpt"-Force }
#rename the 'temp' file that is created and include the sub-directory name, text and date/time:
ForEach ($foldername in $GetAllActionTargetSubFolders.FullName) {Get-ChildItem $foldername -Recurse -File -Include *temp* | Rename-Item -NewName {$_.Directory.Name + "_Consolidated" + $((Get-Date).ToString('yyyyMMddhhmm')) + ".rpt"}}
I hope someone could help me with this
As Cid already commented, there is no need to create a file with name temp.rpt first and rename it afterwards.
By naming the file as you want it straight away, you don't need that second loop.
Also, when using Get-ChildItem and want it to filter for just one extension, you should use -Filter instead of -Include because this works a lot faster.
Try:
# get a list of all sub directories:
$ActionTargetFolder = "\\networkdrive\2_ActionData_Prep\"
$GetAllActionTargetSubFolders = Get-ChildItem -Path $ActionTargetFolder -Directory -Recurse
# for each sub directory, make 1 file that contains all lines from the files that are in that specific sub directory:
foreach ($folder in $GetAllActionTargetSubFolders) {
# create the full path and filename for the output
$outFile = Join-Path -Path $folder.FullName -ChildPath ('{0}_Consolidated_{1:yyyyMMddHHmm}.rpt' -f $folder.Name, (Get-Date))
$content = $folder | Get-ChildItem -File -Filter '*.rpt' | Get-Content
$content | Sort-Object -Unique | Set-Content -Path $outFile
}
I have contents in a variable from GitHub and I want to export then to file automatically created o my local machine
I have tried to use
$FileContent | Out-File ('C:\Devjobs\clonefolder' + '\' + $repo.name + '\' + $srccontent.name)
It gives the error
Out-File : Could not find a part of the path 'C:\Devjobs\clonefolder\bct-common-devcomm-codegen-messages\BCT.Common.DevComm.CodeGen.Messages.sln'.
At line:1 char:18
+ ... lnContent | Out-File ('C:\Devjobs\clonefolder' + '\' + $repo.name + ' ...
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : OpenError: (:) [Out-File], DirectoryNotFoundException
+ FullyQualifiedErrorId : FileOpenFailure,Microsoft.PowerShell.Commands.OutFileCommand
As stackprotector already commented, the error shows DirectoryNotFoundException, which means you are trying to create a file in a directory that does not yet exist.
To avoid that, first create the path for the output file, then create the file.
$pathOut = Join-Path -Path 'C:\Devjobs\clonefolder' -ChildPath $repo.name
# create the folder path if it does not exist already
$null = New-Item -Path $pathOut -ItemType Directory -Force
# now write the file
$FileContent | Set-Content -Path (Join-Path -Path $pathOut -ChildPath $srccontent.name)
By using the -Force switch on New-Item you will either create the directory, OR have a DirectoryInfo object returned if the folder already existed.
In this case, we have no further need for that object, so we discard it with $null =.
Beware that this only works like that on the file system, if you would do the same on a registry key, you wil lose all content of the existing key!
Note: I use Set-Content rather than Out-File because on PowerShell versions up to and including 5.1, Out-File without using the -Encoding parameter will write the file in Unicode (UTF16-LE) encoding which may or may not be what you expect.
Following your comment:
foreach ($srccontent in $srccontents) {
if (<cond>) {
$slnContent = <rest>
$NewslnContent = "content"
$pathOut = Join-Path -Path 'C:\Devjobs\clonefolder' -ChildPath $repo.name
# first create the folder path if it does not exist already
$null = New-Item -Path $pathOut -ItemType Directory -Force
# now write the file
$NewslnContent | Set-Content -Path (Join-Path -Path $pathOut -ChildPath $srccontent.name)
}
}
Instead of string concatenation you may want to try Join-Path for cross-platform. That being said, if you are on a Windows machine this is not likely to be your issue.
You may want to use Test-Path to verify if the path and the file exists already.
$path = 'C:' |
Join-Path -ChildPath 'Devjobs' |
Join-Path -ChildPath 'clonefolder' |
Join-Path -ChildPath $repo.name
$filepath = $path | Join-Path -ChildPath $srccontent.name
If (-Not (Test-Path $path)) {
New-Item -Type Directory -Path $path
}
If (-Not (Test-Path $filepath)) {
Remove-Item -Path $filepath
}
$FileContent | Out-File $filepath
I have to create a script that searches for file, takes part of the folder name and move the file to a new location with that new name.
I am planning to use powershell for this but would be up willing to look for other options. This used for millions of files.
Example 1
sourcefolder\a\b\test_123456\example.txt -> \destinationfolder\example_123456.txt
Problem is I don't know how many folders deep the file is and the amount of folder name changes, I need everything after the last _
Example 2
sourcefolder\a\b\c\test_test_1234\example.txt -> \destinationfolder\example_1234.txt
I am researching how to script and will update question when I when I have some progress
FileInfo objects include many properties. One of these is the .Directory property which returns the directory (as DirectoryInfo object) that represents the parent folder the file is in. This Directory also has properties, like .Name.
You can use this like below:
$sourceFolder = 'D:\Test' # the root folder to search through
$destinationFolder = 'X:\Archive' # the destinationpath for the moved files
# make sure the destination folder exists
$null = New-Item -Path $destinationFolder -ItemType Directory -Force
# get a collection of FileInfo objects
# if you need more file extensions like both .txt and .log files, replace -Filter '*.txt' with -Include '*.txt', '*.log'
# this will be slower than using -Filter though..
$filesToMove = Get-ChildItem -Path $sourceFolder -File -Filter '*.txt' -Recurse | Where-Object {$_.Directory.Name -like '*_*'}
# using a foreach(..) is a bit faster than 'ForEach-Object'
foreach ($file in $filesToMove) {
# get the last part after '_' of the parent directory name
$suffix = ($file.Directory.Name -split '_')[-1]
# combine to create the new path and filename
$target = Join-Path -Path $destinationFolder -ChildPath ('{0}_{1}{2}' -f $file.BaseName, $suffix, $file.Extension)
$file | Move-Item -Destination $target -Force -WhatIf
}
Take off the WhatIf safety switch if you are satisfied what is displayed on screen about what would be moved is correct.
You don't even need the foreach loop because Move-Item can handle a scriptblock as parameter for the Destination like this:
$sourceFolder = 'D:\Test' # the root folder to search through
$destinationFolder = 'X:\Archive' # the destinationpath for the moved files
# make sure the destination folder exists
$null = New-Item -Path $destinationFolder -ItemType Directory -Force
# get a collection of FileInfo objects
# if you need more file extensions like both .txt and .log files, replace -Filter '*.txt' with -Include '*.txt', '*.log'
# this will be slower than using -Filter though..
$filesToMove = Get-ChildItem -Path $sourceFolder -File -Filter '*.log' -Recurse |
Where-Object {$_.Directory.Name -like '*_*'} |
Move-Item -Destination {
$suffix = ($_.Directory.Name -split '_')[-1]
Join-Path -Path $destinationFolder -ChildPath ('{0}_{1}{2}' -f $_.BaseName, $suffix, $_.Extension)
} -Force
Here, the $_ Automatic variable is used instead of a variable you define in a foreach loop.
P.S. If you only need files from subfolders with a name ending in _ followed by numbers only as opposed to names like sub_folder, change the Where-Object {...} clause in the code to
Where-Object {$_.Directory.Name -match '_\d+$'}
I am trying to write powershell Script which will create backupfolder on same Path where Application exist and need to copy the folders & files into backupfolder before deploying. Below are the command was using to perform but am getting error
$Source = "C:\XYZ"
$BackupFolder = New-Item -ItemType Directory -Force -Path $source_$(Get-Date)
Copy-Item -Path $Source\* $BackupFolder -Force
Error: Cannot copy item C:\XYZ\Backup_18-02-2017 on to itself
Try:
Copy-Item $Source\* $BackupFolder -Exclude $BackupFolder
That will eliminate the folder that you are copying into as a source that is being copied from.
Variables can contain underscores. The following works and displays the string "asdf"
$a_ = "adsf"; $a_
Your New-Item cmdlet call should have failed since $source_ is not a variable and would return null. This is default behavior for PowerShell. When I run your code as is I get the following:
New-Item : Cannot find drive. A drive with the name '02/18/2017 22' does not exist.At line:1 char:1
+ New-Item -ItemType Directory -Force -Path "$source_$(Get-Date)" -what ...
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : ObjectNotFound: (02/18/2017 22:String) [New-Item], DriveNotFoundException
+ FullyQualifiedErrorId : DriveNotFound,Microsoft.PowerShell.Commands.NewItemCommand
So I would have expected your folder variable to be null. wOxxOm brings this up in comment as well
Several options to address what I am sure is the partial source of your issue.
$BackupFolder = New-Item -ItemType Directory -Force -Path "$source`_$(Get-Date)"
$BackupFolder = New-Item -ItemType Directory -Force -Path "$($source)_$(Get-Date)"
$BackupFolder = New-Item -ItemType Directory -Force -Path ("{0}_{1} -f "$source, Get-Date)
You will still have to try and exclude this folder from the copy as well like Keith Hill's answer is telling you
Copy-Item $Source\* $BackupFolder -Exclude $BackupFolder
try Something like this
$Source = "C:\XYZ"
$Destination="{0}{1:yyyyMMdd}" -f $source, (Get-Date)
New-Item -ItemType Directory -Force -Path $Destination
Copy-Item -Path $Source\* $Destination -Recurse -Force
If I understand the question correctly. You want to take "C:\XYZ" and backup into the same directory called "C:\XYZ\backup_$DATE". What you will actually do is create a loop that will break once it reaches the max 248 characters. If we use the -exclude option then we can exclude the backup directory "C:\XYZ\backup_$DATE".
This function will do the trick and also gives you error handling.
Function Get-CopyDirectory{
#####################
# Dynamic Variables #
#####################
$Date = Get-Date -format ddMM-yyyy
$Exclude="Backup*"
####################
# Static Variables #
####################
$AppPath = "F:\Test\"
$BackupPath = "$AppPath\BACKUP_$Date\"
if (Test-Path $BackupPath) {
Write-Host "Backup Exist" -f Cyan
}
else
{
Copy-Item "$AppPath\*" $BackupPath -Exclude $Exclude -recurse -verbose
}
}
CLS
Get-CopyDirectory
I have the following snippet of PowerShell script:
$source = 'd:\t1\*'
$dest = 'd:\t2'
$exclude = #('*.pdb','*.config')
Copy-Item $source $dest -Recurse -Force -Exclude $exclude
Which works to copy all files and folders from t1 to t2, but it only excludes the exclude list in the "root"/"first-level" folder and not in sub-folders.
How do I make it exclude the exclude list in all folders?
I think the best way is to use Get-ChildItem and pipe in the Copy-Item command.
I found that this worked:
$source = 'd:\t1'
$dest = 'd:\t2'
$exclude = #('*.pdb','*.config')
Get-ChildItem $source -Recurse -Exclude $exclude | Copy-Item -Destination {Join-Path $dest $_.FullName.Substring($source.length)}
Basically, what is happening here is that you're going through the valid files one by one, then copying them to the new path. The 'Join-Path' statement at the end is so that the directories are also kept when copying over the files. That part takes the destination directory and joins it with the directory after the source path.
I got the idea from here, and then modified it a bit to make it work for this example.
I hope it works!
I had this problem, too, and spent 20 minutes with applying the solutions here, but kept having problems.So I chose to use robocopy - OK, it's not powershell, but should be available everywhere where powershell runs.
And it worked right out of the box:
robocopy $source $dest /S /XF <file patterns to exclude> /XD <directory patterns to exclude>
e.g.
robocopy $source $dest /S /XF *.csproj /XD obj Properties Controllers Models
Plus, it has tons of features, like resumable copy.
Docs here.
As comments format code badly I'll post as answer but it's just an addition to #landyman's answer.
The proposed script has a drawback - it will create double-nested folders. For example for 'd:\t1\sub1' it will create empty directory 'd:\t2\sub1\sub1'. That's due to the fact that Copy-Item for directories expects parent directory name in -Destination property not directory name itself.
Here's a workaround I found:
Get-ChildItem -Path $from -Recurse -Exclude $exclude | Copy-Item -Force -Destination {
if ($_.GetType() -eq [System.IO.FileInfo]) {
Join-Path $to $_.FullName.Substring($from.length)
} else {
Join-Path $to $_.Parent.FullName.Substring($from.length)
}
}
Note that the syntax spec calls for a STRING ARRAY; ala String[]
SYNTAX
Copy-Item [[-Destination] <String>] [-Confirm] [-Container] [-Credential <PSCredential>] [-Exclude <String[]>] [-Filter <String>] [-Force] [-FromSession <PSSession>] [-Include
<String[]>] -LiteralPath <String[]> [-PassThru] [-Recurse] [-ToSession <PSSession>] [-UseTransaction] [-WhatIf] [<CommonParameters>]
If you're not explicit in your array generation, you end up with an Object[] - and that is ignored in many cases, leaving the appearance of "buggy behavior" because of type-safety. Since PowerShell can process script-blocks, evaluation of other than a type-specific variable (so that a valid string could be determined) would leave an opening for the potential of an injection mode attack on any system whose execution policy were lax.
So this is unreliable:
PS > $omissions = #("*.iso","*.pdf","*.zip","*.msi")
PS > $omissions.GetType()
Note the result....
IsPublic IsSerial Name BaseType
-------- -------- ---- --------
True True Object[] System.Array
And this works.... for example:
PS > $omissions = [string[]]#("*.iso","*.pdf","*.zip","*.msi")
**or**
PS > [string[]]$omissions = ("*.iso,*.pdf,*.zip,*.msi").split(',')
PS > $omissions.GetType()
IsPublic IsSerial Name BaseType
-------- -------- ---- --------
True True String[] System.Array
Note that even a "single" element would still require the same cast, so as to create a 1-element array.
If you're trying this at home, be sure to use the Replace-Variable "omissions" to clean out the existence of $omissions before recasting it in the examples shown above.
And as far as a pipeline that works reliably that I've tested....
--------------------------------------------------------------------------------------- cd $sourcelocation
ls | ?{$_ -ne $null} | ?{$_.BaseName -notmatch "^\.$"} | %{$_.Name} | cp -Destination $targetDir -Exclude $omissions -recurse -ErrorAction silentlycontinue
---------------------------------------------------------------------------------------
The above does a directory listing of the source files in the base (selected "current") directory, filters out potential problem items, converts the file to the basename and forces cp (copy-item alias) to re-access the file "by name" in the "current directory" - thus reacquiring the file object, and copies it. This will create empty directories, including those that may even contain excluded files (less the exclusions of course). Note also that "ls" (get-childitem) does NOT -recurse - that is left to cp. Finally - if you're having problems and need to debug, remove the -ErrorAction silentlycontinue switch and argument, which hides a lot of nuisances that might interrupt the script otherwise.
For those whose comments were related to "\" inclusions, keep in mind that you're working over the .NET sub-layer via an interpreter (i.e. PowerShell), and in c# for example, the inclusion of a single "\" (or multiple singles in a string), results in the compiler demanding you correct the condition by using either "\\" to escape the backslash, or precede the string with an # as in #"\"; with the other remaining option being the enclosure of the string in single quotes, as '\'. All of this is because of ASCII interpolation of character combinations like "\n" etc.
The latter is a much bigger subject, so I'll leave you with that consideration.
The exclude parameter won't work with dirs. A variant of Bo's script does the trick:
$source = 'c:\tmp\foo'
$dest = 'c:\temp\foo'
$exclude = '\.bak'
Get-ChildItem $source -Recurse | where {$_.FullName -notmatch $exclude} |
Copy-Item -Destination {Join-Path $dest $_.FullName.Substring($source.length)}
I was looking for a way to copy files modified after a certain date/timestamp so as to archive them. This way I could save off exactly what files I worked on (assuming I know when I started). (Yes, I know this is what SCM is for, but there are times when I just want to snapshot my work without checking it in.)
Using landyman's tip, and stuff I found elsewhere, I found that this worked:
$source = 'c:\tmp\foo'
$dest = 'c:\temp\foo'
$exclude = #('*.pdb', '*.config')
Get-ChildItem $source -Recurse -Exclude $exclude |
where-object {$_.lastwritetime -gt "8/24/2011 10:26 pm"} |
Copy-Item -Destination {Join-Path $dest $_.FullName.Substring($source.length)}
Get-ChildItem with Join-Path was working mostly for me, but I realized it was copying root directories inside the other root directories, which was bad.
For example
c:\SomeFolder
c:\SomeFolder\CopyInHere
c:\SomeFolder\CopyInHere\Thing.txt
c:\SomeFolder\CopyInHere\SubFolder
c:\SomeFolder\CopyInHere\SubFolder\Thin2.txt
Source Directory: c:\SomeFolder\CopyInHere
Destination Directory: d:\PutItInHere
Goal:
Copy every childitem Inside c:\SomeFolder\CopyInHere to the root of d:\PutItInHere, but not including c:\SomeFolder\CopyInHere itself.
- E.g. Take all the children of CopyInHere and make them Children of PutItInHere
The above examples do this most of the way, but what happens is It Creates a folder Called SubFolder, and Creates a Folder in Folder called SubFolder.
That's because Join-Path Calculates a destination path of d:\PutItInHere\SubFolder for the SubFolder child item, so SubFolder get's created in a Folder called SubFolder.
I got around this by Using Get-ChildItems to bring back a collection of the items, then using a loop to go through it.
Param(
[Parameter(Mandatory=$True,Position=1)][string]$sourceDirectory,
[Parameter(Mandatory=$True,Position=2)][string]$destinationDirectory
)
$sourceDI = [System.IO.DirectoryInfo]$sourceDirectory
$destinationDI = [System.IO.DirectoryInfo]$destinationDirectory
$itemsToCopy = Get-ChildItem $sourceDirectory -Recurse -Exclude #('*.cs', 'Views\Mimicry\*')
foreach ($item in $itemsToCopy){
$subPath = $item.FullName.Substring($sourceDI.FullName.Length)
$destination = Join-Path $destinationDirectory $subPath
if ($item -is [System.IO.DirectoryInfo]){
$itemDI = [System.IO.DirectoryInfo]$item
if ($itemDI.Parent.FullName.TrimEnd("\") -eq $sourceDI.FullName.TrimEnd("\")){
$destination = $destinationDI.FullName
}
}
$itemOutput = New-Object PSObject
$itemOutput | Add-Member -Type NoteProperty -Name Source -Value $item.FullName
$itemOutput | Add-Member -Type NoteProperty -Name Destination -Value $destination
$itemOutput | Format-List
Copy-Item -Path $item.FullName -Destination $destination -Force
}
What this does in short, is it uses the current item's full name for the destination calculation. However it then checks to see if it is a DirectoryInfo object. If it is it checks if it's Parent Folder is the Source Directory, that means the current folder being iterated is a direct child of the source directory, as such we should not append it's name to the destination directory, because we want that folder to be created in the destination directory, not in a folder of it's in the destination directory.
Following that, every other folder will work fine.
$sourcePath="I:\MSSQL\Backup\Full"
$excludedFiles=#("MASTER", "DBA", "MODEL", "MSDB")
$sourceFiles=(ls $sourcePath -recurse -file) | where-object { $_.directory.name -notin $excludedFiles }
this is what i did, i needed to copy out a bunch of backup files to a separate location on the network for client pickup. we didn't want them to have the above system DB backups.
I had a similar problem extending this a bit. I want a solution working for sources like
$source = "D:\scripts\*.sql"
too. I found this solution:
function Copy-ToCreateFolder
{
param(
[string]$src,
[string]$dest,
$exclude,
[switch]$Recurse
)
# The problem with Copy-Item -Rec -Exclude is that -exclude effects only top-level files
# Copy-Item $src $dest -Exclude $exclude -EA silentlycontinue -Recurse:$recurse
# http://stackoverflow.com/questions/731752/exclude-list-in-powershell-copy-item-does-not-appear-to-be-working
if (Test-Path($src))
{
# Nonstandard: I create destination directories on the fly
[void](New-Item $dest -itemtype directory -EA silentlycontinue )
Get-ChildItem -Path $src -Force -exclude $exclude | % {
if ($_.psIsContainer)
{
if ($Recurse) # Non-standard: I don't want to copy empty directories
{
$sub = $_
$p = Split-path $sub
$currentfolder = Split-Path $sub -leaf
#Get-ChildItem $_ -rec -name -exclude $exclude -Force | % { "{0} {1}" -f $p, "$currentfolder\$_" }
[void](New-item $dest\$currentfolder -type directory -ea silentlycontinue)
Get-ChildItem $_ -Recurse:$Recurse -name -exclude $exclude -Force | % { Copy-item $sub\$_ $dest\$currentfolder\$_ }
}
}
else
{
#"{0} {1}" -f (split-path $_.fullname), (split-path $_.fullname -leaf)
Copy-Item $_ $dest
}
}
}
}
The below snippet will copy all files and folders from $source to $dest, excluding .pdb and .config files from the root folder and sub-folders:
Get-ChildItem -Path $source | Copy-Item -Destination $dest -Recurse -Container -Exclude #('*.pdb','*.config')
One way of copying items from one folder to another using regular expressions for exclusion:
$source = '.\source'
$destination = '.\destination'
$exclude = '.*\.pdf$|.*\.mp4$|\\folder1(\\|$)|\\folder2(\\|$)'
$itemsToCopy = Get-ChildItem $source -Recurse |
Where-Object FullName -notmatch $exclude | Select-Object -Expand FullName
$sourceFullNameLength = (Get-Item $source).FullName.Length
foreach ($item in $itemsToCopy) {
$relativeName = $item.Substring($sourceFullNameLength + 1)
Copy-Item -Path $item -Destination "$destination\$relativeName"
}
I wrote this for daily use and packaged it in the script module, it maintains all the directory structure and supports wildcards:
function Copy-Folder {
[CmdletBinding()]
param(
[Parameter(Mandatory)]
[String]$FromPath,
[Parameter(Mandatory)]
[String]$ToPath,
[string[]] $Exclude
)
if (Test-Path $FromPath -PathType Container) {
New-Item $ToPath -ItemType Directory -ErrorAction SilentlyContinue | Out-Null
Get-ChildItem $FromPath -Force | ForEach-Object {
# avoid the nested pipeline variable
$item = $_
$target_path = Join-Path $ToPath $item.Name
if (($Exclude | ForEach-Object { $item.Name -like $_ }) -notcontains $true) {
if (Test-Path $target_path) { Remove-Item $target_path -Recurse -Force }
Copy-Item $item.FullName $target_path
Copy-Folder -FromPath $item.FullName $target_path $Exclude
}
}
}
}
Just call the Copy-Folder -FromPath 'fromDir' -ToPath 'destDir' -Exclude *.pdb,*.config
The -FromPath and -ToPath can be omitted,
Copy-Folder -FromPath 'fromDir destDir -Exclude *.pdb,*.config