I am new to powershell and trying to learn a basic file move from one directory to another. My goal is to move files and folders that are over 18months old to cold storage folder run as a scheduled Task. I need to be able to easily modify it's directories to fit our needs. It needs to preserve the folder structure and only move files that fit the above parameters. I also need it to log everything it did so if something is off i know where.
If I run this it just copies everything. If I comment out the %{Copy-Item... then it runs and lists only based on my parameters and logs it. Where am I going wrong or am I way off base?
Yes it would be easy to use robocopy to do this but I want to use powershell and learn from it.
#Remove-Variable * -ErrorAction SilentlyContinue; Remove-Module *; $error.Clear();
#Clear-Host
#Days older than
$Days = "-485"
#Path Variables
$Sourcepath = "C:\Temp1"
$DestinationPath = "C:\Temp2"
#Logging
$Logfile = "c:\temp3\file_$((Get-Date).ToString('MM-dd-yyyy_hh-mm-ss')).log"
#transcript logs all outputs to txt file
Start-Transcript -Path $Logfile -Append
Get-ChildItem $Sourcepath -Force -Recurse |
Where-Object {$_.LastwriteTime -le (Get-Date).AddDays($Days)} |
% {Copy-Item -Path $Sourcepath -Destination $DestinationPath -Recurse -Force}
Stop-Transcript
Problem
Copy-Item -Path $Sourcepath -Destination $DestinationPath -Recurse -Force
You always specify the same path for source and destination. With parameter -recurse you will copy the whole directory $SourcePath for each matching file.
Solution
You need to feed the output of the previous pipeline steps to Copy-Item by using the $_ (aka $PSItem) variable, basically using Copy-Item in single-item mode.
Try this (requires .NET >= 5.0 for GetRelativePath method):
Get-ChildItem $Sourcepath -File -Force -Recurse |
Where-Object {$_.LastwriteTime -le (Get-Date).AddDays($Days)} |
ForEach-Object {
$relativeSourceFilePath = [IO.Path]::GetRelativePath( $sourcePath, $_.Fullname )
$destinationFilePath = Join-Path $destinationPath $relativeSourceFilePath
$destinationSubDirPath = Split-Path $destinationFilePath -Parent
# Need to create sub directory when using Copy-Item in single-item mode
$null = New-Item $destinationSubDirPath -ItemType Directory -Force
# Copy one file
Copy-Item -Path $_ -Destination $destinationFilePath -Force
}
Alternative implementation without GetRelativePath (for .NET < 5.0):
Push-Location $Sourcepath # Base path to use for Get-ChildItem and Resolve-Path
try {
Get-ChildItem . -File -Force -Recurse |
Where-Object {$_.LastwriteTime -le (Get-Date).AddDays($Days)} |
ForEach-Object {
$relativeSourceFilePath = Resolve-Path $_.Fullname -Relative
$destinationFilePath = Join-Path $destinationPath $relativeSourceFilePath
$destinationSubDirPath = Split-Path $destinationFilePath -Parent
# Need to create sub directory when using Copy-Item in single-item mode
$null = New-Item $destinationSubDirPath -ItemType Directory -Force
# Copy one file
Copy-Item -Path $_ -Destination $destinationFilePath -Force
}
}
finally {
Pop-Location # restore previous location
}
On a side note, $Days = "-485" should be replaced by $Days = -485.
You currently create a string instead of a number and rely on Powershell's ability to automagically convert string to number when "necessary". This doesn't always work though, so better create a variable with the appropriate datatype in the first place.
I currently try to move some old cmd/batch scripts to PS, finally! Currently I have a script that finds all .bak-files and creates a 7ziped file along with it if it does not exist.
My current script looks like this:
for /R "E:\Backup" %%f in (*.bak) do (
echo %%f
if not exist "%%f.7z" (
7Z a "%%f.7z" "%%f"
)
)
So I’m trying to rewrite in PS.
This will give me the list of all .back-files, but how do I add .7z to the filename and test it it does exist (Test-Path):
$path = "E:\Backup"
Get-ChildItem -Path $path -Recurse -Filter *.bak | Test-Path -Path %{$_.FullName} # Add .7z here?
And in the next step I’ll run the command to create the 7z-file:
7z.exe a "file.bak.7z" "file.bak"
Should I store all found files in an array and iterate the array, or should I use the ‘|’ to chain the commands?
Thanks for advice!
Couldn't test, but something like this should work
Get-ChildItem -Path $path -Filter '*.bak' -File -Recurse |
Foreach-Object {
$zip = $_.FullName + '.7z'
If (!(Test-Path -Path $zip -PathType Leaf)) {
# create the zip file
7z.exe a $zip $_.FullName
}
}
Thanks #Theo,
I learned allot here!
My final PS looks like this:
$path = "E:\Backup"
$7z = "C:\Program Files\7-Zip\7z.exe"
$filter = "*.bak"
# Compress files not already compressed.
Get-ChildItem -Path $path -Filter $filter -File -Recurse |
ForEach-Object {
$archive = $_.FullName + '.7z'
If (!(Test-Path -Path $archive -PathType Leaf)){
& #7z a "$archive" "$($_.FullName)"
}
}
I am trying to move all folders/subfolders/subfiles from one folder on a server to another using a powershell script, while keeping the same file structure throughout. To do this I am using Get-childItem | copy-item. All folders, subfolders, and subfiles are moving properly EXCEPT for the first folder in the directory. It instead outputs all of these subfiles/subfolders from this folder into the destination. It does however keep their structure, just does not include their parent directory. My code is as follows:
Get-ChildItem $sourceFilePath -Force | Copy-Item -Destination ("*file path*" -f $destinationServer, $destinationClient) -Recurse -Force
filepath is paraphrased to improve readability
$sourceFilePath is the source folder that I am trying to copy
$destination server / $destinationClient are variables used in the paraphrased "file path"
I cannot figure out why this code works for all other folders, subfolders, and files EXCEPT for this one single folder and its items. Any help would be greatly appreciated, and if there's any other information that would help please let me know.
Answer thanks to #Christian Müller:
New-Item $destinationFilePath -name "MissingFolder" -type directory -Force | Out-Null
Get-ChildItem $sourceFilePath -Force | Copy-Item -Destination ("*filepath*" -f $destinationServer, $destinationClient) -Recurse -Force
This is a really strange bug and also astonished myself! :)
It seems when the destination base directory is not existing, than it is only created but not with content.
So with this construction, you can even debug this behaviour in ISE:
$sourceFilePath = "c:\temp\Test1"
$destPath = "c:\temp\Test2"
$a=Get-ChildItem $sourceFilePath -Force
rm -Force $destPath
foreach($x in $a) {
$x | Copy-Item -Destination "$destPath" -Recurse -Force
}
dir $destPath
Creating the target directory first, resolving the issue, with New-item:
$sourceFilePath = "c:\temp\Test1"
$destPath = "c:\temp\Test2"
$a=Get-ChildItem $sourceFilePath -Force
rm -Force $destPath
New-Item -ItemType Directory -Force -Path $destPath
foreach($x in $a) {
$x | Copy-Item -Destination "$destPath" -Recurse -Force
}
dir $destPath
But for my example it would work, to not using "Get-ChildItem" at all but
Copy-Item c:\temp\test1 -Destination c:\temp\test2 -Recurse -Force
Would this also work for you?
Copy-Item $sourceFilePath -Destination ("*file path*" -f $destinationServer, $destinationClient) -Recurse -Force
I've got the following code snippet which currently removes everything in my temp directory and re-adds a new temp directory.
if($serverVersion.name -like "*2003*"){
$dir = "\\$server" + '\C$\WINDOWS\Temp\*'
remove-item $dir -force -recurse
if($?){new-item -path "\\$server\admin$\Temp" -Type Directory}
}
elseif($serverVersion.name -like "*2008*"){
$dir = "\\$server" + '\C$\Windows\Temp\*'
remove-item $dir -force -recurse
if($?){New-Item -Path "\\$server\admin$\Temp" -Type Directory}
}
I'm trying to slightly alter the code to where it will no longer delete the temp directory and instead simply remove all of the contents inside of temp. I added \* at the end of my $dir variable so that it tries to get all of the items inside of temp rather than deleting Temp itself. When I run this however I'm not deleting anything. What is wrong with my code?
This works for me, so long as you meet the pre-reqs and have full control over all files/folders under Temp
# Prerequisites
# Must have the PowerShell ActiveDirectory Module installed
# Must be an admin on the target servers
#
# however if you have no permissions to some folders inside the Temp,
# then you would need to take ownship first.
#
$Server = "server Name"
$dir = "\\$Server\admin$\Temp\*"
$OS = (Get-ADComputer $Server -Properties operatingsystem).operatingSystem
IF (($os -like "*2003*") -or ($os -like "*2008*"))
{
remove-item $dir -Recurse -force
}
According to the PowerShell help file for remove-item, the -recurse parameter is faulty. It recommends that you get-childitem and pipe to remove-item. See example from the help file below.
-------------------------- EXAMPLE 4 --------------------------
C:\PS>get-childitem * -include *.csv -recurse | remove-item
Figured out how to do this and figure it may be useful for someone in the future.
if($serverVersion.name -like "*2003*"){
$dir = "\\$server" + '\C$\WINDOWS\Temp'
Get-ChildItem -path $dir -Recurse | %{Remove-Item -Path $_.FullName -Force}
if($?){new-item -path "\\$server\admin$\Temp" -Type Directory}
}
elseif($serverVersion.name -like "*2008*"){
$dir = "\\$server" + '\C$\Windows\Temp'
Get-ChildItem -path $dir -Recurse | %{Remove-Item -Path $_.FullName -Force}
write-host "success?"
if($?){New-Item -Path "\\$server\admin$\Temp" -Type Directory}
}
Using get-childitem it will look at everything inside of Temp without deleting Temp itself.
I have the following snippet of PowerShell script:
$source = 'd:\t1\*'
$dest = 'd:\t2'
$exclude = #('*.pdb','*.config')
Copy-Item $source $dest -Recurse -Force -Exclude $exclude
Which works to copy all files and folders from t1 to t2, but it only excludes the exclude list in the "root"/"first-level" folder and not in sub-folders.
How do I make it exclude the exclude list in all folders?
I think the best way is to use Get-ChildItem and pipe in the Copy-Item command.
I found that this worked:
$source = 'd:\t1'
$dest = 'd:\t2'
$exclude = #('*.pdb','*.config')
Get-ChildItem $source -Recurse -Exclude $exclude | Copy-Item -Destination {Join-Path $dest $_.FullName.Substring($source.length)}
Basically, what is happening here is that you're going through the valid files one by one, then copying them to the new path. The 'Join-Path' statement at the end is so that the directories are also kept when copying over the files. That part takes the destination directory and joins it with the directory after the source path.
I got the idea from here, and then modified it a bit to make it work for this example.
I hope it works!
I had this problem, too, and spent 20 minutes with applying the solutions here, but kept having problems.So I chose to use robocopy - OK, it's not powershell, but should be available everywhere where powershell runs.
And it worked right out of the box:
robocopy $source $dest /S /XF <file patterns to exclude> /XD <directory patterns to exclude>
e.g.
robocopy $source $dest /S /XF *.csproj /XD obj Properties Controllers Models
Plus, it has tons of features, like resumable copy.
Docs here.
As comments format code badly I'll post as answer but it's just an addition to #landyman's answer.
The proposed script has a drawback - it will create double-nested folders. For example for 'd:\t1\sub1' it will create empty directory 'd:\t2\sub1\sub1'. That's due to the fact that Copy-Item for directories expects parent directory name in -Destination property not directory name itself.
Here's a workaround I found:
Get-ChildItem -Path $from -Recurse -Exclude $exclude | Copy-Item -Force -Destination {
if ($_.GetType() -eq [System.IO.FileInfo]) {
Join-Path $to $_.FullName.Substring($from.length)
} else {
Join-Path $to $_.Parent.FullName.Substring($from.length)
}
}
Note that the syntax spec calls for a STRING ARRAY; ala String[]
SYNTAX
Copy-Item [[-Destination] <String>] [-Confirm] [-Container] [-Credential <PSCredential>] [-Exclude <String[]>] [-Filter <String>] [-Force] [-FromSession <PSSession>] [-Include
<String[]>] -LiteralPath <String[]> [-PassThru] [-Recurse] [-ToSession <PSSession>] [-UseTransaction] [-WhatIf] [<CommonParameters>]
If you're not explicit in your array generation, you end up with an Object[] - and that is ignored in many cases, leaving the appearance of "buggy behavior" because of type-safety. Since PowerShell can process script-blocks, evaluation of other than a type-specific variable (so that a valid string could be determined) would leave an opening for the potential of an injection mode attack on any system whose execution policy were lax.
So this is unreliable:
PS > $omissions = #("*.iso","*.pdf","*.zip","*.msi")
PS > $omissions.GetType()
Note the result....
IsPublic IsSerial Name BaseType
-------- -------- ---- --------
True True Object[] System.Array
And this works.... for example:
PS > $omissions = [string[]]#("*.iso","*.pdf","*.zip","*.msi")
**or**
PS > [string[]]$omissions = ("*.iso,*.pdf,*.zip,*.msi").split(',')
PS > $omissions.GetType()
IsPublic IsSerial Name BaseType
-------- -------- ---- --------
True True String[] System.Array
Note that even a "single" element would still require the same cast, so as to create a 1-element array.
If you're trying this at home, be sure to use the Replace-Variable "omissions" to clean out the existence of $omissions before recasting it in the examples shown above.
And as far as a pipeline that works reliably that I've tested....
--------------------------------------------------------------------------------------- cd $sourcelocation
ls | ?{$_ -ne $null} | ?{$_.BaseName -notmatch "^\.$"} | %{$_.Name} | cp -Destination $targetDir -Exclude $omissions -recurse -ErrorAction silentlycontinue
---------------------------------------------------------------------------------------
The above does a directory listing of the source files in the base (selected "current") directory, filters out potential problem items, converts the file to the basename and forces cp (copy-item alias) to re-access the file "by name" in the "current directory" - thus reacquiring the file object, and copies it. This will create empty directories, including those that may even contain excluded files (less the exclusions of course). Note also that "ls" (get-childitem) does NOT -recurse - that is left to cp. Finally - if you're having problems and need to debug, remove the -ErrorAction silentlycontinue switch and argument, which hides a lot of nuisances that might interrupt the script otherwise.
For those whose comments were related to "\" inclusions, keep in mind that you're working over the .NET sub-layer via an interpreter (i.e. PowerShell), and in c# for example, the inclusion of a single "\" (or multiple singles in a string), results in the compiler demanding you correct the condition by using either "\\" to escape the backslash, or precede the string with an # as in #"\"; with the other remaining option being the enclosure of the string in single quotes, as '\'. All of this is because of ASCII interpolation of character combinations like "\n" etc.
The latter is a much bigger subject, so I'll leave you with that consideration.
The exclude parameter won't work with dirs. A variant of Bo's script does the trick:
$source = 'c:\tmp\foo'
$dest = 'c:\temp\foo'
$exclude = '\.bak'
Get-ChildItem $source -Recurse | where {$_.FullName -notmatch $exclude} |
Copy-Item -Destination {Join-Path $dest $_.FullName.Substring($source.length)}
I was looking for a way to copy files modified after a certain date/timestamp so as to archive them. This way I could save off exactly what files I worked on (assuming I know when I started). (Yes, I know this is what SCM is for, but there are times when I just want to snapshot my work without checking it in.)
Using landyman's tip, and stuff I found elsewhere, I found that this worked:
$source = 'c:\tmp\foo'
$dest = 'c:\temp\foo'
$exclude = #('*.pdb', '*.config')
Get-ChildItem $source -Recurse -Exclude $exclude |
where-object {$_.lastwritetime -gt "8/24/2011 10:26 pm"} |
Copy-Item -Destination {Join-Path $dest $_.FullName.Substring($source.length)}
Get-ChildItem with Join-Path was working mostly for me, but I realized it was copying root directories inside the other root directories, which was bad.
For example
c:\SomeFolder
c:\SomeFolder\CopyInHere
c:\SomeFolder\CopyInHere\Thing.txt
c:\SomeFolder\CopyInHere\SubFolder
c:\SomeFolder\CopyInHere\SubFolder\Thin2.txt
Source Directory: c:\SomeFolder\CopyInHere
Destination Directory: d:\PutItInHere
Goal:
Copy every childitem Inside c:\SomeFolder\CopyInHere to the root of d:\PutItInHere, but not including c:\SomeFolder\CopyInHere itself.
- E.g. Take all the children of CopyInHere and make them Children of PutItInHere
The above examples do this most of the way, but what happens is It Creates a folder Called SubFolder, and Creates a Folder in Folder called SubFolder.
That's because Join-Path Calculates a destination path of d:\PutItInHere\SubFolder for the SubFolder child item, so SubFolder get's created in a Folder called SubFolder.
I got around this by Using Get-ChildItems to bring back a collection of the items, then using a loop to go through it.
Param(
[Parameter(Mandatory=$True,Position=1)][string]$sourceDirectory,
[Parameter(Mandatory=$True,Position=2)][string]$destinationDirectory
)
$sourceDI = [System.IO.DirectoryInfo]$sourceDirectory
$destinationDI = [System.IO.DirectoryInfo]$destinationDirectory
$itemsToCopy = Get-ChildItem $sourceDirectory -Recurse -Exclude #('*.cs', 'Views\Mimicry\*')
foreach ($item in $itemsToCopy){
$subPath = $item.FullName.Substring($sourceDI.FullName.Length)
$destination = Join-Path $destinationDirectory $subPath
if ($item -is [System.IO.DirectoryInfo]){
$itemDI = [System.IO.DirectoryInfo]$item
if ($itemDI.Parent.FullName.TrimEnd("\") -eq $sourceDI.FullName.TrimEnd("\")){
$destination = $destinationDI.FullName
}
}
$itemOutput = New-Object PSObject
$itemOutput | Add-Member -Type NoteProperty -Name Source -Value $item.FullName
$itemOutput | Add-Member -Type NoteProperty -Name Destination -Value $destination
$itemOutput | Format-List
Copy-Item -Path $item.FullName -Destination $destination -Force
}
What this does in short, is it uses the current item's full name for the destination calculation. However it then checks to see if it is a DirectoryInfo object. If it is it checks if it's Parent Folder is the Source Directory, that means the current folder being iterated is a direct child of the source directory, as such we should not append it's name to the destination directory, because we want that folder to be created in the destination directory, not in a folder of it's in the destination directory.
Following that, every other folder will work fine.
$sourcePath="I:\MSSQL\Backup\Full"
$excludedFiles=#("MASTER", "DBA", "MODEL", "MSDB")
$sourceFiles=(ls $sourcePath -recurse -file) | where-object { $_.directory.name -notin $excludedFiles }
this is what i did, i needed to copy out a bunch of backup files to a separate location on the network for client pickup. we didn't want them to have the above system DB backups.
I had a similar problem extending this a bit. I want a solution working for sources like
$source = "D:\scripts\*.sql"
too. I found this solution:
function Copy-ToCreateFolder
{
param(
[string]$src,
[string]$dest,
$exclude,
[switch]$Recurse
)
# The problem with Copy-Item -Rec -Exclude is that -exclude effects only top-level files
# Copy-Item $src $dest -Exclude $exclude -EA silentlycontinue -Recurse:$recurse
# http://stackoverflow.com/questions/731752/exclude-list-in-powershell-copy-item-does-not-appear-to-be-working
if (Test-Path($src))
{
# Nonstandard: I create destination directories on the fly
[void](New-Item $dest -itemtype directory -EA silentlycontinue )
Get-ChildItem -Path $src -Force -exclude $exclude | % {
if ($_.psIsContainer)
{
if ($Recurse) # Non-standard: I don't want to copy empty directories
{
$sub = $_
$p = Split-path $sub
$currentfolder = Split-Path $sub -leaf
#Get-ChildItem $_ -rec -name -exclude $exclude -Force | % { "{0} {1}" -f $p, "$currentfolder\$_" }
[void](New-item $dest\$currentfolder -type directory -ea silentlycontinue)
Get-ChildItem $_ -Recurse:$Recurse -name -exclude $exclude -Force | % { Copy-item $sub\$_ $dest\$currentfolder\$_ }
}
}
else
{
#"{0} {1}" -f (split-path $_.fullname), (split-path $_.fullname -leaf)
Copy-Item $_ $dest
}
}
}
}
The below snippet will copy all files and folders from $source to $dest, excluding .pdb and .config files from the root folder and sub-folders:
Get-ChildItem -Path $source | Copy-Item -Destination $dest -Recurse -Container -Exclude #('*.pdb','*.config')
One way of copying items from one folder to another using regular expressions for exclusion:
$source = '.\source'
$destination = '.\destination'
$exclude = '.*\.pdf$|.*\.mp4$|\\folder1(\\|$)|\\folder2(\\|$)'
$itemsToCopy = Get-ChildItem $source -Recurse |
Where-Object FullName -notmatch $exclude | Select-Object -Expand FullName
$sourceFullNameLength = (Get-Item $source).FullName.Length
foreach ($item in $itemsToCopy) {
$relativeName = $item.Substring($sourceFullNameLength + 1)
Copy-Item -Path $item -Destination "$destination\$relativeName"
}
I wrote this for daily use and packaged it in the script module, it maintains all the directory structure and supports wildcards:
function Copy-Folder {
[CmdletBinding()]
param(
[Parameter(Mandatory)]
[String]$FromPath,
[Parameter(Mandatory)]
[String]$ToPath,
[string[]] $Exclude
)
if (Test-Path $FromPath -PathType Container) {
New-Item $ToPath -ItemType Directory -ErrorAction SilentlyContinue | Out-Null
Get-ChildItem $FromPath -Force | ForEach-Object {
# avoid the nested pipeline variable
$item = $_
$target_path = Join-Path $ToPath $item.Name
if (($Exclude | ForEach-Object { $item.Name -like $_ }) -notcontains $true) {
if (Test-Path $target_path) { Remove-Item $target_path -Recurse -Force }
Copy-Item $item.FullName $target_path
Copy-Folder -FromPath $item.FullName $target_path $Exclude
}
}
}
}
Just call the Copy-Folder -FromPath 'fromDir' -ToPath 'destDir' -Exclude *.pdb,*.config
The -FromPath and -ToPath can be omitted,
Copy-Folder -FromPath 'fromDir destDir -Exclude *.pdb,*.config