I have a pretty basic powershell copy script that copies items from a source folder to a destination folder. However this is moving way too much data, and I'd like to check if the filename already exists so that file can be ignored. I don't need this as complex as verifying created date/checksum/etc.
Currently it's along the lines of:
Copy-Item source destination -recurse
Copy-Item source2 destination2 -recurse
I'd imagine I need to add the Test-Path cmdlet, but I'm uncertain how to implement it.
You could always call ROBOCOPY from PowerShell for this.
Use the /xc (exclude changed) /xn (exclude newer) and /xo (exclude older) flags:
robocopy /xc /xn /xo source destination
This will ONLY copy those files that are not in the destination folder.
For more option type robocopy /?
$exclude = Get-ChildItem -recurse $dest
Copy-Item -Recurse $file $dest -Verbose -Exclude $exclude
While I agree that Robocopy is the best tool for something like this, I'm all for giving the customer what they asked for and it was an interesting PowerShell exercise.
This script should do just what you asked for: copy a file from Source to Destination only if it does not already exist in the Destination with a minimum of frills. Since you had the -recurse option in your example, that made for a bit more coding than just simply testing for the filename in the Destination folder.
$Source = "C:\SourceFolder"
$Destination = "C:\DestinationFolder"
Get-ChildItem $Source -Recurse | ForEach {
$ModifiedDestination = $($_.FullName).Replace("$Source","$Destination")
If ((Test-Path $ModifiedDestination) -eq $False) {
Copy-Item $_.FullName $ModifiedDestination
}
}
Building off of Wai Ha Lee's post, here's an example that worked for me:
$Source = "<your path here>"
$Dest = "<your path here>"
$Exclude = Get-ChildItem -recurse $Dest
Get-ChildItem $Source -Recurse -Filter "*.pdf" | Copy-Item -Destination $Dest -Verbose -Exclude $Exclude
This builds a list to exclude, then copies any pdf in the source directory and sub-directories to the destination in a single folder...excluding the existing files. Again, this is an example from my needs, but similar to yours. Should be easy enough to tweak to your hearts content.
Function Copy-IfNotPresent will accept one file at a time but it's easy to loop for all files you want to copy. Here's an example:
gci c:\temp\1\*.* -Recurse -File | % { Copy-IfNotPresent -FilePath $_ -Destination "C:\temp\2\$(Resolve-Path $_ -relative)" -Verbose }
Here's the function. It will generate the folder tree if necessary. Here's the gists link: https://gist.github.com/pollusb/cd47b4afeda8edbf8943a8808c880eb8
Function Copy-IfNotPresent {
<#
Copy file only if not present at destination.
This is a one file at a time call. It's not meant to replace complex call like ROBOCOPY.
Destination can be a file or folder. If it's a folder, you can use -Container to force Folder creation when not exists
#>
[CmdletBinding()]
Param (
[Parameter(Mandatory)]
$FilePath,
[Parameter(Mandatory)]
[string]$Destination,
[switch]$Container,
[switch]$WhatIf
)
#region validations
if ($FilePath -isnot [System.IO.FileInfo]){
$File = Get-ChildItem $FilePath -File
} else {
$File = $FilePath
}
if (!$File.Count){
Write-Warning "$FilePath no file found."
return
} elseif ($File.Count -gt 1) {
Write-Warning "$FilePath must resolve to one file only."
return
}
#endregion
# Destination is a folder
if ($Container -or (Test-Path -Path $Destination -PathType Container)) {
if (!(Test-Path $Destination)) {
New-Item -Path $Destination -ItemType Container | Out-Null
}
$Destination += "\$($File.Name)"
}
# Destination is a file
if (!(Test-Path $Destination)) {
if ($WhatIf) {
Write-Host "WhatIf:Copy-IfNotPresent $FilePath -> $Destination"
} else {
# Force creation of parent folder
$Parent = Split-Path $Destination -Parent
if (!(Test-Path $Parent)) {
New-Item $Parent -ItemType Container | Out-Null
}
Copy-Item -Path $FilePath -Destination $Destination
Write-Verbose "Copy-IfNotPresent $FilePath -> $Destination (is absent) copying"
}
} else {
Write-Verbose "Copy-IfNotPresent $Destination (is present) not copying"
}
}
$source = "c:\source"
$destination = "c:\destination"
Create a list of files to exclude, i.e. files already existing in the destination.
$exclude = Get-Childitem -Recurse $destination | ForEach-Object { $_.FullName -replace [Regex]::Escape($destination ), "" }
Recursively copy all contents from the source to the destination excluding the previously collected files.
Copy-Item -Recurse -Path (Join-Path $source "*") -Destination $destination -Exclude $exclude -Force -Verbose
(Join-Path $source "*") add a wildcard at end ensuring that you get the children of the source folder instead of the source folder itself.
Force is used because I don't mind that there are already existing folders (results in error messages). Use with caution.
ForEach-Object { $_.FullName -replace [Regex]::Escape($destination ), "" } transforms the existing file full names into values which can be used as Exclude parameter
Here is a recursive script that syncronizes 2 folders ignoring existing files:
function Copy-FilesAndFolders([string]$folderFrom, [string]$folderTo) {
$itensFrom = Get-ChildItem $folderFrom
foreach ($i in $itensFrom)
{
if ($i.PSIsContainer)
{
$subFolderFrom = $folderFrom + "\" + $i.BaseName
$subFolderTo = $folderTo + "\" + $i.BaseName
Copy-FilesAndFolders $subFolderFrom $subFolderTo | Out-Null
}
else
{
$from = $folderFrom + "\" + $i.Name
$to = $folderTo + "\" + $i.Name
if (!(Test-Path $from)) # only copies non-existing files
{
if (!(Test-Path $folderTo)) # if folder doesn't exist, creates it
{
New-Item -ItemType "directory" -Path $folderTo
}
Copy-Item $from $folderTo
}
}
}
}
To call it:
Copy-FilesAndFolders "C:\FromFolder" "C:\ToFolder"
Lots of great answers in here, here's my contribution as it relates to keeping an mp3 player in sync with a music library.
#Tom Hubbard, 10-19-2021
#Copy only new music to mp3 player, saves time by only copying items that don't exist on the destination.
#Leaving the hardcoded directories and paths in here, sometimes too much variable substitution is confusing for newer PS users.
#Gets all of the albums in the source directory such as your music library
$albumsInLibrary = gci -Directory -path "C:\users\tom\OneDrive\Music" | select -ExpandProperty Name
#Gets all of the albums of your destination folder, such as your mp3 player
$albumsOnPlayer = gci -Directory -Path "e:\" | select -ExpandProperty name
#For illustration, it will list the differences between the music library and the music player.
Compare-Object -DifferenceObject $albumsInLibrary -ReferenceObject $albumsOnPlayer
#Loop through each album in the library
foreach ($album in $albumsInLibrary)
{
#Check to see if the music player contains this directory from the music library
if ($albumsOnPlayer -notcontains $album)
{
#If the album doesn't exist on the music player, copy it and it's child items from the library to the player
write-host "$album is not on music player, copying to music player" -ForegroundColor Cyan
Copy-Item -path "C:\users\Tom\OneDrive\music\$album" -Recurse -Destination e:\$album
}
}
Related
When I try to import a CSV, and take a source filename/path and destination folder ref, copy-item seems to not copy the file in question.
I have a folder full of files in C:\Dir1\Test\Files\ and I need to copy them to individual folders in C:\Dir1\Test, based on what is in the csv.
$SourceDir = 'C:\Dir1\Test\Files\'
$DestDir = 'C:\Dir1\Test\'
Import-Csv C:\Dir1\Test\FileList.csv | ForEach-Object {
$Source = $SourceDir + $($_.'FilePath')
$Dest = $DestDir + "$($_.'Folder Ref')\"
Copy-Item $Source -Destination $Dest
}
If I switch out the Copy-Item to Write-Host, it reads to me correctly, am I doing something wrong?
Nothing happens, it returns me to the prompt with no output
Constructing file paths using string concatenation as you are doing is never a good idea..
Better use PowerShells cmdlet Join-Path for that or .Net [System.IO.Path]::Combine() method.
As mklement0 already commented, Copy-Item by default does not procude any visual output unless you add -Verbose.
You can also append switch -PassThru and in that case, the cmdlet returns an object that represents the copied item.
In your case, why not add an informative message yourself, something like:
$SourceDir = 'C:\Dir1\Test\Files'
$DestDir = 'C:\Dir1\Test'
Import-Csv -Path 'C:\Dir1\Test\FileList.csv' | ForEach-Object {
# construct the source path
$Source = Join-Path -Path $SourceDir -ChildPath $_.FilePath
if (Test-Path -Path $source -PathType Leaf) {
# construct the destination path
$Dest = Join-Path -Path $DestDir -ChildPath $_.'Folder Ref'
# make sure the target path exists before trying to copy to it
$null = New-Item -Path $Dest -ItemType Directory -Force
# now copy the file
Write-Host "Copying file '$Source' to '$Dest'"
Copy-Item -Path $Source -Destination $Dest
}
else {
Write-Warning "File '$Source' could not be found"
}
}
I have a directory of information that is separated into document numbers so each folder that contains documents starts with DOC-######-NameOfDocument. The thing I am trying to do is create a PowerShell script that will search a directory for any folders with a specified document number and then take the contents of that folder, move it up one level, and then delete the original folder (which should now be empty).
Below is the closest I have gotten to my intended result.
$Path = "filepath"
$Folders = Get-ChildItem -Filter "DOC-#####*" -Recurse -Name -Path $Path
$companyID = "######"
foreach ($Folder in $Folders){
$filepath = $Path + $Folder
$Files = Get-ChildItem -Path $filepath
$imagesourc = $filepath + $companyID
$imageDest = $filepath.Substring(0, $filepath.LastIndexOf('\'))
if (Test-Path -Path $imagesourc){
Copy-Item -Path $imagesourc -Destination $imageDest -Recurse
}
foreach ($File in $Files){
$Parent_Directory = Split-Path -Path $File.FullName
$Destination_Path = $filepath.Substring(0, $filepath.LastIndexOf('\'))
Copy-Item -Path $File.FullName -Destination $Destination_Path -Recurse
if ($null -eq (Get-ChildItem -Path $Parent_Directory)) {
}
}
Remove-Item $filepath -Recurse
}
This does what I need but for whatever reason I can't Devine, it will not work on .HTM files. Most of the files I am moving are .html and .htm files so I need to get it to work with .htm as well. The files with .HTM will not move and the folder won't be deleted either which is good at least.
Try using this:
$ErrorActionPreference = 'Stop'
$fileNumber = '1234'
$initialFolder = 'X:\path\to\folders'
$folders = Get-ChildItem -Path $initialFolder -Filter DOC-$fileNumber* -Force -Directory -Recurse
foreach($folder in $folders)
{
try
{
Move-Item $folder\* -Destination $folder.Parent.FullName
Remove-Item $folder
}
catch [System.IO.IOException]
{
#(
"$_".Trim()
"File FullName: {0}" -f $_.TargetObject
"Destination Folder: {0}" -f $folder.Parent.FullName
) | Out-String | Write-Warning
}
catch
{
Write-Warning $_
}
}
Important Notes:
Move-Item $folder\* will move all folder contents recursively. If there are folders inside $folder, those will also be moved too, if you want to target folders which only have files inside, an if condition should be added before this cmdlet.
Try {...} Catch {...} is there to handle file collision mainly, if a file with a same name already exists in the parent folder, it will let you know and it will not be moved nor will the folder be deleted.
-Filter DOC-$fileNumber* will capture all the folders named with the numbers in $fileNumber however, be careful because it may capture folders which you may not intent to remove.
Example: If you want to get all folders containing the number 1234 (DOC-12345-NameOfDocument, DOC-12346-NameOfDocument, ...) but you don't want to capture DOC-12347-NameOfDocument then you should fine tune the filter. Or you could add the -Exclude parameter.
-Force & -Directory to get hidden folders and to target only folders.
I Have this powershell script “copFiles.ps1” that looks in a txt file "Filestocopy.txt" for a list and copies them to a destination
$source = "C:\Data\Filestocopy.txt"
$destination = "C:\Data\Models"
Get-Content $source | ForEach-Object {copy-item $_ $destination}
It’ll only copy the files if they’re in the same folder as the .ps1 file and it ignores subfolders, how can I get it to look in subfolders of the folder that its in, I gather I need to use the -recurse option but don’t know how to rewrite it so it works.
The .ps1 file is fired by a bat file.
Many thanks
I don't know how fast this will be, but you can give an array as the argument for the -Path parameter of Get-ChildItem add the -Recurse switch to dig out the files in subdirectories and simply pipe them along to Copy-Item. something like:
Get-ChildItem (Get-Content $Source) -Recurse |
Copy-Item -Destination $destination
You may also want to add the -File switch.
Update
Based on your comment I played around with this a a little more:
$source = "C:\Data\Filestocopy.txt"
$Destination = "C:\data\Models"
# Get-ChildItem (Get-Content $Source) -Recurse |
Get-ChildItem (Get-Content $Source) -Recurse -File |
ForEach-Object{
If( $_.Directory.FullName -eq $Destination )
{ # Don't work on files already present in the destination
# when the destination is under the current directory...
Continue
}
$FileNum = $null
$NewName = Join-Path -Path $Destination -ChildPath $_.Name
While( (Test-Path $NewName) )
{
++$FileNum
$NewName = Join-Path -Path $Destination -ChildPath ($_.BaseName + "_" + $FileNum + $_.Extension)
}
Copy-Item $_.FullName -Destination $NewName
}
This will increment the destination file name in cases where a file by that name already exists in the destination. If the destination is under the current directory it will prevent analyzing those files by comparing the path of the file to the destination. Files must have unique names in a given folder so I'm not sure how else it can be handled.
Our Git repo blew up and we ended up losing the repo so now all our our users code is only on local workstations. For temporary storage we are going to have all of them put their local repo's on a network share. I am currently trying to write a PowerShell script to allow users to select all their repos with GridView and then copy them to the network share. This will cause a lot of overlap, so I only want files that have the latest modified date (commit) to overwrite when their are duplicate files.
For example,
User 1 has repo\file.txt last modified 8/10 and uploads it to network share.
User 2 also has repo\file.txt last modifed 8/12. when User 2 copies to the share it should overwrite User 1 file because it is the newer file.
I am new to PowerShell so I am not sure which direction to take.
As of right now I figured out how to copy over all files, but can't figure out the last modified piece. Any help would be greatly appreciated.
$destination = '\\remote\IT\server'
$filesToMove = get-childitem -Recurse | Out-GridView -OutputMode Multiple
$filesToMove | % { copy-item $_.FullName $destination -Recurse }
If your users have permission to write/delete files in the remote destination path, this should do it:
$destination = '\\remote\IT\server\folder'
# create the destination folder if it does not already exist
if (!(Test-Path -Path $destination -PathType Container)) {
Write-Verbose "Creating folder '$destination'"
New-Item -Path $destination -ItemType Directory | Out-Null
}
Get-ChildItem -Path 'D:\test' -File -Recurse |
Out-GridView -OutputMode Multiple -Title 'Select one or more files to copy' | ForEach-Object {
# since we're piping the results of the Get-ChildItem into the GridView,
# every '$_' is a FileInfo object you can pipe through to the Copy-Item cmdlet.
$skipFile = $false
# create the filename for a possible duplicate in the destination
$dupeFile = Join-Path -Path $destination -ChildPath $_.Name
if (Test-Path -Path $dupeFile) {
# if a file already exists AND is newer than the selected file, do not copy
if ((Get-Item -Path $dupeFile).LastWriteTime -gt $_.LastWriteTime ) {
Write-Host "Destination file '$dupeFile' is newer. Skipping."
$skipFile = $true
}
}
if (!$skipFile) {
$_ | Copy-Item -Destination $destination -Force
}
}
this is my first post here so please be forgiving. I'm browsing reddit/stackoverflow looking for cases to practice my PowerShell skills. I tried creating a script like you asked for on my local home PC, let me know if that somehow helps you:
$selectedFiles = get-childitem -Path "C:\Users\steven\Desktop" -Recurse | Out-GridView -OutputMode Multiple
$destPath = "D:\"
foreach ($selectedFile in $selectedFiles) {
$destFileCheck = $destPath + $selectedFile
if (Test-Path -Path $destFileCheck) {
$destFileCheck = Get-ChildItem -Path $destFileCheck
if ((Get-Date $selectedFile.LastWriteTime) -gt (Get-Date $destFileCheck.LastWriteTime)) {
Copy-Item -Path $selectedFile.FullName -Destination $destFileCheck.FullName
}
else {
Write-Host "Source file is older than destination file, skipping copy."
}
}
}
I am trying to create a code on Powershell that will actually Copy files from one Location( Lets say A) to location B. Now Location B have two subfolders (lets say X and Y). I need to copy the file from A to B but before copying I need to make sure that the files which I am copying should not be there in X or Y in order to avoid file duplication. If the file exist, it should not copy that particular file.
$PathS = Get-ChildItem -Path "\\sc-y-ap-swt-1\AutoClientFiles\reception\*.txt" |
Where-Object { $_.CreationTime -gt (Get-Date).AddDays(-1) }
$PathD = "C:\OCM\data\EverestSwift\inbound\"
$pathtest = Get-ChildItem -path "C:\OCM\data\EverestSwift\inbound\" -Recurse -File
If((Test-Path -Path "\\sc-y-ap-swt-1\AutoClientFiles\reception\*.txt") -eq $false) {
Exit
} Else {
Try {
Foreach ($File in $Pathtest){
if ($File -eq $PathS ){
Write-Host "Duplicate Files"
exit 1
}
Copy-Item -Path $PathS -Destination $PathD -Force
Exit 0
}
} catch [Exception]{
Write-Host $_.Exception.Message
Exit 1
}
}
You can do this, but why. As Cory said, this is why robocopy exists.
What do you mean by same?
The filename can be the same, but the timestamps can be different, thus making it a different file, even if the name is the same. So, you should be looking at name and timestamp or file hashes.
So, see these Q&A about such a use case.
Does Robocopy SKIP copying existing files by default?
How to skip existing and/or same size files when using robocopy
RoboCopy "%%F" %destination% *.srt *.pdf *.mp4 *.jpg /COPYALL /XO /R:0
Yet, doing this with powerShell, your post could be a duplicate of this one.
Copy items from Source to Destination if they don't already exist
Examples from the above:
$Source = 'C:\SourceFolder'
$Destination = 'C:\DestinationFolder'
Get-ChildItem $Source -Recurse | ForEach {
$ModifiedDestination = $($_.FullName).Replace("$Source","$Destination")
If ((Test-Path $ModifiedDestination) -eq $False) {
Copy-Item $_.FullName $ModifiedDestination
}
}
# Or
$Source = '<your path here>'
$Dest = '<your path here>'
$Exclude = Get-ChildItem -recurse $Dest
Get-ChildItem $Source -Recurse -Filter '*' |
Copy-Item -Destination $Dest -Verbose -Exclude $Exclude