Powershell loop through folder, search by file names and location then replace - powershell

I would like to use a loop for my program that grabs the file names of only the .dll in the folder and subfolders of that directory. It then searches a specified location/path for a .dll with the same file name and if it exists it replaces it. So far my program copies all files from one location to the other and once they are copied I need the above stated to be worked out.
My biggest issue is how do you search by filenames in a loop at a specified location and if it exists, replace it? Below code is locally in random places before I put the correct paths using servers and other drives.
#sets source user can edit path to be more precise
$source = "C:\Users\Public\Music\Sample Music\*"
#sets destination
$1stdest = "C:\Users\User\Music\Sample Music Location"
#copies source to destination
Get-ChildItem $source -recurse | Copy-Item -destination $1stdest
#takes 1stdest and finds only dlls to variable
#not sure if this is right but it takes the .dlls only, can you do that in the foreach()?
Get-ChildItem $1stdest -recurse -include "*.dll"

Here you go, you'll need to edit your paths back in. Also, note that $1stDest was changed to enumerate the list of files in the destination folder.
The logic goes through all of the files in $source, and looks for a match in $1stDest. if it finds some, it stores them in $OverWriteMe. The code then steps through each file to be overwritten and copies it.
As written, it uses -WhatIf, so you'll have a preview of what would happen before running it. If you like what you see, remove the -WhatIf on line 15.
$source = "c:\temp\stack\source\"
#sets destination
$1stdest = get-childitem C:\temp\stack\Dest -Recurse
#copies source to destination
ForEach ($file in (Get-ChildItem $source -recurse) ){
If ($file.BaseName -in $1stdest.BaseName){
$overwriteMe = $1stdest | Where BaseName -eq $file.BaseName
Write-Output "$($file.baseName) already exists # $($overwriteMe.FullName)"
$overwriteMe | ForEach-Object {
copy-item $file.FullName -Destination $overwriteMe.FullName -WhatIf
#End of ForEach $overwriteme
}
#End Of ForEach $file in ...
}
}
OutPut
1 already exists # C:\temp\stack\Dest\1.txt
What if: Performing the operation "Copy File" on target "Item: C:\temp\stack\source\1.txt Destination: C:\temp\stack\Dest\1.txt".
5 already exists # C:\temp\stack\Dest\5.txt
What if: Performing the operation "Copy File" on target "Item: C:\temp\stack\source\5.txt Destination: C:\temp\stack\Dest\5.txt".

Related

Powershell: Find Folders with (Name) and Foreach Copy to Location Preserve Directory Structure

Got another multi-step process I'm looking to streamline. Basically, I'm looking to build a Powershell script to do three things:
Get-Childitem to look for folders with a specific name (we'll call it NAME1 as a placeholder)
For each folder it finds that has the name, I want it to output the full directory to a TXT file (so that in the end I wind up with a text file that has a list of the results it found, with their full paths; so if it finds folders with "NAME1" in five different subdirectories of the folder I give it, I want the full path beginning with the drive letter and ending with "NAME1")
Then I want it to take the list from the TXT file, and copy each file path to another drive and preserve directory structure
So basically, if it searches and finds this:
D:\TEST1\NAME1
D:\TEST7\NAME1
D:\TEST8\NAME1\
That's what I want to appear in the text file.
Then what I want it to do is to go through each line in the text file and plug the value into a Copy-Item (I'm thinking the source directory would get assigned to a variable), so that when it's all said and done, on the second drive I wind up with this:
E:\BACKUP\TEST1\NAME1
E:\BACKUP\TEST7\NAME1
E:\BACKUP\TEST8\NAME1\
So in short, I'm looking for a Get-Childitem that can define a series of paths, which Copy-Item can then use to back them up elsewhere.
I already have one way to do this, but the problem is it seems to copy everything every time, and since one of these drives is an SSD I only want to copy what's new/changed each time (not to mention that would save time when I need to run a backup):
$source = "C:\"
$target = "E:\BACKUP\"
$search = "NAME1"
$source_regex = [regex]::escape($source)
(gci $source -recurse | where {-not ($_.psiscontainer)} | select -expand fullname) -match "\\$search\\" |
foreach {
$file_dest = ($_ | split-path -parent) -replace $source_regex,$target
if (-not (test-path $file_dest)){mkdir $file_dest}
copy-item $_ -Destination $file_dest -force -verbose
}
If there's a way to do this that wouldn't require writing out a TXT file each time I'd be all for that, but I don't know a way to do this the way I'm looking for except a Copy-Item.
I'd be very grateful for any help I can get with this. Thanks all!
If I understand correctly, you want to copy all folders with a certain name, keeping the original folder structure in the destination path and copy only files that are newer than what is in the destination already.
Try
$source = 'C:\'
$target = 'E:\BACKUP\'
$search = 'NAME1'
# -ErrorAction SilentlyContinue because in the C:\ disk you are bound to get Access Denied on some paths
Get-ChildItem -Path $source -Directory -Recurse -Filter $search -ErrorAction SilentlyContinue | ForEach-Object {
# construct the destination folder path
$dest = Join-Path -Path $target -ChildPath $_.FullName.Substring($source.Length)
# copy the folder including its files and subfolders (but not empty subfolders)
# for more switches see https://learn.microsoft.com/en-us/windows-server/administration/windows-commands/robocopy
robocopy $_.FullName $dest  /XO /S /R:0
}
If you don't want console output of robocopy you can silence it by appending 2>&1, so neither stdout nor stderr is echoed
If you want to keep a file after this with both the source paths and the destinations, I'd suggest doing
$source = 'C:\'
$target = 'E:\BACKUP\'
$search = 'NAME1'
$output = [System.Collections.Generic.List[object]]::new()
# -ErrorAction SilentlyContinue because in the C:\ disk you are bound to get Access Denied on some paths
Get-ChildItem -Path $source -Directory -Recurse -Filter $search -ErrorAction SilentlyContinue | ForEach-Object {
# construct the destination folder path
$dest = Join-Path -Path $target -ChildPath $_.FullName.Substring($source.Length)
# add an object to the output list
$output.Add([PsCustomObject]#{Source = $_.FullName; Destination = $dest })
# copy the folder including its files and subfolders (but not empty subfolders)
# for more switches see https://learn.microsoft.com/en-us/windows-server/administration/windows-commands/robocopy
robocopy $_.FullName $dest  /XO /S /R:0
}
# write the output to csv file
$output | Export-Csv -Path 'E:\backup.csv' -NoTypeInformation

Powershell: Move Subfolders but Ignore Files in Root?

Got what I thought was a fairly straightforward question but after several hours of Googling I can't find an answer. Pretty sure Powershell is versatile enough to do this, I'm just not sure how to go code it up.
So basically, I have this:
W:\ (root)
Main Folder 1
Subfolder 1
Sub-Subfolder 1
Sub-Subfolder 2
Sub-Subfolder 3
File1.fil
File2.fil
File3.fil
All I'm trying to do is get Powershell to search Subfolder 1 and move Sub-Subfolders 1-3 (and their contents) into Subfolder 1, but ignore Files 1-3.
The syntax that I've worked up looks like this:
$source = "W:\Main Folder 1\Subfolder 1\"
$destination = "W:\Main Folder\"
Get-ChildItem $source -attributes D -Recurse | Move-Item -Destination "$destination" -Force -Verbose
When I -WhatIf it it looks like it should work, but then when I try it I get the dreaded "cannot create a file when that file already exists" (basically it's saying that it can't create folders in Main Folder 1 with the same names as ones from Subfolder 1). I've got a -Force flag there and I'd have thought this would do the trick, but...here we are.
Any idea what to do to force it to move them? (I don't want it to delete the existing folder to move the new one though.)
If I understand what you're trying to do this code works.
$source = "G:\Test\A\FolderTwo"
$destination = "G:\Test\A"
Get-ChildItem -Path $source -Directory |
Move-Item -Destination $destination -Force -Verbose
Verbose Output:
VERBOSE: Performing the operation "Move Directory" on target "Item: G:\Test\A\FolderTwo\Folder2A Destination: G:\Test\A\Folder2A".
VERBOSE: Performing the operation "Move Directory" on target "Item: G:\Test\A\FolderTwo\Folder2B Destination: G:\Test\A\Folder2B".
The 7 regular files in the Source Folder remained there. The Directories in the Source Folder and all their contents including Sub folders were moved.
Just make sure you have the Source & Destination directories defined correctly.
Note there is no need for the -Recurse switch as moving whole directories also moves their contents.
Edited to remove un-necessary quotes per mklelement0's comment below!
This should get you further. Change the "$($folder.name)-2" to whatever you want to append the new folder names with.
$destination = "W:\Main Folder\"
$source = Get-ChildItem "W:\Main Folder 1\Subfolder 1\" -Directory
foreach ($folder in $source) {
$folder | Move-Item -Destination $destination\"$($folder.name)-2"
}

Verify file copied using copy-item

i'm trying to copy file from source to destination any verify if file copied or not.
But the problem is if i make changes inside the source file which was copied earlier then destination file not getting override when i execute the code again. Also i want log file each time files are copied.
Files in folder:- .csv, .log, .png, .html
$source="C:\52DWM93"
$destination="C:\Temp\"
Copy-Item -Path $source -Destination $destination -Force
$ver=(Get-ChildItem -file -path $destination -Recurse).FullName | foreach {get-filehash $_ -Algorithm md5}
If($ver)
{Write-Host "ALL file copied"}
else
{Write-Host "ALL file not copied"}
If you copy directories like this you need the -Recurse switch for Copy-Item. Without it you're not going to copy anything except the directory itself.
You can of course also use Get-ChildItem with whatever filter or Recurse flag you care about and pipe that into Copy-Item.
Use the *-FileCatalog cmdlets for verification.

Copy-Item with overwrite?

Here is a section of code from a larger script. The goal is to recurse through a source directory, then copy all the files it finds into a destination directory, sorted into subdirectories by file extension. It works great the first time I run it. If I run it again, instead of overwriting existing files, it fails with this error on each file that already exists in the destination:
Copy-Item : Cannot overwrite the item with itself
I try, whenever possible, to write scripts that are idempotent but I havn't been able to figure this one out. I would prefer not to add a timestamp to the destination file's name; I'd hate to end up with thirty versions of the exact same file. Is there a way to do this without extra logic to check for a file's existance and delete it if it's already there?
## Parameters for source and destination directories.
$Source = "C:\Temp"
$Destination = "C:\Temp\Sorted"
# Build list of files to sort.
$Files = Get-ChildItem -Path $Source -Recurse | Where-Object { !$_.PSIsContainer }
# Copy the files in the list to destination folder, sorted in subfolders by extension.
foreach ($File in $Files) {
$Extension = $File.Extension.Replace(".","")
$ExtDestDir = "$Destination\$Extension"
# Check to see if the folder exists, if not create it
$Exists = Test-Path $ExtDestDir
if (!$Exists) {
# Create the directory because it doesn't exist
New-Item -Path $ExtDestDir -ItemType "Directory" | Out-Null
}
# Copy the file
Write-Host "Copying $File to $ExtDestDir"
Copy-Item -Path $File.FullName -Destination $ExtDestDir -Force
}
$Source = "C:\Temp"
$Destination = "C:\Temp\Sorted"
You are trying to copy files from a source directory to a sub directory of that source directory. The first time it works because that directory is empty. The second time it doesn't because you are enumerating files of that sub directory too and thus attempt to copy files over themselves.
If you really need to copy the files into a sub directory of the source directory, you have to exclude the destination directory from enumeration like this:
$Files = Get-ChildItem -Path $Source -Directory |
Where-Object { $_.FullName -ne $Destination } |
Get-ChildItem -File -Recurse
Using a second Get-ChildItem call at the beginning, which only enumerates first-level directories, is much faster than filtering the output of the Get-ChildItem -Recurse call, which would needlessly process each file of the destination directory.

Move files that have the same name but different extension

I'm trying to use the script from here to move avi files I've converted into mp4, back into their original folders. All seems to work ok, the script tries to move the files to the correct location (see below) but the mp4 files aren't moved.
What if: Performing the operation "Move File" on target "Item: Z:\AVI\MVI_4965.mp4 Destination: Z:\Pictures\2011\04_21_11_Bergen\MVI_4965.mp4".
All files are located on my Z: NAS drive and I've modified the script to account for this (see below) starting from PS Z:\>.
# Create a hashtable with key = file basename and value = containing directory
$mediaFiles = #{}
Get-ChildItem -Recurse .\Pictures | ?{!$_.PsIsContainer} | ForEach-Object {
$mediaFiles[$_.BaseName] = $_.DirectoryName
}
# Look through lost files and if the lost file exists in the hash, then move it
Get-ChildItem -Recurse .\AVI | ?{!$_.PsIsContainer} | ForEach-Object {
if ($mediaFiles.ContainsKey($_.BaseName)) {
Move-Item -whatif $_.FullName $mediaFiles[$_.BaseName]
}
}
Any ideas whats stopping the files being moved or how to correct?
That's the point of -WhatIf. It simply says what it would do without actually doing it so you can verify the script first. Remove it from the Move-Item command.