So I asked here before about helping with a script to copy and paste files from one folder to another.
However, after I was done, I found that some of the files went missing. I had 600,393 files but when I checked my new folder it only had 600,361 files.
I think it may have been overwritten by duplicates even though the naming convention was supposed to stop those kinds of problems.
Here's the script
$destfolder = '.\destfolder\'
Get-ChildItem -Recurse -File .\srcfolder\ |
Invoke-Parallel {
$_ | Copy-Item -Destination (
Join-Path $using:destfolder ($_.directory.parent.name, $_.directory.name, $_.name -join '-')
) -Verbose -WhatIf
}
(Thanks to the great dudes on r/software, r/Techsupport, and mklement0)
So is there a way to add a postfix that adds a 0 to the name of any file that has the same name as a file already in a folder?
like directory-subdirectory-0-filename.ext
EDIT- Problem is all the files are read-only not hidden, I don't want any hidden files.
Note that Get-ChildItem doesn't include hidden items by default, which may explain at least part of the the discrepancy.
Use the -Force switch to include hidden items.
Separately / additionally, you can deal with name collisions as follows:
$destfolder = '.\destfolder\'
Get-ChildItem -Force -Recurse -File .\srcfolder\ |
Invoke-Parallel {
$newName = Join-Path $using:destfolder ($_.directory.parent.name, $_.directory.name, $_.name -join '-')
# Try to create the file, but only if it doesn't already exist.
if ($null -eq (New-Item $newName -ErrorAction SilentlyContinue)) {
# File already exists -> create duplicate with GUID.
$newName += '-' + (New-Guid)
}
$_ | Copy-Item -Destination $newName -Verbose
}
Note:
With multi-threaded execution, assigning sequence numbers to duplicates would be a nontrivial undertaking, as each thread would have to "reserve" a sequence number and ensure that no other thread claims it before copying to a file incorporating this number is complete.
To avoid such challenges, the above approach simply appends a - plus a GUID to the target file name.
Related
I want to merge many CSV-files into one (a few hundred files) removing the header row of the added CSVs.
As the files sit in several subfolders I need to start from the root traversing all the subfolders and process all CSVs in there. Before merging I want to archive them with zip deleting old CSVs. The new merged CSV-file and the zip-archive should be named like their parent folder.
In case the Script is started again for the same folder none of already processed files should be damaged or removed accidentally.
I am not a Powershell guy so I have been copying pasting from several resources in the web and came up with the following solution (Sorry don't remember the resources feel free to put references in the comment if you know).
This patch-work code does the job but it doesn't feel very bulletproof. For now it is processing the CSV files in the subfolders only. Processing the files within the given $targDir as well would also be nice.
I am wondering if it could be more compact. Suggestions for improvement are appreciated.
$targDir = "\\Servername\folder\"; #path
Get-ChildItem "$targDir" -Recurse -Directory |
ForEach-Object { #walkinthrough all subfolder-paths
#
Set-Location -Path $_.FullName
#remove existing AllInOne.csv (targed name for a merged file) in case it has been left over from a previous execution.
$FileName = ".\AllInOne.csv"
if (Test-Path $FileName) {
Remove-Item $FileName
}
#remove existing AllInOne.csv (targed name for archived files) in case it has been left over from a previous execution.
$FileName = ".\AllInOne.zip"
if (Test-Path $FileName) {
Remove-Item $FileName
}
#compressing all csv files in the current path, temporarily named AllInOne.zip. Doing that for each file adding it to the archive (with -Update)
# I wonder if there is a more efficient way to do that.
dir $_.FullName | where { $_.Extension -eq ".csv"} | foreach { Compress-Archive $_.FullName -DestinationPath "AllInOne.zip" -Update}
##########################################################
# This code is basically merging all the CSV files
# skipping the header of added files
##########################################################
$getFirstLine = $true
get-childItem ".\*.csv" | foreach {
$filePath = $_
$lines = $lines = Get-Content $filePath
$linesToWrite = switch($getFirstLine) {
$true {$lines}
$false {$lines | Select -Skip 1}
}
$getFirstLine = $false
Add-Content ".\AllInOne.csv" $linesToWrite
# Output file is named AllInOne.csv temporarily - this is not a requirement
# It was simply easier for me to come up with this temp file in the first place (symptomatic for copy&paste).
}
#########################################################
#deleting old csv files
dir $_.FullName | where { $_.Extension -eq ".csv" -and $_ -notlike "AllInOne.csv"} | foreach { Remove-Item $_.FullName}
# Temporarily rename AllinOne files with parent folder name
Get-ChildItem -Path $_.FullName -Filter *.csv | Rename-Item -NewName {$_.Basename.Replace("AllInOne",$_.Directory.Name) + $_.extension}
Get-ChildItem -Path $_.FullName -Filter *.zip | Rename-Item -NewName {$_.Basename.Replace("AllInOne",$_.Directory.Name) + $_.extension}
}
I have been executing it in the Powershell ISE. The Script is for a house keeping only, executed casually and not on a regular base - so performance doesn't matter so much.
I prefer to stick with a script that doesn't depend on additional libraries if possible (e.g. for Zip).
It may not be bulletproof, but I have seen worse cobbled together scripts. It'll definitely do the job you want it to, but here are some small changes that will make it a bit shorter and harder to break.
Since all your files are CSVs and all would have the same headers, you can use Import-CSV to compile all of the files into an array. You won't have to worry about stripping the headers or accidentally removing a row.
Get-ChildItem "*.csv" | Foreach-Object {
$csvArray += Import-CSV $_
}
Then you can just use Export-CSV -Path $_.FullName -NoTypeInformation to output it all in to a new CSV file.
To have it check the root folder and all the subfolders, I would throw all of the lines in the main ForEach loop into a function and then call it once for the root folder and keep the existing loop for all the subfolders.
function CompileCompressCSV {
param (
[string] $Path
)
# Code from inside the ForEach Loop
}
# Main Script
CompileCompressCSV -Path $targetDir
Get-ChildItem -Path $targetDir -Recurse -Directory | ForEach-Object {
CompileCompressCSV -Path $_.FullName
}
This is more of a stylistic choice, but I would do the steps of this script in a slightly different order:
Get Parent Folder Name
Remove old compiled CSVs and ZIPs
Compile CSVs into an array and output with Parent Folder Name
ZIP together CSVs into a file with the Parent Folder Name
Remove all CSV files
Personally, I'd rather name the created files properly the first time instead of having to go back and rename them unless there is absolutely no way around it. That doesn't seem the case for your situation so you should be able to create them with the right name on the first go.
So I have danced with this off and on throughout the day and the timeless phrase "There's more than one way to skin a cat" keeps coming to mind so I decided to take to the community.
Scenario:
Source folder "C:\Updates" has 100 files of various extensions. All need to be copied to the sub-folders only of "C:\Prod\" overwriting any duplicates that it may find.
The Caveats:
The sub-folder names (destinations) in "C:\Prod" are quite dynamic and change frequently.
A naming convention is used to determine which sub-folders in the destination need to be excluded when the source files are being copied (to retain the original versions). For ease of explanation lets say any folder names starting with "!stop" should be excluded from the copy process. (!stop* if wildcards considered)
So, here I am wanting the input of those greater than I to tackle this in PS if I'm lucky. I've tinkered with Copy-Item and xcopy today so I'm excited to hear other's input.
Thanks!
-Chris
Give this a shot:
Get-ChildItem -Path C:\Prod -Exclude !stop* -Directory `
| ForEach-Object { Copy-Item -Path C:\Updates\* -Destination $_ -Force }
This grabs each folder (the -Directory switch ensures we only grab folders) in C:\Prod that does not match the filter and pipes it to the ForEach-Object command where we are running the Copy-Item command to copy the files to the directory.
The -Directory switch is not available in every version of PowerShell; I do not know which version it was introduced in off the top of my head. If you have an older version of PowerShell that does not support -Directory then you can use this script:
Get-ChildItem -Path C:\Prod -Exclude !stop* `
| Where-Object { $_.PSIsContainer } `
| ForEach-Object { Copy-Item -Path C:\Updates\* -Destination $_ -Force }
To select only sub folders which do not begin with "!stop" do this
$Source = "C:\Updates\*"
$Dest = "C:\Prod"
$Stop = "^!stop"
$Destinations = GCI -Path $Dest |?{$_.PSIsContainer -and $_.Name -notmatch $Stop }
ForEach ($Destination in $Destinations) {
Copy-Item -Path $Source -Destination $Destination.FullName -Force
}
Edited Now copies all files from Update to subs of Source not beginning with "!stop" The -whatif switch shows what would happen, to arm the script remove the -whatif.
Edit2 Streamlined the script. If also Sub/sub-folders of C:\Prod shall receive copies include a -rec option to the gci just in front of he pipe.
I'm using the following command to copy a directory tree from one folder to another.
Copy-Item $SOURCE $DEST -Filter {PSIsContainer} -Recurse -Force -Verbose
The verbose option is correctly showing each folder that is copied. However, I would like to tell the Verbose option to only shows the first level of the subfolders that are copied. Hence the subfolders/subfolders/... etc wouldn't appear.
Is it possible?
Instead of using the -Verbose option, you could use the -PassThru option to process the successfully processed items via the pipeline. In the following example, I am assuming that $DEST is the existing directory in which the newly copied directory will appear. (You cannot call Get-Item on non-existant objects.)
$SOURCE = Get-Item "foo"
$DEST = Get-Item "bar"
Copy-Item $SOURCE $DEST -Filter {PSIsContainer} -Recurse -Force -PassThru | Where-Object {
# Get the parent object. The required member is different between
# files and directories, which makes this a bit more complex than it
# might have been.
if ($_.GetType().Name -eq "DirectoryInfo") {
$directory = $_.Parent
} else {
$directory = $_.Directory
}
# Select objects as required, in this case only allow through
# objects where the second level parent is the pre-existing target
# directory.
$directory.Parent.FullName -eq $DEST.FullName
}
Count the number of backslashes in the path and add logic to select first level only perhaps. Something like this perhaps?
$Dirs=get-childitem $Source -Recurse | ?{$_.PSIsContainer}
Foreach ($Dir in $Dirs){
$Level=([regex]::Match($Dir.FullName,"'b")).count
if ($Level -eq 1){Copy-Item $Dir $DEST -Force -Verbose}
else{Copy-Item $Dir $DEST -Force}}
*Edited to include looping and logic per requirements
I would suggest using robocopy instead of copy-item. Its /LEV:n switch sounds like it's exactly what you're looking for. Example (you'll need to test & tweak to meet your requirements):
robocopy $source $dest /LEV:2
robocopy has approximately 7 gazillion options you can specify to get some very useful and interesting behavior out of it.
I need to copy only certain parts of a folder using Powershell, specifically this list:
$files = #("MyProgram.exe",
"MyProgram.exe.config",
"MyProgram.pdb",
".\XmlConfig\*.xml")
In human readable form: 3 specific MyProgram.* files under root of target folder and all XML files under XmlConfig folder which itself is under root of source path (..\bin\Release\ in my case). XmlConfig folder must be created in destination, if it does not exist.
What I have tried:
(1) I tried the following, but it did not work, i.e. no folder or files were created at the destination path:
Copy-Item -Recurse -Path "..\bin\Release\" -Destination ".\Test\" -Include $files
(2) When -Include is removed, whole folder structure is successfully created, including subfolders and files:
Copy-Item -Recurse -Path "..\bin\Release\" -Destination ".\Test\"
It must be something wrong with my understanding of how -Include filter works:
(3) I tested an assumption that -Include needs an array of wildcards, but this did not work either:
$files = #("*MyProgram.exe*",
"*MyProgram.exe.config*",
"*MyProgram.pdb*",
"*.\XmlConfig\*.xml*")
Please advise on how to properly do Copy-Item in my case.
UPDATE (based on below answers):
I am looking for a generic implementation that takes an array of strings. It opens the possibility to put all necessary files/paths in one place, for easy editing, so that a non-Powershell knowledgeable person can understand and modify it as required. So in the end it would be single script to perform XCOPY deployments for any project, with input file being the only variable part. For above example, the input would look like this (saved as input.txt and passed as an argument to the main script):
MyProgram.exe
MyProgram.exe.config
MyProgram.pdb
.\XmlConfig\*.xml
I would prefer wildcards approach, since not many people know regex.
i don't know what is wrong with filter but you can still do
$files | % { copy-item ..\bin\release\$_ -Destination .\test}
if you want to preserve directoty structure you'll have to weak this a little, like :
$sourcedir="c:\temp\test"
$f=#("existing.txt","hf.csv";"..\dir2\*.txt")
$f |%{
$source=ls (join-Path $sourcedir $_) |select -expand directoryname
if ("$source" -like "$sourcedir*"){
$destination=$source.Substring($sourcedir.Length)+".\"
}
else{
$destination=$_
}
copy-item $sourcedir\$_ -Destination $destination -WhatIf
}
AFAICT -Include works only with file names or directory names and not combinations i.e. paths. You can try something like this:
$files = 'MyProgram\.exe|MyProgram\.exe\.config|MyProgram\.pdb|XmlConfig\\.*?\.xml'
Get-ChildItem ..\bin\release -r | Where {!$_.PSIsContainer -and ($_.FullName -match $files)} |
Copy-Item -Dest .\test
With wildcards you could do it this way:
$files = #('*MyProgram.exe','*MyProgram.exe.config','*MyProgram.pdb','*\XmkConfig\*.xml')
Get-ChildItem ..\bin\release -r |
Foreach {$fn=$_.Fullname;$_} |
Where {!$_.PSIsContainer -and ($files | Where {$fn -like $_})} |
Copy-Item -Dest .\test
I'm trying to create a script to delete cabinet files in virtual servers. For some reason, the code that I've created ends up not deleting any cabinet files and instead tries to delete the entire WINDOWS Directory, and I have no idea why this is occurring. Was curious if anyone might have any ideas on what the issue may be, since I can't find anything:
$dir = "\\$server" + '\C$\windows'
$cabinetArray = #()
foreach ($item in get-childitem -path $dir){
if ($item.name -like "*.cab"){
$cabinetArray = $cabinetArray + $item
}
}
for ($i = 0; $i -le $cabinetArray.length; $i++){
$removal = $dir + "\" + $cabinetArray[$i]
remove-item $removal -force -recurse
}
I did some testing and it seems that for some reason my array that I'm trying to use to gather all the cabinet files isn't even getting filled for some reason. I'm not sure if there's a specific way to only gather the .cab files since right now whenever I run this on my test server it tries deleting everything.
I don't know if deleting all the cab files in that folder is a good idea or not, but I'll answer your question. You're doing a lot of math and building your own collection of objects when PoweShell will do it all for you. Try something like this:
$dir = "\\" + $server + '\C$\windows'
$cabinetFiles = Get-ChildItem -Path $dir -Filter "*.cab" -Recurse
$cabinetFiles | %{
Remove-Item -Path $_.FullName -Force
}
Or, as a one liner:
Get-ChildItem -Path ("\\" + $server + '\C$\windows') -Filter "*.cab" -Recurse | %{Remove-Item -Path $_.FullName -Force}
Use the pipeline, here's a simplified version of your code (remove -WhatIf do delete the files). The code gets all *.cab files from the windows directory of the remote box (recursively), makes sure that only file objects passes on and then deletes them.
Get-ChildItem "\\$server\admin$" -Filter *.cab -Recurse |
Where-Object {!$_.PSIsContainer} |
Remove-Item -Force -WhatIf
For some reason, the code that I've created ends up not deleting any cabinet files and instead tries to delete the entire WINDOWS Directory, and I have no idea why this is occurring.
It is occurring because your for loop is being entered, and that is happening because $cabinetArray's length is zero. Once the for loop is entered, the $removal variable is assigned the value of $dir plus a trailing backslash. You are then calling remove-item on the windows directory.