how to make this script run faster for a huge fileshare? - powershell

New here and getting to learn powershell, so forgive me for mistakes.
A senior staff had left abruptly and i was tasked to finding out all folders in DFS that the employee had access to (security reasons).
Couldn't find a script that does that for me (to scan 14TB of DFS shares to find what folders user or his group memberships may have access to), so just wrote my own.
Its working fine but too slow for my liking, wondering if it can be tuned to run faster.
running it in 2 parts to save folders first, then user each folder path to get ACL permissions and filter against the username to a csv (with ~ as delimiter to avoid messing with commas).
using powershell 5.1
$ErrorActionPreference = "Continue"
#$rootDirectory = 'C:\temp'
$rootDirectory = '\\?\UNC\myServer\myShare'
$scriptName = 'myACL'
$version = 1.0
$dateStamp = (Get-Date).ToString('yyyyMMddHHmm')
$scriptDirectory = $PSScriptRoot
$log = $scriptDirectory + "\" + $scriptName + "_dirList_v" + $version + "_"+$dateStamp+".log"
"Path" | Out-File $log
function getSubfolders ([String]$arg_directory, [string]$arg_log)
{
$subFolders = Get-ChildItem -LiteralPath $arg_directory -Directory -Force -ErrorAction SilentlyContinue | Select-Object -expandProperty FullName
$subFolders | Out-File $arg_log -append
#"just before loop" | Out-File $arg_log -append
foreach ($folder in $subFolders)
{
#"working on $folder" | Out-File $arg_log -append
getSubfolders $folder $arg_log
}
#"returning from function" | Out-File $arg_log -append
}
#part1
getSubfolders $rootDirectory $log
#part2
$dirListSourceFile = $log
$log2 = $scriptDirectory + "\" + $scriptName + "_permissionList_v" + $version + "_"+$dateStamp+".csv"
$i=0
"Sr~Path~User/Group~Rights~isInherited?" | Out-File $log2
Start-Sleep -s 2
Import-CSV $dirListSourceFile | ForEach-Object{
$i++
$path = $_.path.Trim()
$Acl = get-acl $path | Select *
ForEach ($Access in $Acl.Access)
{
if($Access.IdentityReference.value -eq "mydomain\user1" -or $Access.IdentityReference.value -eq "mydomain\sg1" -or $Access.IdentityReference.value -eq "mydomain\sg2" -or $Access.IdentityReference.value -eq "mydomain\sg3" -or $Access.IdentityReference.value -eq "mydomain\sg4")
{
"$i~$path~$($Access.IdentityReference.value)~$($Access.FileSystemRights)~$($Access.IsInherited)" | Out-File $log2 -append
}
}
}

As you can read in the comments, if you have the possibility to run the code locally do it. You can use the same technique, as you did in case of the UNC path, for local paths - e.g. \\?\C:\directoy
see:
https://learn.microsoft.com/en-us/windows/win32/fileio/maximum-file-path-limitation?tabs=registry
Furthermore you did write a recursive function but thats not necessary in this case as get-childitem has this feature built in. Currently you call for each subfolder get-childitem again and also you write each time a log entry to the disk. Its faster to collect the data and write it to the disk one time, e.g.:
#Get paths locally and add \\?\ in combination with -pspath to overcome 256 string length limit, add -recurse for recursive enumeration
$folders = get-childitem -PSPath "\\?\[localpath]" -Directory -Recurse -ErrorAction:SilentlyContinue
#Write to lofile
$folders.fullname | set-content -Path $arg_log
Also if you want to optimize performance avoid unecessary operations like this:
$Acl = get-acl $path | Select *
get-acl gives you a complete object and you take it send it over the pipeline and select all (*) properties from it. Why? This $Acl = get-acl $path is enough.
Finally you may use io classes directly, instead of get-childitem - see:
How to speed up Powershell Get-Childitem over UNC

Related

Get-childitem is not working with foreach loop

Trying to delete files from multiple path,
So have created a csv file like path,days,filter
importing this file in shell and looping over each object to delete contents, but getchilditem is failing again and again,
Unable to understand reason behind that,
Below is the code what m trying to achieve
Start-Transcript -Path "D:\delete.log"
$pathlist= Import-csv -LiteralPath "D:\diskpath.csv"
$count = 0
foreach($p in $pathlist){
Write-host $p.path " | " $p.days -ForegroundColor DarkCyan
$path = $p.path
$days = $p.days
$filter = $p.filter
Get-ChildItem -Path $path -Filter $filter | where-object{$_.LastWriteTime -lt [datetime]::Now.AddDays(-$days)}|Remove-Item -Force -Verbose -Recurse -ErrorAction SilentlyContinue -Confirm $false
}
Stop-Transcript
without for loop, script executes properly, but with for loop it fails
Please let know if any further information needed on this query,
will like provide the same,
have already google and read multiple questions here at SO, but unable to find reason behind failure,
#T-Me and #Theo
Thanks for highlighting error, haven't looked type error in script my bad, whereas while manual typing in PowerShell was writing correctly, but in script made mistake, now its working-
Start-Transcript -Path "D:\delete.log"
$pathlist= Import-csv -LiteralPath "D:\diskpath.csv"
$count = 0
foreach($p in $pathlist){
Write-host $p.path " | " $p.days -ForegroundColor DarkCyan
$path = $p.path
$days = $p.days
$filter = $p.filter
Get-ChildItem -Path $path -Filter $filter | where-object{$_.LastWriteTime -lt [datetime]::Now.AddDays(-$days)}|Remove-Item -Force -Verbose -Recurse
}

Powershell: Script to search all user profiles and copy the most recent to all user profiles

I am looking to write a powershell script to search all user profiles on a server for a specific file, compare the files by the lastmodifieddate, and then copy the newest file to all user profiles. The script will also create a backup of the last three versions of the file.
I previously wrote this script for our pilot environment where only two people were accessing the app (this is for a XenApp), but the user base has now expanded and I would like to create the prod version of the script to cover future growth.
Any help would be very much appreciated. Thanks! Script below...
$SRC1 = "\\Server\c$\Users\XXXX1\AppData\Roaming\EMIESiteListManager\sitelist.xml"
$SRC2 = "\\Server\c$\Users\XXXX2\AppData\Roaming\EMIESiteListManager\sitelist.xml"
$SRC3 = "\\Server\c$\Users\XXXX3\AppData\Roaming\EMIESiteListManager\sitelist.xml"
$BKU = "\\storage\IT\EMSLM\Backup"
if ( (get-item $SRC1).LastWriteTime -gt (get-item $SRC2).LastWriteTime ) {Copy-Item $SRC1 $SRC2}
else {Copy-Item $SRC2 $SRC1}
if ( (get-item $SRC1).LastWriteTime -gt (get-item $SRC3).LastWriteTime ) {Copy-Item $SRC1 $SRC3}
else {Copy-Item $SRC3 $SRC1}
if ( (get-item $SRC1).LastWriteTime -gt (get-item $SRC2).LastWriteTime ) {Copy-Item $SRC1 $SRC2}
Remove-Item $BKU\sitelist_old_2.xml
Rename-Item $BKU\sitelist_old_1.xml $BKU\sitelist_old_2.xml
Rename-Item $BKU\sitelist.xml $BKU\sitelist_old_1.xml
Copy-Item $SRC1 $BKU
& 'C:\Program Files (x86)\Enterprise Mode Site List Manager\EMIESiteListManager.exe'
Exit
this isn't everything, but it should be a good place to start
$users = dir "\\Server\c$\Users" -Directory | select -ExpandProperty fullname
$newest = dir "\\Server\c$\Users\*\AppData\Roaming\EMIESiteListManager\sitelist.xml" | sort lastwritetime -Descending | select -First 1 -ExpandProperty fullname
$files = #()
$users | % {
$files += $newest -replace [regex]::Escape($_)
}
$newestEnd = $files | sort {$_.length} | select -f 1
$users | % {
$dest = Join-Path $_ $newestEnd
copy $newest $dest -force
}
Working off of Anthony Stringer's response I was able to build a script that meets my exact needs. Anthony's script would have worked, but was missing a couple things that I wanted:
1.) Identify all profiles with an existing sitelist.xml file and place in an array or hash table.
2.) Copy only to those user profiles where the sitelist.xml file existed (my fault, I never requested this in my original question)
Thank you Anthony for the starting point. Updated script below:
$Users = dir "\\server\c$\Users" -Directory -Exclude Public, Default, Administrator* | select -ExpandProperty fullname
$FilePath = "AppData\Roaming\EMIESiteListManager\sitelist.xml"
$UserPath = Join-Path -path $Users $filePath
$NewestFile = dir "\\server\c$\Users\*\AppData\Roaming\EMIESiteListManager\sitelist.xml" | sort lastwritetime -Descending | select -First 1 -ExpandProperty fullname
$BackUp = "\\storage\ctxvol01\appdata\IT\EMSLM\Backup"
$BackUpFile = "\\storage\ctxvol01\appdata\IT\EMSLM\Backup\sitelest.xml"
$EMSLM_Users = #()
$UserPath | ForEach {
If ((Test-Path -path $_) -eq $true)
{$EMSLM_Users += $_}
}
$EMSLM_Users | ForEach-Object {
Copy-Item $NewestFile $_ -force -erroraction silentlycontinue
}
If ($NewestFile.lastwritetime -gt $BackUpFile.lastwritetime)
{
Remove-Item $BackUp\sitelist_old_2.xml -and Rename-Item $BackUp\sitelist_old_1.xml $BackUp\sitelist_old_2.xml -and Rename-Item $BackUp\sitelist.xml $BackUp\sitelist_old_1.xml -and Copy-Item $NewestFile $BackUp
}
& 'C:\Program Files (x86)\Enterprise Mode Site List Manager\EMIESiteListManager.exe'
Exit

How to recursively remove all empty folders in PowerShell?

I need to recursively remove all empty folders for a specific folder in PowerShell (checking folder and sub-folder at any level).
At the moment I am using this script with no success.
Could you please tell me how to fix it?
$tdc='C:\a\c\d\'
$a = Get-ChildItem $tdc -recurse | Where-Object {$_.PSIsContainer -eq $True}
$a | Where-Object {$_.GetFiles().Count -eq 0} | Select-Object FullName
I am using PowerShell on Windows 8.1 version.
You need to keep a few key things in mind when looking at a problem like this:
Get-ChildItem -Recurse performs head recursion, meaning it returns folders as soon as it finds them when walking through a tree. Since you want to remove empty folders, and also remove their parent if they are empty after you remove the empty folders, you need to use tail recursion instead, which processes the folders from the deepest child up to the root. By using tail recursion, there will be no need for repeated calls to the code that removes the empty folders -- one call will do it all for you.
Get-ChildItem does not return hidden files or folders by default. As a result you need to take extra steps to ensure that you don't remove folders that appear empty but that contain hidden files or folders. Get-Item and Get-ChildItem both have a -Force parameter which can be used to retrieve hidden files or folders as well as visible files or folders.
With those points in mind, here is a solution that uses tail recursion and that properly tracks hidden files or folders, making sure to remove hidden folders if they are empty and also making sure to keep folders that may contain one or more hidden files.
First this is the script block (anonymous function) that does the job:
# A script block (anonymous function) that will remove empty folders
# under a root folder, using tail-recursion to ensure that it only
# walks the folder tree once. -Force is used to be able to process
# hidden files/folders as well.
$tailRecursion = {
param(
$Path
)
foreach ($childDirectory in Get-ChildItem -Force -LiteralPath $Path -Directory) {
& $tailRecursion -Path $childDirectory.FullName
}
$currentChildren = Get-ChildItem -Force -LiteralPath $Path
$isEmpty = $currentChildren -eq $null
if ($isEmpty) {
Write-Verbose "Removing empty folder at path '${Path}'." -Verbose
Remove-Item -Force -LiteralPath $Path
}
}
If you want to test it here's code that will create interesting test data (make sure you don't already have a folder c:\a because it will be deleted):
# This creates some test data under C:\a (make sure this is not
# a directory you care about, because this will remove it if it
# exists). This test data contains a directory that is hidden
# that should be removed as well as a file that is hidden in a
# directory that should not be removed.
Remove-Item -Force -Path C:\a -Recurse
New-Item -Force -Path C:\a\b\c\d -ItemType Directory > $null
$hiddenFolder = Get-Item -Force -LiteralPath C:\a\b\c
$hiddenFolder.Attributes = $hiddenFolder.Attributes -bor [System.IO.FileAttributes]::Hidden
New-Item -Force -Path C:\a\b\e -ItemType Directory > $null
New-Item -Force -Path C:\a\f -ItemType Directory > $null
New-Item -Force -Path C:\a\f\g -ItemType Directory > $null
New-Item -Force -Path C:\a\f\h -ItemType Directory > $null
Out-File -Force -FilePath C:\a\f\test.txt -InputObject 'Dummy file'
Out-File -Force -FilePath C:\a\f\h\hidden.txt -InputObject 'Hidden file'
$hiddenFile = Get-Item -Force -LiteralPath C:\a\f\h\hidden.txt
$hiddenFile.Attributes = $hiddenFile.Attributes -bor [System.IO.FileAttributes]::Hidden
Here's how you use it. Note that this will remove the top folder (the C:\a folder in this example, which gets created if you generated the test data using the script above) if that folder winds up being empty after deleting all empty folders under it.
& $tailRecursion -Path 'C:\a'
You can use this:
$tdc="C:\a\c\d"
$dirs = gci $tdc -directory -recurse | Where { (gci $_.fullName).count -eq 0 } | select -expandproperty FullName
$dirs | Foreach-Object { Remove-Item $_ }
$dirs will be an array of empty directories returned from the Get-ChildItem command after filtering. You can then loop over it to remove the items.
Update
If you want to remove directories that contain empty directories, you just need to keep running the script until they're all gone. You can loop until $dirs is empty:
$tdc="C:\a\c\d"
do {
$dirs = gci $tdc -directory -recurse | Where { (gci $_.fullName).count -eq 0 } | select -expandproperty FullName
$dirs | Foreach-Object { Remove-Item $_ }
} while ($dirs.count -gt 0)
If you want to ensure that hidden files and folders will also be removed, include the -Force flag:
do {
$dirs = gci $tdc -directory -recurse | Where { (gci $_.fullName -Force).count -eq 0 } | select -expandproperty FullName
$dirs | Foreach-Object { Remove-Item $_ }
} while ($dirs.count -gt 0)
Get-ChildItem $tdc -Recurse -Force -Directory |
Sort-Object -Property FullName -Descending |
Where-Object { $($_ | Get-ChildItem -Force | Select-Object -First 1).Count -eq 0 } |
Remove-Item -Verbose
The only novel contribution here is using Sort-Object to reverse sort by the directory's FullName. This will ensure that we always process children before we process parents (i.e., "tail recursion" as described by Kirk Munro's answer). That makes it recursively remove empty folders.
Off hand, I'm not sure if the Select-Object -First 1 will meaningfully improve performance or not, but it may.
Just figured I would contribute to the already long list of answers here.
Many of the answers have quirks to them, like needing to run more than once. Others are overly complex for the average user (like using tail recursion to prevent duplicate scans, etc).
Here is a very simple one-liner that I've been using for years, and works great...
It does not account for hidden files/folders, but you can fix that by adding -Force to the Get-ChildItem command
This is the long, fully qualified cmdlet name version:
Get-ChildItem -Recurse -Directory | ? { -Not ($_.EnumerateFiles('*',1) | Select-Object -First 1) } | Remove-Item -Recurse
So basically...here's how it goes:
Get-ChildItem -Recurse -Directory - Start scanning recursively looking for directories
$_.EnumerateFiles('*',1) - For each directory...Enumerate the files
EnumerateFiles will output its findings as it goes, GetFiles will output when it is done....at least, that's how it is supposed to work in .NET...for some reason in PowerShell GetFiles starts spitting out immediately. But I still use EnumerateFiles because in testing it was reliably faster.
('*',1) means find ALL files recursively.
| Select-Object -First 1 - Stop at the first file found
This was difficult to test how much it helped. In some cases it helped tremendously, other times it didn't help at all, and in some cases it slowed it down by a small amount. So I really don't know. I guess this is optional.
| Remove-Item -Recurse - Remove the directory, recursively (ensures directories that contain empty sub directories gets removed)
If you're counting characters, this could be shortened to:
ls -s -ad | ? { -Not ($_.EnumerateFiles('*',1) | select -First 1) } | rm -Recurse
-s - alias for -Recurse
-ad - alias for -Directory
If you really don't care about performance because you don't have that many files....even more so to:
ls -s -ad | ? {!($_.GetFiles('*',1))} | rm -Recurse
Side note:
While playing around with this, I started testing various versions with Measure-Command against a server with millions of files and thousands of directories.
This is faster than the command I've been using (above):
(gi .).EnumerateDirectories('*',1) | ? {-Not $_.EnumerateFiles('*',1) } | rm -Recurse
ls c:\temp -rec |%{ if ($_.PSIsContainer -eq $True) {if ( (ls $_.fullname -rec | measure |select -expand count ) -eq "0" ){ ri $_.fullname -whatif} } }
Assuming you're inside the parent folder of interest
gci . -Recurse -Directory | % { if(!(gci -Path $_.FullName)) {ri -Force -Recurse $_.FullName} }
For your case with $tdc it'll be
gci $tdc -Recurse -Directory | % { if(!(gci -Path $_.FullName)) {ri -Force -Recurse $_.FullName} }
If you just want to make sure, that you delete only folders that may contain subfolders but no files within itself and its subfolders, this may be an easier an quicker way.
$Empty = Get-ChildItem $Folder -Directory -Recurse |
Where-Object {(Get-ChildItem $_.FullName -File -Recurse -Force).Count -eq 0}
Foreach ($Dir in $Empty)
{
if (test-path $Dir.FullName)
{Remove-Item -LiteralPath $Dir.FullName -recurse -force}
}
Recursively removing empty subdirectories can also be accomplished using a "For Loop".
Before we start, let's make some subdirectories & text files to work with in $HOME\Desktop\Test
MD $HOME\Desktop\Test\0\1\2\3\4\5
MD $HOME\Desktop\Test\A\B\C\D\E\F
MD $HOME\Desktop\Test\A\B\C\DD\EE\FF
MD $HOME\Desktop\Test\Q\W\E\R\T\Y
MD $HOME\Desktop\Test\Q\W\E\RR
"Hello World" > $HOME\Desktop\Test\0\1\Text1.txt
"Hello World" > $HOME\Desktop\Test\A\B\C\D\E\Text2.txt
"Hello World" > $HOME\Desktop\Test\A\B\C\DD\Text3.txt
"Hello World" > $HOME\Desktop\Test\Q\W\E\RR\Text4.txt
First, store the following Script Block in the variable $SB. The variable can be called later using the &SB command. The &SB command will output a list of empty subdirectories contained in $HOME\Desktop\Test
$SB = {
Get-ChildItem $HOME\Desktop\Test -Directory -Recurse |
Where-Object {(Get-ChildItem $_.FullName -Force).Count -eq 0}
}
NOTE: The -Force parameter is very important. It makes sure that directories which contain hidden files and subdirectories, but are otherwise empty, are not deleted in the "For Loop".
Now use a "For Loop" to recursively remove empty subdirectories in $HOME\Desktop\Test
For ($Empty = &$SB ; $Empty -ne $null ; $Empty = &$SB) {Remove-Item (&$SB).FullName}
Tested as working on PowerShell 4.0
I have adapted the script of RichardHowells.
It doesn't delete the folder if there is a thumbs.db.
##############
# Parameters #
##############
param(
$Chemin = "" , # Path to clean
$log = "" # Logs path
)
###########
# Process #
###########
if (($Chemin -eq "") -or ($log-eq "") ){
Write-Error 'Parametres non reseignes - utiliser la syntaxe : -Chemin "Argument" -log "argument 2" ' -Verbose
Exit
}
#loging
$date = get-date -format g
Write-Output "begining of cleaning folder : $chemin at $date" >> $log
Write-Output "------------------------------------------------------------------------------------------------------------" >> $log
<########################################################################
define a script block that will remove empty folders under a root folder,
using tail-recursion to ensure that it only walks the folder tree once.
-Force is used to be able to process hidden files/folders as well.
########################################################################>
$tailRecursion = {
param(
$Path
)
foreach ($childDirectory in Get-ChildItem -Force -LiteralPath $Path -Directory) {
& $tailRecursion -Path $childDirectory.FullName
}
$currentChildren = Get-ChildItem -Force -LiteralPath $Path
Write-Output $childDirectory.FullName
<# Suppression des fichiers Thumbs.db #>
Foreach ( $file in $currentchildren )
{
if ($file.name -notmatch "Thumbs.db"){break}
if ($file.name -match "Thumbs.db"){
Remove-item -force -LiteralPath $file.FullName}
}
$currentChildren = Get-ChildItem -Force -LiteralPath $Path
$isEmpty = $currentChildren -eq $null
if ($isEmpty) {
$date = get-date -format g
Write-Output "Removing empty folder at path '${Path}'. $date" >> $log
Remove-Item -Force -LiteralPath $Path
}
}
# Invocation of the script block
& $tailRecursion -Path $Chemin
#loging
$date = get-date -format g
Write-Output "End of cleaning folder : $chemin at $date" >> $log
Write-Output "------------------------------------------------------------------------------------------------------------" >> $log
Something like this works for me. The script delete empty folders and folders containing only folder (no files, no hidden files).
$items = gci -LiteralPath E:\ -Directory -Recurse
$dirs = [System.Collections.Generic.HashSet[string]]::new([string[]]($items |% FullName))
for (;;) {
$remove = $dirs |? { (gci -LiteralPath $_ -Force).Count -eq 0 }
if ($remove) {
$remove | rm
$dirs.ExceptWith( [string[]]$remove )
}
else {
break
}
}
I wouldn't take the comments/1st post to heart unless you also want to delete files that are nested more than one folder deep. You are going to end up deleting directories that may contain directories that may contain files. This is better:
$FP= "C:\Temp\"
$dirs= Get-Childitem -LiteralPath $FP -directory -recurse
$Empty= $dirs | Where-Object {$_.GetFiles().Count -eq 0 **-and** $_.GetDirectories().Count -eq 0} |
Select-Object FullName
The above checks to make sure the directory is in fact empty whereas the OP only checks to make sure there are no files. That in turn would result in files nexted a few folders deep also being deleted.
You may need to run the above a few times as it won't delete Dirs that have nested Dirs. So it only deletes the deepest level. So loop it until they're all gone.
Something else I do not do is use the -force parameter. That is by design. If in fact remove-item hits a dir that is not empty you want to be prompted as an additional safety.
$files = Get-ChildItem -Path c:\temp -Recurse -Force | where psiscontainer ; [array]::reverse($files)
[Array]::reverse($files) will reverse your items, so you get the lowest files in hierarchy first.
I use this to manipulate filenames that have too long filepaths, before I delete them.
This is a simple approach
dir -Directory | ? { (dir $_).Count -eq 0 } | Remove-Item
This will remove up all empty folders in the specified directory $tdc.
It is also a lot faster since there's no need for multiple runs.
$tdc = "x:\myfolder" # Specify the root folder
gci $tdc -Directory -Recurse `
| Sort-Object { $_.FullName.Length } -Descending `
| ? { $_.GetFiles().Count -eq 0 } `
| % {
if ($_.GetDirectories().Count -eq 0) {
Write-Host " Removing $($_.FullName)"
$_.Delete()
}
}
#By Mike Mike Costa Rica
$CarpetasVacias = Get-ChildItem -Path $CarpetaVer -Recurse -Force -Directory | Where {(gci $_.fullName).count -eq 0} | select Fullname,Name,LastWriteTime
$TotalCarpetas = $CarpetasVacias.Count
$CountSu = 1
ForEach ($UnaCarpeta in $CarpetasVacias){
$RutaCarp = $UnaCarpeta.Fullname
Remove-Item -Path $RutaCarp -Force -Confirm:$False -ErrorAction Ignore
$testCar = Test-Path $RutaCarp
if($testCar -eq $true){
$Datem = (Get-Date).tostring("MM-dd-yyyy HH:mm:ss")
Write-Host "$Datem ---> $CountSu de $TotalCarpetas Carpetas Error Borrando Directory: $RutaCarp" -foregroundcolor "red"
}else{
$Datem = (Get-Date).tostring("MM-dd-yyyy HH:mm:ss")
Write-Host "$Datem ---> $CountSu de $TotalCarpetas Carpetas Correcto Borrando Directory: $RutaCarp" -foregroundcolor "gree"
}
$CountSu += 1
}

powershell backup script with error logging per file

Really need help creating a script that backs up, and shoots out the error along the file that did not copy
Here is what I tried:
Creating lists of filepaths to pass on to copy-item, in hopes to later catch errors per file, and later log them:
by using $list2X I would be able to cycle through each file, but copy-item loses the Directory structure and shoots it all out to a single folder.
So for now I am using $list2 and later I do copy-item -recurse to copy the folders:
#create list to copy
$list = Get-ChildItem -path $source | Select-Object Fullname
$list2 = $list -replace ("}"),("")
$list2 = $list2 -replace ("#{Fullname=") , ("")
out-file -FilePath g:\backuplog\DirList.txt -InputObject $list2
#create list crosscheck later
$listX = Get-ChildItem -path $source -recurse | Select-Object Fullname
$list2X = $listX -replace ("}"),("")
$list2X = $list2X -replace ("#{Fullname=") , ("")
out-file -FilePath g:\backuplog\FileDirList.txt -InputObject $list2X
And here I would pass the list:
$error.clear()
Foreach($item in $list2){
Copy-Item -Path $item -Destination $destination -recurse -force -erroraction Continue
}
out-file -FilePath g:\backuplog\errorsBackup.txt -InputObject $error
Any help with this is greatly appreciated!!!
The answer to complex file-copying or backup scripts is almost always: "Use robocopy."
Bill
"Want to copy all the items in C:\Scripts (including subfolders) to C:\Test? Then simply use a wildcard character..."
Next make it easier on yourself and do something like this:
$files = (Get-ChildItem $path).FullName #Requires PS 3.0
#or
$files = Get-ChildItem $path | % {$_.Fullname}
$files | Out-File $outpath
well it took me a long time, considering my response time. here is my copy function, which logs most errors(network drops, failed copies , etc) the copy function , and targetobject.
Function backUP{ Param ([string]$destination1 ,$list1)
$destination2 = $destination1
#extract new made string for backuplog
$index = $destination2.LastIndexOf("\")
$count = $destination2.length - $index
$source1 = $destination2.Substring($index, $count)
$finalstr2 = $logdrive + $source1
Foreach($item in $list1){
Copy-Item -Container: $true -Recurse -Force -Path $item -Destination $destination1 -erroraction Continue
if(-not $?)
{
write-output "ERROR de copiado : " $error| format-list | out-file -Append "$finalstr2\GCI-ERRORS-backup.txt"
Foreach($erritem in $error){
write-output "Error Data:" $erritem.TargetObject | out-file -Append "$finalstr2\GCI- ERRORS-backup.txt"
}
$error.Clear()
}
}
}

foreach copy-item not working

My Copy-Item doesn't work when included in a foreach loop.
Much like Powershell: Copy-Item not working when in ForEach loop
Only my destination folder is not set the same as the originating folder which seemed to be the problem there.
This is the very basic function. My objective is to grab the latest log files from a directory containting log files for lots of stuff. I'm only interested in a few defined in $servers. A line in Servers.txt looks like this: \\clientname\d$\logdirectory\processlog\
When I Set-Location to a path in servers.txt and run Get-ChildItem it works as expected.
I also need to generate a new folder for each object in \Logs\ but one thing at a time.
$servers = #()
$servers = Get-Content c:\Test\Servers.txt
$destServer = #()
$destServer = ( "clientname")
$destinationFolder = "\\" + $destServer + "\d$\Logs\"
foreach ($serverpath in $servers) {
Write-Host " Copying from $serverpath "
Set-Location -literalpath $serverpath |
Get-ChildItem |
Sort-Object -Descending LastWriteTime |
Select -First 2 |
Copy-Item -Destination $destinationFolder -Recurse -Force
}