Really need help creating a script that backs up, and shoots out the error along the file that did not copy
Here is what I tried:
Creating lists of filepaths to pass on to copy-item, in hopes to later catch errors per file, and later log them:
by using $list2X I would be able to cycle through each file, but copy-item loses the Directory structure and shoots it all out to a single folder.
So for now I am using $list2 and later I do copy-item -recurse to copy the folders:
#create list to copy
$list = Get-ChildItem -path $source | Select-Object Fullname
$list2 = $list -replace ("}"),("")
$list2 = $list2 -replace ("#{Fullname=") , ("")
out-file -FilePath g:\backuplog\DirList.txt -InputObject $list2
#create list crosscheck later
$listX = Get-ChildItem -path $source -recurse | Select-Object Fullname
$list2X = $listX -replace ("}"),("")
$list2X = $list2X -replace ("#{Fullname=") , ("")
out-file -FilePath g:\backuplog\FileDirList.txt -InputObject $list2X
And here I would pass the list:
$error.clear()
Foreach($item in $list2){
Copy-Item -Path $item -Destination $destination -recurse -force -erroraction Continue
}
out-file -FilePath g:\backuplog\errorsBackup.txt -InputObject $error
Any help with this is greatly appreciated!!!
The answer to complex file-copying or backup scripts is almost always: "Use robocopy."
Bill
"Want to copy all the items in C:\Scripts (including subfolders) to C:\Test? Then simply use a wildcard character..."
Next make it easier on yourself and do something like this:
$files = (Get-ChildItem $path).FullName #Requires PS 3.0
#or
$files = Get-ChildItem $path | % {$_.Fullname}
$files | Out-File $outpath
well it took me a long time, considering my response time. here is my copy function, which logs most errors(network drops, failed copies , etc) the copy function , and targetobject.
Function backUP{ Param ([string]$destination1 ,$list1)
$destination2 = $destination1
#extract new made string for backuplog
$index = $destination2.LastIndexOf("\")
$count = $destination2.length - $index
$source1 = $destination2.Substring($index, $count)
$finalstr2 = $logdrive + $source1
Foreach($item in $list1){
Copy-Item -Container: $true -Recurse -Force -Path $item -Destination $destination1 -erroraction Continue
if(-not $?)
{
write-output "ERROR de copiado : " $error| format-list | out-file -Append "$finalstr2\GCI-ERRORS-backup.txt"
Foreach($erritem in $error){
write-output "Error Data:" $erritem.TargetObject | out-file -Append "$finalstr2\GCI- ERRORS-backup.txt"
}
$error.Clear()
}
}
}
Related
New here and getting to learn powershell, so forgive me for mistakes.
A senior staff had left abruptly and i was tasked to finding out all folders in DFS that the employee had access to (security reasons).
Couldn't find a script that does that for me (to scan 14TB of DFS shares to find what folders user or his group memberships may have access to), so just wrote my own.
Its working fine but too slow for my liking, wondering if it can be tuned to run faster.
running it in 2 parts to save folders first, then user each folder path to get ACL permissions and filter against the username to a csv (with ~ as delimiter to avoid messing with commas).
using powershell 5.1
$ErrorActionPreference = "Continue"
#$rootDirectory = 'C:\temp'
$rootDirectory = '\\?\UNC\myServer\myShare'
$scriptName = 'myACL'
$version = 1.0
$dateStamp = (Get-Date).ToString('yyyyMMddHHmm')
$scriptDirectory = $PSScriptRoot
$log = $scriptDirectory + "\" + $scriptName + "_dirList_v" + $version + "_"+$dateStamp+".log"
"Path" | Out-File $log
function getSubfolders ([String]$arg_directory, [string]$arg_log)
{
$subFolders = Get-ChildItem -LiteralPath $arg_directory -Directory -Force -ErrorAction SilentlyContinue | Select-Object -expandProperty FullName
$subFolders | Out-File $arg_log -append
#"just before loop" | Out-File $arg_log -append
foreach ($folder in $subFolders)
{
#"working on $folder" | Out-File $arg_log -append
getSubfolders $folder $arg_log
}
#"returning from function" | Out-File $arg_log -append
}
#part1
getSubfolders $rootDirectory $log
#part2
$dirListSourceFile = $log
$log2 = $scriptDirectory + "\" + $scriptName + "_permissionList_v" + $version + "_"+$dateStamp+".csv"
$i=0
"Sr~Path~User/Group~Rights~isInherited?" | Out-File $log2
Start-Sleep -s 2
Import-CSV $dirListSourceFile | ForEach-Object{
$i++
$path = $_.path.Trim()
$Acl = get-acl $path | Select *
ForEach ($Access in $Acl.Access)
{
if($Access.IdentityReference.value -eq "mydomain\user1" -or $Access.IdentityReference.value -eq "mydomain\sg1" -or $Access.IdentityReference.value -eq "mydomain\sg2" -or $Access.IdentityReference.value -eq "mydomain\sg3" -or $Access.IdentityReference.value -eq "mydomain\sg4")
{
"$i~$path~$($Access.IdentityReference.value)~$($Access.FileSystemRights)~$($Access.IsInherited)" | Out-File $log2 -append
}
}
}
As you can read in the comments, if you have the possibility to run the code locally do it. You can use the same technique, as you did in case of the UNC path, for local paths - e.g. \\?\C:\directoy
see:
https://learn.microsoft.com/en-us/windows/win32/fileio/maximum-file-path-limitation?tabs=registry
Furthermore you did write a recursive function but thats not necessary in this case as get-childitem has this feature built in. Currently you call for each subfolder get-childitem again and also you write each time a log entry to the disk. Its faster to collect the data and write it to the disk one time, e.g.:
#Get paths locally and add \\?\ in combination with -pspath to overcome 256 string length limit, add -recurse for recursive enumeration
$folders = get-childitem -PSPath "\\?\[localpath]" -Directory -Recurse -ErrorAction:SilentlyContinue
#Write to lofile
$folders.fullname | set-content -Path $arg_log
Also if you want to optimize performance avoid unecessary operations like this:
$Acl = get-acl $path | Select *
get-acl gives you a complete object and you take it send it over the pipeline and select all (*) properties from it. Why? This $Acl = get-acl $path is enough.
Finally you may use io classes directly, instead of get-childitem - see:
How to speed up Powershell Get-Childitem over UNC
Trying to delete files from multiple path,
So have created a csv file like path,days,filter
importing this file in shell and looping over each object to delete contents, but getchilditem is failing again and again,
Unable to understand reason behind that,
Below is the code what m trying to achieve
Start-Transcript -Path "D:\delete.log"
$pathlist= Import-csv -LiteralPath "D:\diskpath.csv"
$count = 0
foreach($p in $pathlist){
Write-host $p.path " | " $p.days -ForegroundColor DarkCyan
$path = $p.path
$days = $p.days
$filter = $p.filter
Get-ChildItem -Path $path -Filter $filter | where-object{$_.LastWriteTime -lt [datetime]::Now.AddDays(-$days)}|Remove-Item -Force -Verbose -Recurse -ErrorAction SilentlyContinue -Confirm $false
}
Stop-Transcript
without for loop, script executes properly, but with for loop it fails
Please let know if any further information needed on this query,
will like provide the same,
have already google and read multiple questions here at SO, but unable to find reason behind failure,
#T-Me and #Theo
Thanks for highlighting error, haven't looked type error in script my bad, whereas while manual typing in PowerShell was writing correctly, but in script made mistake, now its working-
Start-Transcript -Path "D:\delete.log"
$pathlist= Import-csv -LiteralPath "D:\diskpath.csv"
$count = 0
foreach($p in $pathlist){
Write-host $p.path " | " $p.days -ForegroundColor DarkCyan
$path = $p.path
$days = $p.days
$filter = $p.filter
Get-ChildItem -Path $path -Filter $filter | where-object{$_.LastWriteTime -lt [datetime]::Now.AddDays(-$days)}|Remove-Item -Force -Verbose -Recurse
}
Hi this is the first time I create a program using powershell, I created a powershell script to move old files that are not in use to a NAS, the code works as what I want, but I need a Log.txt file for find out what files have been moved. can someone please help me?
$Date = Get-Date -UFormat %d-%m-%Y
$Source = 'C:\Source\'
$Temp = 'C:\Backup-Temp\'
$Nas = 'D:\Destination\'
$Ext = "*.zip","*.rar"
$SetTime = '-5'
New-Item -Path $Temp -Name "Backup-$Date" -ItemType "directory"
Foreach ($Ext in $Ext) {
get-childitem -Path "$Source" -Recurse |
Where-object {$_.LastWriteTime -lt (get-date).AddDays($SetTime) -and $_.name -like "$Ext"} |
Move-item -destination "$Temp\Backup-$Date" |
Compress-Archive -Path "$Temp\Backup-$Date" -DestinationPath "$Nas\Backup-$date.Zip"
}
$Date = Get-Date -UFormat %d-%m-%Y
$Source = 'C:\Source\'
$Temp = 'C:\Backup-Temp\'
$Nas = 'D:\Destination\'
$Ext = "*.zip","*.rar"
$SetTime = '-5'
$LogFileFullName = 'c:\tmp\log.txt'
function Write-Log([string]$msg){
Out-File -FilePath $LogFileFullName -InputObject "$([DateTime]::Now): $msg" -Append
}
New-Item -Path $Temp -Name "Backup-$Date" -ItemType "directory"
Foreach ($Ext in $Ext) {
get-childitem -Path "$Source" -Recurse |
Where-object {$_.LastWriteTime -lt (get-date).AddDays($SetTime) -and $_.name -like "$Ext"} |
ForEach-Object {
Move-item $_.FullName -destination "$Temp\Backup-$Date" |
Compress-Archive -Path "$Temp\Backup-$Date" -DestinationPath "$Nas\Backup-$date.Zip"
Write-Log $_.FullName
}
}
You could just add the following to your chain of piped commands:
Add-Content $logfile "$_.name`n"
where $logfile is set to a static filename prior.
I may be old-fashioned, but having so many commands in one chain is prone to failure. It would be more resilient to break-up the chain so that you can include some error checking along the way.
A better but less desirable option would be to put your chain within a try/catch block.
Best of luck.
The logging only traps on the IDs in my text file (get-content) it does not print the file name which gets copied
I've tried using the log option with robocopy however it only logs the last enter in my get-content text file
$Source = "F:\Temp\"
$Test = "F:\Output\"
$Daily_Results="F:\Output\Test\"
foreach ($ID in Get-Content F:\Files\files.txt) {
$ID
Get-ChildItem -Path $Source | foreach {
if($_ -match $ID) {
$Path=$Source+"$_\"
$Path
robocopy $path $test
Write-Host $Path
"File copied"
Write-Output $ID "File copied" | Out-File $Daily_Results\$(get-date -f yyyy-MM-dd)_CopyMove_Results.txt -append
Write-Output $_ | Out-File $Daily_Results\$(get-date -f yyyy-MM-dd)_CopyMove_Results.txt -append
}
}
}
What happens is that $_ gets you the full object, you need to explicitly say you want the Name.
Write-Output $_.Name | Out-File $Daily_Results\$(get-date -f yyyy-MM-dd)_CopyMove_Results.txt -append
As you are copying the files one-by-one, I see no real advantage in using robocopy here, but rather user PowerShell's own Copy-Item cmdlet.
Because you didn't say what the $ID from the text file could be, from the code you gave I gather that it is some string that must be part of the file name to copy.
This should work for you then
$Source = 'F:\Temp'
$Destination = 'F:\Output'
$Daily_Results = Join-Path -Path 'F:\Output\Test' -ChildPath ('{0:yyyy-MM-dd}_CopyMove_Results.txt' -f (Get-Date))
foreach ($ID in Get-Content F:\Files\files.txt) {
Get-ChildItem -Path $Source -File -Recurse | Where-Object { $_.Name -like "*$ID*" } | ForEach-Object {
$_ | Copy-Item -Destination $Destination -Force
Write-Host "File '$($_.Name)' copied"
# output to the log file
"$ID File copied: '$($_.Name)'" | Out-File -FilePath $Daily_Results -Append
}
}
I have the following powershell code in which,
backup of (original) files in folder1 is taken in folder2 and the files in folder1 are updated with folder3 files.
The concept of hotfix !!
cls
[xml] $XML = Get-content -Path <path of xml>
$main = $XML.File[0].Path.key
$hotfix = $XML.File[1].Path.key
$backup = $XML.File[2].Path.key
Get-ChildItem -Path $main | Where-Object {
Test-Path (Join-Path $hotfix $_.Name)
} | ForEach-Object {
Copy-Item $_.FullName -Destination $backup -Recurse -Container
}
write-host "`nBack up done !"
Get-ChildItem -Path $hotfix | ForEach-Object {Copy-Item $_.FullName -Destination $main -force}
write-host "`nFiles replaced !"
Now, as the backup of files is taken in folder2, I need to create a log file which contains - name of the file whose backup is taken, date and time, location where the backup is taken
Can anyone please help me with this?
I did the following code, but its of no use, as I cannot sync the both.
cls
$path = "C:\Log\Nlog.log"
$numberLines = 25
For ($i=0;$i -le $numberLines;$i++)
{
$SampleString = "Added sample {0} at {1}" -f $i,(Get-Date).ToString("h:m:s")
add-content -Path $path -Value $SampleString -Force
}
Any help or a different approach is appreciated !!
You can use the -PassThru switch parameter to have Copy-Item return the new items it just copied - then do the logging immediately after that, inside the ForEach-Object scriptblock:
| ForEach-Object {
$BackupFiles = Copy-Item $_.FullName -Destination $backup -Recurse -Container -PassThru
$BackupFiles |ForEach-Object {
$LogMessage = "[{0:dd-MM-yyyy hh:mm:ss.fff}]File copied: {1}" -f $(Get-Date),$_.FullName
$LogMessage | Out-File ".\backups.log" -Append
}
}