The logging only traps on the IDs in my text file (get-content) it does not print the file name which gets copied
I've tried using the log option with robocopy however it only logs the last enter in my get-content text file
$Source = "F:\Temp\"
$Test = "F:\Output\"
$Daily_Results="F:\Output\Test\"
foreach ($ID in Get-Content F:\Files\files.txt) {
$ID
Get-ChildItem -Path $Source | foreach {
if($_ -match $ID) {
$Path=$Source+"$_\"
$Path
robocopy $path $test
Write-Host $Path
"File copied"
Write-Output $ID "File copied" | Out-File $Daily_Results\$(get-date -f yyyy-MM-dd)_CopyMove_Results.txt -append
Write-Output $_ | Out-File $Daily_Results\$(get-date -f yyyy-MM-dd)_CopyMove_Results.txt -append
}
}
}
What happens is that $_ gets you the full object, you need to explicitly say you want the Name.
Write-Output $_.Name | Out-File $Daily_Results\$(get-date -f yyyy-MM-dd)_CopyMove_Results.txt -append
As you are copying the files one-by-one, I see no real advantage in using robocopy here, but rather user PowerShell's own Copy-Item cmdlet.
Because you didn't say what the $ID from the text file could be, from the code you gave I gather that it is some string that must be part of the file name to copy.
This should work for you then
$Source = 'F:\Temp'
$Destination = 'F:\Output'
$Daily_Results = Join-Path -Path 'F:\Output\Test' -ChildPath ('{0:yyyy-MM-dd}_CopyMove_Results.txt' -f (Get-Date))
foreach ($ID in Get-Content F:\Files\files.txt) {
Get-ChildItem -Path $Source -File -Recurse | Where-Object { $_.Name -like "*$ID*" } | ForEach-Object {
$_ | Copy-Item -Destination $Destination -Force
Write-Host "File '$($_.Name)' copied"
# output to the log file
"$ID File copied: '$($_.Name)'" | Out-File -FilePath $Daily_Results -Append
}
}
Related
I've the following code:
$readPath = "C:\FirstFolder"
$writePath = "C:\SecondFolder"
Function Recurse($folder, $lvl) {
Get-ChildItem -Path $folder -File | ForEach-Object {
Write-Host "$(" "*$lvl)> $($_.Name) - $((Get-Acl -Path $_.FullName).Owner)"
}
Get-ChildItem -Path $folder -Directory | ForEach-Object {
Write-Host "$(" "*$lvl)> $($_.Name)"
Recurse -folder $_.FullName -lvl $($lvl+1)
}
}
$root = Get-Item -Path $readPath
Recurse -folder $root.FullName -lvl 0
which gives an output like this:
> File0.xlsx - OwnerB
> Directory1
>File1.1.txt - OwnerB
>File1.2.ppt - OwnerA
>Directory2
>File2.1 - OwnerA
>File2.2 - OwnerA
When I add the code $log | Out-File $writePath\OwnerTree.txt -Encoding UTF8, my output file is blank.
Anyone know how to get an output file with the same layout as what appears in PowerShell?
Just a few things:
Make your function output the strings instead of using Write-Host, of which the sole purpuse is to write to the console screen for display
capture the result of your function in a variable and save that to a file
If you want to write both to file AND to the console, use Set-Content instead of Out-File, because that also has a switch -PassThru
function Get-OwnerTree ([string]$folder, [int]$lvl) {
Get-ChildItem -Path $folder -File | ForEach-Object {
"$(" "*$lvl)> $($_.Name) - $((Get-Acl -Path $_.FullName).Owner)"
}
Get-ChildItem -Path $folder -Directory | ForEach-Object {
"$(" "*$lvl)> $($_.Name)"
Get-OwnerTree -folder $_.FullName -lvl (++$lvl)
}
}
$root = Get-Item -Path $readPath
$log = Get-OwnerTree -folder $root.FullName -lvl 0
$log | Set-Content -Path "$writePath\OwnerTree.txt" -Encoding UTF8 -PassThru
I have also changed the function name to comply with PowerShell's Verb-Noun naming convention
I am trying to construct a script that moves through specific folders and the log files in it, and filters the error codes. After that it passes them into a new file.
I'm not really sure how to do that with for loops so I'll leave my code bellow.
If someone could tell me what I'm doing wrong, that would be greatly appreciated.
$file_name = Read-Host -Prompt 'Name of the new file: '
$path = 'C:\Users\user\Power\log_script\logs'
Add-Type -AssemblyName System.IO.Compression.FileSystem
function Unzip
{
param([string]$zipfile, [string]$outpath)
[System.IO.Compression.ZipFile]::ExtractToDirectory($zipfile, $outpath)
}
if ([System.IO.File]::Exists($path)) {
Remove-Item $path
Unzip 'C:\Users\user\Power\log_script\logs.zip' 'C:\Users\user\Power\log_script'
} else {
Unzip 'C:\Users\user\Power\log_script\logs.zip' 'C:\Users\user\Power\log_script'
}
$folder = Get-ChildItem -Path 'C:\Users\user\Power\log_script\logs\LogFiles'
$files = foreach($logfolder in $folder) {
$content = foreach($line in $files) {
if ($line -match '([ ][4-5][0-5][0-9][ ])') {
echo $line
}
}
}
$content | Out-File $file_name -Force -Encoding ascii
Inside the LogFiles folder are three more folders each containing log files.
Thanks
Expanding on a comment above about recursing the folder structure, and then actually retrieving the content of the files, you could try something line this:
$allFiles = Get-ChildItem -Path 'C:\Users\user\Power\log_script\logs\LogFiles' -Recurse
# iterate the files
$allFiles | ForEach-Object {
# iterate the content of each file, line by line
Get-Content $_ | ForEach-Object {
if ($_ -match '([ ][4-5][0-5][0-9][ ])') {
echo $_
}
}
}
It looks like your inner loop is of a collection ($files) that doesn't yet exist. You assign $files to the output of a ForEach(...) loop then try to nest another loop of $files inside it. Of course at this point $files isn't available to be looped.
Regardless, the issue is you are never reading the content of your log files. Even if you managed to loop through the output of Get-ChildItem, you need to look at each line to perform the match.
Obviously I cannot completely test this, but I see a few issues and have rewritten as below:
$file_name = Read-Host -Prompt 'Name of the new file'
$path = 'C:\Users\user\Power\log_script\logs'
$Pattern = '([ ][4-5][0-5][0-9][ ])'
if ( [System.IO.File]::Exists( $path ) ) { Remove-Item $path }
Expand-Archive 'C:\Users\user\Power\log_script\logs.zip' 'C:\Users\user\Power\log_script'
Select-String -Path 'C:\Users\user\Power\log_script\logs\LogFiles\*' -Pattern $Pattern |
Select-Object -ExpandProperty line |
Out-File $file_name -Force -Encoding ascii
Note: Select-String cannot recurse on its own.
I'm not sure you need to write your own UnZip function. PowerShell has the Expand-Archive cmdlet which can at least match the functionality thus far:
Expand-Archive -Path <SourceZipPath> -DestinationPath <DestinationFolder>
Note: The -Force parameter allows it to over write the destination files if they are already present. which may be a substitute for testing if the file exists and deleting if it does.
If you are going to test for the file that section of code can be simplified as:
if ( [System.IO.File]::Exists( $path ) ) { Remove-Item $path }
Unzip 'C:\Users\user\Power\log_script\logs.zip' 'C:\Users\user\Power\log_script'
This is because you were going to run the UnZip command regardless...
Note: You could also use Test-Path for this.
Also there are enumerable ways to get the matching lines, here are a couple of extra samples:
Get-ChildItem -Path 'C:\Users\user\Power\log_script\logs\LogFiles' |
ForEach-Object{
( Get-Content $_.FullName ) -match $Pattern
# Using match in this way will echo the lines that matched from each run of
# Get-Content. If nothing matched nothing will output on that iteration.
} |
Out-File $file_name -Force -Encoding ascii
This approach will read the entire file into an array before running the match on it. For large files it may pose a memory issue, however it enabled the clever use of -match.
OR:
Get-ChildItem -Path 'C:\Users\user\Power\log_script\logs\LogFiles' |
Get-Content |
ForEach-Object{ If( $_ -match $Pattern ) { $_ } } |
Out-File $file_name -Force -Encoding ascii
Note: You don't need the alias echo or its real cmdlet Write-Output
UPDATE: After fuzzing around a bit and trying different things I finally got it to work.
I'll include the code below just for demonstration purposes.
Thanks everyone
$start = Get-Date
"`n$start`n"
$file_name = Read-Host -Prompt 'Name of the new file: '
Out-File $file_name -Force -Encoding ascii
Expand-Archive -Path 'C:\Users\User\Power\log_script\logs.zip' -Force
$i = 1
$folders = Get-ChildItem -Path 'C:\Users\User\Power\log_script\logs\logs\LogFiles' -Name -Recurse -Include *.log
foreach($item in $folders) {
$files = 'C:\Users\User\Power\log_script\logs\logs\LogFiles\' + $item
foreach($file in $files){
$content = Get-Content $file
Write-Progress -Activity "Filtering..." -Status "File $i of $($folders.Count)" -PercentComplete (($i / $folders.Count) * 100)
$i++
$output = foreach($line in $content) {
if ($line -match '([ ][4-5][0-5][0-9][ ])') {
Add-Content -Path $file_name -Value $line
}
}
}
}
$end = Get-Date
$time = [int]($end - $start).TotalSeconds
Write-Output ("Runtime: " + $time + " Seconds" -join ' ')
I am running a PowerShell script which I have a list of IDs in a text file that matches to files on a server and then Robocopy copies them to other servers. Recently a file that is present on a server doesn't get copied to 1 out of 2 destinations. The weird thing is it only happens approximately 10 files out of 500.
I am seeing weird errors in EventViewer. Would this cause the issue?
Event ID 400 Engine state is changed from None to Available.
Details: NewEngineState=Available PreviousEngineState=None
Event ID 403
Engine state is changed from Available to Stopped.
Details: NewEngineState=Stopped PreviousEngineState=Available
Changing copy-item to robocopy in the scripts
$CC="S:\CC 2019"
$Daily_Results="S:\~Copy_Move Daily Results\"
$Missing_Files = "S:\~Missing Files\Results\"
$Lab = "\\********\E-Drive\Logs\Files"
$Source = "S:\"
foreach ($ID in Get-Content 'S:\~Master CSC Location Files\CC_Listings.txt') {
$ID
$Count=0
Get-ChildItem -Path $Source -ErrorAction SilentlyContinue | where {$_.name -match $ID} |foreach {
$Path="S:\"+$_.name
Robocopy $Source $Lab $_.name /COPY:DAT /IS
Robocopy $Source $CC $_.name /MOV /IS
Write-Host $_.name
"File copied"
Write-Output $ID "File copied" | Out-File $Daily_Results\$(get-date -f yyyy-MM-dd)_CC_CopyMove_Results.txt -append
Write-Output $_.name | Out-File $Daily_Results\$(get-date -f yyyy-MM-dd)_CC_CopyMove_Results.txt -append
$Count++
}
If($Count –eq 0){
Write-Host $ID "No File Found"
Write-Output $ID "No File Found at $(get-date -f hh:mm:ss)" | Out-File $Missing_Files\$(get-date -f yyyy-MM-dd)_CC_Missing_Files_Results.txt -append
}
}
I have the following powershell code in which,
backup of (original) files in folder1 is taken in folder2 and the files in folder1 are updated with folder3 files.
The concept of hotfix !!
cls
[xml] $XML = Get-content -Path <path of xml>
$main = $XML.File[0].Path.key
$hotfix = $XML.File[1].Path.key
$backup = $XML.File[2].Path.key
Get-ChildItem -Path $main | Where-Object {
Test-Path (Join-Path $hotfix $_.Name)
} | ForEach-Object {
Copy-Item $_.FullName -Destination $backup -Recurse -Container
}
write-host "`nBack up done !"
Get-ChildItem -Path $hotfix | ForEach-Object {Copy-Item $_.FullName -Destination $main -force}
write-host "`nFiles replaced !"
Now, as the backup of files is taken in folder2, I need to create a log file which contains - name of the file whose backup is taken, date and time, location where the backup is taken
Can anyone please help me with this?
I did the following code, but its of no use, as I cannot sync the both.
cls
$path = "C:\Log\Nlog.log"
$numberLines = 25
For ($i=0;$i -le $numberLines;$i++)
{
$SampleString = "Added sample {0} at {1}" -f $i,(Get-Date).ToString("h:m:s")
add-content -Path $path -Value $SampleString -Force
}
Any help or a different approach is appreciated !!
You can use the -PassThru switch parameter to have Copy-Item return the new items it just copied - then do the logging immediately after that, inside the ForEach-Object scriptblock:
| ForEach-Object {
$BackupFiles = Copy-Item $_.FullName -Destination $backup -Recurse -Container -PassThru
$BackupFiles |ForEach-Object {
$LogMessage = "[{0:dd-MM-yyyy hh:mm:ss.fff}]File copied: {1}" -f $(Get-Date),$_.FullName
$LogMessage | Out-File ".\backups.log" -Append
}
}
Really need help creating a script that backs up, and shoots out the error along the file that did not copy
Here is what I tried:
Creating lists of filepaths to pass on to copy-item, in hopes to later catch errors per file, and later log them:
by using $list2X I would be able to cycle through each file, but copy-item loses the Directory structure and shoots it all out to a single folder.
So for now I am using $list2 and later I do copy-item -recurse to copy the folders:
#create list to copy
$list = Get-ChildItem -path $source | Select-Object Fullname
$list2 = $list -replace ("}"),("")
$list2 = $list2 -replace ("#{Fullname=") , ("")
out-file -FilePath g:\backuplog\DirList.txt -InputObject $list2
#create list crosscheck later
$listX = Get-ChildItem -path $source -recurse | Select-Object Fullname
$list2X = $listX -replace ("}"),("")
$list2X = $list2X -replace ("#{Fullname=") , ("")
out-file -FilePath g:\backuplog\FileDirList.txt -InputObject $list2X
And here I would pass the list:
$error.clear()
Foreach($item in $list2){
Copy-Item -Path $item -Destination $destination -recurse -force -erroraction Continue
}
out-file -FilePath g:\backuplog\errorsBackup.txt -InputObject $error
Any help with this is greatly appreciated!!!
The answer to complex file-copying or backup scripts is almost always: "Use robocopy."
Bill
"Want to copy all the items in C:\Scripts (including subfolders) to C:\Test? Then simply use a wildcard character..."
Next make it easier on yourself and do something like this:
$files = (Get-ChildItem $path).FullName #Requires PS 3.0
#or
$files = Get-ChildItem $path | % {$_.Fullname}
$files | Out-File $outpath
well it took me a long time, considering my response time. here is my copy function, which logs most errors(network drops, failed copies , etc) the copy function , and targetobject.
Function backUP{ Param ([string]$destination1 ,$list1)
$destination2 = $destination1
#extract new made string for backuplog
$index = $destination2.LastIndexOf("\")
$count = $destination2.length - $index
$source1 = $destination2.Substring($index, $count)
$finalstr2 = $logdrive + $source1
Foreach($item in $list1){
Copy-Item -Container: $true -Recurse -Force -Path $item -Destination $destination1 -erroraction Continue
if(-not $?)
{
write-output "ERROR de copiado : " $error| format-list | out-file -Append "$finalstr2\GCI-ERRORS-backup.txt"
Foreach($erritem in $error){
write-output "Error Data:" $erritem.TargetObject | out-file -Append "$finalstr2\GCI- ERRORS-backup.txt"
}
$error.Clear()
}
}
}