I would like the powershell script to generate new log file with the same name and copy old log file to new file with new name if that log file will exceed certain size, like for example log.1 file extension.
I implemented basic Add-Content command with file directory as a variable:
$logmsg = $date + " " + $status.ToString() + " " + $item + " " + $perf + " " + $output
Add-Content D:\...\..._log.txt -Value $logmsg
I don't actually need to script to create more log files like for example log.2, log.3 and etc. I just need to keep old logs in this log.1 file and if the size of original log file will exceed again, log.1 file can be overwritten.
Couldn't find any method for PS scripting specifically.
If I understand your question correctly, you would like to keep one current log file and if it exceeds a certain size, the content should be stored away in another file and the current log file should then be emptied.
You can do this like so:
$logfile = 'D:\...\..._log.txt' # this is your current log file
$oldLogs = 'D:\...\old_logs.txt' # the 'overflow' file where old log contents is written to
$maxLogSize = 1MB # the maximum size in bytes you want
$logmsg = $date + " " + $status.ToString() + " " + $item + " " + $perf + " " + $output
# check if the log file exists
if (Test-Path -Path $logfile -PathType Leaf) {
# check if the logfile is at its maximum size
if ((Get-Item -Path $logfile).Length -ge $maxLogSize) {
# append the contents of the log file to another file 'old_logs.txt'
Add-Content -Path $oldLogs -Value (Get-Content -Path $logfile)
# clear the content of the current log file
# or delete it completely with Remove-Item -Path $logfile -Force
Clear-Content -Path $logfile
}
}
# keep adding info to the current log file
Add-Content -Path $logfile -Value $logmsg
Hope this helps
Related
I am needing to write a script to automate moving all of the IIS log files from a system that has been running for months. This system consists of many web servers.
I am successful at getting the log directory moved to the new location, and then my script will move the other log files from the old directory to the new location with the exception of the current file. It cannot move because there is a file in the destination with the same name. (current log file.) I want to rename it, but do not want to rename all files in the directory. I know I could use a wildcard, but would prefer to rename only that file.
What command can I use to find the name of the file, not the directory or path? I have pieced this together from other smaller requests I have found on here and on the web.
Import-Module WebAdministration
$LogPath = "e:\Logs\IISlogs"
foreach($WebSite in $(get-website))
{
$logFile="$($Website.logFile.directory)\w3svc$($website.id)".replace("%SystemDrive%",$env:SystemDrive)
}
New-Item $LogPath -ItemType Directory
Set-ItemProperty “IIS:\Sites\Default Web Site” -name logFile.directory -value $LogPath
$path = $logpath+"\W3SVC1"
$Timeout = 60
$timer = [Diagnostics.Stopwatch]::StartNew()
while (($timer.Elapsed.TotalSeconds -lt $Timeout) -and (-not (Test-Path -Path $path )))
{
Start-Sleep -Seconds 1
$tot = $timer.Elapsed.Seconds
Write-Output -Message ("Still waiting for action to complete after " + $tot + " seconds upto " + $Timeout)
}
$timer.Stop()
Move "$($logfile)\*.*" $path
All files have a time a date stamp. Filter the file name, sort by the time stamp descending, and select the first one.
For example:
$dir = "e:\Logs\IISlogs"
($latest = Get-ChildItem -Path $dir |
Sort-Object LastAccessTime -Descending |
Select-Object -First 1)
"Current log file details: $($latest.name)"
Get-Content $latest
If you are saying the logs can be of a different name, then you need to specify the wildcard match for that in the path specification.
I have an old piece of PowerShell script that uses Add-Content in a for loop to write data to a file and it works fine when run locally on a PC's C: drive.
I've now been asked to relocate the script and files to a QNAP folder share (not sure if this has anything to do with the problem).
Now when the script runs from the share, the for loop runs infinitely - you can see the file size increasing and check the row count once you break out of the program.
It doesn't seem to matter if I use UNC or drive mapping the infinite looping still occurs.
Here is the script block:
####################
# Define Variables #
####################
$Source = 'G:\Label_Spreadsheets\' + $textbox1.Text + '.csv'
$Target = 'G:\Label_Spreadsheets\labels.csv'
$EndNum = ($LastUsedNumber + $numericupdown1.Value)
######################
# Create Source File #
######################
#######################
# Add CSV Header rows #
#######################
Add-Content $Source "Stock Code,Filter#,ProductionOrder#,SalesOrder#,Compatability";
#####################
# Add specific Rows #
#####################
for ($i = $StartNum; $i -le $EndNum; $i++) {
$line = $combobox1.SelectedItem + ',' + $i + ',' + $textbox1.Text + ',' + $textbox2.Text + ','
Add-Content $Source $line
}
I wondered if it was a known provider problem (as shown at this URL) but trying these did not resolve the issue.
And yes I know it's writing a CSV - like I said: old script.
There are multiple text files in a folder and there are multiple affected IDs in each text file. I need to replace them with new IDs. Also, I want to generate a text log file listing the filename,oldID,newID . This log file is required for crosschecking and validation.
I have a csv file from which I am creating a array IdList. I have listed the part of code for replacing the string below.
foreach ($f in gci -r -include "*.txt")
{
Set-Variable -Name filename -value ($f.name)
for( $i=0; $i -le $elementCount; $i++)
{
Write-Host $i
$oldString= $IdList[$i].OLD_ID+','+$IdList[$i].OLD_ID_TYPE
$newString= $IdList[$i].NEW_ID+','+$IdList[$i].NEW_ID_TYPE
gc $f.fullname|ForEach-Object {if($_.contains($oldString)){ "$filename,$oldString,$newString" >> $logFile; $_ -replace $oldString,$newString}}| sc $f.fullname
}
}
I am getting error :
Set-Content : The process cannot access the file 'U:\testScript\rapg6785.txt' because it is being used by another process.
At line:22 char:152
+ gc $f.fullname|ForEach-Object {if($_.contains($oldString)){ "$filename,$oldString,$newString" >> $logFile; $_ -replace $oldString,$newString}}| sc <<<< $f.fullname
+ CategoryInfo : NotSpecified: (:) [Set-Content], IOException
+ FullyQualifiedErrorId : System.IO.IOException,Microsoft.PowerShell.Commands.SetContentCommand
try putting gc $f.fullname in brackets: (gc $f.fullname).
In this way the pipeline starts when gc ends to read the file and free the file handle to be used by another process.
Okay - I am brand new to PowerShell. I only started using it two weeks ago. I've scoured the web to create some scripts and now I'm trying something that seems a bit advanced and I'm uncertain how I should solve this.
I'm creating an audit script to determine what files are different between two backup repositories to ensure they've been properly synchronized (the synchronization scripts use robocopy and they've failed more than once without producing an error). The folders are quite extensive and upon occasion, I'm finding that the script just hangs on certain folders (always on the largest of them) and it will never complete due to this.
At first, I was using Get-ChildItem on the full source path, but that created a memory problem and the script would never complete. So, I thought I'd enumerate the child directories and perform a compare on each child directory... but depending on the folder, that goes bad as well.
Here is the script (using Powershell 2):
$serverArray=#("Server1","Server2","Server3")
for ($i=0; $i -lt 8; $i++) {
$server = $serverArray[$i]
$source="\\$server\Share\"
$destination = "D:\BackupRepository\$server"
# Copy to removable drive
$remoteDestination = "T:\BackupRepository\" + $server
$log = $server + "ShareBackup.log"
$remoteLog = "Remote_" + $server + "ShareBackup.log"
$logDestination = $localLogPath + $log
$logUNCDestination = $uncLogPath + $log
$logRemoteDestination = $localLogPath + $remoteLog
$logUNCRemoteDestination = $uncLogPath + $remoteLog
## This file is used for the process of checking
## whether or not the backup was successful
$backupReport = $localReportPath + $server + "ShareBackupReport.txt"
$remoteBackupReport = $localReportPath + "Remote_" + $server + "ShareBackupReport.txt"
## Variables for the failure emails
$failEmailSubject = "AUDIT REPORT for " + $server
$failRemoteEmailSubject = "AUDIT REPORT for " + $server
$failEmailBody = "The Audit for " + $server + " has found a file mismatch. Please consult the attached Backup Report."
$failRemoteEmailBody = "The Audit of the Remote Backups for " + $server + " has found a file mismatch. Please consult the attached Backup Report."
$sourceFolderArray = Get-ChildItem $source | ?{ $_.PSIsContainer }
$sourceFolderCount = $sourceFolderArray.Count
$mismatchCount = 0
$remoteMismatchCount = 0
for ($s1=0; $s1 -lt $sourceFolderCount; $s1++) {
$sourceFolder = $sourceFolderArray[$s1].FullName
$sourceFolderName = $sourceFolderArray[$s1].Name
$destFolder = $destination + "\" + $sourceFolderName
$remoteDestFolder = $remoteDestination + "\" + $sourceFolderName
Write-Host "Currently working on: " $sourceFolderName
$shot1 = Get-ChildItem -recurse -path $sourceFolder
$shot2 = Get-ChildItem -recurse -path $destFolder
$shot3 = Get-ChildItem -recurse -path $remoteDestFolder
$auditReportDest = "C:\BackupReports\Audits\"
$auditReportOutput = $auditReportDest + $server + "_" + $sourceFolderName + ".txt"
$auditReportRemoteOutput = $auditReportDest + $server + "_Remote_" + $sourceFolderName + ".txt"
$auditMismatchReport = $auditReportDest + "MismatchReport_" + $numericDate + ".txt"
Compare-Object $shot1 $shot2 -PassThru > $auditReportOutput
Compare-Object $shot2 $shot3 -PassTHru > $auditReportRemoteOutput
$auditCompare = Get-ChildItem $auditReportOutput
$auditRemoteCompare = Get-ChildItem $auditReportRemoteOutput
if ($auditCompare.Length -gt 0) {
$content = Get-ChildItem -Recurse $auditReportOutput
Add-Content $auditMismatchReport $content
Write-Host "Mismatch FOUND: " $sourceFolderName
$mismatchCount = $mismatchCount + 1
}
if ($auditRemoteCompare.Length -gt 0) {
$remoteContent = Get-ChilItem -Recurse $auditReportRemoteOutput
Add-Content $auditMismatchReport $remoteContent
Write-Host "Remote Mismatch FOUND: " $sourceFolderName
$remoteMismatchCount = $remoteMismatchCount + 1
}
}
send-mailmessage -from $emailFrom -to $emailTo -subject "AUDIT REPORT: Backups" -body "The full mismatch report is attached. There were $mismatchCount mismatched folders found and $remoteMismatchCount remote mismatched folders found. Please review to ensure backups are current." -Attachments "$auditMismatchReport" -priority High -dno onSuccess, onFailure -smtpServer $emailServer
}
What I've discovered when run interactively is that I'll get a "Currently working on FolderName" and if that object is "too large" (whatever that is), the script will just sit there at that point giving no indication of any error, but it will not continue (I've waited hours). Sometimes I can hit Ctrl-C interactively and rather than quitting the script, it takes the interrupt as a cancel for the current process and moves to the next item.
The rub is, I need to schedule this to happen daily to ensure the backups remain synchronized. Any help or insight is appreciated. And, yes, this is probably raw and inelegant, but right now I'm just trying to solve how I can get around the script hanging on me.
Not sure what version of PS you're using, but Get-Childitem has known problems scaling to large directories:
http://blogs.msdn.com/b/powershell/archive/2009/11/04/why-is-get-childitem-so-slow.aspx
If you're just comparing file names, you can get much better results in large directory structures using the legacy dir command. The /b (bare) switch returns just the fullname strings that can be readily used with Powershell's comparison operators.
$sourcedir = 'c:\testfiles'
$source_regex = [regex]::escape($sourcedir)
(cmd /c dir $Sourcedir /b /s) -replace "$source_regex\\(.+)$",'$1'
This uses a regular expression and the -replace operator to trim the soruce directory off of the fullnames returned by dir. The -replace operator will work with arrays, so you can do all of them in one operation without a foreach loop.
I want to avoid moving files that are currently open by another process. Is there any way the move-item PowerShell command can move, or even worse copy, a currently open file?
We currently have a situation where we have two processes that need data files transferred from process A's output folder to process B's input folder. The idea is that process A writes a file, and then a PowerShell script moves the files to the folder that process B reads.
We are having an issue sometimes that the same file is transferred twice, and it is not a partial file either time.
The below code is executed at 00, 10, 20, 30, 40, 50 minutes past the hour. Process B on the Samba server runs at 05, 15, 25, 35, 45, 55 minutes past the hour and moves the files out of the folder the PowerShell script puts them in, once process B has finished processing the files. There are only ever up to about a dozen 1 KB files being moved at a time.
Process A is not controlled by us and can write files to that location at any time. It seems there is some race condition happening with Process A creating a file just before the PowerShell script moves the file, where the script copies the file and then moves it 10 minutes later when the script runs again.
With the below code is the only possibility that Process A is making the file twice if two entries are logged for the same file with "Moved File" in the log file?
$source = "C:\folder\*.txt"
$target_dir = "\\samba-server\share\"
$bad_dir = "C:\folder\bad_files\"
$log = "C:\SystemFiles\Logs\transfer.log"
$files = Get-ChildItem $source
foreach ($file in $files){
if ($file.name -eq $null) {
# Nothing to do, Added this in since for some reason it executes the conditions below
}
elseif (test-path ($target_dir + $file.name)) {
# If there is a duplicate file, write to the log file, then copy it to the bad dir with
# the datetime stamp in front of the file name
$log_string = ((Get-Date -format G) + ",Duplicate File," + "'" + $file.name + "', " + $file.LastWriteTime)
write-output ($log_string) >> $log
$new_file = ($bad_dir + (get-date -format yyyy.MM.dd.HHmmss) + "_" + $file.name)
move-item $file.fullname $new_file
}
else {
# The file doesnt exist on the remote source, so we are good to move it.
move-item $file.fullname $target_dir
if ($?) { # If the last command completed successfully
$log_string = ((Get-Date -format G) + ",Moved File," + "'" + $file.name + "', " + $file.LastWriteTime)
} else {
$log_string = ((Get-Date -format G) + ",Failed to Move File," + "'" + $file.name + "', " + $file.LastWriteTime)
}
write-output ($log_string) >> $log
}
}
This is the classic producer-consumer problem, which is a well researched topic.
Some solutions you might try are checking the file's last write time. If it is well enough in the past, it can be moved without issues. Another one would be trying to open the file with exclusive access. If it fails, the file is still being used by the producer process. Othervise, close the file and move it.
Some examples are like so,
# List files that were modified at least five minutes ago
gci | ? { $_.lastwritetime -le (get-date).addminutes(-5) }
# Try to open a file with exclusive mode
try {
$f1 = [IO.File]::Open("c:\temp\foo.txt", [IO.Filemode]::Open, [IO.FileAccess]::Read, [IO.FileShare]::None)
# If that didn't fail, close and move the file to new location
$f1.Close()
$f1.Dispose()
Move-Item "c:\temp\foo.txt" $newLocation
} catch [System.IO.IOException]{
"File is already open" # Catch the file is locked exception, try again later
}