Powershell send-mailmessage attachment different than source file after calling function - powershell

#robocopy backup scripts on each server copy their log files to one of two locations: \\backup_server\backup\logs\success if the backup is
#successful, or \\backup_server\backup\logs\fail if the backup fails. This script looks at those two locations to check for failures or
#missing log files, then reports them to the helpdesk.
Function test-servers {
param ($serverlist)
foreach($server in $serverlist){
if(test-path \\backup_server\backup\logs\success\$server){
add-content -path \\backup_server\backup\logs\errorlog.txt -value $server -force
add-content -path \\backup_server\backup\logs\errorlog.txt -value $success -force
}
elseif(test-path \\backup_server\backup\logs\fail\$server){
add-content -path \\backup_server\backup\logs\errorlog.txt -value $server -force
add-content -path \\backup_server\backup\logs\errorlog.txt -value $failure -force
}
else{
add-content -path \\backup_server\backup\logs\errorlog.txt -value $server -force
add-content -path \\backup_server\backup\logs\errorlog.txt -value $missing -force
}
}
}
$date = get-date
$loglocation = "The log file archive can be found in \\backup_server\backup\logs\archive\"
$filename = (get-date).tostring("MM-dd-yyyy")
$success = "Error log found, backup was successful."
$failure = "Error log found, backup was completed with errors."
$missing = "Error log not found, check server."
[System.Collections.ArrayList]$Serverlist = Get-Content \\backup_server\backup\logs\serverloglist.txt
remove-item \\backup_server\backup\logs\errorlog.txt
new-item -path \\backup_server\backup\logs\ -name "errorlog.txt" -itemtype "file"
add-content -path \\backup_server\backup\logs\errorlog.txt -value $date -force
test-servers -serverlist $serverlist
add-content -path \\backup_server\backup\logs\errorlog.txt -value $loglocation -force
#email out log file and clean up file system
send-mailmessage -from "no-reply#example.com" -to "helpdesk#example.com" -smtpserver "mail.example.com" -port 25 -subject "Nightly Robocopy Log Files" -Attachments "\\backup_server\backup\logs\errorlog.txt" -body "Nightly server backup completed, see attachment for details"
exit
Background: I work for a small company IT department and we’re managing around 2 dozen generic file servers which perform simple robocopy batch backups every night. No problems there. In an effort to limit the number of “Server_1 has completed its nightly backup” tickets to our helpdesk, I’ve written a quick-and-dirty script that we use to confirm the completion of our nightly backups. All of the log files from our nightly backups are copied to a “success” or “fail” folder, and this script checks each location and notes which files are in which folder and emails a single ticket. Then the backup logs that failed are copied to an archive folder for the current date and a copy of the primary log file copied along with them. This system works fantastically, the attached log file has all of the proper information about which server failed or succeeded.
So here is my dilemma. The log file that is attached to the send-mailmessage email isn’t the same log file as the source file! The log file left over in the \\backup_server\backup\logs\ directory has the current date and time (although a few seconds after the log file sent via email) on line 1 as does the copy sent by send-mailmessage, but says that every server’s nightly log is missing (see image). So somewhere along the line, 2 log files are being created, with the exact same file name and file path, the correct log first, and the incorrect log seconds later. The only way I can see this happening is if there’s a problem with my function. But it’s only called once, so I don’t see how it could write two different files. The script is called by the Task Scheduler on the server, and is running only once each day according to the task history, reinforced by the fact that we only receive one email from the script each day.
For the sake of brevity, I’ve removed the end of the script where the old logs are copied and cleaned up, which consists of a few lines of copy- and remove-item. I’ve gone over the man pages for each cmdlet in the script and delved into most every post involving send-mailmessage and use of functions on SE.
How can a script that only calls the new-item cmdlet once to create a single file instead create two separate instances of the file? Is it an issue with my function, or does send-mailmessage have unknown-to-me effects on its attachments?
As this is my first post on the SE network, I understand it may not conform to all of the rules. I’ve done my best to go over the FAQs and keep the post on topic. If the question can be improved in any way, your advice is welcome.
Log File Example, on the left is the file we receive as the email attachment, the right is the file left \\backup_server\backup\logs\errorlog.txt
Edit: I updated the script, removing the function at the beginning and simply placing the foreach loop where the function was being called to confirm it was not an issue with the function call within the script.

I adjusted the script a bit, and the task ran in the morning as scheduled, issue persists. But after running the script manually for a few tests, both the log in the archive and the emailed log were the same! So it appears the error was not with the script itself, but with the way I am calling it in the task scheduler. As scheduling a task to run a powershell script is a completely different question, I'll consider this one resolved. I'll leave the question up for a week and then delete, in case any of the 3 people who have seen it are interested in the solution.
Edit: So it appears it was entirely user error! And the worst kind of user error. I had the task set up on 2 different servers, one running a slightly older version of the script, running on the same schedule. So the tasks ran, one overwrote the logs from the other, and I was left with the aforementioned headache. Deleted the second task and now everything works properly! Hooray for stupid mistakes.

Related

Powershell: copy file without locking

I created simple nagios plugin check_log.ps1 to check log file on windows machine. It works in way that make copy content of log and in next time look for specified string in difference between copy of log and original log file.
The problem is that sometimes in random moments check_log.ps1 locks log file so it cause stop of the application which create log file.
Generally plugin use original log file in two places
# compare content of $Logfile and $Oldlog, save diff to $tempdiff
Compare-Object -ReferenceObject (Get-Content -Path $Logfile) -DifferenceObject (Get-Content -Path $Oldlog) | Select-Object -Property InputObject > $tempdiff
# override file $Oldlog using conetent of $Logfile
Copy-Item $Logfile $Oldlog
I make test. In one PS session I run while($true) { [string]"test" >> C:\test\test.log }, in second session I run plugin C:\test\check_log.ps1 C:\test\test.log C:\test\Old_log.log test
I'm not fully sure if my test is correct but I think that Copy-Item command cause problem. When I comment this line in script I don't see any errors in terminals. I tested some custom functions to copy file which I found in internet but I din't find solution for my problem.
Do you have an idea how to make it work fully?
if you think the copy-item is locking the file, try reading the content and then saving it to another location. Something like this:
Get-Content $Logfile | Set-Content $Oldlog

Notepad opens stale file after Set-Content

I have the following code in a script:
Set-Content -Path $myFile -Value $myContent
Later in the script, I wish to show the end result by opening the file in Notepad:
notepad $myFile
I find that the file I see that is launched has the original content, not the one I set in the script. The actual file has the correct content, so it's just a matter of getting stale file content.
I've only noticed this behavior when I started working with a large file, so I think I need to flush or wait after setting the content.
One workaround I found is to use the -PassThru parameter:
$dummy = Set-Content -Path $myFile -Value $myContent -PassThru
However, I find that this takes quite a long time. Is this the trade-off I have to deal with?

Copy Script Trouble

I've created a script that I found in other forums to copy a file from one server to another. A bit of background here. We have large PDF files being generated nightly and saved on a share at one site that need to be copied over to our corporate share. These files are pretty sizable (anywhere between 25MB to as high as 65MB), so I can only copy these files off hours. The source repository holds all the original files from the year, so I only want to copy the most recent files. I created a script (or tried to at least) that copies only the most recent files from the SourceHost location to CorpHost share and set up a Task Schedule to run at 7:30pm.
The script kicks off and runs as scheduled, but nothing gets copied over. I don't see any errors being generated from the task schedule and it appears to run as normal as the script returns a "not copying .pdf". Originally, I though that maybe it was bypassing all the files because the generation date is outside the $Max_days range (-1), so I increased it to -2. No luck. Increased it again to -5 - no luck. -10... nothing.
Here's the code sample:
$RemotePath = "\\<CorpHost>\Shared\Packing_Slips\<Site>"
$SourcePath = "\\<SourceHost>\<Site>_packingslips"
$Max_days = "-1"
$Curr_date = Get-Date
#Checking date and then copying file from RemotePath to LocalPath
foreach ($file in (Get-ChildItem $SourcePath))
{
if ($file.LastWriteTime -gt ($Curr_date).AddDays($Max_days))
{
Copy-Item -Path $file.FullName -Destination $RemotePath
}
else
{
"not copying $file"
}
}

Powershell script to detect log file failures

I run perfmon on a server and the log files go into a c:\perfmon folder. The perfmon task restarts each week and the log files just collect in there over time. There could be a range of csv file dates in that folder with different dates.
I would like to write a powershell script that will check that folder to make sure that there is a file there with todays modified date on it. If there isn't one for today I would like a FAILED email so that I know to look at perfmon for issues.
Has anyone written anything like this. I have tried several of the scripts in here and none do it exactly how I would like it to work. This is what I have so far based on other scripts.
It sort of works but it is checking all files and responding for all files as well. If I had 3 files over the last three days and none for today I would get 3 emails saying FAIL. If I have one for today and two older ones I get 3 OK emails. If I just have just one file for today I get one OK email. How do I restrict this to just one email for a fail or success. There could be 50-100 files in that folder after two years and I just want a FAIL if none of them are modified today.
Hope that all makes sense. I'm afraid my Powershell skills are very weak.
$EmailTech = #{
To = 'a#a.com'
SmtpServer = 'relayserver'
From = 'a#a.com'
}
$CompareDate = (Get-Date).AddDays(-1)
Get-ChildItem -Path c:\perflogs\logs | ForEach-Object {
$Files = (Get-ChildItem -Path c:\perflogs\logs\*.csv | Where-Object {$_.LastWritetime -gt $CompareDate} | Measure-Object).Count
if ($Files -eq 0)
{
$EmailTech.Subject = 'Perfmon File ServerA - FAIL'
$EmailTech.Body = 'A performance monitor log file was not found on ServerA for today'
Send-MailMessage #EmailTech
}
Else
{
# If we found files it's ok and we don't report it
$EmailTech.Subject = 'Perfmon File ServerA - OK'
$EmailTech.Body = 'A performance monitor log file was found on ServerA for today'
Send-MailMessage #EmailTech
}
}

Remove-Item error: Cannot remove item [item path & name]: Access to the path '[item path & name]' is denied

I'm new to PowerShell.
I'm trying to automate the deployment of dll components from a folder on a source server to multiple folders on the destination server. This seems like it should be simple enough: copy components from source (deployment) folder on source server to folders on destination server, verify copies and finally delete components from deployment folder on source server.
The copying of the files from the source server to the destination server is working without issue.
However, when the script moves on to delete the components from the source server, I'm intermittently confronted with the error:
"Remove-Item error: Cannot remove item [item path & name]: Access to the path '[item path & name]' is denied."
I've run this script several times; sometimes it completes with no problems, sometimes with the error. The error does not occur for every file to be deleted and seems to occur on different components each time it presents.
Below is the function I have written to remove components and verify deletions:
function DeleteSourceFiles($srcPath) {
# Announce delete
OutputToHostAndLog ("Files will be removed from "+$srcPath+"...")
OutputToHostAndLog "Removing files..."
# Deletes all file items (i.e. all except folders) in source folder
$filesToDelete=Get-ChildItem $srcPath | Where-Object {$_ -is [IO.FileInfo]}
ForEach($item in $filesToDelete) {
Remove-Item $srcPath\$item -force
# Verify deletions
if(Test-Path($srcPath+"\"+$item)) {
OutputToHostAndLog ("Delete failed: "+$item.Name)
$fail++
}
else {
OutputToHostAndLog ($item.Name+" deleted successfully...")
}
}
}
The use of the -force parameter with the Remove-Item cmdlet does not appear to have any effect on the issue. The files (again, different files with each failure) don't appear to be isReadOnly anyway.
Similarly, running PowerShell as Administrator seems to have no effect, although Get-Acl for the source folder indicates that Administrator should have FullControl.
Is this a permissions issue that I am missing?
Any suggestions very much appreciated...
EDIT:
I updated my script thus:
function DeleteSourceFiles($srcPath) {
# Announce delete
OutputToHostAndLog ("Files will be removed from "+$srcPath+"...")
OutputToHostAndLog "Removing files..."
OutputToHostAndLog $gap
# Delete all file items (i.e. all except folders) in source folder
$filesToDelete=Get-ChildItem $srcPath | Where-Object {$_ -is [IO.FileInfo]} | ForEach {
Remove-Item $_.FullName -Force
# Verify deletions
if(Test-Path($srcPath+"\"+$_)) {
OutputToHostAndLog ("Delete failed: "+$_.Name)
$fail++
}
else {
OutputToHostAndLog ($_.Name+" deleted successfully...")
}
}
}
This seems to work OK although I'm still not sure why this arrangement should produce different results. In the interest of learning, any insights would be greatly appreciated...
Intermittent access denied errors likely indicate that one or more of the files you're trying to remove has been locked by another application. It's a really common problem when you're trying to clean up log directories.
The only thing I'd recommend to do is wait for the application with the lock to release the file.
I fixed this problem by setting the security permissions for users to have 'Modify' and 'Full Control' added for the folder I was trying to delete from
I had this issue while running PowerShell in a Scheduled Task. I did a few things and it worked on the last one:
Gave the user Full Control over the files in the folder.
Made my group a principal
a) e.g. New-ScheduledTaskPrincipal -GroupId "BUILTIN\Administrators" -RunLevel Highest
Set the Task to use Highest Privileges: