I have a script which moves some files from a folder to the temp folder, archives them and cleans the temp folder at the end.
I want my script to also write information about it in the win-event log.
Here is my script:
Get-ChildItem C:\Users\Administrator\Desktop\test1\ | Where-Object {$_.LastWriteTime -lt "09/24/2018 09:00 PM"} | Move-Item -Destination C:\Users\Administrator\Desktop\data\
Compress-Archive -path C:\Users\Administrator\Desktop\data\ -CompressionLevel Optimal -DestinationPath C:\Users\Administrator\Desktop\data1\test.zip
Remove-Item C:\Users\Administrator\Desktop\data\*
I want to add code which will write an event for any error into the win-event log.
Per the comments, you can use Write-EventLog to write to the Windows Event Logs. If you want to write any errors that occur during those commands, then you probably want to use a Try..Catch to catch any errors and handle them:
Try {
$PrevEAP = $ErrorActionPreference
$ErrorActionPreference = 'Stop'
Get-ChildItem C:\Users\Administrator\Desktop\test1\ | Where-Object {$_.LastWriteTime -lt "09/24/2018 09:00 PM"} | Move-Item -Destination C:\Users\Administrator\Desktop\data\
Compress-Archive -path C:\Users\Administrator\Desktop\data\ -CompressionLevel Optimal -DestinationPath C:\Users\Administrator\Desktop\data1\test.zip
Remove-Item C:\Users\Administrator\Desktop\data\*
Catch {
Write-Error $_
$ErrorEvent = #{
LogName = 'Application'
Source = 'YourScript'
EventID = 123
EntryType = 'Information'
Message = $_
}
Write-EventLog #ErrorEvent
}
Finally {
$ErrorActionPreference = $PrevEAP
}
In order for an exception (error) to trigger a Try..Catch the exception needs to be terminating (vs non-terminating). You can force cmdlets to do terminating errors by setting the cmdlets -ErrorAction to Stop, or you can do this globally via the $ErrorActionPreference variable.
In the catch block, the error is held in the special variable: $_. So we can use Write-Error to still write it out to the console (if you want to) and then we're using Write-EventLog to write it into the Event Log.
Customise LogName, Source, EventID, Information etc. as per your needs. Note LogName needs to be one of the existing Logs and Entry Type needs to be one of the valid entry types (Information, Warning, Error).
Related
Is there a way in PowerShell to use Expand-Archive so that files are written where they don't exist, but are not overwritten when they do exist? I can achieve this with -ErrorAction SilentlyContinue, but that ignores things that might be actual errors.
To silence only "file already exists" error messages of Expand-Archive, you can redirect the error stream to the success stream and process error records using ForEach-Object:
Expand-Archive -Path Test.zip -DestinationPath . -EA Continue 2>&1 | ForEach-Object {
if( $_ -is [System.Management.Automation.ErrorRecord] ) {
if( $_.FullyQualifiedErrorId -split ',' -notcontains 'ExpandArchiveFileExists' ) {
Write-Error $_ # output error that is not "file exists"
}
}
else {
$_ # pass success stream through
}
}
-EA Continue (-ErrorAction) overrides the preference variable $ErrorActionPreference to make sure errors are not turned into exceptions (in which case the first error would interrupt the extraction).
2>&1 redirects (merges) the error stream (#2) to the success stream (#1), so both can be processed using ForEach-Object.
$_ -is [System.Management.Automation.ErrorRecord] tests if the current pipeline element is an error record.
When this is the case, we test what kind of error we have, by checking the FullyQualifiedErrorId property of the ErrorRecord (the exception type System.IO.IOException would be too general to test for)
Otherwise it is a message from the success stream, which will be simply passed through.
In case you are wondering how I came up with that FullyQualifiedErrorId thing, I just run Expand-Archive without redirection and called Get-Error afterwards. This outputs all information of the last error record, so I could look up the information to detect the error condition.
An alternative solution, similar to the one suggested by Abraham Zinala, is to unconditionally silence all errors and use -ErrorVariable to collect the errors and shown the relevant ones after the call to Expand-Archive has returned:
$oldErrorActionPreference = $ErrorActionPreference
$ErrorActionPreference = 'SilentlyContinue'
$archiveErrors = $null
Expand-Archive -Path Test.zip -DestinationPath . -ErrorVariable archiveErrors
$ErrorActionPreference = $oldErrorActionPreference
$archiveErrors | Sort-Object { $_ | Out-String } -Unique | ForEach-Object {
if( $_ -is [System.Management.Automation.ErrorRecord] ) {
if( $_.FullyQualifiedErrorId -split ',' -notcontains 'ExpandArchiveFileExists' ) {
$_ # output error that is not "file exists"
}
}
}
The errors of Expand-Archive cannot be completely silenced through the -ErrorAction parameter, because some errors (like input file doesn't exist) are detected as part of parameter validation. To really silence all errors, the $ErrorActionPreference variable must be used.
It is important to set the error variable to $null before calling Expand-Archive because the command doesn't reset the variable, when there is no error.
The name of the variable passed to -ErrorVariable must be specified without $.
The Sort-Object -Unique command makes sure we don't show duplicate errors.
My PowerShell script just checks multiple servers to make sure the input* and output* directories are clear of any files.
I'm simply trying to output to console the results of a GCI call prior to throwing an error message. However, when I uncomment the "throw" line, the $inputFiles and $outputFiles no longer output to the console. Below is the code:
$allServers = #(
"server1.com",
"server2.com")
foreach ($server in $allServers) {
$inputFiles = Get-ChildItem -Path "\\$server\C$\jobs\statements\input*\" -Recurse | Where-Object {! $_.PSIsContainer } | Select FullName
$outputFiles = Get-ChildItem -Path "\\$server\C$\jobs\statements\output*\" -Recurse | Where-Object {! $_.PSIsContainer } | Select FullName
if ($inputFiles -eq $NULL -and $outputFiles -eq $NULL) {
Write-Host "Environment is ready for statement processing."
}
else {
Write-Host "Environment is NOT ready for statement processing."
Write-Host "The following files exist in input/output: `n"
$inputFiles
$outputFiles
#Throw "Files exist in input/output. See above for details."
}
}
Below is the console output:
Environment is NOT ready for statement processing.
The following files exist in input/output:
Environment is NOT ready for statement processing.
The following files exist in input/output:
FullName
--------
\\server1.com\C$\jobs\statements\input\asdasd.txt
\\server1.com\C$\jobs\statements\input_254\asdasd.txt
\\server1.com\C$\jobs\statements\input_test\asdasd.txt
\\server2.com\C$\jobs\statements\input\CUSSTAT10302021.245
\\server2.com\C$\jobs\statements\input\CUSSTAT11312021
\\server2.com\C$\jobs\statements\input\CUSSTAT11312021.zip
And below is the console output when I uncomment the "throw" line:
Environment is NOT ready for statement processing.
The following files exist in input/output:
Files exist in input/output. See above for details.
At C:\jobs\statements\bin\Statements-EnvironmentCheck.ps1:47 char:9
+ Throw "Files exist in input/output. See above for details."
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : OperationStopped: (Files exist in ...ve for details.:String) [], RuntimeException
+ FullyQualifiedErrorId : Files exist in input/output. See above for details.
I know I have some error output cleanup to perform in order to include all the servers that might have files present, but please ignore that for now.
What you're experiencing is explained in this answer and this answer, basically you need to implement Out-Host \ Out-Default:
$inputFiles, $outputFiles | Out-Host # Should fix the problem
# possibly `throw` might require this too
throw "Files exist in input/output. See above for details." | Out-Host
However, I feel is worth showing you a better way to approach your code, returning a unified array of objects which you can filter, sort and export.
$allServers = #(
"server1.com"
"server2.com"
)
$result = foreach ($server in $allServers) {
# use `-File` instead of `! $_.PSIsContainer`
$out = #{
in = Get-ChildItem "\\$server\C$\jobs\statements\input*\" -Recurse -File
out = Get-ChildItem "\\$server\C$\jobs\statements\output*\" -Recurse -File
}
# if $out['in'] and $out['out'] are `$null`, Ready is `$true`
[pscustomobject]#{
Ready = -not($out['in'] -or $out['out'])
Server = $server
Files = $out
}
}
Now, if you want to see which servers are Ready (no files in input and output):
$result.where{ $_.Ready }
And if you want to see which servers are not Ready, and have a list of the files:
$result.where{ -not $_.Ready }.foreach{
foreach($file in $_.Files.PSBase.Values.FullName) {
[pscustomobject]#{
Server = $_.Server
Files = $file
}
}
}
I've tried to make a PowerShell script that writes an event when the text files in an folder are older than 15 minutes.
This is the code I'm trying to use:
$targetfile=get-childitem C:\Users\user\Desktop\bizin\*.txt
if ($targetfile.AddMinutes -15 )
{
write-eventlog -logname Application -source ESENT -eventID 9999 -entrytype Error -message "No new files in 15min gate is down!" -category 3
'create event log to say bad things have happened'
}
I always get the event when I run the script. I would like to only get an event when the file is older than 15 minutes. It's going to be a scheduled task to run every 15 minutes.
You need to access the LastWriteTime property of the file and compare it with a DateTime object:
$targetFiles = Get-ChildItem -Path 'C:\Users\user\Desktop\bizin\' -Filter *.txt
foreach ($targetFile in $targetFiles)
{
if ($targetFile.LastWriteTime -lt [DateTime]::Now.AddMinutes(-15))
{
# Write the eventlog....
}
}
Note: You can use the -Filter parameter for the Get-ChildItem cmdlet to retrieve only txtfiles.
Here is the same example with use of aliases and the pipeline in just one line:
gci -Path 'C:\Users\user\Desktop\bizin\' -Filter *.txt | ? { $_.LastWriteTime -lt [DateTime]::Now.AddMinutes(-15) } | % { <# Write the eventlog... #> }
I'm using New-ADUser and Add-ADGroupMember
If the user already exists or is already in the group then the functions throw exceptions (which are expected and not a problem).
How do I log the exceptions to file and keep going?
Redirection is not working - the exceptions always go to the
console.
-ErrorAction is not working - exceptions still go to the console
Try / Catch works, but execution stops and the rest of the commands don't run
I can do a Try / Catch for every single statment, but that seems
ridiculous
You can combine -ErrorAction SilentlyContinue with -ErrorVariable:
$e = $null
New-ADUser iExist -ErrorAction SilentlyContinue -ErrorVariable e
$e # contains the error
You can also use the built in $Error variable, which is a circular buffer holding all the errors.
$ErrorPreference = SilentlyContinue # I don't like this personally
New-ADUser iExist
Add-ADGroupMember iExist iForgotTheParameters
$Error[0] # The Add-ADGroupMember error
$Error[1] # The New-ADUser error
So you could set your $ErrorPreference, do a bunch of commands, and at the end of it all, do something like $Error | Out-File -Path errors.txt.
Have a look at PowerShell Error Handling and Why You Should Care for more ideas.
The simplest way to accomplish this is probably using the trap construct:
function Test-Trap{
trap {
$_ | Out-String | Out-File C:\path\to\errors.txt -Append
}
Get-ADUser -NoSuchParam "argument"
Write-Host "Show must go on"
nonexistingcommand
Write-Host "Still executing"
}
When you call Test-Trap, you'll see that after the error has been written to the console, the trap is executed, and the rest of the execution flow is resumed:
And the error record output as it would normally appear on screen (courtesy of Out-String) has been saved to the file:
You could add cool features like timestamps and stack traces to your trap:
function Test-Trap{
trap {
$LogPath = "C:\path\to\errors.txt"
$ErrorCount = $ErrorCount + 1
$("[Error {0} trapped {1}]:" -f $ErrorCount,(Get-Date -Format "dd/MM/yyyy HH:mm:ss.fff")) | Out-File $LogPath -Append
$_ | Out-String | Out-File $LogPath -Append
if(($st = $_.Exception.Stacktrace)){ $st |Out-File $LogPath -Append }
$("[Error {0} logged]" -f $ErrorCount)| Out-File $LogPath -Append
}
Provoke-Error -NoSuchParam muhahaha
}
I have a powershell workflow which is generating the error:
"The operation did not complete within the allotted timeout of
00:00:30. The time allotted to this operation may have been a portion
of a longer timeout"
The workflow script is:
Workflow Test-Me (){
Param
(
$Path = "c:\temp"
)
$Files = InlineScript{
Get-ChildItem -Path $using:Path -File -Recurse -Force | Where-Object {$_.LastWriteTime -lt ((get-date).AddDays(-$using:Days))} | Select FullName
}
Foreach -parallel ($File in $Files){
sequence{
InlineScript{
Remove-Item -Path $using:File.FullName -force -ErrorAction:SilentlyContinue
} -PSActionRunningTimeoutSec 300
}
}
}
The line that generates the error is the InlineScript which handles the remove-item operation. It runs and works for 30 seconds after it reaches the operation before it quits with the error referenced above. I've added the -PSActionRunningTimeoutSec parameter to the InlineScript and that didn't impact the error. I've also set the workflow common parameters as follows:
-PSRunningTimeoutSec = 300
-PSElapsedTimeoutSec = 0
I call the workflow cmdlet with the following process:
PS C:\> . "c:\path\to\Test-Me.ps1"
PS C:\> Test-Me -PSRunningTimeoutSec 300 -PSElapsedTimeoutSec 0
There's obviously a timeout somewhere that I can't find/don't know about but powershell isn't being specific. What timeout did I miss and how do I change it?
References:
about_WorkflowCommonParameters
PowerShell Workflows: Using Parameters
Syntactic Differences Between Script Workflows and Scripts
about_InlineScript