Need my PowerShell logfile in a table format - powershell

I recently finished my script with the help of someone on this site (Matt) Thanks again!
I now need to somehow get the logfile into a tabled format and I'm not sure how to implement that with the current setup of the script, any ideas?
Write-Host Report generated at (Get-date)
write-host("Lower Environments Status Check");
# Preprocessing Items
$msg = ""
$array = get-content C:\LowerEnvChecklist\appurls.txt
$log = "C:\LowerEnvChecklist\lowerenvironmentslog.txt"
$errorTexts = "error has occurred","Oops","Unable to display widget data","unexpected error occurred","temporarily unavailable","there was a problem"
$regex = ($errorTexts | ForEach-Object{[regex]::Escape($_)}) -join "|"
write-host("Checking appurls.txt...One moment please.");
("`n---------------------------------------------------------------------------") | out-file $log -Append
Get-Date | Out-File $log -Append
("`n***Checking Links***") | out-file $log -Append
("`n") | out-file $log -Append
# Loop through each element of the array.
ForEach($target in $array){
# Erase results for the next pass in case of error.
$result, $response, $stream, $page = $null
# Navigate to site urls
$result = [System.Net.WebRequest]::Create($target)
$response = $result.GetResponse()
$stream = [System.IO.StreamReader]$response.GetResponseStream()
$page = $stream.ReadToEnd()
# To ensure login/authentication pages that give a 403 response pages still show as online
If($response.StatusCode -eq 403){
$msg = " $target -----> is ONLINE!"}
# Determine if the status code 200 pages are truly up based on the information above.
If($response.StatusCode -eq 200){
# While the page might have rendered need to determine there are no errors present.
If($page -notmatch $regex){
$msg = " $target -----> is ONLINE!"
} else {
$msg = " $target -----> may be DOWN, please check!"
}
} else {
$msg = " $target -----> may be DOWN, please check!"
}
# Log Results.
$msg | Out-File $log -Append -width 120
write-host $msg
# Close the response.
$response.Close()
}
# Write completion to logfile.
("`n") | out-file $log -Append
("`n***Lower Environments Checklist Completed***") | out-file $log -Append
# Write completion to host.
write-host("Lower Environments Checklist Completed");
# Open logfile once script is complete.
Invoke-Item C:\LowerEnvChecklist\lowerenvironmentslog.txt

If you just want to view it in-script you could do Out-GridView on your log file. This will open a new window with a view of the data in the log file that looks like a table. Depending on your formatting you may have to add extra items like headers that are human readable.

To wet your whistle with structured output I opted to show you a CSV based solution. Either way all avenues require objects. What we do here is create a custom object that we populate as the script progresses. Each pass sends the details down the pipe. Using the pipeline we can use Export-CSV to collect all of the data in a nice file. Even filtering is possible now.
write-host("Lower Environments Status Check");
# Preprocessing Items
$array = Get-Content C:\LowerEnvChecklist\appurls.txt
$log = "C:\LowerEnvChecklist\lowerenvironmentslog.csv"
$errorTexts = "error has occurred","Oops","Unable to display widget data","unexpected error occurred","temporarily unavailable","there was a problem"
$regex = ($errorTexts | ForEach-Object{[regex]::Escape($_)}) -join "|"
# Loop through each element of the array. Use the pipeline to make output easier
$array | ForEach-Object{
# Keep the variable $target so it is not lost in scopes. Build the object to be completed as we go.
$target = [pscustomobject][ordered]#{
URL = $_
Status = ""
Detail = "N/A"
Timestamp = Get-Date
}
# Erase results for the next pass in case of error.
$result, $response, $stream, $page = $null
# Navigate to site urls. If we cant access the site set a flag to mark the site as down.
$result = [System.Net.WebRequest]::Create($target.URL)
$response = try{$result.GetResponse()}catch{$null}
switch([int]$response.StatusCode){
403{
$target.Status = "OK"
$target.Detail = "403"
}
200{
# Get page content to confirm up status
$stream = [System.IO.StreamReader]$response.GetResponseStream()
$page = $stream.ReadToEnd()
# While the page might have rendered need to determine there are no errors present.
If($page -notmatch $regex){
$target.Status = "OK"
} else {
$target.Status = "DOWN"
$target.Detail = "Pattern"
}
}
default{
$target.Status = "DOWN"
}
}
# Send the object down the pipeline
$target
# Close the response. The object might not exist so check before we call the methods.
if($response){$response.Close()}
if($stream){$stream.Close()}
} | Export-CSV -Path $log -NoTypeInformation
# Write completion to host.
write-host("Lower Environments Checklist Completed");
# Open logfile once script is complete.
Invoke-Item $log
I took the liberty off adding another column to your request called Detail it could add context. Not sure what you wanted from the date but if you have plenty of URLS and processing time then I suppose it could be of use. Also to reduce the if logic I added a switch statement. This would be more useful if you react to other status codes down the road. Still, good thing to know.
Sample Output
URL Status Detail Timestamp
--- ------ ------ ---------
https://7fweb DOWN N/A 1/11/2016 12:18:16 PM
http://www.google.ca OK N/A 1/11/2016 12:18:16 PM
http://www.microsoft.com DOWN Pattern 1/11/2016 12:18:16 PM
I added "windows" to $errorTexts to trigger a pattern match for microsoft.com

Related

I have created a custom compliance policy in Endpoint manager but the device statuses are 'Error' instead of 'non-compliant' or 'compliant'

This is the very basic picture I am working with. It doesnt show the device as 'compliant' or 'non-compliant', but rather with 'error' or 'pending'. I noticed backslashes used as escape character in the logs. Could the path be a problem? Please advise.
$filePath = "C:\ProgramData\Autodesk\_PennCompliance"
$currentFileName = Get-ChildItem -Path $filePath -Name companyCompliance*.txt
$hash = #{
FileName = $currentFileName
}
Write-Output $hash
return $hash | ConvertTo-Json -Compress
My JSON file has the matching key as in the hashTable 'FileName'
{
"Rules":[
{
"SettingName":"FileName",
"Operator":"IsEquals",
"DataType":"String",
"Operand":"PennCompliance_2021-0921.txt",
"MoreInfoUrl":"https://call4cloud.nl/2021/11/the-last-days-of-custom-compliance/#part1",
"RemediationStrings":[
{
"Language":"en_US",
"Title":"Must update text file suffix.",
"Description": "Must update the suffix containing the date (PennCompliance_yyyy-mmdd.txt) of the PennCompliance text file."
}
]
}
]
}
Now I am looking for the status of 'Compliant' or 'Not Compliant' but instead see 'error' or 'pending'. I have deployed a Custom compliance policy from a sample and it worked fine so its something with this code. I also looked at the intuneManagementExtension logs at the time it failed.
{"PolicyId":"0cb83122-b322-45f5-9ab1-8e75c28ce7f5","UserId":"3dc325b8-6b7b-4b95-9e56-df64471366e0","PolicyHash":null,"Result":3,"ResultDetails":null,"InternalVersion":2,"ErrorCode":0,"ResultType":3,"PreRemediationDetectScriptOutput":"{\"FileName\":{\"value\":\"PennoniCompliance_2021-0921.txt\",\"PSPath\":\"Microsoft.PowerShell.Core\\\\FileSystem::C:\\\\ProgramData\\\\Autodesk\\\\_PennCompliance\\\\PennoniCompliance_2021-0921.txt\",\"PSParentPath\":\"Microsoft.PowerShell.Core\\\\FileSystem::C:\\\\ProgramData\\\\Autodesk\\\\_PennCompliance\",\"PSChildName\":\"PennoniCompliance_2021-0921.txt\",\"PSDrive\":{\"CurrentLocation\":\"WINDOWS\\\\system32\",\"Name\":\"C\",\"Provider\":\"Microsoft.PowerShell.Core\\\\FileSystem\",\"Root\":\"C:\\\\\",\"Description\":\"Windows\",\"MaximumSize\":null,\"Credential\":\"System.Management.Automation.PSCredential\",\"DisplayRoot\":null},\"PSProvider\":{\"ImplementingType\":\"Microsoft.PowerShell.Commands.FileSystemProvider\",\"HelpFile\":\"System.Management.Automation.dll-Help.xml\",\"Name\":\"FileSystem\",\"PSSnapIn\":\"Microsoft.PowerShell.Core\",\"ModuleName\":\"Microsoft.PowerShell.Core\",\"Module\":null,\"Description\":\"\",\"Capabilities\":52,\"Home\":\"C:\\\\WINDOWS\\\\system32\\\\config\\\\systemprofile\",\"Drives\":\"C\"},\"PSIsContainer\":false}}","PreRemediationDetectScriptError":null,"RemediationScriptErrorDetails":null,"PostRemediationDetectScriptOutput":null,"PostRemediationDetectScriptError":null,"RemediationStatus":4,"Info":{"RemediationExitCode":null,"FirstDetectExitCode":0,"LastDetectExitCode":null,"ErrorDetails":null},"TargetType":1,"RunAsAccount":1,"AssignmentFilterIds":null,"BiosMetadata":null}
I just don't know what to make of it. It looks like the backslash is escaping the absolute path, Im not sure. What I do know is any help would be much appreciated.
You're getting an object back and passing the whole thing to the output. $currentFileName could possibly return multiple files, and each of those files has multiple properties.
You can inspect this on your box by passing $currentFileName to get member to see what what properties are available:
$currentFileName | Get-Member
To get back just the file name (of the first result [0]) you can do:
$currentFileName = (Get-ChildItem -Path $filePath -Name companyCompliance*.txt)[0].Name
Just be warned, if Get-ChildItem doesn't return anything, the [0] will throw an error. You can use an if block to check the count.
I also removed the Write-Output because that could cause issues.
Here's the full solution:
$filePath = "C:\ProgramData\Autodesk\_PennoniCompliance"
$currentFile = Get-ChildItem -Path $filePath -Name companyCompliance*.txt
if($currentFile.Count -ge 1){
$currentFileName = $currentFile[0].Name
}
else {
$currentFileName = 'None'
}
$hash = #{
FileName = $currentFileName
}
return $hash | ConvertTo-Json -Compress

Read Log File for Monitoring using regex

I am trying to Monitor a Log file using Powershell, but I am not able to figure out what regex should I use to have my required Monitoring output.
Get-Content $file -wait | where {$_ -match "some regex"} |
foreach { send_email($_) }
Here is a sample of my Log File.
19:43:06.5230 Info {"message":"YourCode_Prod execution started","level":"Information","timeStamp":"2019-01-15T19:43:06.5132404+00:00","fingerprint":"588aeb19-76e5-415a-88ff-a69797eb414f","windowsIdentity":"AD\\gaurav.m","machineName":"EDEV-3","processName":"YourCode_Prod","processVersion":"1.0.6800.16654","fileName":"YourCode_Work","jobId":"22a537ae-35e6-4abd-a57d-9dd0c273e81a","robotName":"Gaurav"}
19:50:48.8014 Info {"message":"YourCode_Prod execution ended","level":"Information","timeStamp":"2019-01-15T19:50:48.8005228+00:00","fingerprint":"b12b7d6f-cf3a-4e24-b1e6-1cf4413c12e2","windowsIdentity":"AD\\gaurav.m","machineName":"EDEV-3","processName":"YourCode_Prod","processVersion":"1.0.6800.16654","fileName":"YourCode_Work","jobId":"22a537ae-35e6-4abd-a57d-9dd0c273e81a","robotName":"Gaurav","totalExecutionTimeInSeconds":462,"totalExecutionTime":"00:07:42"}
I tried to generate regex, but I could not find the right logic for this.
$searchpattern = [regex]"(?:(?:started))|(?:(?:ended))"
Get-Content -Path C:\Execution.log | foreach {
if ($PSItem -match $searchpattern) {
# does the entry have a valid date/time
$time, $type, $data = $PSItem -split " ", 3
$time
$type
$data
}
}
I need to monitor this file which will have many lines, but I want to take action only for "execution started" and "execution ended".
I could separate the type of message time and the data. But I need to further drill down on the data.
This script will run every 5 min, so I will compare time -5 min and start reading log from there, as soon as I find ended or started I will take necessary action.
Your log data is in JSON format, so you could simply process the extracted data as such.
Get-Content -Path C:\Execution.log | ForEach-Object {
$time, $type, $json = $_ -split " ", 3
$data = ConvertFrom-Json $json
if ($data.message -match 'execution (started|ended)') {
# do stuff
}
}

Issue creating a script with dual logic in deeming a website online

I am currently trying to create a script that allows me to check multiple web url's in order to see if they are online and active. My company has multiple servers with different environments active (Production, Staging, Development etc.) I need a script that can check all the environments URL's and tell me whether or not they are online each and every morning so I can be ahead of the game in addressing any servers or websites being down.
My issue however is I can't solely base the logic strictly on an HTTP code to deem the site online or not, some of our websites may be online from an HTTP standpoint but have components or webparts of the site that is down displaying an error message on the page.
I am having trouble coming up with a script that can not only check the HTTP status as well as scan the page and parse out any error messages and then write to host based on both pieces of logic whether or not the site is "Online" or "Down"
Here is what I have so far, you will notice it does not include anything regarding parse for key words as I don't know how to implement...
#Lower Environments Checklist Automated Script
Write-Host Report generated at (Get-date)
write-host("Lower Environments Status Check");
$msg = ""
$array = get-content C:\LowerEnvChecklist\appurls.txt
$log = "C:\LowerEnvChecklist\lowerenvironmentslog.txt"
write-host("Checking appurls.txt...One moment please.");
("`n--------------------------------------------------------------------------- ") | out-file $log -Append
Get-Date | Out-File $log -Append
("`n***Checking Links***") | out-file $log -Append
("`n") | out-file $log -Append
for ($i=0; $i -lt $array.length; $i++) {
$HTTP_Status = -1
$HTTP_Request = [System.Net.WebRequest]::Create($array[$i])
$HTTP_Request.Timeout =60000
$HTTP_Response = $HTTP_Request.GetResponse()
$HTTP_Status = [int]$HTTP_Response.StatusCode
If ($HTTP_Status -eq 200) {
$msg = $array[$i] + " is ONLINE!"
}
Else {
$msg = $array[$i] + " may be DOWN, please check!"
}
$HTTP_Response.Close()
$msg | Out-File $log -Append -width 120
write-host $msg
}
("`n") | out-file $log -Append
("`n***Lower Environments Checklist Completed***") | out-file $log -Append
write-host("Lower Environments Checklist Completed");
appurls.txt just contains the internal URLs I need checked FYI.
Any help would be much appreciated! Thanks.
Here is something to at least give you an idea what to do. Need to capture the website data in order to parse it. Then we run a regex query against that which is built from an array of strings. Those strings are texts that might be seen on a page that is not working.
# build a regex query of error strings to match against.
$errorTexts = "error has occurred","Oops","Unable to display widget data","unexpected error occurred","temporarily unavailable"
$regex = ($errorTexts | ForEach-Object{[regex]::Escape($_)}) -join "|"
# Other preproccessing would go here
# Loop through each element of the array
ForEach($target in $array){
# Erase results for the next pass in case of error.
$result, $response, $stream, $page = $null
# Navigate to the website.
$result = [System.Net.WebRequest]::Create($target)
$response = $result.GetResponse()
$stream = [System.IO.StreamReader]$response.GetResponseStream()
$page = $stream.ReadToEnd()
# Determine if the page is truly up based on the information above.
If($response.StatusCode -eq 200){
# While the page might have rendered need to determine there are no errors present
if($page -notmatch $regex){
$msg = "$target is online!"
} else {
$msg = "$target may be DOWN, please check!"
}
} else {
$msg = "$target may be DOWN, please check!"
}
# Log Result
$msg | Out-File $log -Append -width 120
# Close the connection
$response.Close()
}
# Other postproccessing would go here
I wanted to show what a here-string looked like to replace some of your out-file repetition. Your log file header used to be several lines of this. I have reduced it to one.
#"
---------------------------------------------------------------------------
$(Get-Date)
***Checking Links***
"# | Out-File $log -Append
Also consider CodeReview.SE for critiquing working code. There are other areas which could in theory be improved but are out of scope for this question.

How to parse and delete archived event logs in Powershell

I'm trying to parse archived Security logs to track down an issue with changing permissions. This script greps through .evtx files that are +10 days old. It currently outputs what I want, but when it goes to clean up the old logs (About 50GB/daily, uncompressed, each of which are archived into their own daily folder via another script that runs at midnight) it begins complaining that the logs are in use and cannot be deleted. The process that seems to be in use when I try to delete the files through Explorer is alternately DHCP Client or Event Viewer, stopping both of these services works, but clearly I can't run without eventvwr. DHCP client is used for networking niceness but is not needed.
The only thing that touches the .evtx files is this script, they're not backed up, they're not monitored by anything else, they're not automatically parsed by the Event Log service, they're just stored on disk waiting.
The script originally deleted things as it went, but then since that failed all the deletions were moved to the end, then to the KillLogWithFire() function. Even the timer doesn't seem to help. I've also tried moving the files to a Processed subfolder, but that does't work for the same reason.
I assume that there's some way to release any handles that this script opens on any files, but attempting to .close() or .dispose() of the EventLog variable in the loop doesn't work.
$XPath = #'
*[System[Provider/#Name='Microsoft-Windows-Security-Auditing']]
and
*[System/EventID=4670]
'#
$DeletableLogs = #()
$logfile = "L:\PermChanges.txt"
$AdminUsers = ("List","of","Admin","Users")
$today = Get-Date
$marker = "
-------------
$today
-------------
"
write-output $marker >> $logfile
Function KillLogWithFire($log){
Try {
remove-item $log
}
Catch [writeerror]{
$Timer += 1
sleep $timer
write-output "Killing log $log in $timer seconds"
KillLogWithFire($log)
}
}
Function LogPermissionChange($PermChanges){
ForEach($PermChange in $PermChanges){
$Change = #{}
$Change.ChangedBy = $PermChange.properties[1].value.tostring()
#Filter out normal non-admin users
if ($AdminUsers -notcontains $Change.ChangedBy){continue}
$Change.FileChanged = $PermChange.properties[6].value.tostring()
#Ignore temporary files
if ($Change.FileChanged.EndsWith(".tmp")){continue}
elseif ($Change.FileChanged.EndsWith(".partial")){continue}
$Change.MadeOn = $PermChange.TimeCreated.tostring()
$Change.OriginalPermissions = $PermChange.properties[8].value.tostring()
$Change.NewPermissions = $PermChange.properties[9].value.tostring()
write-output "{" >> $logfile
write-output ("Changed By : "+ $Change.ChangedBy) >> $logfile
write-output ("File Changed : "+ $Change.FileChanged) >> $logfile
write-output ("Change Made : "+ $Change.MadeOn) >> $logfile
write-output ("Original Permissions :
"+ $Change.OriginalPermissions) >> $logfile
write-output ("New Permissions :
"+ $Change.NewPermissions) >> $logfile
"}
" >> $logfile
}
}
GCI -include Archive-Security*.evtx -path L:\Security\$Today.AddDays(-10) -recurse | ForEach-Object{
Try{
$PermChanges = Get-WinEvent -Path $_ -FilterXPath $XPath -ErrorAction Stop
}
Catch [Exception]{
if ($_.Exception -match "No events were found that match the specified selection criteria."){
}
else {
Throw $_
}
}
LogPermissionChange($PermChanges)
$PermChanges = $Null
$DeletableLogs += $_
}
foreach ($log in $DeletableLogs){
$Timer = 0
Try{
remove-item $log
}
Catch [IOException]{
KillLogWithFire($log)
}
}
UPDATE
Rather than editing the original code as I've been told not to do, I wanted to post the full code that's now in use as a separate answer. The Initial part, which parses the logs and is run every 30 minutes is mostly the same as above:
$XPath = #'
*[System[Provider/#Name='Microsoft-Windows-Security-Auditing']]
and
*[System/EventID=4670]
'#
$DeletableLogs = #()
$logfile = "L:\PermChanges.txt"
$DeleteList = "L:\DeletableLogs.txt"
$AdminUsers = ("List","Of","Admins")
$today = Get-Date
$marker = "
-------------
$today
-------------
"
write-output $marker >> $logfile
Function LogPermissionChange($PermChanges){
ForEach($PermChange in $PermChanges){
$Change = #{}
$Change.ChangedBy = $PermChange.properties[1].value.tostring()
#Filter out normal non-admin users
if ($AdminUsers -notcontains $Change.ChangedBy){continue}
$Change.FileChanged = $PermChange.properties[6].value.tostring()
#Ignore temporary files
if ($Change.FileChanged.EndsWith(".tmp")){continue}
elseif ($Change.FileChanged.EndsWith(".partial")){continue}
$Change.MadeOn = $PermChange.TimeCreated.tostring()
$Change.OriginalPermissions = $PermChange.properties[8].value.tostring()
$Change.NewPermissions = $PermChange.properties[9].value.tostring()
write-output "{" >> $logfile
write-output ("Changed By : "+ $Change.ChangedBy) >> $logfile
write-output ("File Changed : "+ $Change.FileChanged) >> $logfile
write-output ("Change Made : "+ $Change.MadeOn) >> $logfile
write-output ("Original Permissions :
"+ $Change.OriginalPermissions) >> $logfile
write-output ("New Permissions :
"+ $Change.NewPermissions) >> $logfile
"}
" >> $logfile
}
}
GCI -include Archive-Security*.evtx -path L:\Security\ -recurse | ForEach-Object{
Try{
$PermChanges = Get-WinEvent -Path $_ -FilterXPath $XPath -ErrorAction Stop
}
Catch [Exception]{
if ($_.Exception -match "No events were found that match the specified selection criteria."){
}
else {
Throw $_
}
}
LogPermissionChange($PermChanges)
$PermChanges = $Null
$DeletableLogs += $_
}
foreach ($log in $DeletableLogs){
write-output $log.FullName >> $DeleteList
}
The second portion does the deletion, including the helper function above graciously provided by TheMadTechnician. The code still loops as the straight delete is faster than the function, but not always successful even ages after the files have not been touched.:
# Log Cleanup script. Works around open log issues caused by PS parsing of
# saved logs in EventLogParser.ps1
$DeleteList = "L:\DeletableLogs.txt"
$DeletableLogs = get-content $DeleteList
Function Close-LockedFile{
Param(
[Parameter(Mandatory=$true,ValueFromPipeline=$true)][String[]]$Filename
)
Begin{
$HandleApp = 'C:\sysinternals\Handle.exe'
If(!(Test-Path $HandleApp)){Write-Host "Handle.exe not found at $HandleApp`nPlease download it from www.sysinternals.com and save it in the afore mentioned location.";break}
}
Process{
$HandleOut = Invoke-Expression ($HandleApp+' '+$Filename)
$Locks = $HandleOut |?{$_ -match "(.+?)\s+pid: (\d+?)\s+type: File\s+(\w+?): (.+)\s*$"}|%{
[PSCustomObject]#{
'AppName' = $Matches[1]
'PID' = $Matches[2]
'FileHandle' = $Matches[3]
'FilePath' = $Matches[4]
}
}
ForEach($Lock in $Locks){
Invoke-Expression ($HandleApp + " -p " + $Lock.PID + " -c " + $Lock.FileHandle + " -y") | Out-Null
If ( ! $LastexitCode ) { "Successfully closed " + $Lock.AppName + "'s lock on " + $Lock.FilePath}
}
}
}
Function KillLogWithFire($log){
Try {
Close-LockedFile $Log -
}
Catch [System.IO.IOException]{
$Timer += 1
sleep $timer
write-host "Killing $Log in $Timer seconds with fire."
KillLogWithFire($Log)
}
}
foreach ($log in $DeletableLogs){
Try {
remove-item $log -ErrorAction Stop
}
Catch [System.IO.IOException]{
$Timer = 0
KillLogWithFire($Log)
}
}
remove-item $DeleteList
One solution would be to get HANDLE.EXE and use it to close any open handles. Here's a function that I use roughly based off of this script. It uses handle.exe, finds what has a file locked, and then closes handles locking that file open.
Function Close-LockedFile{
Param(
[Parameter(Mandatory=$true,ValueFromPipeline=$true)][String[]]$Filename
)
Begin{
$HandleApp = 'C:\sysinternals\Handle.exe'
If(!(Test-Path $HandleApp)){Write-Host "Handle.exe not found at $HandleApp`nPlease download it from www.sysinternals.com and save it in the afore mentioned location.";break}
}
Process{
$HandleOut = Invoke-Expression ($HandleApp+' '+$Filename)
$Locks = $HandleOut |?{$_ -match "(.+?)\s+pid: (\d+?)\s+type: File\s+(\w+?): (.+)\s*$"}|%{
[PSCustomObject]#{
'AppName' = $Matches[1]
'PID' = $Matches[2]
'FileHandle' = $Matches[3]
'FilePath' = $Matches[4]
}
}
ForEach($Lock in $Locks){
Invoke-Expression ($HandleApp + " -p " + $Lock.PID + " -c " + $Lock.FileHandle + " -y") | Out-Null
If ( ! $LastexitCode ) { "Successfully closed " + $Lock.AppName + "'s lock on " + $Lock.FilePath}
}
}
}
I have handle.exe saved in C:\Sysinternals, you may want to adjust the path in the function, or save the executable there.
I was having a very similar problem and after lots of searching found this article. Whilst handle.exe worked when I first tried I did note the -c carries a warning "Closing handles can cause application or system instability"
I am also using get-winevent and it seems to (sometimes) lock the .evtx file being processed. I have written a loop to wait 5 secs an retry. Sometimes it takes up to 2 minutes or the file to be released, I have had one run overnight and it had hundreds of retries.
When I used handle the first time it worked perfectly. I then implemented it into the script and later found it to be looping an "unexplained error". I ended up having to reboot the server to get things working again so removed the handle.exe from the script and back to waiting for the file to be closed.
I can reliably release the file by stopping the script and closing down the powershell ise. As soon as the ISE is closed the file can be deleted without a problem.
Unfortunately I need this script to keep running and not be held up by the file remaining open. I am surprised that have to resort to sysinternals to release the file and that powershell does not offer an easy way to close the file.
I had the same issue as GTEM where closing the handles would cause corruption when processing hundreds of event log files. Eventually Get-WinEvent would not work properly. It would either freeze or give me the same "unexplained error".
So I opened a premier case with MS. They lead me to the actual variable I was storing the Get-WinEvent events in was what was locking the file. I guess it doesn't actually unlock the file if you are still using that variable. So to resolve this I added some code to my script after I transferred the variable to a new variable. You can see the code I added in the 3rd region listed below.
#***************************************************************************
#region *** Get the log entries.
# clear the log entry for each pass
$LogEntry = #()
# Get the vent from the log file and export it to the logentry variable and output to the screen
Get-WinEvent -Path $NewPath -FilterXPath $XPathFilter -ErrorAction SilentlyContinue | Tee-Object -Variable LogEntry
#endregion *** End get the log entries
#***************************************************************************
#***************************************************************************
#region *** This is where I copy it to the new variable for later output.
# if there are any log entries
if ($LogEntry.Count -gt 0) {
# Add the log entries to the log file
$LogEntries += $LogEntry
} # if there are any log entries
#endregion *** End were I copy to the new variable.
#***************************************************************************
#***************************************************************************
#region *** This is where I added code to allow me to remove the file lock.
# Remove the variable to release the evtx file lock
Remove-Variable -Name LogEntry
# Garbage collect to remove any additional memory tied to the file lock.
[GC]::Collect()
# sleep for 1 seconds
Sleep -Seconds 1
#endregion **** Code to remove the file lock.
#***************************************************************************
After this was done, I no longer have to use Handle.exe to close the file anymore.

Powershell: Search data in *.txt files to export into *.csv

First of all, this is my first question here. I often come here to browse existing topics, but now I'm hung on my own problem. And I didn't found a helpful resource right now. My biggest concern would be, that it won't work in Powershell... At the moment I try to get a small Powershell tool to save me a lot of time. For those who don't know cw-sysinfo, it is a tool that collects information of any host system (e.g. Hardware-ID, Product Key and stuff like that) and generates *.txt files.
My point is, if you have 20, 30 or 80 server in a project, it is a huge amount of time to browse all files and just look for those lines you need and put them together in a *.csv file.
What I have working is more like the basic of the tool, it browses all *.txt in a specific path and checks for my keywords. And here is the problem that I just can use the words prior to those I really need, seen as follow:
Operating System: Windows XP
Product Type: Professional
Service Pack: Service Pack 3
...
I don't know how I can tell Powershell to search for "Product Type:"-line and pick the following "Professional" instead. Later on with keys or serial numbers it will be the same problem, that is why I just can't browse for "Standard" or "Professional".
I placed my keywords($controls) in an extra file that I can attach the project folders and don't need to edit in Powershell each time. Code looks like this:
Function getStringMatch
{
# Loop through the project directory
Foreach ($file In $files)
{
# Check all keywords
ForEach ($control In $controls)
{
$result = Get-Content $file.FullName | Select-String $control -quiet -casesensitive
If ($result -eq $True)
{
$match = $file.FullName
# Write the filename according to the entry
"Found : $control in: $match" | Out-File $output -Append
}
}
}
}
getStringMatch
I think this is the kind of thing you need, I've changed Select-String to not use the -quiet option, this will return a matches object, one of the properties of this is the line I then split the line on the ':' and trim any spaces. These results are then placed into a new PSObject which in turn is added to an array. The array is then put back on the pipeline at the end.
I also moved the call to get-content to avoid reading each file more than once.
# Create an array for results
$results = #()
# Loop through the project directory
Foreach ($file In $files)
{
# load the content once
$content = Get-Content $file.FullName
# Check all keywords
ForEach ($control In $controls)
{
# find the line containing the control string
$result = $content | Select-String $control -casesensitive
If ($result)
{
# tidy up the results and add to the array
$line = $result.Line -split ":"
$results += New-Object PSObject -Property #{
FileName = $file.FullName
Control = $line[0].Trim()
Value = $line[1].Trim()
}
}
}
}
# return the results
$results
Adding the results to a csv is just a case of piping the results to Export-Csv
$results | Export-Csv -Path "results.csv" -NoTypeInformation
If I understand your question correctly, you want some way to parse each line from your report files and extract values for some "keys". Here are a few lines to give you an idea of how you could proceede. The example is for one file, but can be generalized very easily.
$config = Get-Content ".\config.txt"
# The stuff you are searching for
$keys = #(
"Operating System",
"Product Type",
"Service Pack"
)
foreach ($line in $config)
{
$keys | %{
$regex = "\s*?$($_)\:\s*(?<value>.*?)\s*$"
if ($line -match $regex)
{
$value = $matches.value
Write-Host "Key: $_`t`tValue: $value"
}
}
}