Issue creating a script with dual logic in deeming a website online - powershell

I am currently trying to create a script that allows me to check multiple web url's in order to see if they are online and active. My company has multiple servers with different environments active (Production, Staging, Development etc.) I need a script that can check all the environments URL's and tell me whether or not they are online each and every morning so I can be ahead of the game in addressing any servers or websites being down.
My issue however is I can't solely base the logic strictly on an HTTP code to deem the site online or not, some of our websites may be online from an HTTP standpoint but have components or webparts of the site that is down displaying an error message on the page.
I am having trouble coming up with a script that can not only check the HTTP status as well as scan the page and parse out any error messages and then write to host based on both pieces of logic whether or not the site is "Online" or "Down"
Here is what I have so far, you will notice it does not include anything regarding parse for key words as I don't know how to implement...
#Lower Environments Checklist Automated Script
Write-Host Report generated at (Get-date)
write-host("Lower Environments Status Check");
$msg = ""
$array = get-content C:\LowerEnvChecklist\appurls.txt
$log = "C:\LowerEnvChecklist\lowerenvironmentslog.txt"
write-host("Checking appurls.txt...One moment please.");
("`n--------------------------------------------------------------------------- ") | out-file $log -Append
Get-Date | Out-File $log -Append
("`n***Checking Links***") | out-file $log -Append
("`n") | out-file $log -Append
for ($i=0; $i -lt $array.length; $i++) {
$HTTP_Status = -1
$HTTP_Request = [System.Net.WebRequest]::Create($array[$i])
$HTTP_Request.Timeout =60000
$HTTP_Response = $HTTP_Request.GetResponse()
$HTTP_Status = [int]$HTTP_Response.StatusCode
If ($HTTP_Status -eq 200) {
$msg = $array[$i] + " is ONLINE!"
}
Else {
$msg = $array[$i] + " may be DOWN, please check!"
}
$HTTP_Response.Close()
$msg | Out-File $log -Append -width 120
write-host $msg
}
("`n") | out-file $log -Append
("`n***Lower Environments Checklist Completed***") | out-file $log -Append
write-host("Lower Environments Checklist Completed");
appurls.txt just contains the internal URLs I need checked FYI.
Any help would be much appreciated! Thanks.

Here is something to at least give you an idea what to do. Need to capture the website data in order to parse it. Then we run a regex query against that which is built from an array of strings. Those strings are texts that might be seen on a page that is not working.
# build a regex query of error strings to match against.
$errorTexts = "error has occurred","Oops","Unable to display widget data","unexpected error occurred","temporarily unavailable"
$regex = ($errorTexts | ForEach-Object{[regex]::Escape($_)}) -join "|"
# Other preproccessing would go here
# Loop through each element of the array
ForEach($target in $array){
# Erase results for the next pass in case of error.
$result, $response, $stream, $page = $null
# Navigate to the website.
$result = [System.Net.WebRequest]::Create($target)
$response = $result.GetResponse()
$stream = [System.IO.StreamReader]$response.GetResponseStream()
$page = $stream.ReadToEnd()
# Determine if the page is truly up based on the information above.
If($response.StatusCode -eq 200){
# While the page might have rendered need to determine there are no errors present
if($page -notmatch $regex){
$msg = "$target is online!"
} else {
$msg = "$target may be DOWN, please check!"
}
} else {
$msg = "$target may be DOWN, please check!"
}
# Log Result
$msg | Out-File $log -Append -width 120
# Close the connection
$response.Close()
}
# Other postproccessing would go here
I wanted to show what a here-string looked like to replace some of your out-file repetition. Your log file header used to be several lines of this. I have reduced it to one.
#"
---------------------------------------------------------------------------
$(Get-Date)
***Checking Links***
"# | Out-File $log -Append
Also consider CodeReview.SE for critiquing working code. There are other areas which could in theory be improved but are out of scope for this question.

Related

Read Log File for Monitoring using regex

I am trying to Monitor a Log file using Powershell, but I am not able to figure out what regex should I use to have my required Monitoring output.
Get-Content $file -wait | where {$_ -match "some regex"} |
foreach { send_email($_) }
Here is a sample of my Log File.
19:43:06.5230 Info {"message":"YourCode_Prod execution started","level":"Information","timeStamp":"2019-01-15T19:43:06.5132404+00:00","fingerprint":"588aeb19-76e5-415a-88ff-a69797eb414f","windowsIdentity":"AD\\gaurav.m","machineName":"EDEV-3","processName":"YourCode_Prod","processVersion":"1.0.6800.16654","fileName":"YourCode_Work","jobId":"22a537ae-35e6-4abd-a57d-9dd0c273e81a","robotName":"Gaurav"}
19:50:48.8014 Info {"message":"YourCode_Prod execution ended","level":"Information","timeStamp":"2019-01-15T19:50:48.8005228+00:00","fingerprint":"b12b7d6f-cf3a-4e24-b1e6-1cf4413c12e2","windowsIdentity":"AD\\gaurav.m","machineName":"EDEV-3","processName":"YourCode_Prod","processVersion":"1.0.6800.16654","fileName":"YourCode_Work","jobId":"22a537ae-35e6-4abd-a57d-9dd0c273e81a","robotName":"Gaurav","totalExecutionTimeInSeconds":462,"totalExecutionTime":"00:07:42"}
I tried to generate regex, but I could not find the right logic for this.
$searchpattern = [regex]"(?:(?:started))|(?:(?:ended))"
Get-Content -Path C:\Execution.log | foreach {
if ($PSItem -match $searchpattern) {
# does the entry have a valid date/time
$time, $type, $data = $PSItem -split " ", 3
$time
$type
$data
}
}
I need to monitor this file which will have many lines, but I want to take action only for "execution started" and "execution ended".
I could separate the type of message time and the data. But I need to further drill down on the data.
This script will run every 5 min, so I will compare time -5 min and start reading log from there, as soon as I find ended or started I will take necessary action.
Your log data is in JSON format, so you could simply process the extracted data as such.
Get-Content -Path C:\Execution.log | ForEach-Object {
$time, $type, $json = $_ -split " ", 3
$data = ConvertFrom-Json $json
if ($data.message -match 'execution (started|ended)') {
# do stuff
}
}

Need my PowerShell logfile in a table format

I recently finished my script with the help of someone on this site (Matt) Thanks again!
I now need to somehow get the logfile into a tabled format and I'm not sure how to implement that with the current setup of the script, any ideas?
Write-Host Report generated at (Get-date)
write-host("Lower Environments Status Check");
# Preprocessing Items
$msg = ""
$array = get-content C:\LowerEnvChecklist\appurls.txt
$log = "C:\LowerEnvChecklist\lowerenvironmentslog.txt"
$errorTexts = "error has occurred","Oops","Unable to display widget data","unexpected error occurred","temporarily unavailable","there was a problem"
$regex = ($errorTexts | ForEach-Object{[regex]::Escape($_)}) -join "|"
write-host("Checking appurls.txt...One moment please.");
("`n---------------------------------------------------------------------------") | out-file $log -Append
Get-Date | Out-File $log -Append
("`n***Checking Links***") | out-file $log -Append
("`n") | out-file $log -Append
# Loop through each element of the array.
ForEach($target in $array){
# Erase results for the next pass in case of error.
$result, $response, $stream, $page = $null
# Navigate to site urls
$result = [System.Net.WebRequest]::Create($target)
$response = $result.GetResponse()
$stream = [System.IO.StreamReader]$response.GetResponseStream()
$page = $stream.ReadToEnd()
# To ensure login/authentication pages that give a 403 response pages still show as online
If($response.StatusCode -eq 403){
$msg = " $target -----> is ONLINE!"}
# Determine if the status code 200 pages are truly up based on the information above.
If($response.StatusCode -eq 200){
# While the page might have rendered need to determine there are no errors present.
If($page -notmatch $regex){
$msg = " $target -----> is ONLINE!"
} else {
$msg = " $target -----> may be DOWN, please check!"
}
} else {
$msg = " $target -----> may be DOWN, please check!"
}
# Log Results.
$msg | Out-File $log -Append -width 120
write-host $msg
# Close the response.
$response.Close()
}
# Write completion to logfile.
("`n") | out-file $log -Append
("`n***Lower Environments Checklist Completed***") | out-file $log -Append
# Write completion to host.
write-host("Lower Environments Checklist Completed");
# Open logfile once script is complete.
Invoke-Item C:\LowerEnvChecklist\lowerenvironmentslog.txt
If you just want to view it in-script you could do Out-GridView on your log file. This will open a new window with a view of the data in the log file that looks like a table. Depending on your formatting you may have to add extra items like headers that are human readable.
To wet your whistle with structured output I opted to show you a CSV based solution. Either way all avenues require objects. What we do here is create a custom object that we populate as the script progresses. Each pass sends the details down the pipe. Using the pipeline we can use Export-CSV to collect all of the data in a nice file. Even filtering is possible now.
write-host("Lower Environments Status Check");
# Preprocessing Items
$array = Get-Content C:\LowerEnvChecklist\appurls.txt
$log = "C:\LowerEnvChecklist\lowerenvironmentslog.csv"
$errorTexts = "error has occurred","Oops","Unable to display widget data","unexpected error occurred","temporarily unavailable","there was a problem"
$regex = ($errorTexts | ForEach-Object{[regex]::Escape($_)}) -join "|"
# Loop through each element of the array. Use the pipeline to make output easier
$array | ForEach-Object{
# Keep the variable $target so it is not lost in scopes. Build the object to be completed as we go.
$target = [pscustomobject][ordered]#{
URL = $_
Status = ""
Detail = "N/A"
Timestamp = Get-Date
}
# Erase results for the next pass in case of error.
$result, $response, $stream, $page = $null
# Navigate to site urls. If we cant access the site set a flag to mark the site as down.
$result = [System.Net.WebRequest]::Create($target.URL)
$response = try{$result.GetResponse()}catch{$null}
switch([int]$response.StatusCode){
403{
$target.Status = "OK"
$target.Detail = "403"
}
200{
# Get page content to confirm up status
$stream = [System.IO.StreamReader]$response.GetResponseStream()
$page = $stream.ReadToEnd()
# While the page might have rendered need to determine there are no errors present.
If($page -notmatch $regex){
$target.Status = "OK"
} else {
$target.Status = "DOWN"
$target.Detail = "Pattern"
}
}
default{
$target.Status = "DOWN"
}
}
# Send the object down the pipeline
$target
# Close the response. The object might not exist so check before we call the methods.
if($response){$response.Close()}
if($stream){$stream.Close()}
} | Export-CSV -Path $log -NoTypeInformation
# Write completion to host.
write-host("Lower Environments Checklist Completed");
# Open logfile once script is complete.
Invoke-Item $log
I took the liberty off adding another column to your request called Detail it could add context. Not sure what you wanted from the date but if you have plenty of URLS and processing time then I suppose it could be of use. Also to reduce the if logic I added a switch statement. This would be more useful if you react to other status codes down the road. Still, good thing to know.
Sample Output
URL Status Detail Timestamp
--- ------ ------ ---------
https://7fweb DOWN N/A 1/11/2016 12:18:16 PM
http://www.google.ca OK N/A 1/11/2016 12:18:16 PM
http://www.microsoft.com DOWN Pattern 1/11/2016 12:18:16 PM
I added "windows" to $errorTexts to trigger a pattern match for microsoft.com

Local Groups and Members

I have a requirement to report the local groups and members from a specific list of servers. I have the following script that I have pieced together from other scripts. When run the script it writes the name of the server it is querying and the server's local group names and the members of those groups. I would like to output the text to a file, but where ever I add the | Out-File command I get an error "An empty pipe element is not allowed". My secondary concern with this script is, will the method I've chosen the report the server being queried work when outputting to a file. Will you please help correct this newbies script errors please?
$server=Get-Content "C:\Powershell\Local Groups\Test.txt"
Foreach ($server in $server)
{
$computer = [ADSI]"WinNT://$server,computer"
"
"
write-host "==========================="
write-host "Server: $server"
write-host "==========================="
"
"
$computer.psbase.children | where { $_.psbase.schemaClassName -eq 'group' } | foreach {
write-host $_.name
write-host "------"
$group =[ADSI]$_.psbase.Path
$group.psbase.Invoke("Members") | foreach {$_.GetType().InvokeMember("Name", 'GetProperty', $null, $_, $null)}
write-host **
write-host
}
}
Thanks,
Kevin
You say that you are using Out-File and getting that error. You don't show_where_ in your code that is being called from.
Given the code you have my best guess is that you were trying something like this
Foreach ($server in $server){
# All the code in this block
} | Out-File c:\pathto.txt
I wish I had a technical reference for this interpretation but alas I have not found one (Think it has to do with older PowerShell versions). In my experience there is not standard output passed from that construct. As an aside ($server in $server) is misleading even if it works. Might I suggest this small change an let me know if that works.
$servers=Get-Content "C:\Powershell\Local Groups\Test.txt"
$servers | ForEach-Object{
$server = $_
# Rest of code inside block stays the same
} | Out-File c:\pathto.txt
If that is not your speed then I would also consider building an empty array outside the block and populate is for each loop pass.
# Declare empty array to hold results
$results = #()
Foreach ($server in $server){
# Code before this line
$results += $group.psbase.Invoke("Members") | foreach {$_.GetType().InvokeMember("Name", 'GetProperty', $null, $_, $null)}
# Code after this line
}
$results | Set-Content c:\pathto.txt
Worthy Note
You are mixing Console output with standard output. depending on what you want to do with the script you will not get the same output you expect. If you want the lines like write-host "Server: $server" to be in the output file then you need to use Write-Output

Powershell: Search data in *.txt files to export into *.csv

First of all, this is my first question here. I often come here to browse existing topics, but now I'm hung on my own problem. And I didn't found a helpful resource right now. My biggest concern would be, that it won't work in Powershell... At the moment I try to get a small Powershell tool to save me a lot of time. For those who don't know cw-sysinfo, it is a tool that collects information of any host system (e.g. Hardware-ID, Product Key and stuff like that) and generates *.txt files.
My point is, if you have 20, 30 or 80 server in a project, it is a huge amount of time to browse all files and just look for those lines you need and put them together in a *.csv file.
What I have working is more like the basic of the tool, it browses all *.txt in a specific path and checks for my keywords. And here is the problem that I just can use the words prior to those I really need, seen as follow:
Operating System: Windows XP
Product Type: Professional
Service Pack: Service Pack 3
...
I don't know how I can tell Powershell to search for "Product Type:"-line and pick the following "Professional" instead. Later on with keys or serial numbers it will be the same problem, that is why I just can't browse for "Standard" or "Professional".
I placed my keywords($controls) in an extra file that I can attach the project folders and don't need to edit in Powershell each time. Code looks like this:
Function getStringMatch
{
# Loop through the project directory
Foreach ($file In $files)
{
# Check all keywords
ForEach ($control In $controls)
{
$result = Get-Content $file.FullName | Select-String $control -quiet -casesensitive
If ($result -eq $True)
{
$match = $file.FullName
# Write the filename according to the entry
"Found : $control in: $match" | Out-File $output -Append
}
}
}
}
getStringMatch
I think this is the kind of thing you need, I've changed Select-String to not use the -quiet option, this will return a matches object, one of the properties of this is the line I then split the line on the ':' and trim any spaces. These results are then placed into a new PSObject which in turn is added to an array. The array is then put back on the pipeline at the end.
I also moved the call to get-content to avoid reading each file more than once.
# Create an array for results
$results = #()
# Loop through the project directory
Foreach ($file In $files)
{
# load the content once
$content = Get-Content $file.FullName
# Check all keywords
ForEach ($control In $controls)
{
# find the line containing the control string
$result = $content | Select-String $control -casesensitive
If ($result)
{
# tidy up the results and add to the array
$line = $result.Line -split ":"
$results += New-Object PSObject -Property #{
FileName = $file.FullName
Control = $line[0].Trim()
Value = $line[1].Trim()
}
}
}
}
# return the results
$results
Adding the results to a csv is just a case of piping the results to Export-Csv
$results | Export-Csv -Path "results.csv" -NoTypeInformation
If I understand your question correctly, you want some way to parse each line from your report files and extract values for some "keys". Here are a few lines to give you an idea of how you could proceede. The example is for one file, but can be generalized very easily.
$config = Get-Content ".\config.txt"
# The stuff you are searching for
$keys = #(
"Operating System",
"Product Type",
"Service Pack"
)
foreach ($line in $config)
{
$keys | %{
$regex = "\s*?$($_)\:\s*(?<value>.*?)\s*$"
if ($line -match $regex)
{
$value = $matches.value
Write-Host "Key: $_`t`tValue: $value"
}
}
}

Sharepoint & PowerShell - StringBuilder.Replace

I'm using StringBuilder.Replace in a PowerShell script to strip out line breaks in text fields before outputting to a log file. Below is an example of what I'm using... and it works perfectly on our development environment. However, on the live environment, no line breaks are stripped out at all. Does anyone know what could be causing it to differ from environment to environment? There is a lot more content on the live server, but since the actual system is identical to the dev, all the text fields themselves are the same.
$log = "C:\mylogfile.csv"
$newline = [System.Environment]::NewLine
$sb2 = New-Object System.Text.StringBuilder
$sb2.Append("Text fields")
$sb2.Replace($newline,".")
$sb2.ToString() | Out-File $log -Append
Ok, sod's law that I find a solution shortly after posting on here..!
The following works for me. I'd experimented with 'r and 'n with no luck, but by doing both of them together with NewLine, all line breaks are now being stripped out:
$log = "C:\mylogfile.csv"
$newline = [System.Environment]::NewLine
$charsToStrip = "`r","`n", $newline
$sb2 = New-Object System.Text.StringBuilder
$sb2.Append("Text fields")
foreach ($char in $charsToStrip)
{
$sb2.Replace($char,".")
}
$sb2.ToString() | Out-File $log -Append