Get HTTP response code when response code does not equal 200 - powershell

Im trying to write a script which loops trough a CSV File with a bunch of links in it. Then i want to check with Invoke-WebRequest if the page is reachable. This works fine if the HTTP response code is "200". But I also want the script to write out the response code when it is somewhat like 4xx, 3xx, 5xx etc. So far the response code variable is just empty.
The problem is that when ''Invoke-WebRequest'' returns not 200(thats kinda the point of the script) it generates an error instead of giving me a clean output with the HTTP Response Code.
My script so far:
$Links = Import-Csv -Path 'Path\to\file.csv' -Delimiter ";"
$user = Get-Credential
function Check-Link{
$name_de = $Link.title_de
$link_de = $Link.url_de
Write-Host $link_de
$HTTP_Response = (Invoke-WebRequest $Link.url_de -Credential $user -Method Head).StatusCode
if($HTTP_Response -ne 200){
Write-Host "The page $name_de is not reachable! Response Code: $HTTP_Response"
}
else{
Write-Host "The page $name_de is reachable!"
}
}
$DeadLinks
foreach($Link in $Links){
Check-Link
Start-Sleep 10
}
Now when the Response code does NOT equal 200 the output ist:
www.example.com
the page example.com is not reachable! Response Code:
The expected output would be:
www.example.com
the page example.com is not reachable! Response Code: 401

Related

URL health-check PowerShell script correctly gets HTTP 200 on most sites, but incorrect '0' status code on some...API timeout issue?

I have a URL health-checking PowerShell script which correctly gets an HTTP 200 status code on most of my intranet sites, but a '0' status code is returned on a small minority of them. The '0' code is an API return rather than from the web site itself, according to my research of questions from others who have written similar URL-checking PowerShell scripts. Thinking this must be a timeout issue, where API returns '0' before the slowly-responding web site returns its 200, I've researched yet more questions about this subject area on SO and implemented a suggestion from someone to insert a timeout in the script. The timeout setting though, no matter how high I set the timeout value, doesn't help. I still get the same '0' "response" code from the same web sites even though those web sites are up and running as checked from any regular web browser. Any thoughts on how I could tweak the timeout setting in the script below in order to get the correct 200 response code?
The Script:
$URLListFile = "C:\Users\Admin1\Documents\Scripts\URL Check\URL_Check.txt"
$URLList = Get-Content $URLListFile -ErrorAction SilentlyContinue
#if((test-path $reportpath) -like $false)
#{
#new-item $reportpath -type file
#}
#For every URL in the list
$result = foreach($Uri in $URLList) {
try{
#For proxy systems
[System.Net.WebRequest]::DefaultWebProxy = [System.Net.WebRequest]::GetSystemWebProxy()
[System.Net.WebRequest]::DefaultWebProxy.Credentials = [System.Net.CredentialCache]::DefaultNetworkCredentials
#Web request
$req = [system.Net.WebRequest]::Create($uri)
$req.Timeout=5000
$res = $req.GetResponse()
}
catch {
#Err handling
$res = $_.Exception.Response
}
$req = $null
#Getting HTTP status code
$int = [int]$res.StatusCode
# output a formatted string to capture in variable $result
"$int - $uri"
#Disposing response if available
if($res){
$res.Dispose()
}
}
# output on screen
$result
#output to log file
$result | Set-Content -Path "C:\Users\Admin1\Documents\Scripts\z_Logs\URL_Check\URL_Check_log.txt" -Force
Current output:
200 - http://192.168.1.1/
200 - http://192.168.1.2/
200 - http://192.168.1.250/config/authentication_page.htm
0 - https://192.168.1.50/
200 - http://app1-vip-http.dev.local/
0 - https://CA/certsrv/Default.asp
Perhaps using PowerShell cmdlet Invoke-WebRequest works better for you. It has many more parameters and switches to play around with like ProxyUseDefaultCredentials and DisableKeepAlive
$pathIn = "C:\Users\Admin1\Documents\Scripts\URL Check\URL_Check.txt"
$pathOut = "C:\Users\Admin1\Documents\Scripts\z_Logs\URL_Check\URL_Check_log.txt"
$URLList = Get-Content -Path $pathIn
$result = foreach ($uri in $URLList) {
try{
$res = Invoke-WebRequest -Uri $uri -UseDefaultCredentials -UseBasicParsing -Method Head -TimeoutSec 5 -ErrorAction Stop
$status = [int]$res.StatusCode
}
catch {
$status = [int]$_.Exception.Response.StatusCode.value__
}
# output a formatted string to capture in variable $result
"$status - $uri"
}
# output on screen
$result
#output to log file
$result | Set-Content -Path $pathOut -Force

PowerShell - Test Multiple URLs from TXT/CSV file and record HTTP Code into a CSV file for reporting

Hoping someone can help with this.
I built a script based on this link
[Net.ServicePointManager]::SecurityProtocol = [Net.SecurityProtocolType]::Tls12
$urlStr = Read-Host "Please enter URL to check"
[uri]$urlStr
# First we create the request.
$HTTPS_Request = [System.Net.WebRequest]::Create("$urlStr")
# We then get a response from the site.
$HTTPS_Response = $HTTPS_Request.GetResponse()
# We then get the HTTP code as an integer.
$HTTPS_Status = [int]$HTTPS_Response.StatusCode
$HTTPS_StatusDesc = [string]$HTTPS_Response.StatusDescription
#Write-Host "HTTP CODE: $HTTPS_Status"
do {
If ($HTTPS_Status -eq 301) {
Write-Host "HTTP CODE: $HTTPS_Status"
Write-Host "HTTP CODE DESCRIPTION: $HTTPS_StatusDesc"
Write-Host "Landing page moved permanently and redirects to another URL."
Write-Host "Please update Landing page to new URL"
}
ElseIf ($HTTPS_Status -eq 302) {
Write-Host "HTTP CODE: $HTTPS_Status"
Write-Host "HTTP CODE DESCRIPTION: $HTTPS_StatusDesc"
Write-Host "If this occurs once, then no issues"
Write-Host "If this occurs more than once, please update Landing page to new URL"
}
} while ($HTTPS_Status -ge 300 -and $HTTPS_Status -lt 400)
If ($HTTPS_Status -eq 200) {
Write-Host "HTTP CODE: $HTTPS_Status"
Write-Host "HTTP CODE DESCRIPTION: $HTTPS_StatusDesc"
Write-Host "Landed on page"
}
ElseIf ($HTTPS_Status -gt 400) {
Write-Host "HTTP CODE: $HTTPS_Status"
Write-Host "HTTP CODE DESCRIPTION: $HTTPS_StatusDesc"
Write-Host "Error - issue with Landing page. Please investigate."
}
# Finally, we clean up the http request by closing it.
$HTTPS_Response.Close()
$HTTPS_Response.Dispose()
#Read-Host -Prompt “Press Enter to exit”
Currently, the above is built to handle one URL at a time, which was fine for me as the amount of links and usage of the script was low overall so I left it as is.
But as the usage and URLs are increasing for the above script, I am hoping to have the code further modified into running multiple URLs at the same time.
My idea is to save the URLs into a TXT or CSV file and have it read, line-by-line, and run the script per line. It would then record the response and output the HTTP code (e.g. 200, 404, etc...) into a CSV (or the original CSV file) and input the data there as well.
If possible, I'd like to record the output from "$HTTP_Response" and add that in as well, but this would be a secondary objective.
Any help would be much appreciated.
Thanks.
Rajiv.
First thing you wanna do is turn your script into a function, with the URL as a parameter!
function Get-HTTPResponseCode
{
param(
[Parameter(Mandatory = $true, Position = 0)]
[uri]$Url,
[switch]$Quiet
)
[Net.ServicePointManager]::SecurityProtocol = [Net.SecurityProtocolType]::Tls12
try {
# First we create the request.
$HTTPS_Request = [System.Net.WebRequest]::Create("$Url")
# We then get a response from the site.
$HTTPS_Response = $HTTPS_Request.GetResponse()
# We then get the HTTP code as an integer.
$HTTPS_Status = [int]$HTTPS_Response.StatusCode
$HTTPS_StatusDesc = [string]$HTTPS_Response.StatusDescription
}
finally {
# Finally, we clean up the http request by closing it.
$HTTPS_Response.Close()
$HTTPS_Response.Dispose()
}
#Write-Host "HTTP CODE: $HTTPS_Status"
if(-not $Quiet){
if ($HTTPS_Status -eq 301) {
Write-Host "HTTP CODE: $HTTPS_Status"
Write-Host "HTTP CODE DESCRIPTION: $HTTPS_StatusDesc"
Write-Host "Landing page moved permanently and redirects to another URL."
Write-Host "Please update Landing page to new URL"
}
elseif ($HTTPS_Status -eq 302) {
Write-Host "HTTP CODE: $HTTPS_Status"
Write-Host "HTTP CODE DESCRIPTION: $HTTPS_StatusDesc"
Write-Host "If this occurs once, then no issues"
Write-Host "If this occurs more than once, please update Landing page to new URL"
}
elseif ($HTTPS_Status -eq 200) {
Write-Host "HTTP CODE: $HTTPS_Status"
Write-Host "HTTP CODE DESCRIPTION: $HTTPS_StatusDesc"
Write-Host "Landed on page"
}
elseif ($HTTPS_Status -gt 400) {
Write-Host "HTTP CODE: $HTTPS_Status"
Write-Host "HTTP CODE DESCRIPTION: $HTTPS_StatusDesc"
Write-Host "Error - issue with Landing page. Please investigate."
}
}
# return the response code
return $HTTPS_Status
}
Now that we have an easily reusable function, we can do interesting things, like using it in a calculated property for example:
$URLs = #(
'https://www.stackoverflow.com'
'https://www.stackexchange.com'
)
$URLs |Select-Object #{Name='URL';Expression={$_}},#{Name='Status'; Expression={Get-HTTPResponseCode -Url $_}} |Export-Csv .\path\to\result.csv -NoTypeInformation
If you want to store the input URLs seperately, simply put them in a file, one per line, and then use Get-Content to read them from disk:
$URLs = Get-Content .\path\to\file\with\urls.txt

Trying to Use Power shell Script to test availability of website

I am trying to use a powershell script to test the availability of certain websites. I have a script here that writes "Site is OK" if the site returns a http 200 code. It should return "The Site may be down, please check!" If it returns any other code. I put in 'https://www.google.com/cas76' which should obviously return a 404 error however the script returns "Site is ok" How should I go about fixing my code so it returns "The Site may be down, please check!"
Tried putting in websites that would obviously not return a 200 code.
# First we create the request.
$HTTP_Request = [System.Net.WebRequest]::Create('https://www.google.com/cas76')
# We then get a response from the site.
$HTTP_Response = $HTTP_Request.GetResponse()
# We then get the HTTP code as an integer.
$HTTP_Status = [int]$HTTP_Response.StatusCode
If ($HTTP_Status -eq 200) {
Write-Host "Site is OK!"
}
Else {
Write-Host "The Site may be down, please check!"
}
# Finally, we clean up the http request by closing it.
$HTTP_Response.Close()
Code acknowledges that there is a 404 error
Exception calling "GetResponse" with "0" argument(s): "The remote server returned an error: (404) Not Found."
At C:\Users\TX394UT\Desktop\Web_Bot_Project\WebsiteMonitoring.ps1:6 char:1
+ $HTTP_Response = $HTTP_Request.GetResponse()
However, Site is OK! Prints on the console.
the way that the webclient handles returned errors is ... odd. [grin]
that message is treated as a non-terminating error and needs to be handled as such. the most obvious way is with Try/Catch/Finally. when capturing the exception message, the 404 StatusCode is converted to the text value for it - NotFound.
$TestUrl = 'https://www.google.com/cas76'
try
{
$Response = (Invoke-WebRequest -Uri $TestUrl -ErrorAction Stop).StatusCode
}
catch
{
$Response = $_.Exception.Response.StatusCode
}
$Response
output for the bad url = NotFound
output for a good url = 200

How to check status of list of Websites / URLs? (Using Power-Shell script)

I want to take the http status of multiple URL at once to prepare a report. How to acheive it using powershell?
I have seen questions on monitoring multiple websites through windows machine. My friend wanted to check status of 200 URLs which he used to do manually. I wrote a Power-Shell script to overcome this. Posting it for the benefit of all users.
Save the below code as "AnyName.ps1" file in "D:\MonitorWeb\"
#Place URL list file in the below path
$URLListFile = "D:\MonitorWeb\URLList.txt"
$URLList = Get-Content $URLListFile -ErrorAction SilentlyContinue
#For every URL in the list
Foreach($Uri in $URLList) {
try{
#For proxy systems
[System.Net.WebRequest]::DefaultWebProxy = [System.Net.WebRequest]::GetSystemWebProxy()
[System.Net.WebRequest]::DefaultWebProxy.Credentials = [System.Net.CredentialCache]::DefaultNetworkCredentials
#Web request
$req = [system.Net.WebRequest]::Create($uri)
$res = $req.GetResponse()
}catch {
#Err handling
$res = $_.Exception.Response
}
$req = $null
#Getting HTTP status code
$int = [int]$res.StatusCode
#Writing on the screen
Write-Host "$int - $uri"
#Disposing response if available
if($res){
$res.Dispose()
}
}
Place a file "URLList.txt" with list of URLs in the same directory "D:\MonitorWeb\"
e.g:
http://www.google.com
http://google.com
http://flexboxfroggy.com
http://www.lazyquestion.com/interview-questions-and-answer/es6
Now open normal command prompt and navigate to the "D:\MonitorWeb\" and type below command:
powershell -executionpolicy bypass -File .\AnyName.ps1
And you'll get output like below:
HTTP STATUS 200 = OK
Note:
This works in power-shell version 2 as well.
Works even if you are behind proxy (uses default proxy settings)

Powershell Command: HttpWebRequest stucks after fetching two requests

I have a list of URL in a text file and I want to test whether all of them are reachable or not. I fired the following command in windows powershell but somehow after displaying the status of first two requests, the command stucks somewhere and never returns. Am I missing something?
cat .\Test.txt | % { [system.Net.WebRequest]::Create("$_").GetResponse().StatusCode }
Text File
http://www.google.com
http://www.yahoo.com
http://www.bing.com
Output:
OK
OK
----> after that it stucks.
use Invoke-WebRequest instead:
$sites = 'http://www.google.com','http://www.yahoo.com','http://www.bing.com'
foreach ($site in $sites) {
Invoke-WebRequest $site
$site
}
From memory: You have to explicitly close the Response stream:
$req = [System.Net.HttpWebRequest]::Create($aRequestUrl);
$response = $null
try
{
$response = $req.GetResponse()
# do something with the response
}
finally
{
# Clear the response, otherwise the next HttpWebRequest may fail... (don't know why)
if ($response -ne $null) { $response.Close() }
}