Powershell Command: HttpWebRequest stucks after fetching two requests - powershell

I have a list of URL in a text file and I want to test whether all of them are reachable or not. I fired the following command in windows powershell but somehow after displaying the status of first two requests, the command stucks somewhere and never returns. Am I missing something?
cat .\Test.txt | % { [system.Net.WebRequest]::Create("$_").GetResponse().StatusCode }
Text File
http://www.google.com
http://www.yahoo.com
http://www.bing.com
Output:
OK
OK
----> after that it stucks.

use Invoke-WebRequest instead:
$sites = 'http://www.google.com','http://www.yahoo.com','http://www.bing.com'
foreach ($site in $sites) {
Invoke-WebRequest $site
$site
}

From memory: You have to explicitly close the Response stream:
$req = [System.Net.HttpWebRequest]::Create($aRequestUrl);
$response = $null
try
{
$response = $req.GetResponse()
# do something with the response
}
finally
{
# Clear the response, otherwise the next HttpWebRequest may fail... (don't know why)
if ($response -ne $null) { $response.Close() }
}

Related

URL health-check PowerShell script correctly gets HTTP 200 on most sites, but incorrect '0' status code on some...API timeout issue?

I have a URL health-checking PowerShell script which correctly gets an HTTP 200 status code on most of my intranet sites, but a '0' status code is returned on a small minority of them. The '0' code is an API return rather than from the web site itself, according to my research of questions from others who have written similar URL-checking PowerShell scripts. Thinking this must be a timeout issue, where API returns '0' before the slowly-responding web site returns its 200, I've researched yet more questions about this subject area on SO and implemented a suggestion from someone to insert a timeout in the script. The timeout setting though, no matter how high I set the timeout value, doesn't help. I still get the same '0' "response" code from the same web sites even though those web sites are up and running as checked from any regular web browser. Any thoughts on how I could tweak the timeout setting in the script below in order to get the correct 200 response code?
The Script:
$URLListFile = "C:\Users\Admin1\Documents\Scripts\URL Check\URL_Check.txt"
$URLList = Get-Content $URLListFile -ErrorAction SilentlyContinue
#if((test-path $reportpath) -like $false)
#{
#new-item $reportpath -type file
#}
#For every URL in the list
$result = foreach($Uri in $URLList) {
try{
#For proxy systems
[System.Net.WebRequest]::DefaultWebProxy = [System.Net.WebRequest]::GetSystemWebProxy()
[System.Net.WebRequest]::DefaultWebProxy.Credentials = [System.Net.CredentialCache]::DefaultNetworkCredentials
#Web request
$req = [system.Net.WebRequest]::Create($uri)
$req.Timeout=5000
$res = $req.GetResponse()
}
catch {
#Err handling
$res = $_.Exception.Response
}
$req = $null
#Getting HTTP status code
$int = [int]$res.StatusCode
# output a formatted string to capture in variable $result
"$int - $uri"
#Disposing response if available
if($res){
$res.Dispose()
}
}
# output on screen
$result
#output to log file
$result | Set-Content -Path "C:\Users\Admin1\Documents\Scripts\z_Logs\URL_Check\URL_Check_log.txt" -Force
Current output:
200 - http://192.168.1.1/
200 - http://192.168.1.2/
200 - http://192.168.1.250/config/authentication_page.htm
0 - https://192.168.1.50/
200 - http://app1-vip-http.dev.local/
0 - https://CA/certsrv/Default.asp
Perhaps using PowerShell cmdlet Invoke-WebRequest works better for you. It has many more parameters and switches to play around with like ProxyUseDefaultCredentials and DisableKeepAlive
$pathIn = "C:\Users\Admin1\Documents\Scripts\URL Check\URL_Check.txt"
$pathOut = "C:\Users\Admin1\Documents\Scripts\z_Logs\URL_Check\URL_Check_log.txt"
$URLList = Get-Content -Path $pathIn
$result = foreach ($uri in $URLList) {
try{
$res = Invoke-WebRequest -Uri $uri -UseDefaultCredentials -UseBasicParsing -Method Head -TimeoutSec 5 -ErrorAction Stop
$status = [int]$res.StatusCode
}
catch {
$status = [int]$_.Exception.Response.StatusCode.value__
}
# output a formatted string to capture in variable $result
"$status - $uri"
}
# output on screen
$result
#output to log file
$result | Set-Content -Path $pathOut -Force

HTTP Status codes via Powershell

Somehow the below codes works fine for the first few test URLs within C:\testurl.txt then it hung up forever when it is processing the 4th URL from the C:\testurl.txt , no idea why it hangs up?
It is already working fine for up to 3 URLs but stuck up on 4th onward
CLS
$urllist = Get-Content "C:\testurl.txt" # URLs to test one in each line
foreach ($url in $urllist) {
Write-Host $url
$req = [System.Net.WebRequest]::Create($url)
try {
$res = $req.GetResponse()
} catch [System.Net.WebException] {
$res = $_.Exception.Response
}
$res.StatusCode
#Print OK or whatever
[int]$res.StatusCode
#Print 200 or whatever
}
It is working fine for up to 3 URLs but hangs the script on 4th URL without any output or error message. Here is the example of c:\testurl.txt
http://www.google.com
http://www.google.com
http://www.google.com
http://www.google.com
http://www.hotmail.com
http://www.gmail.com
http://www.yahoo.com
http://www.msn.com
Please note each URL will be in a new line, you will see that script will stop at (the 4th one) you may try with your own URLs, etc too
then it hung up forever
No - it's hung until the underlying TCP connections of the previous requests time out.
The .NET CLR will internally pool all WebRequest dispatches so that only a finite number of external requests will be initiated concurrently, and as long as you have a number of un-closed WebResponse objects in memory, your requests will start queuing up.
You can avoid this by closing them (as you should):
foreach ($url in $urllist) {
Write-Host $url
$req = [System.Net.WebRequest]::Create($url)
try {
$res = $req.GetResponse()
}
catch [System.Net.WebException] {
$res = $_.Exception.Response
}
finally {
$res.StatusCode
#Print OK or whatever
[int]$res.StatusCode
#Print 200 or whatever
$res.Dispose()
# close connection, dispose of response stream
}
}

Measure response time using Invoke-WebRequest similar to curl

I have a curl command which response time by breaking it by each action in invoking a service.
curl -w "#sample.txt" -o /dev/null someservice-call
I want to measure the response time in a similar way using PowerShell's built-in Invoke-WebRequest call. So far I am able to get total response time using Measure-Command. Can someone please help me with this?
Content of sample.txt used in curl:
time_namelookup: %{time_namelookup}\n
time_connect: %{time_connect}\n
time_appconnect: %{time_appconnect}\n
time_pretransfer: %{time_pretransfer}\n
time_redirect: %{time_redirect}\n
time_starttransfer: %{time_starttransfer}\n
----------\n
time_total: %{time_total}\n
time in milliseconds:
$url = "google.com"
(Measure-Command -Expression { $site = Invoke-WebRequest -Uri $url -UseBasicParsing }).Milliseconds
This seems to do it without any noticable overhead:
$StartTime = $(get-date)
Invoke-WebRequest -Uri "google.com" -UseBasicParsing
Write-Output ("{0}" -f ($(get-date)-$StartTime))
As the other solutions point out, there is a performance catch when using powershell only.
The most efficient solution would probably be to write some c# with the measurements built in. But when it's not properly compiled beforehand, the loading-time will increase dramatically when the C# needs to be compiled.
But there is another way.
Since you can use almost all dotnet constructs within powershell, you can just write the same request and measurement logic within powershell itself.
I have written a small method which should do the trick:
function Measure-PostRequest {
param(
[string] $Url,
[byte[]] $Bytes,
[switch] $Block
)
$content = [Net.Http.ByteArrayContent]::new($bytes);
$client = [Net.Http.HttpClient]::new();
$stopwatch = [Diagnostics.Stopwatch]::new()
$result = $null;
if ($block) {
# will block and thus not allow ctrl+c to kill the process
$stopwatch.Start()
$result = $client.PostAsync($url, $content).GetAwaiter().GetResult()
$stopwatch.Stop()
} else {
$stopwatch.Start()
$task = $client.PostAsync($url, $content)
while (-not $task.AsyncWaitHandle.WaitOne(200)) { }
$result = $task.GetAwaiter().GetResult()
$stopwatch.Stop()
}
[PSCustomObject]#{
Response = $result
Milliseconds = $stopwatch.ElapsedMilliseconds
}
}

Restart application pool based on http response code

I am trying to write a PowerShell script that will restart an application pool in IIS if a 503 response code is received.
So far I have managed to retrieve the response code for every crm application under the default website in IIS. However I am unsure how I would go about finding the application pool name. I've tried the below, but it returns the same application pool for each site. Can anyone help?
$getSite = (Get-WebApplication -Site 'Default Web Site')
$SiteURL = ForEach ($site in $getSite.path) {("http://localhost")+$site}
ForEach ($crm in $SiteURL){
$req = [system.Net.WebRequest]::Create($crm)
try {
$res = $req.GetResponse()
} catch [System.Net.WebException] {
$res = $_.Exception.Response
}
$ApplicationPool = ForEach ($app in $getSite.applicationpool) {$app}
if([int]$res.StatusCode -eq 503) {write-host ($crm + ' ' + [int]$res.StatusCode) + $app}
}
I think you need to access $_.Exception.InnerException for the the Response property.
Your $ApplicationPool assignment doesn't make much sense, as you would only need one applicationPool name per $crm app you test:
foreach($App in #(Get-WebApplication -Site 'Default Web Site')){
# Uri for the application
$TestUri = 'http://localhost{0}' -f $App.path
# Create WebRequest
$Request = [system.Net.WebRequest]::Create($TestUri)
try {
# Get the response
$Response = $Request.GetResponse()
} catch [System.Net.WebException] {
# If it fails, get Response from the Exception
$Response = $_.Exception.InnerException.Response
}
# The numerical value of the StatusCode value is the HTTP status code, ie. 503
if(503 -eq ($Response.StatusCode -as [int])){
# Restart the app pool
Restart-WebAppPool -Name $App.applicationPool
}
}

Test app link with powershell

I am using powershell to do some monitoring and I want to check if an application's jnlp
exists on a website and is available for downloading.
I have the link to the .jnlp and so far I'm downloading the file with .navigate().
$ie = new-object -com "InternetExplorer.Application"
Try {
$ie.navigate("http://bla.com/testApp.jnlp")
} Catch {
#$_
$ErrorMessage = $_.Exception.Message
}
I tried to catch an exception by giving invalid filename but it doesn't work.
Also I thought of downloading the app and try to delete the file afterwards so as to
check that it actually exists but it would be too slow since I have many jnlps to check.
Is there another more simple and elegant way to do so? I want to avoid the downloading of
each file I want to test.
How about using WebClient class from .Net? Getting data is simple enough. Like so,
$webclient = new-object System.Net.WebClient
try {
# Download data as string and store the result into $data
$data = $webclient.DownloadString("http://www.google.com/")
} catch [Net.WebException] {
# A 404 or some other error occured, process the exception here
$ex = $_
$ex.Exception
}
If you're using PowerShell 3.0 or higher, you can use Invoke-WebRequest to see if a page exists by issuing an HTTP HEAD request and checking the status code.
$Result = Invoke-WebRequest -uri `http://bla.com/testApp.jnlp` -method head
if ($Result.StatusCode -ne 200){
# Something other than "OK" was returned.
}
This is doable with System.Net.WebClient as well but it's a bit more effort.