why this powershell http server code doesn't work - powershell

I copied and pasted this code https://community.idera.com/database-tools/powershell/powertips/b/tips/posts/creating-powershell-web-server in a powershell console directory which contains an index.html file
when browsing to http://localhost:8080/index.html I get an error Oops, the page is not available!
Is there something wrong with the code I can't see what ?
# enter this URL to reach PowerShell’s web server
$url = 'http://localhost:8080/'
# HTML content for some URLs entered by the user
$htmlcontents = #{
'GET /' = '<html><building>Here is PowerShell</building></html>'
'GET /services' = Get-Service | ConvertTo-Html
}
# start web server
$listener = New-Object System.Net.HttpListener
$listener.Prefixes.Add($url)
$listener.Start()
try
{
while ($listener.IsListening) {
# process received request
$context = $listener.GetContext()
$Request = $context.Request
$Response = $context.Response
$received = '{0} {1}' -f $Request.httpmethod, $Request.url.localpath
# is there HTML content for this URL?
$html = $htmlcontents[$received]
if ($html -eq $null) {
$Response.statuscode = 404
$html = 'Oops, the page is not available!'
}
# return the HTML to the caller
$buffer = [Text.Encoding]::UTF8.GetBytes($html)
$Response.ContentLength64 = $buffer.length
$Response.OutputStream.Write($buffer, 0, $buffer.length)
$Response.Close()
}
}
finally
{
$listener.Stop()
}

It does work if you go to just http://localhost:8080/, you'd have to also have an index.html listing in order to browse to that too.
Just modify the $htmlContents section like so:
# HTML content for some URLs entered by the user
$htmlcontents = #{
'GET /' = '<html><building>Here is PowerShell</building></html>'
'GET /services' = Get-Service | ConvertTo-Html
'GET /index.html' = '<html><building>this is my index page</building></html>'
}
You could also have a statement like this.
$htmlcontents = #{
'GET /' = '<html><building>Here is PowerShell</building></html>'
'GET /services' = Get-Service | ConvertTo-Html
'GET /index.html' = '<html><building>this is my index page</building></html>'
'GET /fromPage.html' = Get-content "C:\temp\fence.txt"
}

Related

URL health-check PowerShell script correctly gets HTTP 200 on most sites, but incorrect '0' status code on some...API timeout issue?

I have a URL health-checking PowerShell script which correctly gets an HTTP 200 status code on most of my intranet sites, but a '0' status code is returned on a small minority of them. The '0' code is an API return rather than from the web site itself, according to my research of questions from others who have written similar URL-checking PowerShell scripts. Thinking this must be a timeout issue, where API returns '0' before the slowly-responding web site returns its 200, I've researched yet more questions about this subject area on SO and implemented a suggestion from someone to insert a timeout in the script. The timeout setting though, no matter how high I set the timeout value, doesn't help. I still get the same '0' "response" code from the same web sites even though those web sites are up and running as checked from any regular web browser. Any thoughts on how I could tweak the timeout setting in the script below in order to get the correct 200 response code?
The Script:
$URLListFile = "C:\Users\Admin1\Documents\Scripts\URL Check\URL_Check.txt"
$URLList = Get-Content $URLListFile -ErrorAction SilentlyContinue
#if((test-path $reportpath) -like $false)
#{
#new-item $reportpath -type file
#}
#For every URL in the list
$result = foreach($Uri in $URLList) {
try{
#For proxy systems
[System.Net.WebRequest]::DefaultWebProxy = [System.Net.WebRequest]::GetSystemWebProxy()
[System.Net.WebRequest]::DefaultWebProxy.Credentials = [System.Net.CredentialCache]::DefaultNetworkCredentials
#Web request
$req = [system.Net.WebRequest]::Create($uri)
$req.Timeout=5000
$res = $req.GetResponse()
}
catch {
#Err handling
$res = $_.Exception.Response
}
$req = $null
#Getting HTTP status code
$int = [int]$res.StatusCode
# output a formatted string to capture in variable $result
"$int - $uri"
#Disposing response if available
if($res){
$res.Dispose()
}
}
# output on screen
$result
#output to log file
$result | Set-Content -Path "C:\Users\Admin1\Documents\Scripts\z_Logs\URL_Check\URL_Check_log.txt" -Force
Current output:
200 - http://192.168.1.1/
200 - http://192.168.1.2/
200 - http://192.168.1.250/config/authentication_page.htm
0 - https://192.168.1.50/
200 - http://app1-vip-http.dev.local/
0 - https://CA/certsrv/Default.asp
Perhaps using PowerShell cmdlet Invoke-WebRequest works better for you. It has many more parameters and switches to play around with like ProxyUseDefaultCredentials and DisableKeepAlive
$pathIn = "C:\Users\Admin1\Documents\Scripts\URL Check\URL_Check.txt"
$pathOut = "C:\Users\Admin1\Documents\Scripts\z_Logs\URL_Check\URL_Check_log.txt"
$URLList = Get-Content -Path $pathIn
$result = foreach ($uri in $URLList) {
try{
$res = Invoke-WebRequest -Uri $uri -UseDefaultCredentials -UseBasicParsing -Method Head -TimeoutSec 5 -ErrorAction Stop
$status = [int]$res.StatusCode
}
catch {
$status = [int]$_.Exception.Response.StatusCode.value__
}
# output a formatted string to capture in variable $result
"$status - $uri"
}
# output on screen
$result
#output to log file
$result | Set-Content -Path $pathOut -Force

Get response link from HTTP request on Powershell

The following Powershell script runs a Google search of an image stored within my hard drive.
How can I get the link which is followed to get to the results page? Is it possible to navigate to the different webpages displayed on it?
I've tried $request.Links | Select href to try and get a list of the links, but it didn't work. I've also tried to add Write-Output $respStream to the code, but then it doesn't run.
Set-ExecutionPolicy Bypass -scope Process -Force
function Get-GoogleImageSearchUrl
{
param(
[Parameter(Mandatory = $true)]
[ValidateScript({ Test-Path $_ })]
[string] $ImagePath
)
# extract the image file name, without path
$fileName = Split-Path $imagePath -Leaf
# the request body has some boilerplate before the raw image bytes (part1) and some after (part2)
# note that $filename is included in part1
$part1 = #"
-----------------------------7dd2db3297c2202
Content-Disposition: form-data; name="encoded_image"; filename="$fileName"
Content-Type: image/jpeg
"#
$part2 = #"
-----------------------------7dd2db3297c2202
Content-Disposition: form-data; name="image_content"
-----------------------------7dd2db3297c2202--
"#
# grab the raw bytes composing the image file
$imageBytes = [Io.File]::ReadAllBytes($imagePath)
# the request body should sandwich the image bytes between the 2 boilerplate blocks
$encoding = New-Object Text.ASCIIEncoding
$data = $encoding.GetBytes($part1) + $imageBytes + $encoding.GetBytes($part2)
# create the HTTP request, populate headers
$request = [Net.HttpWebRequest] ([Net.HttpWebRequest]::Create('http://images.google.com/searchbyimage/upload'))
$request.Method = "POST"
$request.ContentType = 'multipart/form-data; boundary=---------------------------7dd2db3297c2202' # must match the delimiter in the body, above
$request.ContentLength = $data.Length
# don't automatically redirect to the results page, just take the response which points to it
$request.AllowAutoredirect = $false
# populate the request body
$stream = $request.GetRequestStream()
$stream.Write($data, 0, $data.Length)
$stream.Close()
# get response stream, which should contain a 302 redirect to the results page
$respStream = $request.GetResponse().GetResponseStream()
# pluck out the results page link that you would otherwise be redirected to
(New-Object Io.StreamReader $respStream).ReadToEnd() -match 'HREF\="([^"]+)"' | Out-Null
$matches[1]
}
$url = Get-GoogleImageSearchUrl "C:\Users\Path\filename.jpeg"
Start-Process $url
As mentioned by #soc, you shouldn't need the stream to pull the url, the moved to location is in the response header:
$request = [Net.HttpWebRequest] ([Net.HttpWebRequest]::Create('http://images.google.com/searchbyimage/upload'))
$request.AllowAutoredirect = $false
...
$response = $request.GetResponse()
if ($response.StatusCode -eq 302) {
$redirect_url = $response.Headers["Location"]
write-host $redirect_url
}

How to get GET parameter?

I was successfully able to open a port on my computer (using only PowerShell) and know when HTTP requests are done to that port. I came up with this simple code:
$listener = [System.Net.Sockets.TcpListener]5566;
$listener.Start();
while ($true) {
$client = $Listener.AcceptTcpClient();
Write-Host "Connected!";
$client.Close();
}
If I open my browser and type http://localhost:5566 in the PowerShell interface it will show a message that a user got connected.
What I need to do is to get the GET parameters of this HTTP request. For example, if instead I had opened my browser and typed http://localhost:5566/test.html?parameter1=xxx&parameter2=yyy.
How can I grab the GET parameters (parameter1 and parameter2) name and values using my simplified code above?
If you are comfortable using the HttpListener instead of the TcpListener. It's easier to do the job.
Below script will output in a browser
Path is /test.html
parameter2 is equal to yyy
parameter1 is equal to xxx
Quick and dirty script
$listener = New-Object System.Net.HttpListener
$listener.Prefixes.Add("http://localhost:5566/")
try {
$listener.Start();
while ($true) {
$context = $listener.GetContext()
$request = $context.Request
# Output the request to host
Write-Host $request | fl * | Out-String
# Parse Parameters from url
$rawUrl = $request.RawUrl
$Parameters = #{}
$rawUrl = $rawUrl.Split("?")
$Path = $rawUrl[0]
$rawParameters = $rawUrl[1]
if ($rawParameters) {
$rawParameters = $rawParameters.Split("&")
foreach ($rawParameter in $rawParameters) {
$Parameter = $rawParameter.Split("=")
$Parameters.Add($Parameter[0], $Parameter[1])
}
}
# Create output string (dirty html)
$output = "<html><body><p>"
$output = $output + "Path is $Path" + "<br />"
foreach ($Parameter in $Parameters.GetEnumerator()) {
$output = $output + "$($Parameter.Name) is equal to $($Parameter.Value)" + "<br />"
}
$output = $output + "</p></body></html>"
# Send response
$statusCode = 200
$response = $context.Response
$response.StatusCode = $statusCode
$buffer = [System.Text.Encoding]::UTF8.GetBytes($output)
$response.ContentLength64 = $buffer.Length
$output = $response.OutputStream
$output.Write($buffer,0,$buffer.Length)
$output.Close()
}
} finally {
$listener.Stop()
}
Cheers
Glenn

Powershell httplistener handle more than one request at the same time

I´m using a normal powershell httplistener script.
The script listenes on port 80 and gives an response.
Now I tried to handle more than one request as the same time. The problem is that the second respons has to wait until the first response was finished by the script.
I tried to start an own job for every http-request - but I can´t send a response to the listener from the PS-Job.
Does anyone know, how to handle parallel httprequests in PS?
Here is the Script I´m using:
$url = 'http://localhost/'
$listener = New-Object System.Net.HttpListener
$listener.Prefixes.Add($url)
$listener.Start()
Write-Host "Listening at $url..."
while ($listener.IsListening)
{
$context = $listener.GetContext()
$requestUrl = $context.Request.Url
$response = $context.Response
Write-Host ''
Write-Host "> $requestUrl"
$localPath = $requestUrl.LocalPath
$route = $routes.Get_Item($requestUrl.LocalPath)
if ($route -eq $null)
{
$response.StatusCode = 404
}
else
{
$content = & $route
$buffer = [System.Text.Encoding]::UTF8.GetBytes($content)
$response.ContentLength64 = $buffer.Length
$response.OutputStream.Write($buffer, 0, $buffer.Length)
}
$response.Close()
$responseStatus = $response.StatusCode
Write-Host "< $responseStatus"
}
What is the right way to handle more than one request at the same time?
Thanks #all!
You can use multiple runspaces (even the runspace pool) within the same PowerShell process. See this blog post for details on how to do that.
Check the $request.url.LocalPath or another one of the url properties depending on how much of context you are looking for.
Provide a different response depending on the request.
$htmlout = "<html><link rel=""stylesheet"" href=""MyStleSheet.css""><body class=""body"" />Hello World</body></html>"
$css = ".body {background-color: white;font-size: 20px;font-family: calibri;}"
while($Listener.IsListening -and $noBreak){
$Context = $listener.GetContext()
$request = $context.request
if($request.url.LocalPath -match '.css')
{
$buffer = [System.Text.Encoding]::utf8.getbytes($css)
}
if($request.url.LocalPath -match '.html')
{
$buffer = [System.Text.Encoding]::utf8.getbytes($htmlout)
}
$response = $context.response
$response.contentlength64 = $buffer.length
$response.OutputStream.Write($buffer,0,$buffer.length)
}

How to catch post/get variables with Powershell httpListener?

I want to know how to catch url vars with powershell system.net.HttpListener
Thanks
$listener = New-Object system.net.HttpListener
$listener.Prefixes.Add('http://127.0.0.1:8080')
$listener.Start()
$context = $listener.GetContext() # block
$request = $context.Request
$response = $context.Response
# $var = read post/get var
$page = Get-Content -Path C:\play.html -Raw
$page = $page.Replace('%VAR%',$var)
$buffer = [System.Text.Encoding]::UTF8.GetBytes($page)
$response.ContentLength64 = $buffer.Length
$output = $response.OutputStream
$output.Write($buffer,0,$buffer.Length)
$output.Close()
$listener.Stop()
If the method header is GET then use the QueryString property to get the query parameters. If the method header is POST then check HasEntityBody property and if that is true, read the POST data from the body using the InputSteam property.