I am currently logged in with my browser and wanted to get the current cookie of that session. but when i use this code it creates another session id for that request only. NOT for the currently logged in session in my browser.
$url = "http://example.website/"
$cookiejar = New-Object System.Net.CookieContainer
$webrequest = [System.Net.HTTPWebRequest]::Create($url);
$webrequest.CookieContainer = $cookiejar
$response = $webrequest.GetResponse()
$cookies = $cookiejar.GetCookies($url)
foreach ($cookie in $cookies) {
Write-Host "$($cookie.name) = $($cookie.value)"
}
i wanted to output the similar session id cookie in my browser and with the script.
As Lee_Dailey suggests, you can use the IE COM object interface, however the cookie details are limited. The details are returned as a String which can be converted into a hashtable to get key,value pairs - but extended information such as domain or expiration will not be available.
This will only work for Internet Explorer and I am not certain how complete the information is, for example whether Secure cookies can be retrieved this way, so you will need to test.
Depending on your requirements, this may or may not be sufficient.
#URL we want to retrieve cookie information from
$URL = 'https://stackoverflow.com'
#Hashtable to store the cookie details
$CookieDetails = #{}
#Create a shell object
$Shell = New-Object -ComObject Shell.Application
#Find the web browser tab that starts with our URL
$IE = $Shell.Windows() | Where-Object { $_.Type -eq "HTML Document" -and $_.LocationURL -like "$URL*"}
#Split the cookie string and for each line parse into k,v pairs
foreach($Line in ($IE.Document.cookie -split "; "))
{
$Line = $Line -split "="
$CookieDetails.($Line[0]) = $Line[1..-1] -join "="
}
#Output the hashtable result
$CookieDetails
Related
I have a URL health-checking PowerShell script which correctly gets an HTTP 200 status code on most of my intranet sites, but a '0' status code is returned on a small minority of them. The '0' code is an API return rather than from the web site itself, according to my research of questions from others who have written similar URL-checking PowerShell scripts. Thinking this must be a timeout issue, where API returns '0' before the slowly-responding web site returns its 200, I've researched yet more questions about this subject area on SO and implemented a suggestion from someone to insert a timeout in the script. The timeout setting though, no matter how high I set the timeout value, doesn't help. I still get the same '0' "response" code from the same web sites even though those web sites are up and running as checked from any regular web browser. Any thoughts on how I could tweak the timeout setting in the script below in order to get the correct 200 response code?
The Script:
$URLListFile = "C:\Users\Admin1\Documents\Scripts\URL Check\URL_Check.txt"
$URLList = Get-Content $URLListFile -ErrorAction SilentlyContinue
#if((test-path $reportpath) -like $false)
#{
#new-item $reportpath -type file
#}
#For every URL in the list
$result = foreach($Uri in $URLList) {
try{
#For proxy systems
[System.Net.WebRequest]::DefaultWebProxy = [System.Net.WebRequest]::GetSystemWebProxy()
[System.Net.WebRequest]::DefaultWebProxy.Credentials = [System.Net.CredentialCache]::DefaultNetworkCredentials
#Web request
$req = [system.Net.WebRequest]::Create($uri)
$req.Timeout=5000
$res = $req.GetResponse()
}
catch {
#Err handling
$res = $_.Exception.Response
}
$req = $null
#Getting HTTP status code
$int = [int]$res.StatusCode
# output a formatted string to capture in variable $result
"$int - $uri"
#Disposing response if available
if($res){
$res.Dispose()
}
}
# output on screen
$result
#output to log file
$result | Set-Content -Path "C:\Users\Admin1\Documents\Scripts\z_Logs\URL_Check\URL_Check_log.txt" -Force
Current output:
200 - http://192.168.1.1/
200 - http://192.168.1.2/
200 - http://192.168.1.250/config/authentication_page.htm
0 - https://192.168.1.50/
200 - http://app1-vip-http.dev.local/
0 - https://CA/certsrv/Default.asp
Perhaps using PowerShell cmdlet Invoke-WebRequest works better for you. It has many more parameters and switches to play around with like ProxyUseDefaultCredentials and DisableKeepAlive
$pathIn = "C:\Users\Admin1\Documents\Scripts\URL Check\URL_Check.txt"
$pathOut = "C:\Users\Admin1\Documents\Scripts\z_Logs\URL_Check\URL_Check_log.txt"
$URLList = Get-Content -Path $pathIn
$result = foreach ($uri in $URLList) {
try{
$res = Invoke-WebRequest -Uri $uri -UseDefaultCredentials -UseBasicParsing -Method Head -TimeoutSec 5 -ErrorAction Stop
$status = [int]$res.StatusCode
}
catch {
$status = [int]$_.Exception.Response.StatusCode.value__
}
# output a formatted string to capture in variable $result
"$status - $uri"
}
# output on screen
$result
#output to log file
$result | Set-Content -Path $pathOut -Force
I wrote the below function to pop up an IE window to handle the user authentication of the OAuth2.0 authorization code flow in PowerShell which works but when calling it as a function, it doesn't stay in the while loop to wait for the URL of the IE window to change and to filter out the OAuth2.0 authorization code and then close the window.
Is there a way to keep the function "open" for longer and to make sure it waits for the URL of the IE window to change?
All remarks regarding the function are welcome...
function Show-OAuth2AuthCodeWindow {
[CmdletBinding()]
param
(
[Parameter(Mandatory = $true, Position = 0, HelpMessage = "The OAuth2 authorization code URL pointing towards the oauth2/v2.0/authorize endpoint as documented here: https://learn.microsoft.com/en-us/azure/active-directory/develop/v2-oauth2-auth-code-flow")]
[System.Uri] $URL
)
try {
# create an Internet Explorer object to display the OAuth 2 authorization code browser window to authenticate
$InternetExplorer = New-Object -ComObject InternetExplorer.Application
$InternetExplorer.Width = "600"
$InternetExplorer.Height = "500"
$InternetExplorer.AddressBar = $false # disable the address bar
$InternetExplorer.ToolBar = $false # disable the tool bar
$InternetExplorer.StatusBar = $false # disable the status bar
# store the Console Window Handle (HWND) of the created Internet Explorer object
$InternetExplorerHWND = $InternetExplorer.HWND
# make the browser window visible and navigate to the OAuth2 authorization code URL supplied in the $URL parameter
$InternetExplorer.Navigate($URL)
# give Internet Explorer some time to start up
Start-Sleep -Seconds 1
# get the Internet Explorer window as application object
$InternetExplorerWindow = (New-Object -ComObject Shell.Application).Windows() | Where-Object {($_.LocationURL -match "(^https?://.+)") -and ($_.HWND -eq $InternetExplorerHWND)}
# wait for the URL of the Internet Explorer window to hold the OAuth2 authorization code after a successful authentication and close the window
while (($InternetExplorerWindow = (New-Object -ComObject Shell.Application).Windows() | Where-Object {($_.LocationURL -match "(^https?://.+)") -and ($_.HWND -eq $InternetExplorerHWND)})) {
Write-Host $InternetExplorerWindow.LocationURL
if (($InternetExplorerWindow.LocationURL).StartsWith($RedirectURI.ToString() + "?code=")) {
$OAuth2AuthCode = $InternetExplorerWindow.LocationURL
$OAuth2AuthCode = $OAuth2AuthCode -replace (".*code=") -replace ("&.*")
$InternetExplorerWindow.Quit()
}
}
# return the OAuth2 Authorization Code
return $OAuth2AuthCode
}
catch {
Write-Host -ForegroundColor Red "Could not create a browser window for the OAuth2 authentication"
}
}
The following example does what you want with a WebBrowser control, which allows you to register a Navigating event handler to catch the authorization code obtained from your authorization server.
PowerShell OAuth2 client
Answer from this blog post
I managed to get the Auth code flow working using the headless chrome. All you need are these two components.
Chrome/edge driver
Selenium driver
Once you have these setup, you need to use the below Powershell commands to generate token using Auth code flow
$SeleniumWebDriverFullPath = ".\WebDriver.dll" # Full path to selenium web driver
$ClientId = ""
$Scopes = ""
$RedirectUri = ""
$authCodeUri = "$($AuthorizeEndpoint.TrimEnd("/"))?client_id=$ClientId&scope=$Scopes&redirect_uri=$RedirectUri&response_type=code
Write-Host $authCodeUri
Import-Module $SeleniumWebDriverFullPath
$ChromeOptions = New-Object OpenQA.Selenium.Edge.EdgeOptions
$ChromeOptions.AddArgument('headless')
$ChromeOptions.AcceptInsecureCertificates = $True
$ChromeDriver = New-Object OpenQA.Selenium.Edge.EdgeDriver($ChromeOptions);
$ChromeDriver.Navigate().GoToUrl($authCodeUri);
while (!$ChromeDriver.Url.Contains("code")) { Start-Sleep 1 }
Write-Host $ChromeDriver.Url
$ParsedQueryString = [System.Web.HttpUtility]::ParseQueryString($ChromeDriver.Url)
$Code = $ParsedQueryString[0]
Write-Host "Received code: $Code"
Write-Host "Exchanging code for a token"
$tokenrequest = #{ "client_id" = $ClientId; "grant_type" = "authorization_code"; "redirect_uri" = $RedirectUri; "code" = $ParsedQueryString[0] }
$token = Invoke-RestMethod -Method Post -Uri $AuthTokenEndpoint -Body $tokenrequest
$tokenString = $token | ConvertTo-Json
My guess is that the function has no idea what $RedirectURI is.
You should make that a second parameter to the function or it should be (at least) Script scoped
I'd prefer using a second parameter, but if you do scoping, you should be able to use it inside the function with $script:RedirectURI
The site itself has to be logged into-which I do as a "ghost user" using the COM object. There is a page with a table that I'm able to copy by sending keystrokes, collecting a list of urls/ids/other info, which is saved to the clipboard/text file. With the com object, I'm able to open each url element from that file(as long as I'm logged in) and get the page text in the document request, but only after the page has loaded(too soon returns a partial page).
This is extremely slow over 15,000 pages/urls and I was hoping to improve that speed without having to render each url(I even have to put each one in a loop with checks just in case it fails to load)
Is there a way to grab the various label:LabelValue properties from the URL of a webpage without loading it? Edit* I have the code:
[Net.ServicePointManager]::SecurityProtocol = [Net.SecurityProtocolType]::Tls12
$ie = New-Object -ComObject 'internetExplorer.Application'
$user = "me"; $pw = "pw"
$ie.Visible = $true
$ie.Navigate("https://Loginscreen.com")
$userField = $ie.document.getElementByID('login')
$usernamefield.value = "$user"
$passwordfield = $ie.document.getElementByID('password')
$passwordfield.value = "$pw"
$Link = $ie.document.getElementByID('SubmitLogin')
$Link.click()
$ieProc = Get-Process | ? { $_.MainWindowHandle -eq $ie.HWND }
[Microsoft.VisualBasic.Interaction]::AppActivate($ieProc.Id)
$fc = gc "C:\TempProject.txt"
foreach ($f in $fc)
{
$url = $f.split("`t")[-1]
$HTML = Invoke-WebRequest $url
$body = $HTML.ParsedHTML.body.innerText
$body
}
However, it seems to be returning generic information about the login page.....I couldn't find any reference to anything on the webpage that I wanted-either HTML tag info or the text itself.
If you mean "Is there a way to get the text of a URL without using a browser?" the answer is an overwhelming and resounding Yes! Try Invoke-WebRequest and see if the Content property has what you are looking for. Scraping HTML out of text is still something you will need to do manually or use a different package to scrape your juicy bits from the fruit.
My company wants me to grab data from their internal website, organize it, and send it to a database. The data is displayed on tables that you navigate to within the site. I'm wanting to pull the fields into a file or memory for further processing.
So far, I can log into the site in powershell by getting the submit login button's ID, and passing my username/password. I'm able to pass use the navigate method to change the page to the appropriate page within the site. However, running an Invoke-WebRequest on the new page, as well as using the Net.WebClient on the new page is returning the information found on the original site's login screen(I know, because nothing from the table makes it into the returned values, regardless of the commands I use). The commented code is what I've tried previously.
Here is the code-minus the values of my id/password/site link
[Net.ServicePointManager]::SecurityProtocol = [Net.SecurityProtocolType]::Tls12
$ie = New-Object -ComObject 'internetExplorer.Application'
$ie.Visible= $true # Make it visible
$username="myid"
$password="mypw"
$ie.Navigate("https://webpage.com/index.jsp")
While ($ie.Busy -eq $true) {Start-Sleep -Seconds 3;}
$usernamefield = $ie.document.getElementByID('login')
$usernamefield.value = "$username"
$passwordfield = $ie.document.getElementByID('password')
$passwordfield.value = "$password"
$Link = $ie.document.getElementByID('SubmitLogin')
$Link.click()
$url = "https://webpage.com/home.pa#%5BT1%2CM181%5D"
$ie.Navigate($url)
While ($ie.Busy -eq $true) {Start-Sleep -Seconds 3;}
$doc = $ie.document
$web = New-Object Net.WebClient
$web.DownloadString($url)
#$r = Invoke-WebRequest $url
#$r.Forms.fields | get-member
#$InnerText = $r.AllElements |
# Where-Object {$_.tagName -ne "TD" -and $_.innerText -ne $null} |
# Select -ExpandProperty innerText
#write-host $InnerText
#$r.AllElements|Where-Object {$_.InnerHtml -like "*=*"}
#$doc = $ie.Document
#$doc.getElementByID("ext-element-7") | % {
# if ($_.id -ne $null){
# write-host $_.id
# }
#}
$ie.Quit()
I obviously don't have your page and can't ensure that the body of the POST from signing in contains the fields login and password so that will require some trial & error from you. As a mini-example, if you open up your console dev tools network tab and filter by POST, you can observe how your login page signs you in. When I open reddit to sign in, it sends a POST to https://www.reddit.com/login with a body containing a username and password key/value (both plaintext). This action sets up my browser session to persist my login.
Here's a code example that uses the HtmlAgilityPack library to interact with the resulting page as if it were XML.
Enabling TLS1.2:
[System.Net.ServicePointManager]::SecurityProtocol =
[System.Net.ServicePointManager]::SecurityProtocol -bor [System.Net.SecurityProtocolType]::Tls12
Setting up your web session:
$iwrParams = #{
'Uri' = 'https://webpage.com/index.jsp'
'Method' = 'POST'
'Body' = #{
'login' = $username
'password' = $password
}
'SessionVariable' = 'session'
# avoids cases where IE has not been opened
'UseBasicParsing' = $true
}
# don't care about response - only here to initialize the session
$null = Invoke-WebRequest #iwrParams
Getting the protect page content:
$iwrParams = #{
'Uri' = 'https://webpage.com/home.pa#%5BT1%2CM181%5D'
'WebSession' = $session
'UseBasicParsing' = $true
}
$output = (Invoke-WebRequest #iwrParams).Content
Downloading/adding HtmlAgility:
if (-not (Test-Path -Path "$PSScriptRoot\HtmlAgilityPack.dll" -PathType Leaf))
{
Invoke-WebRequest -Uri https://www.nuget.org/api/v2/package/HtmlAgilityPack -OutFile "$PSScriptRoot\html.zip"
Expand-Archive -Path "$PSScriptRoot\html.zip" -DestinationPath "$PSScriptRoot\html" -Force
Copy-Item -Path "$PSScriptRoot\html\lib\netstandard2.0\HtmlAgilityPack.dll" -Destination "$PSScriptRoot\"
Remove-Item -Path "$PSScriptRoot\html", "$PSScriptRoot\html.zip" -Recurse -Force
}
Add-Type -Path "$PSScriptRoot\HtmlAgilityPack.dll"
$html = [HtmlAgilityPack.HtmlDocument]::new()
Loading/parsing your page content:
$html.LoadHtml($output)
# do stuff with output.
$html.DocumentNode.SelectNodes('//*/text()').Text.Where{$PSItem -like '*=*'}
Footnote
I made the assumption in the code you were executing from a script where $PSScriptRoot will be populated. If it's being run interactively, you can use the $pwd automatic variable instead (carry-over from *nix, print working directory). This code requires PSv5+.
After some serious effort-I managed to get the pages to work correctly. Turns out I wasn't waiting for everything to load-but once I had that, I eventually found the correct tag/name to make everything work.
Assuming the code in the original post is correct up to "ie.Navigate($url)"
$ie.Navigate($url)
While ($ie.Busy -eq $true) {Start-Sleep -Seconds 3;}
$r = Invoke-WebRequest $url
$doc = $ie.document
$j = ($doc.getElementsByTagName("body") | Where {$_.className -eq 'thefullclassname found in the quotes of <body class="" of the area you wanted'}).innerText
write-host $j
This gave me the output of a very annoyingly done table that isn't a "table", and has the first row/col on it's own-so formatting the output to an easy to use version will be the new hassle. At least I got everything on the page that had the text I needed...so progress!
Wonder if anyone can help. I'm trying to use powershell to import a list of domains and generate a HTTP request for each of them. Currently I've got this far:
$csvFilename = "C:\Sites\sites.csv"
$csv = Import-Csv $csvFilename -Header #("url") |
foreach {
$HTTP_Request = [System.Net.WebRequest]::Create('$csv')
$HTTP_Response = $HTTP_Request.GetResponse()
$HTTP_Status = [int]$HTTP_Response.StatusCode
If ($HTTP_Status -eq 200) {
Write-Host "Site is OK!"
}
Else {
Write-Host "The Site may be down, please check!"
}
$HTTP_Response.Close()
}
I'm not sure why but the 'Create()' statement won't pick up the URL. Any ideas?
Many thanks
inside the foreach loop, you can access the current element with the automatic variable $_
so replace this line $HTTP_Request = [System.Net.WebRequest]::Create('$csv')
by
$HTTP_Request = [System.Net.WebRequest]::Create("$_")
but before as create method is waiting for a string,you will have to expand your url property (otherwise you get a pscustomObject not a string) :
Import-Csv c:\temp\urls.csv -Header "url" |select -expand url |%{ [System.Net.Webrequest]::Create("$_")}