I'm trying to do something like this to parse a homepage with a login page but Invoke-WebRequest doesn't return anything.
The page I'm trying to access is https://www.suidoapp.waterworks.metro.tokyo.lg.jp/#/login and the code I'm running is this:
$TopURI = "https://www.suidoapp.waterworks.metro.tokyo.lg.jp/#/login"
$TopPage = Invoke-WebRequest -Method Get -Uri $TopURI -SessionVariable MySession -UseBasicParsing
When I look at the Content or RawContent of the $TopPage I can see that it just says "please enable JavaScript" (I've tried both with and without -UseBasicParsing). If I open the page in developer tool in my browser I can see that response for the initial document is the same:
But the interesting thing is that even though the initial page says "please enable JavaScript" the page actually loads:
Has anyone seen this before, where Invoke-WebRequest fails because the response is "please enable JavaScript" yet the page should actually be able to load? Is there another way for me to parse a homepage and send in login forms when Invoke-WebRequest fails like this?
I am having the same issue. The simple answer is: Invoke-WebRequest is not allowed to run javascript for fear of XSS attacks which makes total sense. In my case, I needed to run a Vue.js app via Task Scheduler (on Windows machine in conjunction with IIS). I eventually let PowserShell open a browser and finish the work then close it.
Start-Process -file iexplore -arg 'http://localhost:8080/ (or any URL)' -PassThru
sleep 10
(Get-Process -Name iexplore).Kill()
If you would like to run Firefox instead,
Start-Process -file 'C:\Program Files\Mozilla Firefox\firefox.exe (or your ff location)' -arg 'http://localhost:8080/ (or any URL)' -PassThru
sleep 10
(Get-Process -Name firefox).Kill()
Going back to your question, the loaded page is probably not functional if it invokes any javascript functions. If you talk about form login, you can find resources easily such as this: https://community.auth0.com/t/forms-login-via-curl-or-powershell/17456/3
Related
I've been looking to ways to read HTML on a opened custom chrome profile, but with no luck.
& "C:\Program Files\Google\Chrome\Application\chrome.exe" --profile-directory="Profile 1" $url;
I tried using invoke-webrequest but couldn't figure out how to get it to work through a custom profile.
The point is to read if an item was bought on $url, where "Profile 1" is logged into.
Alternatively I thought about logging into $url with POST method and then returning the HTML as a variable, but I couldn't figure out a way to do that either.
This won't work. This can't work.
You can :
Launch Chrome on a specific profile
Use Invoke-RestMethod and / or Invoke-WebRequest
The two are not correlated though.
The Powershell cmdlets do not depend on Chrome for their operations.
If you want to do stuff through Chrome, you need to do browser automation.
That's a different thing that require specialized tools, such as Selenium.
You can use the .Net interface of selenium directly or use the Selenium powershell module that will help you interface with selenium (and your browser of choice / browser profile) more easily.
To install :
Install-Module -Name Selenium -AllowPrerelease
Note that this tool is a lot more complex to learn and use than just Invoke-WebRequest
References
Github - Selenium-Powershell
Powershell Gallery - Selenium
while navigating to the url , i have different buttons and i want to automate the click on all the buttons and also fetch some data into a file and send email with attachment if something fails using powershell script. I just started writing the script but unable to proceed further. Please suggest.
Code:
$ie=Start-Process -FilePath "C:\Program Files (x86)\Google\Chrome\Application\chrome.exe" -ArgumentList "abc.aspx"
$ie.Document.getElementsByName('').click( )
Error:
You cannot call a method on a null-valued expression.
At line:2 char:2
You are combining Powershell with Javascript.
The variable $ie is empty since you didnt use -PassThru, but even if you did it would not return the content of the website because chrome.exe does not do that.
You can however use Invoke-Webrequest to get content of a website:
$response = Invoke-WebRequest -Uri "google.com"
Then you can use the $response object:
$response.ParsedHTML.getElementsByName('test').click()
I have a URL to a CSV file which, in a browser, I can download and open without issue.
I'm trying to download this file using PowerShell without success. I tried using Invoke-WebRequest, Start-BitsTransfer and using a webrequest object but no luck there.
Invoke-WebRequest comes with a parameter to store its result in a file: -OutFile
Invoke-WebRequest $myDownloadUrl -OutFile c:\file.ext
If you need authorization before you can send a request like this:
Invoke-WebRequest $myAuthUrl /* whatever is neccesary to login */ -SessionVariable MySession
Invoke-WebRequest $myDownloadUrl -WebSession $MySession
To determine the layout of the form where the login happens, you can use Invoke-WebRequests return object. It'll collect information about forms and fields on the HTML (might be Windows only). Mileage of logging in may vary with things like Two-Factor-Auth active or not. Probably you can create some secret link to your file which does not need Auth or possibly google allows you to create a private access token of some sort, which can be send aus Authorization-Header alongside your request.
TLDR answers*:
Method 1, by default synchronous**
Invoke-WebRequest $url -OutFile $path_to_file
(if you get error "...Could not create SSL/TLS secure channel." see Powershell Invoke-WebRequest Fails with SSL/TLS Secure Channel)
Method 2, by default synchronous**
(New-Object System.Net.WebClient).DownloadFile($url, $path_to_file)
Method 3, asynchronous and may be much slower than the other two but is very gentle on bandwidth usage (it uses the BITS service).
Import-Module BitsTransfer
Start-BitsTransfer -Source $url -Destination $path_to_file
Notes:
*: This answer is for those that google for "how to download a file with PowerShell".
**: Read the help pages if you want asynchronous downloading
For a while now I've been using a PS script to download PowerBI bi-monthly and using the BITS, it's been pretty solid and now so much stronger now since I removed the -Asynchronous at the end of the Start-BitsTransfer
$url = "https://download.microsoft.com/download/8/8/0/880BCA75-79DD-466A-927D-1ABF1F5454B0/PBIDesktopSetup.exe"
$output = "%RandomPath%\PowerBI Pro\PBIDesktopSetup.exe"
$start_time = Get-Date
Import-Module BitsTransfer
Start-BitsTransfer -Source $url -Destination $output
#Commented out below because it kept creating "Tmp files"
#Start-BitsTransfer -Source $url -Destination $output -Asynchronous
Afternoon all,
Trying to create a script to go to a webpage and log if it is redirected to an expansion server. For example the site is site.school.net and under high load it is supposed to forward to site2.school.net and site3.school.net. I was looking into powershell just because I want to learn it but I am open to anything that works. I have tried
Start-Process -FilePath FireFox -ArgumentList site.school.net
and it opens a page and all that but I need to run it like every 10-15 seconds and log the url it was sent to. Any help is appreciated, thanks in advance!
Don't use a web browser for this, as it's extra overhead and makes running the script from an unattended system (like in a scheduled job) pretty difficult or impossible.
Use PowerShell's invoke-webrequest cmdlet.
$redirecttest = invoke-webrequest -uri https://jigsaw.w3.org/HTTP/300/302.html -Method get -maximumredirection 0;
Write-Output "StatusCode: $($redirecttest.statuscode)";
Write-Output "Redirecting to: $($redirecttest.headers.location)";
This will attempt to reach the page at the specified URL and not follow a redirect if one is given. It then captures the HTTP headers and outputs the HTTP status code (302, moved permanently in this case) and the location that we're being redirected to.
It will also spit out an error because the server has instructed the client to redirect more times than the client has been told to redirect. handle that error however you prefer/need.
You can then test the redirect location that you're sent to validate that the redirect is working properly:
if ($Redirecttest.headers.location -eq "http://site2.school.net" -or $Redirecttest.headers.location -eq "http://site3.school.net") {
Write-output "Success";
} else {
Write-output "Redirect broken";
}
Or, you could just invoke the web request and allow the redirection, and inspect the resulting URL.
$redirecttest = invoke-webrequest -uri https://jigsaw.w3.org/HTTP/300/302.html -Method get;
$redirecttest.BaseResponse.responseuri.AbsoluteUri;
Note that this will only work if the HTTP server sends true HTTP 30X status codes, as opposed to depending upon a client-side redirect via a META tag or JavaScript.
Here is an example but with Internet Explorer (11). I'm not sure you can automate Firefox with PowerShell. With this example, I get a different output than the original URL.
#Instanciate IE COM Object to interact with IE
$ie = New-Object -ComObject "InternetExplorer.Application"
#store original URL
$requestUri = "http://google.com"
#open the original URL in IE
$ie.navigate($requestUri)
#while IE is busy, wait for 250 seconds and try again
while ($ie.Busy) {
Start-Sleep -Milliseconds 250
}
#retrieve the actual/final URL from the browser
$ie.Document.url
I am attempting to get a script working that does not use invoke-webrequest. The problem I am having is that when I run the script a popup prompt occurs, the popup consists of the following message;
"Windows Security Warning
To allow this website to provide information personalized for you, will you allow it to put a small file (called a cookie) on your computer?"
with yes no response from user
The code I am executing is the following:
$ParsedHTML = New-Object -com "HTMLFILE"
$webresponse = New-Object System.Net.WebClient
$webresponse.Headers.Add("Cookie", $CookieContainer.GetCookieHeader($url))
$result = $webresponse.DownloadString($url)
$ParsedHTML.IHTMLDocument2_write($webresponse)
$ParsedHTML.body.innerText
The main problem with this code is that the $url I am using part of the weblink checks to see if cookies are enabled and this code causes a returned value of disabled.
My question, is there a way to handle the cookie request without changing the output response from the test url site.
Note: This script will be automating a process over hundreds of remote computers and thus having an unhandled popup will just prevent the script from running.
I found the answer in another SO question Using Invoke-Webrequest in PowerShell 3.0 spawns a Windows Security Warning
add the parameter -UseBasicParsing
Technet https://technet.microsoft.com/en-us/library/hh849901.aspx notes that the parameter stops the DOM processing. and has the caveat "This parameter is required when Internet Explorer is not installed on the computers, such as on a Server Core installation of a Windows Server operating system."
So, you mileage may vary.