powershell save download from website without user input - powershell

I need to download a file from a website every hour . Right now I have...
$url = "https://www.misoenergy.org/ria/Consolidated.aspx?format=csv"
$path = "C:\MISO.csv"
# param([string]$url, [string]$path)
if(!(Split-Path -parent $path) -or !(Test-Path -pathType Container (Split-Path -parent $path))) {
$path = Join-Path $pwd (Split-Path -leaf $path)
}
"Downloading [$url]`nSaving at [$path]"
$client = new-object System.Net.WebClient
$client.DownloadFile($url, $path)
#$client.DownloadData($url, $path)
$path
PAUSE
This is getting the response from the website and prompting me to open or save the file. I just want it to save the file. Thank you for any help.

Try this:
$url = "https://www.misoenergy.org/ria/Consolidated.aspx?format=csv"
$data = Invoke-WebRequest -Uri $url
$path = "C:\MISO.csv"
$data.content | Out-file $path
Of course there is no error handling, or checking. It assumes that Invoke-WebRequest has completed successfully and your endpoint does return raw CSV data.

Use wget in my opinion.
The following works for me.
$url = "https://www.misoenergy.org/ria/Consolidated.aspx?format=csv"
$path = "d:\my_cs_folder\MISO.csv"
wget -OutFile $path -Uri $url
I removed the checks for readablity. Destination file Path is changed too. Be advised that you may receive permission denied for placing your file on system drive's root.

I've been attempting a similar thing for a while now and it's very annoying as most cases will not work correctly and sending keystrokes to the save button requires the window to be active, at least for as far as I can see.
Due to this if the computer is being used it will not work, I still need to figure out a way past this but until now I've gotten it to work like this.
$IE = New-Object -ComObject InternetExplorer.Application
$IE.Visible = $true
$IE.Navigate($url)
$IEProc = Get-Process | Where-Object {$_.MainWindowHandle -eq $IE.HWND}
$WS = New-Object -ComObject WScript.Shell
$WS.AppActivate($IEProc.id)
Start-Sleep 2
[Windows.Forms.SendKeys]::SendWait('%{s}')
I'm aware of it being very ugly but it does the job :(

Related

PowerShell DownloadFile for File name Changing every day

Code below to download a file from website:
https://www.mcafee.com/enterprise/en-us/downloads/security-updates.html
File being downloaded
However, the file name is changing every day. For example: 'mediumepo4981dat.zip' today and 'mediumepo4980dat.zip' yesterday.
How can I create a script which I can run daily which works dynamically?
[System.Net.ServicePointManager]::ServerCertificateValidationCallback = {$true}
$uri = "https://download.nai.com/products/datfiles/med/mediumepo4981dat.zip"
$filename = "C:\DownloadTest\mediumepo4981dat.zip"
$wc = New-Object System.Net.WebClient
$wc.UseDefaultCredentials = $true
$wc.DownloadFile($uri, $filename)
Please let me know if you need any further clarification.
Note: Looking for a solution without Invoke-WebRequest as that does not work in my current environment.
Seems like we can assume that the last file listed is always the newest, can't confirm if this will always be true it may change at some point. For the time being, this is how you can get the last file:
$uri = [uri] "https://download.nai.com/products/datfiles/med"
$wr = Invoke-WebRequest $uri
$file = $wr.ParsedHtml.getElementsByTagName('a') |
Select-Object -Last 1 -ExpandProperty TextContent
$toDownload = [uri]::new($uri, $file)
Now you can combine this with the rest of your script:
$filename = Join-Path 'C:\DownloadTest\' -ChildPath $file
$wc = New-Object System.Net.WebClient
$wc.UseDefaultCredentials = $true
$wc.DownloadFile($toDownload.AbsoluteUri, $filename)
Note that this will work only in Windows PowerShell.
$uri = [uri] "https://download.nai.com/products/datfiles/med"
$wr = Invoke-WebRequest $uri
$file = $wr.ParsedHtml.getElementsByTagName('a') |
Select-Object -Last 1 -ExpandProperty TextContent
$toDownload = [uri]::new($uri, $file)
is missing trailing '/' in $uri.
It should be
$uri = [uri] "https://download.nai.com/products/datfiles/med/"

How do you get table data from a website after you login using powershell?

My company wants me to grab data from their internal website, organize it, and send it to a database. The data is displayed on tables that you navigate to within the site. I'm wanting to pull the fields into a file or memory for further processing.
So far, I can log into the site in powershell by getting the submit login button's ID, and passing my username/password. I'm able to pass use the navigate method to change the page to the appropriate page within the site. However, running an Invoke-WebRequest on the new page, as well as using the Net.WebClient on the new page is returning the information found on the original site's login screen(I know, because nothing from the table makes it into the returned values, regardless of the commands I use). The commented code is what I've tried previously.
Here is the code-minus the values of my id/password/site link
[Net.ServicePointManager]::SecurityProtocol = [Net.SecurityProtocolType]::Tls12
$ie = New-Object -ComObject 'internetExplorer.Application'
$ie.Visible= $true # Make it visible
$username="myid"
$password="mypw"
$ie.Navigate("https://webpage.com/index.jsp")
While ($ie.Busy -eq $true) {Start-Sleep -Seconds 3;}
$usernamefield = $ie.document.getElementByID('login')
$usernamefield.value = "$username"
$passwordfield = $ie.document.getElementByID('password')
$passwordfield.value = "$password"
$Link = $ie.document.getElementByID('SubmitLogin')
$Link.click()
$url = "https://webpage.com/home.pa#%5BT1%2CM181%5D"
$ie.Navigate($url)
While ($ie.Busy -eq $true) {Start-Sleep -Seconds 3;}
$doc = $ie.document
$web = New-Object Net.WebClient
$web.DownloadString($url)
#$r = Invoke-WebRequest $url
#$r.Forms.fields | get-member
#$InnerText = $r.AllElements |
# Where-Object {$_.tagName -ne "TD" -and $_.innerText -ne $null} |
# Select -ExpandProperty innerText
#write-host $InnerText
#$r.AllElements|Where-Object {$_.InnerHtml -like "*=*"}
#$doc = $ie.Document
#$doc.getElementByID("ext-element-7") | % {
# if ($_.id -ne $null){
# write-host $_.id
# }
#}
$ie.Quit()
I obviously don't have your page and can't ensure that the body of the POST from signing in contains the fields login and password so that will require some trial & error from you. As a mini-example, if you open up your console dev tools network tab and filter by POST, you can observe how your login page signs you in. When I open reddit to sign in, it sends a POST to https://www.reddit.com/login with a body containing a username and password key/value (both plaintext). This action sets up my browser session to persist my login.
Here's a code example that uses the HtmlAgilityPack library to interact with the resulting page as if it were XML.
Enabling TLS1.2:
[System.Net.ServicePointManager]::SecurityProtocol =
[System.Net.ServicePointManager]::SecurityProtocol -bor [System.Net.SecurityProtocolType]::Tls12
Setting up your web session:
$iwrParams = #{
'Uri' = 'https://webpage.com/index.jsp'
'Method' = 'POST'
'Body' = #{
'login' = $username
'password' = $password
}
'SessionVariable' = 'session'
# avoids cases where IE has not been opened
'UseBasicParsing' = $true
}
# don't care about response - only here to initialize the session
$null = Invoke-WebRequest #iwrParams
Getting the protect page content:
$iwrParams = #{
'Uri' = 'https://webpage.com/home.pa#%5BT1%2CM181%5D'
'WebSession' = $session
'UseBasicParsing' = $true
}
$output = (Invoke-WebRequest #iwrParams).Content
Downloading/adding HtmlAgility:
if (-not (Test-Path -Path "$PSScriptRoot\HtmlAgilityPack.dll" -PathType Leaf))
{
Invoke-WebRequest -Uri https://www.nuget.org/api/v2/package/HtmlAgilityPack -OutFile "$PSScriptRoot\html.zip"
Expand-Archive -Path "$PSScriptRoot\html.zip" -DestinationPath "$PSScriptRoot\html" -Force
Copy-Item -Path "$PSScriptRoot\html\lib\netstandard2.0\HtmlAgilityPack.dll" -Destination "$PSScriptRoot\"
Remove-Item -Path "$PSScriptRoot\html", "$PSScriptRoot\html.zip" -Recurse -Force
}
Add-Type -Path "$PSScriptRoot\HtmlAgilityPack.dll"
$html = [HtmlAgilityPack.HtmlDocument]::new()
Loading/parsing your page content:
$html.LoadHtml($output)
# do stuff with output.
$html.DocumentNode.SelectNodes('//*/text()').Text.Where{$PSItem -like '*=*'}
Footnote
I made the assumption in the code you were executing from a script where $PSScriptRoot will be populated. If it's being run interactively, you can use the $pwd automatic variable instead (carry-over from *nix, print working directory). This code requires PSv5+.
After some serious effort-I managed to get the pages to work correctly. Turns out I wasn't waiting for everything to load-but once I had that, I eventually found the correct tag/name to make everything work.
Assuming the code in the original post is correct up to "ie.Navigate($url)"
$ie.Navigate($url)
While ($ie.Busy -eq $true) {Start-Sleep -Seconds 3;}
$r = Invoke-WebRequest $url
$doc = $ie.document
$j = ($doc.getElementsByTagName("body") | Where {$_.className -eq 'thefullclassname found in the quotes of <body class="" of the area you wanted'}).innerText
write-host $j
This gave me the output of a very annoyingly done table that isn't a "table", and has the first row/col on it's own-so formatting the output to an easy to use version will be the new hassle. At least I got everything on the page that had the text I needed...so progress!

FTPS Upload in Powershell

I'm in the process of learning Powershell, and am working on a little script that will upload a group of files to an FTPS server nightly. The files are located on a network share in a sub-directory containing the date in the name. The files themselves will all begin with the same string, let's say "JONES_". I have this script working for FTP, but I don't quite get what I need to do to get it to work for FTPS:
# Set yesterday's date (since uploads will happen at 2am)
$YDate = (Get-Date).AddDays(-1).ToString('MM-dd-yyyy')
#Create Log File
$Logfile = "C:\powershell\$YDate.log"
Function LogWrite
{
Param ([string]$logstring)
Add-Content $Logfile -value $logstring
}
# Find Directory w/ Yesterday's Date in name
$YesterdayFolder = Get-ChildItem -Path "\\network\storage\location" | Where-Object {$_.FullName.contains($YDate)}
If ($YesterdayFolder) {
#we specify the directory where all files that we want to upload are contained
$Dir= $YesterdayFolder
#ftp server
$ftp = "ftp://ftps.site.com"
$user = "USERNAME"
$pass = "PASSWORD"
$webclient = New-Object System.Net.WebClient
$webclient.Credentials = New-Object System.Net.NetworkCredential($user,$pass)
$FilesToUpload = Get-ChildItem -Path (Join-Path $YesterdayFolder.FullName "Report") | Where-Object {$_.Name.StartsWith("JONES","CurrentCultureIgnoreCase")}
foreach($item in ($FilesToUpload))
{
LogWrite "Uploading file: $YesterdayFolder\Report\$item"
$uri = New-Object System.Uri($ftp+$item.Name)
$webclient.UploadFile($uri, $item.FullName)
}
} Else {
LogWrite "No files to upload"
}
I'd rather not have to deal with a 3rd party software solution, if at all possible.
Using psftp didn't work for me. I couldn't get it to connect to the FTP over SSL. I ended up (reluctantly?) using WinSCP with this code:
$PutCommand = '& "C:\Program Files (x86)\WinSCP\winscp.com" /command "open ftp://USER:PASS#ftps.hostname.com:21/directory/ -explicitssl" "put """"' + $Item.FullName + '""""" "exit"'
Invoke-Expression $PutCommand
In the foreach loop.
I'm not sure if you would consider this as "3rd party software" or not, but you can run PSFTP from within Powershell. Here is an example of how you could do that (source):
$outfile=$YesterdayFolder"\Report\"$item.Name
"rm $outfile`nput $outfile`nbye" | out-file batch.psftp -force -Encoding ASCII
$user = "USERNAME"
$pass = "PASSWORD"
&.\psftp.exe -l $user -pw $pass $ftp -b batch.psftp -be

Script which choose latest file and sends it via FTP

I want this script to choose the latest file from a folder and then send it via ftp to the server.
I think it is choosing the file late because there is a new file on the FTP after running it. However it crashes constantly showing
uploading .....
uploading .....
uploading .....
$Dir="C:/log1"
$ftp = "ftpftpftp"
$user = "useruseruser"
$pass = "passpasspass"
$latest = Get-ChildItem -Path $Dir | Sort-Object LastAccessTime -Descending | Select-Object -First 1
$webclient = New-Object System.Net.WebClient
$webclient.Credentials = New-Object System.Net.NetworkCredential($user,$pass)
for($latest){
"Uploading $latest..."
$uri = New-Object System.Uri($ftp+$latest.Name)
$webclient.UploadFile($uri, $latest.FullName)
}
I think you are using the wrong code block by accident. Currently you have created an infinite loop as you have no condition to how the for will exit.
A simple example of such a loop would be
for(){"Hello? Is it me you are looking for?"}
It should be structured like this
for (initialization; condition; repeat){code block}
an example would be
for($index =1; $index -lt 6;$index++){$index}
There is no need for that code block at all as long as $Dir is not empty. What you can do for a little error prevention is if($latest){} which will only work if $latest contains a file (in this code structure).
if($latest){
"Uploading $latest..."
$uri = New-Object System.Uri($ftp+$latest.Name)
$webclient.UploadFile($uri, $latest.FullName)
}
Your sample output does not have a file name in it so I suspect your $dir contains no files?

Upload files with FTP using PowerShell

I want to use PowerShell to transfer files with FTP to an anonymous FTP server. I would not use any extra packages. How?
I am not sure you can 100% bullet proof the script from not hanging or crashing, as there are things outside your control (what if the server loses power mid-upload?) - but this should provide a solid foundation for getting you started:
# create the FtpWebRequest and configure it
$ftp = [System.Net.FtpWebRequest]::Create("ftp://localhost/me.png")
$ftp = [System.Net.FtpWebRequest]$ftp
$ftp.Method = [System.Net.WebRequestMethods+Ftp]::UploadFile
$ftp.Credentials = new-object System.Net.NetworkCredential("anonymous","anonymous#localhost")
$ftp.UseBinary = $true
$ftp.UsePassive = $true
# read in the file to upload as a byte array
$content = [System.IO.File]::ReadAllBytes("C:\me.png")
$ftp.ContentLength = $content.Length
# get the request stream, and write the bytes into it
$rs = $ftp.GetRequestStream()
$rs.Write($content, 0, $content.Length)
# be sure to clean up after ourselves
$rs.Close()
$rs.Dispose()
There are some other ways too. I have used the following script:
$File = "D:\Dev\somefilename.zip";
$ftp = "ftp://username:password#example.com/pub/incoming/somefilename.zip";
Write-Host -Object "ftp url: $ftp";
$webclient = New-Object -TypeName System.Net.WebClient;
$uri = New-Object -TypeName System.Uri -ArgumentList $ftp;
Write-Host -Object "Uploading $File...";
$webclient.UploadFile($uri, $File);
And you could run a script against the windows FTP command line utility using the following command
ftp -s:script.txt
(Check out this article)
The following question on SO also answers this: How to script FTP upload and download?
I'm not gonna claim that this is more elegant than the highest-voted solution...but this is cool (well, at least in my mind LOL) in its own way:
$server = "ftp.lolcats.com"
$filelist = "file1.txt file2.txt"
"open $server
user $user $password
binary
cd $dir
" +
($filelist.split(' ') | %{ "put ""$_""`n" }) | ftp -i -in
As you can see, it uses that dinky built-in windows FTP client. Much shorter and straightforward, too. Yes, I've actually used this and it works!
Easiest way
The most trivial way to upload a binary file to an FTP server using PowerShell is using WebClient.UploadFile:
$client = New-Object System.Net.WebClient
$client.Credentials =
New-Object System.Net.NetworkCredential("username", "password")
$client.UploadFile(
"ftp://ftp.example.com/remote/path/file.zip", "C:\local\path\file.zip")
Advanced options
If you need a greater control, that WebClient does not offer (like TLS/SSL encryption, etc), use FtpWebRequest. Easy way is to just copy a FileStream to FTP stream using Stream.CopyTo:
$request = [Net.WebRequest]::Create("ftp://ftp.example.com/remote/path/file.zip")
$request.Credentials =
New-Object System.Net.NetworkCredential("username", "password")
$request.Method = [System.Net.WebRequestMethods+Ftp]::UploadFile
$fileStream = [System.IO.File]::OpenRead("C:\local\path\file.zip")
$ftpStream = $request.GetRequestStream()
$fileStream.CopyTo($ftpStream)
$ftpStream.Dispose()
$fileStream.Dispose()
Progress monitoring
If you need to monitor an upload progress, you have to copy the contents by chunks yourself:
$request = [Net.WebRequest]::Create("ftp://ftp.example.com/remote/path/file.zip")
$request.Credentials =
New-Object System.Net.NetworkCredential("username", "password")
$request.Method = [System.Net.WebRequestMethods+Ftp]::UploadFile
$fileStream = [System.IO.File]::OpenRead("C:\local\path\file.zip")
$ftpStream = $request.GetRequestStream()
$buffer = New-Object Byte[] 10240
while (($read = $fileStream.Read($buffer, 0, $buffer.Length)) -gt 0)
{
$ftpStream.Write($buffer, 0, $read)
$pct = ($fileStream.Position / $fileStream.Length)
Write-Progress `
-Activity "Uploading" -Status ("{0:P0} complete:" -f $pct) `
-PercentComplete ($pct * 100)
}
$ftpStream.Dispose()
$fileStream.Dispose()
Uploading folder
If you want to upload all files from a folder, see
PowerShell Script to upload an entire folder to FTP
I recently wrote for powershell several functions for communicating with FTP, see https://github.com/AstralisSomnium/PowerShell-No-Library-Just-Functions/blob/master/FTPModule.ps1. The second function below, you can send a whole local folder to FTP. In the module are even functions for removing / adding / reading folders and files recursively.
#Add-FtpFile -ftpFilePath "ftp://myHost.com/folder/somewhere/uploaded.txt" -localFile "C:\temp\file.txt" -userName "User" -password "pw"
function Add-FtpFile($ftpFilePath, $localFile, $username, $password) {
$ftprequest = New-FtpRequest -sourceUri $ftpFilePath -method ([System.Net.WebRequestMethods+Ftp]::UploadFile) -username $username -password $password
Write-Host "$($ftpRequest.Method) for '$($ftpRequest.RequestUri)' complete'"
$content = $content = [System.IO.File]::ReadAllBytes($localFile)
$ftprequest.ContentLength = $content.Length
$requestStream = $ftprequest.GetRequestStream()
$requestStream.Write($content, 0, $content.Length)
$requestStream.Close()
$requestStream.Dispose()
}
#Add-FtpFolderWithFiles -sourceFolder "C:\temp\" -destinationFolder "ftp://myHost.com/folder/somewhere/" -userName "User" -password "pw"
function Add-FtpFolderWithFiles($sourceFolder, $destinationFolder, $userName, $password) {
Add-FtpDirectory $destinationFolder $userName $password
$files = Get-ChildItem $sourceFolder -File
foreach($file in $files) {
$uploadUrl ="$destinationFolder/$($file.Name)"
Add-FtpFile -ftpFilePath $uploadUrl -localFile $file.FullName -username $userName -password $password
}
}
#Add-FtpFolderWithFilesRecursive -sourceFolder "C:\temp\" -destinationFolder "ftp://myHost.com/folder/" -userName "User" -password "pw"
function Add-FtpFolderWithFilesRecursive($sourceFolder, $destinationFolder, $userName, $password) {
Add-FtpFolderWithFiles -sourceFolder $sourceFolder -destinationFolder $destinationFolder -userName $userName -password $password
$subDirectories = Get-ChildItem $sourceFolder -Directory
$fromUri = new-object System.Uri($sourceFolder)
foreach($subDirectory in $subDirectories) {
$toUri = new-object System.Uri($subDirectory.FullName)
$relativeUrl = $fromUri.MakeRelativeUri($toUri)
$relativePath = [System.Uri]::UnescapeDataString($relativeUrl.ToString())
$lastFolder = $relativePath.Substring($relativePath.LastIndexOf("/")+1)
Add-FtpFolderWithFilesRecursive -sourceFolder $subDirectory.FullName -destinationFolder "$destinationFolder/$lastFolder" -userName $userName -password $password
}
}
Here's my super cool version BECAUSE IT HAS A PROGRESS BAR :-)
Which is a completely useless feature, I know, but it still looks cool \m/ \m/
$webclient = New-Object System.Net.WebClient
Register-ObjectEvent -InputObject $webclient -EventName "UploadProgressChanged" -Action { Write-Progress -Activity "Upload progress..." -Status "Uploading" -PercentComplete $EventArgs.ProgressPercentage } > $null
$File = "filename.zip"
$ftp = "ftp://user:password#server/filename.zip"
$uri = New-Object System.Uri($ftp)
try{
$webclient.UploadFileAsync($uri, $File)
}
catch [Net.WebException]
{
Write-Host $_.Exception.ToString() -foregroundcolor red
}
while ($webclient.IsBusy) { continue }
PS. Helps a lot, when I'm wondering "did it stop working, or is it just my slow ASDL connection?"
You can simply handle file uploads through PowerShell, like this.
Complete project is available on Github here https://github.com/edouardkombo/PowerShellFtp
#Directory where to find pictures to upload
$Dir= 'c:\fff\medias\'
#Directory where to save uploaded pictures
$saveDir = 'c:\fff\save\'
#ftp server params
$ftp = 'ftp://10.0.1.11:21/'
$user = 'user'
$pass = 'pass'
#Connect to ftp webclient
$webclient = New-Object System.Net.WebClient
$webclient.Credentials = New-Object System.Net.NetworkCredential($user,$pass)
#Initialize var for infinite loop
$i=0
#Infinite loop
while($i -eq 0){
#Pause 1 seconde before continue
Start-Sleep -sec 1
#Search for pictures in directory
foreach($item in (dir $Dir "*.jpg"))
{
#Set default network status to 1
$onNetwork = "1"
#Get picture creation dateTime...
$pictureDateTime = (Get-ChildItem $item.fullName).CreationTime
#Convert dateTime to timeStamp
$pictureTimeStamp = (Get-Date $pictureDateTime).ToFileTime()
#Get actual timeStamp
$timeStamp = (Get-Date).ToFileTime()
#Get picture lifeTime
$pictureLifeTime = $timeStamp - $pictureTimeStamp
#We only treat pictures that are fully written on the disk
#So, we put a 2 second delay to ensure even big pictures have been fully wirtten in the disk
if($pictureLifeTime -gt "2") {
#If upload fails, we set network status at 0
try{
$uri = New-Object System.Uri($ftp+$item.Name)
$webclient.UploadFile($uri, $item.FullName)
} catch [Exception] {
$onNetwork = "0"
write-host $_.Exception.Message;
}
#If upload succeeded, we do further actions
if($onNetwork -eq "1"){
"Copying $item..."
Copy-Item -path $item.fullName -destination $saveDir$item
"Deleting $item..."
Remove-Item $item.fullName
}
}
}
}
You can use this function :
function SendByFTP {
param (
$userFTP = "anonymous",
$passFTP = "anonymous",
[Parameter(Mandatory=$True)]$serverFTP,
[Parameter(Mandatory=$True)]$localFile,
[Parameter(Mandatory=$True)]$remotePath
)
if(Test-Path $localFile){
$remoteFile = $localFile.Split("\")[-1]
$remotePath = Join-Path -Path $remotePath -ChildPath $remoteFile
$ftpAddr = "ftp://${userFTP}:${passFTP}#${serverFTP}/$remotePath"
$browser = New-Object System.Net.WebClient
$url = New-Object System.Uri($ftpAddr)
$browser.UploadFile($url, $localFile)
}
else{
Return "Unable to find $localFile"
}
}
This function send specified file by FTP.
You must call the function with these parameters :
userFTP = "anonymous" by default or your username
passFTP = "anonymous" by default or your password
serverFTP = IP address of the FTP server
localFile = File to send
remotePath = the path on the FTP server
For example :
SendByFTP -userFTP "USERNAME" -passFTP "PASSWORD" -serverFTP "MYSERVER" -localFile "toto.zip" -remotePath "path/on/the/FTP/"
Goyuix's solution works great, but as presented it gives me this error: "The requested FTP command is not supported when using HTTP proxy."
Adding this line after $ftp.UsePassive = $true fixed the problem for me:
$ftp.Proxy = $null;
Simple solution if you can install curl.
curl.exe -p --insecure "ftp://<ftp_server>" --user "user:password" -T "local_file_full_path"