How to download a file from a non static link using .bat script - powershell

I am trying to download the Total counts by date for all King County excel file using a script that will be run later on using task manager. I am stuck due to the link not being static and its naming convention will most likely change in the next few months.
Here's the code that I've written so far:
#echo off
::Script to download COVID-19 Data
echo "Downloading Total counts by date for all King County"
powershell -Command "Invoke-WebRequest https://www.kingcounty.gov/depts/health/covid-19/data/~/media/depts/health/communicable-diseases/documents/C19/data/covid-data-daily-counts-sept-14.ashx -Outfile CovidData.xlsx
echo "Download has been successful!"
cls
pause
I was wondering if there's a way to add a wild card like "*" in the invoke-webrequest to ignore the "sept-14" part of the link.
Link: https://www.kingcounty.gov/depts/health/covid-19/data/daily-summary.aspx
Link that needs a script to auto download with task manager (Total counts by date for all King County): https://www.kingcounty.gov/depts/health/covid-19/data/~/media/depts/health/communicable-diseases/documents/C19/data/covid-data-daily-counts-sept-14.ashx

I created and tested Powershell Script on my side with Windows Powershell ISE and it works 5/5, hope that will work too for you !
$start_time = Get-Date
$url = "https://www.kingcounty.gov/depts/health/covid-19/data/daily-summary.aspx"
$xlFile = "E:\Test\CovidData.xlsx"
$http_request = New-Object -ComObject Microsoft.XMLHTTP
$http_request.open('GET', $url, $false)
#Sending the request
$http_request.send()
$Contents = $http_request.ResponseText
$pattern = "([\w\-\.,#?^=%&/~\+#]*[\w\-\#?^=%&/~\+#])(\.ashx)"
$Links = $Contents | Select-String $pattern -AllMatches | ForEach-Object {$_.Matches.Value} | sort -unique
$Filter = $Links -match "data-daily"
$MyUrlFile = "https://www.kingcounty.gov/depts/health/covid-19/data/" + $Filter
$MyUrlFile
Invoke-WebRequest "$MyUrlFile" -Outfile $xlFile
if (Test-Path $xlFile) { Start $xlFile }
Write-Output "Running Script Time taken is : $((Get-Date).Subtract($start_time).Seconds) second(s)"

Related

Script to download latest Adobe Reader DC Update

I wrote a script download the latest version of Adobe MUI DC but I am not really happy with the parsing. The script starts at https://supportdownloads.adobe.com/new.jsp, followed by some parsing, getting a link to a new site, parsing and finally getting the final download link.
I am not really sure if this is the best way of doing it?
$webclient = New-Object System.Net.WebClient
$download_folder = 'E:\Adobe_Acrobat_Reader_DC_MUI\'
$url = 'https://supportdownloads.adobe.com/support/downloads/'
Write-Host "Downloading ...AdobeDC Update"
try {
If(!(Test-Path $download_folder)){
New-Item -ItemType Directory -Force -Path "$download_folder"
}
$download_url = $url + ((Invoke-WebRequest $url'new.jsp').Links | where outertext -like '*MUI*Continuous*' | select href).href
Write-Host $download_url
$download_url = $url + ((Invoke-WebRequest $download_url).Links | where outertext -like '*proceed to download*' | select outertext, href).href.replace("amp;","")
Write-Host $download_url
$download_url = ((Invoke-WebRequest $download_url).Links | where outertext -like '*download now*' | select outertext, href).href
Write-Host $download_url
if(!(Test-Path ($download_folder + $download_url.Split('/')[-1]))){
$webclient.DownloadFile($download_url, $download_folder + $download_url.Split('/')[-1])
}
} catch {
Throw($_.Exception)
}
Adobe have an Enterprise Administration Guide that is intended for businesses deploying software to multiple machines (rather than the end user themselves updating their own computer).
For Acrobat DC there is a section for Enterprise installers:
Adobe provides enterprise IT with a download site that contains all available installers. Most admins download the product, updates, and patches from ftp://ftp.adobe.com/pub/adobe/reader/ (or Acrobat).
That FTP link is a much easier way to get the latest version than scraping multiple websites.
You would just need to open the ftp site ftp://ftp.adobe.com/pub/adobe/reader/win/AcrobatDC/, get the directory listing, pick the latest folder, and then download the *MUI installer.
So currently you would be downloading:
ftp://ftp.adobe.com/pub/adobe/reader/win/AcrobatDC/1801120036/AcroRdrDCUpd1801120036_MUI.msp
This technique can be used for pretty much any Adobe product as they are all available: ftp://ftp.adobe.com/pub/adobe/
Out of curiosity on this I wrote a basic script to get the latest file from the ftp site:
$DownloadFolder = "E:\Adobe_Acrobat_Reader_DC_MUI\"
$FTPFolderUrl = "ftp://ftp.adobe.com/pub/adobe/reader/win/AcrobatDC/"
#connect to ftp, and get directory listing
$FTPRequest = [System.Net.FtpWebRequest]::Create("$FTPFolderUrl")
$FTPRequest.Method = [System.Net.WebRequestMethods+Ftp]::ListDirectory
$FTPResponse = $FTPRequest.GetResponse()
$ResponseStream = $FTPResponse.GetResponseStream()
$FTPReader = New-Object System.IO.Streamreader -ArgumentList $ResponseStream
$DirList = $FTPReader.ReadToEnd()
#from Directory Listing get last entry in list, but skip one to avoid the 'misc' dir
$LatestUpdate = $DirList -split '[\r\n]' | Where {$_} | Select -Last 1 -Skip 1
#build file name
$LatestFile = "AcroRdrDCUpd" + $LatestUpdate + "_MUI.msp"
#build download url for latest file
$DownloadURL = "$FTPFolderUrl$LatestUpdate/$LatestFile"
#download file
(New-Object System.Net.WebClient).DownloadFile($DownloadURL, "$DownloadFolder$LatestFile")

powershell save download from website without user input

I need to download a file from a website every hour . Right now I have...
$url = "https://www.misoenergy.org/ria/Consolidated.aspx?format=csv"
$path = "C:\MISO.csv"
# param([string]$url, [string]$path)
if(!(Split-Path -parent $path) -or !(Test-Path -pathType Container (Split-Path -parent $path))) {
$path = Join-Path $pwd (Split-Path -leaf $path)
}
"Downloading [$url]`nSaving at [$path]"
$client = new-object System.Net.WebClient
$client.DownloadFile($url, $path)
#$client.DownloadData($url, $path)
$path
PAUSE
This is getting the response from the website and prompting me to open or save the file. I just want it to save the file. Thank you for any help.
Try this:
$url = "https://www.misoenergy.org/ria/Consolidated.aspx?format=csv"
$data = Invoke-WebRequest -Uri $url
$path = "C:\MISO.csv"
$data.content | Out-file $path
Of course there is no error handling, or checking. It assumes that Invoke-WebRequest has completed successfully and your endpoint does return raw CSV data.
Use wget in my opinion.
The following works for me.
$url = "https://www.misoenergy.org/ria/Consolidated.aspx?format=csv"
$path = "d:\my_cs_folder\MISO.csv"
wget -OutFile $path -Uri $url
I removed the checks for readablity. Destination file Path is changed too. Be advised that you may receive permission denied for placing your file on system drive's root.
I've been attempting a similar thing for a while now and it's very annoying as most cases will not work correctly and sending keystrokes to the save button requires the window to be active, at least for as far as I can see.
Due to this if the computer is being used it will not work, I still need to figure out a way past this but until now I've gotten it to work like this.
$IE = New-Object -ComObject InternetExplorer.Application
$IE.Visible = $true
$IE.Navigate($url)
$IEProc = Get-Process | Where-Object {$_.MainWindowHandle -eq $IE.HWND}
$WS = New-Object -ComObject WScript.Shell
$WS.AppActivate($IEProc.id)
Start-Sleep 2
[Windows.Forms.SendKeys]::SendWait('%{s}')
I'm aware of it being very ugly but it does the job :(

FTPS Upload in Powershell

I'm in the process of learning Powershell, and am working on a little script that will upload a group of files to an FTPS server nightly. The files are located on a network share in a sub-directory containing the date in the name. The files themselves will all begin with the same string, let's say "JONES_". I have this script working for FTP, but I don't quite get what I need to do to get it to work for FTPS:
# Set yesterday's date (since uploads will happen at 2am)
$YDate = (Get-Date).AddDays(-1).ToString('MM-dd-yyyy')
#Create Log File
$Logfile = "C:\powershell\$YDate.log"
Function LogWrite
{
Param ([string]$logstring)
Add-Content $Logfile -value $logstring
}
# Find Directory w/ Yesterday's Date in name
$YesterdayFolder = Get-ChildItem -Path "\\network\storage\location" | Where-Object {$_.FullName.contains($YDate)}
If ($YesterdayFolder) {
#we specify the directory where all files that we want to upload are contained
$Dir= $YesterdayFolder
#ftp server
$ftp = "ftp://ftps.site.com"
$user = "USERNAME"
$pass = "PASSWORD"
$webclient = New-Object System.Net.WebClient
$webclient.Credentials = New-Object System.Net.NetworkCredential($user,$pass)
$FilesToUpload = Get-ChildItem -Path (Join-Path $YesterdayFolder.FullName "Report") | Where-Object {$_.Name.StartsWith("JONES","CurrentCultureIgnoreCase")}
foreach($item in ($FilesToUpload))
{
LogWrite "Uploading file: $YesterdayFolder\Report\$item"
$uri = New-Object System.Uri($ftp+$item.Name)
$webclient.UploadFile($uri, $item.FullName)
}
} Else {
LogWrite "No files to upload"
}
I'd rather not have to deal with a 3rd party software solution, if at all possible.
Using psftp didn't work for me. I couldn't get it to connect to the FTP over SSL. I ended up (reluctantly?) using WinSCP with this code:
$PutCommand = '& "C:\Program Files (x86)\WinSCP\winscp.com" /command "open ftp://USER:PASS#ftps.hostname.com:21/directory/ -explicitssl" "put """"' + $Item.FullName + '""""" "exit"'
Invoke-Expression $PutCommand
In the foreach loop.
I'm not sure if you would consider this as "3rd party software" or not, but you can run PSFTP from within Powershell. Here is an example of how you could do that (source):
$outfile=$YesterdayFolder"\Report\"$item.Name
"rm $outfile`nput $outfile`nbye" | out-file batch.psftp -force -Encoding ASCII
$user = "USERNAME"
$pass = "PASSWORD"
&.\psftp.exe -l $user -pw $pass $ftp -b batch.psftp -be

Powershell Script triggered by Task Scheduler not executing Conditional Logic

I have a basic PowerShell (v4) script running on Server 2012. I decided to have Task Scheduler task it. Task scheduler instance is running as domain and machine administrator, Runs whether the user is logged on or not and runs with Highest Privileges. It's configured for Windows Server 2012 R2.
It starts a program of "Powershell" and this is my argument: "-ExecutionPolicy Bypass -file F:\AdMgmt\scripts\newcheckandcopy.ps1"
The problem: It executes some of the script just fine - however, I have an "If/Else" statement that it ignores. I've tried this with two scripts now and the Task scheduler always executes fine, but doesn't handle the If/Else stuff - it just skips over it and runs everything else.
This is the text of the script (it runs perfectly from the powershell console when logged in as the same account that runs the task):
$path = 'Q:\'
$stats = 0
$msg = ''
$days = 1
$hours = 0
$mins = 0
$logtime = Get-Date -uFormat "%y%m%d%H%M"
$files = #(Get-ChildItem -Recurse -Path $path -Include '*.*' | ?{ $_.LastWriteTime -lt (Get-Date).AddDays(-$days).AddHours(-$hours).AddMinutes(-$minutes) -and $_.psIsContainer -eq $false})
if ($files -ne $null) {
$f_names = [System.String]::Join('|',$files)
$msg = 'Message: ' + $f_names
$stats = $files.Count
Send-MailMessage -to "domain#test.com" -from "domain#test.com" -subject "FileMaker Files older than 1 Day" -body $msg -smtpserver "smtp-relay.test.com"
} else {
$msg = 'Message: 0 files exceed defined age'
Send-MailMessage -to "domain#test.com" -from "domain#test.com" -subject "FileMaker Files OK" -body $msg -smtpserver "smtp-relay.test.com"
}
Copy-Item Q:\* F:\admgmt\ -recurse
Add-Content f:\admgmt\logs\checkandcopy.txt $logtime
Write-Host $msg
Write-Host "Statistic: $stats"
I'm guessing your issue is you are running the script outside of user context. the Q:\ drive would not be available to it. PowerShell supports UNC paths so you might be able to substitute the actual path that Q:\ points to.

How to read a log file that is updated all the time

im trying to create a script that will check the contents of a log file, using windows powershell. The powershell script is checking the log file that is created from another windows application.
The script is the following:
$smtpServer = "MailServer"
$fdate = Get-Date -Format yyyyMMdd
$fname = "C:\tmp\"+$fdate+".log"
$content=Get-Content $fname -wait | where { $_ -match ": exception" }|
foreach {
$line=$_
$msg = new-object Net.Mail.MailMessage
$smtp = new-object Net.Mail.SmtpClient($smtpServer)
$msg.From = "admin#xyz.com"
$msg.ReplyTo = "logs#xyz.com"
$msg.To.Add("logs#xyz.com")
$msg.subject = "Exception"
$msg.body = $line
$smtp.Send($msg)
Write-Host $line
}
The script above is not able to read the new additions of the log file. If for example i have 10 lines in the log file and i start the script now it will check only the current 10 lines. If any new lines are added from the application while the script is running, the script is not able to check the newly added lines!
Any recommendations for a proper solution to that issue? Did anyone tried to do something similar using C/C++ or java?
Thank you
Also you can use these: Multithreading with Jobs in PowerShell and PowerShell Multithreading