I am using powershell to do some monitoring and I want to check if an application's jnlp
exists on a website and is available for downloading.
I have the link to the .jnlp and so far I'm downloading the file with .navigate().
$ie = new-object -com "InternetExplorer.Application"
Try {
$ie.navigate("http://bla.com/testApp.jnlp")
} Catch {
#$_
$ErrorMessage = $_.Exception.Message
}
I tried to catch an exception by giving invalid filename but it doesn't work.
Also I thought of downloading the app and try to delete the file afterwards so as to
check that it actually exists but it would be too slow since I have many jnlps to check.
Is there another more simple and elegant way to do so? I want to avoid the downloading of
each file I want to test.
How about using WebClient class from .Net? Getting data is simple enough. Like so,
$webclient = new-object System.Net.WebClient
try {
# Download data as string and store the result into $data
$data = $webclient.DownloadString("http://www.google.com/")
} catch [Net.WebException] {
# A 404 or some other error occured, process the exception here
$ex = $_
$ex.Exception
}
If you're using PowerShell 3.0 or higher, you can use Invoke-WebRequest to see if a page exists by issuing an HTTP HEAD request and checking the status code.
$Result = Invoke-WebRequest -uri `http://bla.com/testApp.jnlp` -method head
if ($Result.StatusCode -ne 200){
# Something other than "OK" was returned.
}
This is doable with System.Net.WebClient as well but it's a bit more effort.
Related
i have been looking for a way to know the status of my ftp files, there are few log files are being uploading on my ftp server after every 15mints, but few times it fails to upload i just want an alert when ever a file fails to upload.
following code has been tired
function update {
$ftprequest = [System.Net.FtpWebRequest]::Create("ftp://ftpsite.com/Script_Apps/install_firefox.exe")
$ftprequest.Method = [System.Net.WebRequestMethods+Ftp]::GetDateTimestamp
$response = $ftprequest.GetResponse().StatusDescription
$tokens = $response.Split(" ")
$code = $tokens[0]
$localfile = (Get-Item "$dir\Apps\install_firefox.exe").LastWriteTimeUtc
if ($tokens -gt $localfile) {
write-host "Updating Firefox Installer..."
$File = "$dir\Apps\install_firefox.exe"
$ftp = "ftp://ftpsite.com/Script_Apps/install_firefox.exe"
$webclient = New-Object System.Net.WebClient
$uri = New-Object System.Uri($ftp)
$webclient.DownloadFile($uri, $File)
"Updated Firefox" >> $global:logfile
mainmenu
}
else {
Write-Host "Local Copy is Newer."
sleep 3
mainmenu
}
}
I'm going to assume this line deals with downloading/uploading files for your FTP site.
$ftp = "ftp://ftpsite.com/Script_Apps/install_firefox.exe"
$webclient = New-Object System.Net.WebClient
$uri = New-Object System.Uri($ftp)
$webclient.DownloadFile($uri, $File)
If so, then we can lookout for errors and do something when one occurs with try{}catch{} blocks.
You can put any code inside a try block, and when it hits a terminating error it will capture the error as $Error[0] which we can reference in a catch block.
try{
$ftp = "ftp://ftpsite.com/Script_Apps/install_firefox.exe"
$webclient = New-Object System.Net.WebClient
$uri = New-Object System.Uri($ftp)
$webclient.DownloadFile($uri, $File)
}
catch{
#Write out an error to the screen
"failed to download file $($ftp). `n--Ran into error $($Error[0].Exception.Message)"
#Todo - send a notification
}
This will give a result like this, when it fails to download.
Failed to download file ftp://ftpsite.com/Script_Apps/install_firefox.exe.
---Ran into error Exception calling "DownloadFile" with "2" argument(s):
"Unable to connect to the remote server"
Then, you can determine the best way to send a notification to suit your needs. There's way's to send e-mail from PowerShell, or Tweets, or even using a Push message to the Pushbullet app on your phone!
Cheers everyone,
I am getting the weirdest problem for which I need your helping ideas how to approach the issue.
So, I have a download script that pulls content off a company intranet using Webclient objects. It requires credentials and it is working on about 80% of the computers. The script pulls a listing using .DownloadString and then parses and gets some files using .DownloadFile.
On the machines that won't work the initial .DownloadString hangs until it appears to run into a timeout and returns $null.
User credentials are irrelevant on these types of machines meaning a user that works on another machine fails on this one.
Addresses, if entered into browser returns content.
Spoken in code I try it this way:
$wc = new-object System.Net.WebClient
$wc.Credentials = new-object System.Net.NetworkCredential($user, $pass, $domain)
$old_eap = $ErrorActionPreference
$ErrorActionPreference = "Stop"
try
{
$tmp = $wc.DownloadString($url)
if ([String]::IsNullOrEmpty($tmp))
{
throw "Intranet server did not return directory listing"
}
Return $tmp #the code is actually part of a function...
}
catch
{
write-error $_.Exception.Message
Return $null
}
finally
{
$ErrorActionPreference = $old_eap
}
I have no idea other than looking for changed settings between different machines. But which settings could be relevant for Webclient behaving like this? Any Ideas? I am seriously stuck...
I forgot... To make things a little easier I am stuck with Version 2.0 and we cant update yet. Bummer...
Thanks in advance
Alex
Maybe try to use xmlhttp as a client. Below is the usage example.
$url = "https://example.com/"
$http = New-Object -ComObject Msxml2.XMLHTTP
$user = "Domain\username"
$pwd = "password"
$utf = [System.Text.Encoding]::UTF8
$http.open("GET", $url, $false, $user, $pwd)
$http.send()
$result = $utf.GetString($http.responseBody)
I have a list of URL in a text file and I want to test whether all of them are reachable or not. I fired the following command in windows powershell but somehow after displaying the status of first two requests, the command stucks somewhere and never returns. Am I missing something?
cat .\Test.txt | % { [system.Net.WebRequest]::Create("$_").GetResponse().StatusCode }
Text File
http://www.google.com
http://www.yahoo.com
http://www.bing.com
Output:
OK
OK
----> after that it stucks.
use Invoke-WebRequest instead:
$sites = 'http://www.google.com','http://www.yahoo.com','http://www.bing.com'
foreach ($site in $sites) {
Invoke-WebRequest $site
$site
}
From memory: You have to explicitly close the Response stream:
$req = [System.Net.HttpWebRequest]::Create($aRequestUrl);
$response = $null
try
{
$response = $req.GetResponse()
# do something with the response
}
finally
{
# Clear the response, otherwise the next HttpWebRequest may fail... (don't know why)
if ($response -ne $null) { $response.Close() }
}
I'm using a Powershell script to automate sending a .txt to an FTP site. When I execute it in powershell, nothing happens. The root\prompt just appears...no messages that it was successful. How do I tell if it worked? Here is my script in case it helps.
$localfile = "D:\Export\TESTING.txt"
$remotefile = "/TESTING.txt"
$ftphost = "ftp://ftp.site.com"
$URI = $ftphost + $remotefile
$username="USERNAME"
$password="1234"
function Get-FTPFile
($URI,$localfile,$username,$password){
$credentials=New-Object System.Net.NetworkCredential
($username,$password)
$ftp=[System.Net.FtpWebRequest]::Create($URI)
$ftp.Credentials=$credentials
$ftp.UseBinary=1
$ftp.KeepAlive=0
$response=$ftp.GetResponse()
$responseStream = $response.GetResponseStream()
$file = New-Object
IO.FileStream ($localfile,[IO.FileMode]::Create)
[byte[]]$buffer = New-Object byte[] 1024
$read = 0
do{
$read=$responseStream.Read($buffer,0,1024)
$file.Write($buffer,0,$read)
}
while ($read -ne 0)$file.close()
}
Use WebRequestMethods.Ftp.GetFileSize following completion of the upload to confirm that the uploaded file size matches the local file size.
You could also use a try/catch block to check for exceptions during your read/write operations. A lack of exceptions would give you some confidence that the upload was successful (i.e. no news is good news).
I am using the following code to upload a file using PowerShell 1.0. How can I tell if the upload completed successfully or if there was an error? I need to delete the file if the upload was successful.
What I have tried:
1. the trap clause. Cant seem to get this one to work.
2. Checking the return value of $webclient.UploadFile -- this seems to always be an empty string, success or not
$File = "D:\Dev\somefilename.zip"
$ftp = "ftp://username:password#example.com/pub/incoming/somefilename.zip"
"ftp url: $ftp"
$webclient = New-Object System.Net.WebClient
$uri = New-Object System.Uri($ftp)
"Uploading $File..."
$webclient.UploadFile($uri, $File)
Drop the trap down into a new scope so that you trap on the exception thrown by Upload e.g.:
$succeeded = $true;
& {
trap { $script:succeeded = $false; continue }
$webclient.UploadFile($uri, $File)
}
if ($succeeded) { 'Yay!' } else { 'Doh!' }
You could also try to catch a specific exception like so:
trap [System.Net.WebException] { ... }
The UploadFile method is synchronous. If it completes without throwing an exception, you have had success. You should get a trappable WebException if it fails.
http://msdn.microsoft.com/en-us/library/36s52zhs.aspx
I'll leave out details about error trapping, as it appears you are familiar with it already.