Through various different posts on StackOverflow and other places I was able to put together a powershell script that FTP uploads files and it works great. However I wanted to add a bit more verbosity to it. See code below:
foreach ($file in $uploadfiles)
{
# create the full path to the file on remote server (odd but okay!)
$ftp_command = $ftp + $file
# for debugging
#$ftp_command
# create a new URI object for the full path of the file
$uri = New-Object System.URI($ftp_command)
#for debugging
#$uri
# finally do our upload to the remote server - URI object, full path to local file
#$responseArray = $ftpclient.UploadFile($uri,$file.Fullname)
$result = $ftpclient.UploadFile($uri,$file.Fullname)
if ($result) {
$file.Fullname + " uploaded successfully"
} else {
$file.Fullname + " not uploaded successfully"
}
}
Basically after the file is uploaded I wanted to check and see if it was successful or not. Upload file is supposed to return a byte array ( http://msdn.microsoft.com/en-us/library/36s52zhs(v=vs.80).aspx ; A Byte array containing the body of the response from the resource.). I'm new to powershell so that's probably where my problem is, but for the life of me I can't get anything to come out of $result so I can test. Is it possible the server isn't returning anything or I'm just not accessing/setting the the byte array correctly? I've tried a variety of different things, but haven't yet figured anything out.
Thanks,
I would not use the $result in your case begining PowerShell 2.0 you can use the try/catch sections.
try
{
$ftpclient.UploadFile($uri,$file.Fullname)
}
catch [System.Net.WebException]
{
# here $_ gives you the exception details
}
Related
I'm trying to keep a central list of log file locations where my log file cleanup script can grab the most up to date list.
$logpaths = (Invoke-WebRequest -UseBasicParsing -Uri 'http://10.7.58.99/logpaths.txt').Content
foreach($logpath in $logpaths)
{
"line"
$logpath
}
My script was sort of working but I was seeing some strange behavior so when I broke it down I found that within the foreach loop it just loops once and dumps the entire contents.
If I download the file the a text file on the local machine I can then use [System.IO.File]::ReadLines and it steps through perfectly. However, I don't want to download the file each time I run it or store it on the local server at all for that matter. How can I step through the content of Invoke-WebRequest line by line?
Based on this example from the .NET docs, you could read a response stream line-by-line like this, which should have better performance.
$url = 'http://10.7.58.99/logpaths.txt'
& {
$myHttpWebRequest = [System.Net.WebRequest]::Create($url)
$myHttpWebResponse = $myHttpWebRequest.GetResponse()
$receiveStream = $myHttpWebResponse.GetResponseStream()
$encode = [System.Text.Encoding]::GetEncoding("utf-8")
$readStream = [System.IO.StreamReader]::new($receiveStream, $encode)
while (-not $readStream.EndOfStream) {
$readStream.ReadLine()
}
$myHttpWebResponse.Close()
$readStream.Close()
} | foreach {
$logPath = $_
}
You might want to turn this into a nice little function. Let me know if you need help.
I'm using following script to download files using powershell.
$folder = "c:\temp\"
$userAgent = "Mozilla/5.0 (Windows NT 6.1; WOW64; rv:7.0.1) Gecko/20100101
Firefox/7.0.1"
$web = New-Object System.Net.WebClient
$web.Headers.Add("user-agent", $userAgent)
Get-Content "c:\temp\URL_List.txt" |
Foreach-Object {
"Downloading " + $_
try {
$target = join-path $folder ([io.path]::getfilename($_))
$web.DownloadFile($_, $target)
} catch {
$_.Exception.Message
}
}
The URL_List.txt file has list of URL's I'm trying to download files from. Here's a sample URL from the list: https://drive.google.com/uc?export=download&id=0B84LPHCa2YmdZmFMV0dsYl9FeTg
If you look at the URL there's no absolute file name in the URL so I'm not sure how to set the target parameter for WebClient.DownloadFile() method.
Read the Download Files. Also might be worth checking since you're using Powershell, Google Drive REST Api module for Powershell.
So the question as I understand it is how to extract the filename for a Google Drive file without downloading the file first
This Chilkat page gave me the idea that it should be possible to access properties with a GET request. Chilkat is a paid API, so I thought I'd try to cobble together a method using direct PowerShell commands.
Invoke-WebRequest works but downloads the whole file. We only need the headers.
This site has the core code for that. From there, it's just parsing the Content-Disposition header to extract the "filename" (and not "filename*"):
$WebPath = "https://drive.google.com/uc?export=download&id=0B84LPHCa2YmdZmFMV0dsYl9FeTg"
$request = [System.Net.WebRequest]::Create( $WebPath )
$headers = $request.GetResponse().Headers
# Content-disposition includes a name-value pair for filename:
$cd = $headers.GetValues("Content-Disposition")
$cd_array = $cd.split(";")
foreach ($item in $cd_array) {
if ($item.StartsWith("filename=")) {
# Get string after equal sign
$filename = $item.Substring($item.IndexOf("=")+1)
# Remove quotation marks, if any
$filename = $filename.Replace('"','')
}
}
Got a whatsapp message that Microsoft is giving away free ebooks from the below url.
URL : Microsoft Ebook Giveaway
To download all the books in one go, the following powershell script was used, which is available in the same url.
Now my problem is, if I run the powershell script as a whole, it is not throwing any error. All the books from the url gets downloaded to a single location in my computer.
But if I try to run the script line by line to understand what each statement does, it is giving the following error when the , $bookList = Invoke-WebRequest $downLoadList gets executed,
Now to resolve this error, there are many others posts in stack overflow, that passes the username and password to overcome this error. Those scripts / solutions are not working at my end.
More than the error, why is it, that the script runs without any errors/issues when I execute the full script, but throws an error when I execute line by line ?
Any inputs on the nature of execution or helpful tips in overcoming the error will be useful... Thank you.
Error :
Invoke-WebRequest : (my ip number )
Credentials are missing.
Make sure to specify a domain with your username
This website has been blocked by a cyber security policy
and SecureWeb does not currently support web exceptions
If you have an exception, copy the link below into a new tab
http://ligman.me/2tk1D2V
At line:1 char:13
+ $bookList = Invoke-WebRequest $downLoadList
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : InvalidOperation: (System.Net.HttpWebRequest:HttpWebRequest)
[Invoke-WebRequest], WebException
+ FullyQualifiedErrorId : WebCmdletWebResponseException,Microsoft.PowerShell.Commands.
InvokeWebRequestCommand
PowerShell Script :
###############################################################
# Eric Ligmans Amazing Free Microsoft eBook Giveaway
# https://blogs.msdn.microsoft.com/mssmallbiz/2017/07/11/largest-free-microsoft-ebook-giveaway-im-giving-away-millions-of-free-microsoft-ebooks-again-including-windows-10-office-365-office-2016-power-bi-azure-windows-8-1-office-2013-sharepo/
# Link to download list of eBooks
# http://ligman.me/2tk1D2V
# Thanks David Crosby for the template (https://social.technet.microsoft.com/profile/david%20crosby/)
#
# Modified by Robert Cain (http://arcanecode.me)
# Added code to check to see if a book was already downloaded,
# and if so was it the correct file size. If so, the book
# download is skipped. This allows users to simply rerun the
# script if their download process is interrupted.
###############################################################
# Set the folder where you want to save the books to
$dest = "I:\new_microsoft\" # Make sure the file path ends in a \
# Download the source list of books
$downLoadList = "http://ligman.me/2tk1D2V"
$bookList = Invoke-WebRequest $downLoadList
# Convert the list to an array
[string[]]$books = ""
$books = $bookList.Content.Split("`n")
# Remove the first line - it's not a book
$books = $books[1..($books.Length -1)]
$books # Here's the list
# Get the total number of books we need to download
$bookCount = $($books).Count
# Set a simple counter to let the user know what book
# number we're currently downloading
$currentBook = 0
# As an option, we can have it log progress to a file
$log = $true
if ($log -eq $true)
{
# Construct a log file name based on the date that
# we can save progress to
$dlStart = Get-Date
$dlStartDate = "$($dlStart.Year)-$($dlStart.Month)-$($dlStart.Day)"
$dlStartTime = "$($dlStart.Hour)-$($dlStart.Minute)-$($dlStart.Second)"
$logFile = "$($dest)BookDlLog-$dlStartDate-$dlStartTime.txt"
}
# Download the books
foreach ($book in $books)
{
# Increment current book number
$currentBook++
try
{
# Grab the header with the books full info
$hdr = Invoke-WebRequest $book -Method Head
# Get the title of the book from the header then
# make it a safe string (remove special characters)
$title = $hdr.BaseResponse.ResponseUri.Segments[-1]
$title = [uri]::UnescapeDataString($title)
# Construct the path to save the file to
$saveTo = $dest + $title
# If the file doesn't exist, download it
if ($(Test-Path $saveTo) -eq $false)
{
$msg = "Downloading book $currentBook of $bookCount - $title"
$msg
if ($log -eq $true) { "`n$($msg)" | Add-Content $logFile }
Invoke-WebRequest $book -OutFile $saveTo
}
else
{
# If it does exist, we need to make sure it wasn't
# a partial download. If the file size on the server
# and the file size on local disk don't match,
# redownload it
# Get the size of the file from the download site
$dlSize = $hdr.BaseResponse.ContentLength
# Get the size of the file on disk
$fileSize = $(Get-ChildItem $saveTo).Length
if ($dlSize -ne $fileSize)
{
# If not equal we need to download the book again
$msg = "Redownloading book $currentBook of $bookCount - $title"
$msg
if ($log -eq $true) { "`n$($msg)" | Add-Content $logFile }
Invoke-WebRequest $book -OutFile $saveTo
}
else
{
# Otherwise we have a good copy of the book, just
# let the user know we're skipping it.
$msg = "Book $currentBook of $bookCount ($title) already exists, skipping it"
$msg
if ($log -eq $true) { "`n$($msg)" | Add-Content $logFile }
}
}
} # end try
catch
{
$msg = "There was an error downloading $title. You may wish to try to download this book manually."
Write-Host $msg -ForegroundColor Red
if ($log -eq $true) { "`n$($msg)" | Add-Content $logFile }
} # end catch
} # end foreach
# Let user know we're done, and give a happy little beep
# in case they aren't looking at the screen.
#"Done downloading all books"
#[Console]::Beep(500,300)
I have a script that, in a nutshell, does the following:
copies required files to a temporary folder
compresses the files in the temporary folder to a .zip file
FTPs the .zip file to our FTP server
tidies up and deletes the temporary folder and .zip file
I have pinched the FTP code from a previous post:
Upload files with FTP using PowerShell
and modified it where necessary (keeping the basics in tact - I think).
The issue I have is that while the .zip file is being FTP'd the script doesn't wait until it is complete. It gets part way through, anywhere from 20Mb to 60Mb before it continues executing, tidies up and deletes the file it is transferring.
The temporary folder is always the same but the .zip filename varies depending on the date so I can't really reverse the order of operations.
Can anyone suggest how I might get the script to wait until the FTP process has completed, success or fail, before it moves on?
Cheers,
Andrew.
Edit: For those that asked....
function FTPtoServer ()
{
<#
What this function has to/should do:
- accept the right number of parameters,
minimum/mandatory: username, password, file
optional: proxy server address/port, proxy username and password
- check that the source file exists, then extract the filename.
- if a proxy is specified, set the appropriate parameters
- transmit the file
- if any errors occur, throw and return
#>
param(
[string]$sourcefile=$(throw 'A sourcefile is required, -sourcefile'), <#fully qualified zip file name#>
[string]$FTPUser =$(throw 'An FTP username is required, -ftpuser'),
[string]$FTPPass =$(throw 'An FTP password is required, -ftppass'),
#
[string]$proxyServer, #proxySocket?? it is an address and port
[string]$proxyUser,
[string]$proxyPass
)
#local variables
$FTPserver = "ftp://ftp.servername.com.au"
#check if the sourcefile exists, if not return/throw an error
# The sourcefile should contain the full path to the file.
if (-not (test-path $sourcefile)){
throw "the source file could not be located: $sourcefile"
}
# extract the filename from the sourcefile.
$filename = split-path -path $sourcefile -leaf
# create the FtpWebRequest and configure it
$ftp = [System.Net.FtpWebRequest]::Create("$FTPserver/$filename")
$ftp = [System.Net.FtpWebRequest]$ftp
$ftp.Method = [System.Net.WebRequestMethods+Ftp]::UploadFile
$ftp.Credentials = new-object System.Net.NetworkCredential($FTPUser,$FTPPass)
$ftp.UseBinary = $true
$ftp.UsePassive = $false
#proxy info
# ******** DANGER Will Robinson - this proxy config has not been
# tested and may not work.
if ($proxyServer){
$proxy = New-Object System.Net.WebProxy $proxyServer
if ($proxyUser -and $proxyPass){
$proxy.Credentials = new-object System.Net.NetworkCredential($proxyUser,$proxyPass)
}
$ftp.Proxy = $proxy
$ftp.UsePassive = $true #apparently, must usePassive if using proxy
}
#now we have checked and prepared everything, lets try and send the file.
# read in the file to upload as a byte array
try{
#work out how much we are sending
$content = [System.IO.File]::ReadAllBytes("$sourceFile")
$ftp.ContentLength = $content.Length
try {
# get the request stream, and write the bytes into it
$rs = $ftp.GetRequestStream()
$rs.Write($content, 0, $content.Length)
# be sure to clean up after ourselves
$rs.Close()
$rs.Dispose()
}
catch {
$errorMessage = "FTP failed. " + $_.exception.message
throw $errormessage
}
}
catch {
$errorMessage = "Unable to transmit file " + $sourceFile + "`r`n" + $_.exception.message
throw $errormessage
}
}
The above is in a separate file, but is called by the following:
try {
FTPtoServer -sourcefile $sourcefile -ftpuser $FTPUser -ftppass $FTPPass
}
catch {
$errorMessage = "FTPtoServer function failed with error: $_"
finishFail -failmessage $errorMessage
}
Cheers.
Found it.
I executed the FTP code above in isolation using a large file (~140Mb) and it threw the error; "The underlying connection was closed: An unexpected error occured on a receive."
I rebooted the FTP server, checked the user account etc etc.
I also tested the M$ FTP client with the same file and it transferred completely and correctly.
Anyway, I found this article: https://www.codeproject.com/Questions/597175/FileplusUploadplustoplusFTPplusserver which also has the error I received.
As it turns out, the Timeout value of FTPWebRequest is NOT -1 as in the doco but 100 seconds.
I checked my FTP logs and sure enough, time between logon and logoff was about 100 seconds.
I added the line: $ftp.Timeout = -1 to my code and first attempt transferred the entire file completely without error.
Previous transfers had worked as they fell below the 100 second timeout.
Many thanks for the posts and help.
I use an alternative oldschool method myself, it should work for you, and doesn't need any extra components on the server.
$ftp_user = "username"
$ftp_password = "password"
$ftp_address = "ftp.someserver.com"
$ftp_commands = #"
open $ftp_address
$ftp_user
$ftp_password
lcd c:\jobs\output
put estate_data_current.xml
bye
"#
set-content -encoding "ASCII" -path ftp_commands.txt -value $ftp_commands
ftp -s:ftp_commands.txt
I'm running into an issue where a removal and reinstallation of a particular application is creating duplicate entries in System Variables under Path. So for example, it's showing C:\Apps\folder;C:\Apps\folder;typical entries.
While this is not causing a problem with functionality of the app, I'd prefer to not have the entry in there twice (or more if it requires an additional removal/installation). I want to automate something so I don't have to go into each system and manually remove one of those entries.
Can this be done through either a batch file or a PowerShell script? I'm not able to find a way, but hopefully someone here will know a way. It's fine if the method removes both entries, as I can add something to the script to add one of them back. One important note, I need to make sure everything else under Path is left intact.
Here is a script (taken from a Microsoft repository) that I used to do the exact same thing.
$RegKey = ([Microsoft.Win32.Registry]::LocalMachine).OpenSubKey("SYSTEM\CurrentControlSet\Control\Session Manager\Environment", $True)
$PathValue = $RegKey.GetValue("Path", $Null, "DoNotExpandEnvironmentNames")
Write-host "Original path :" + $PathValue
$PathValues = $PathValue.Split(";", [System.StringSplitOptions]::RemoveEmptyEntries)
$IsDuplicate = $False
$NewValues = #()
ForEach ($Value in $PathValues)
{
if ($NewValues -notcontains $Value)
{
$NewValues += $Value
}
else
{
$IsDuplicate = $True
}
}
if ($IsDuplicate)
{
$NewValue = $NewValues -join ";"
$RegKey.SetValue("Path", $NewValue, [Microsoft.Win32.RegistryValueKind]::ExpandString)
Write-Host "Duplicate PATH entry found and new PATH built removing all duplicates. New Path :" + $NewValue
}
else
{
Write-Host "No Duplicate PATH entries found. The PATH will remain the same."
}
$RegKey.Close()