ftp via powershell - how to indicate success - powershell

I'm using a Powershell script to automate sending a .txt to an FTP site. When I execute it in powershell, nothing happens. The root\prompt just appears...no messages that it was successful. How do I tell if it worked? Here is my script in case it helps.
$localfile = "D:\Export\TESTING.txt"
$remotefile = "/TESTING.txt"
$ftphost = "ftp://ftp.site.com"
$URI = $ftphost + $remotefile
$username="USERNAME"
$password="1234"
function Get-FTPFile
($URI,$localfile,$username,$password){
$credentials=New-Object System.Net.NetworkCredential
($username,$password)
$ftp=[System.Net.FtpWebRequest]::Create($URI)
$ftp.Credentials=$credentials
$ftp.UseBinary=1
$ftp.KeepAlive=0
$response=$ftp.GetResponse()
$responseStream = $response.GetResponseStream()
$file = New-Object
IO.FileStream ($localfile,[IO.FileMode]::Create)
[byte[]]$buffer = New-Object byte[] 1024
$read = 0
do{
$read=$responseStream.Read($buffer,0,1024)
$file.Write($buffer,0,$read)
}
while ($read -ne 0)$file.close()
}

Use WebRequestMethods.Ftp.GetFileSize following completion of the upload to confirm that the uploaded file size matches the local file size.
You could also use a try/catch block to check for exceptions during your read/write operations. A lack of exceptions would give you some confidence that the upload was successful (i.e. no news is good news).

Related

Uploading a File via FTP using PowerShell

I am writing a Powershell script that watches a directory, and when a file (or multiple) is uploaded into the directory, it takes those files, copies them to another folder, sends them to an FTP server, and then deletes the file from the original directory.
I am having problems connecting to the FTP server. I am not sure if the problem is the way I am configuring the Web Client, or if the problem is that the ftp URI has spaces in it and I am not escaping them properly.
Here is the code:
$source = "c:/testFtp"
$ftpdestination = "ftp://username:password#ftp.ftpsite.com/folder with space/folder with space"
$webclient = New-Object -TypeName System.Net.WebClient
$files = Get-ChildItem $source
foreach ($file in $files)
{
Write-Host "Uploading $file"
try {
$ftp = "$ftpdestination/$file"
$uri = New-Object -TypeName System.Uri -ArgumentList $ftp
$webclient.UploadFile($uri, $source/$file)
} catch {
Add-content "$logs" -value "There was an error uploading to ftp"
}
}
$webclient.Dispose()
I have tried escaping the folder spaces multiple ways, so I am beginning to think that is not the problem and that I am not configuring the web client properly.
It is also not catching errors very often, so I don't believe it throws an error when the webclient has failure on the upload. Any help is appreciated!
ANSWER:
It turns out the WebClient was connecting properly, but SSL was blocking the files from being sent. I found this out by using C# to compile and run the script, and it gave me better error handling, as I am new to Powershell scripts and cannot seem to get good error handling.
After researching, I could not find a way to enable SSL with WebClient, so I switched over to FtpWebRequest. Here is the successful code (This try catch block does not seem to log errors as I would like, but the tool will successfully send files to the ftp server now:
try {
$ftp = [System.Net.FtpWebRequest]::Create("$ftpsite/$filename")
$ftp = [System.Net.FtpWebRequest]$ftp
$ftp.Method = [System.Net.WebRequestMethods+Ftp]::UploadFile
$ftp.Credentials = new-object System.Net.NetworkCredential("$username","$password")
$ftp.UseBinary = $true
$ftp.UsePassive = $true
$ftp.EnableSSL = $true #<-----------------This was the line that made this work
$ftp.KeepAlive = $false
# read in the file to upload as a byte array
$content = [System.IO.File]::ReadAllBytes("$source/$file")
$ftp.ContentLength = $content.Length
# get the request stream, and write the bytes into it
$rs = $ftp.GetRequestStream()
$rs.Write($content, 0, $content.Length)
# be sure to clean up after ourselves
$rs.Close()
$rs.Dispose()
Write-Host "Successfully uploaded: $source/$file"
Copy-Item "$source/$file" -Destination "$copydestination"
Remove-Item "$source/$file"
$logline = "$(Get-Date), Added File: $file to $copydestination"
Add-content "$logs" -value $logline
} catch {
$res = $ftp.GetResponse()
Add-content "$logs" -value "There was an error: $res.StatusCode"
Write-Error -Message "There was an error." -ErrorAction Stop
}

powershell FTPS csv file [duplicate]

The problem:
A client requires that we upload extracted data from our system to their box.com platform, rather than our normal SFTP utility. I have box.com credentials, and am aware they require FTPS not SFTP, and require passive mode. I've cribbed a fragment from ThomasMaurer's Powershell FTP Upload and Download script. Powershell version on my server is 4.0
Code fragment is:
#config
$Username = "username#host.com"
$Password = "redactedpassword"
$LocalFile = "C:\path\to\my\file.csv"
$RemoteFile = "ftp://ftp.box.com:990/file.csv"
#Create FTPWebRequest
$FTPRequest = [System.Net.FtpWebRequest]::Create($RemoteFile)
$FTPRequest = [System.Net.FtpWebRequest]$FTPRequest
$FTPRequest.Method = [System.Net.WebRequestMethods+Ftp]::UploadFile
$FTPRequest.Credentials = New-Object System.Net.NetworkCredential($Username, $Password)
$FTPRequest.UseBinary = $true
$FTPRequest.UsePassive = $true
#read file for upload
$FileContent = gc -en byte $LocalFile
$FTPRequest.ContentLength = $FileContent.Length
#get stream request by bytes
$run = $FTPRequest.GetRequestStream()
$run.Write($FileContent,0,$FileContent.Length)
#cleanup
$run.Close()
$run.Dispose()
The error(s):
Exception calling "GetRequestStream" with "0" argument(s): "System error." At C:\path\to\my\powershellscript.ps1:28 char:1
+ $Run = $FTPRequest.GetRequestStream()
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : NotSpecified: () [], MethodInvocationException
+ FullyQualifiedErrorId: WebException
I also get downstream errors on calling the $FileContent.Length property and $run.close and $run.dispose().
Has anyone successfully automated to box (specifically) or to a passive implicit-ssl using only PowerShell 4.0 commands, and do you have a solid pattern I could reuse? Many thanks
I'm uploading files with a derived version of System.Net.WebClient, which supports FTP over TLS. This can easily be achieved by embedding C# code in PowerShell:
$typeDefinition = #"
using System;
using System.Net;
public class FtpClient : WebClient
{
protected override WebRequest GetWebRequest(Uri address)
{
FtpWebRequest ftpWebRequest = base.GetWebRequest(address) as FtpWebRequest;
ftpWebRequest.EnableSsl = true;
return ftpWebRequest;
}
}
"#
Add-Type -TypeDefinition $typeDefinition
$ftpClient = New-Object FtpClient
$ftpClient.UploadFile("ftp://your-ftp-server/yourfile.name", "STOR", "C:\YourLocalFile.name")
The answer by #h0r41i0 solves the problem by using WebClient. But as the WebClient internally uses (Ftp)WebRequest, it cannot be the solution on its own.
I'll assume that the "System error" occurs because either OP is trying to connect to a secure port (990) with an insecure connection.
Or because the file is too large and the OP code tries to read it whole to memory:
$FileContent = gc -en byte $LocalFile
In either case, there's no reason to give up on FtpWebRequest. Just use a secure connection (FtpWebRequest.EnableSsl). And an efficient way to feed the data from the file to the FTP stream, for example Stream.CopyTo:
$request = [Net.WebRequest]::Create("ftp://ftp.example.com/remote/path/file.zip")
$request.Credentials = New-Object System.Net.NetworkCredential("username", "password")
$request.Method = [System.Net.WebRequestMethods+Ftp]::UploadFile
$request.EnableSsl = $True
$fileStream = [System.IO.File]::OpenRead("C:\local\path\file.zip")
$ftpStream = $request.GetRequestStream()
$fileStream.CopyTo($ftpStream)
$ftpStream.Dispose()
$fileStream.Dispose()
For other options, see Upload files with FTP using PowerShell.
Though note that .NET framework does not support implicit TLS (what is typical use of 990). Only explicit TLS. But support for the explicit TLS is more common ayway. See Does .NET FtpWebRequest Support both Implicit (FTPS) and explicit (FTPES)?
Probably too late to be useful to original questioner, but I found this other answer did the trick for me: Cyril Gupta's answer to Upload files with ftp using powershell
Here is my revised edition, including URL encoding (since the box.com usernames are email addresses which include the "at sign"):
## https://stackoverflow.com/a/2485696/537243
## User comment complains can't turn off passive mode,
## but that is exactly what we want here!
[Reflection.Assembly]::LoadWithPartialName("System.Web") | Out-Null
# config
$Username = "foo#bar.com"
$Password = "s3cr3tpAssw0rd"
$Servername = "ftp.box.com"
# This is what we need URI it to look like:
# ftp://foo%40bar.com:s3cr3tpAssw0rd#ftp.box.com/
$baseURI = "ftp://$([System.Web.HttpUtility]::UrlEncode($Username)):$([System.Web.HttpUtility]::UrlEncode($Password))#$($Servername)"
$LocalFile = "C:\tmp\to_upload\data.csv"
$RemoteFile = "date.csv"
$ftpURI = "$($baseURI)/$($RemoteFile)"
Write-output "ftp uri: $($ftpURI)";
$webclient = New-Object -TypeName System.Net.WebClient;
$ftpURI = New-Object -TypeName System.Uri -ArgumentList $ftpURI; #"convert" it
$webclient.UploadFile($ftpURI, $LocalFile);
Write-output "Uploaded $($LocalFile) ... "; # of course since we didn't use try/catch or other error dectection this is a bit presuming.
Also should note this example uses plain FTP, not FTPS or SFTP.

How to download a zip file with PowerShell from request stream

Currently, I make a POST request to a external website then I am supposed to get a zip file in return. I can get the zip file, but it comes in an xml with just the name.zip and nothing is downloaded. I have no idea why it is not downloading. My code is below on the piece where I make the actual request. I am not sure if I am over engineering this or what else I would have to do to get the actual file to download.
$url = "https://thewebsite.net/v6_1?id=$messageID"
Write-Output($url)
$Body = [byte[]][char[]]$xmlMessage
Write-Output($Body)
$Request = [System.Net.HttpWebRequest]::CreateHttp($url);
$Request.Method="POST"
$Request.ContentType = 'text/xml;charset=utf-8'
$Request.ContentLength = $Body.Length
$Request.ClientCertificates.Add($Certificate)
Write-Output($Request.ClientCertificates)
$Stream = $Request.GetRequestStream();
$Stream.Write($Body, 0, $Body.Length);
$Response = $Request.GetResponse()
$totalLength = [System.Math]::Floor($Response.get_ContentLength()/1024)
$responseStream = $Response.GetResponseStream()
$targetStream = New-Object -TypeName System.IO.FileStream -ArgumentList "D:\path\to\save\test.txt", Create
$buffer = new-object byte[] 1GB
$count = $responseStream.Read($buffer,0,$buffer.length)
$downloadedBytes = $count
while ($count -gt 0)
{
[System.Console]::CursorLeft = 0
[System.Console]::Write("Downloaded {0}K of {1}K", [System.Math]::Floor($downloadedBytes/1024), $totalLength)
$targetStream.Write($buffer, 0, $count)
$count = $responseStream.Read($buffer,0,$buffer.length)
$downloadedBytes = $downloadedBytes + $count
Write-Output($count)
}
$targetStream.Flush()
$targetStream.Close()
$targetStream.Dispose()
$responseStream.Dispose()
Unfortunately without certain download URI it's hard to clarify either you case is nontrivial or you just select non optimal way to get remote file. Routine way to get ".zip" (or any other 'octet/stream' file) with Power-Shell is execute the following command
Invoke-WebRequest -uri "https://thewebsite.net/v6_1?id=$messageID" -Method "GET" -Outfile (-join($messageID,".zip"))
then $messageID.zip file would be created in directory from which you execute Power-Shell
Progress would be shown in console window automatically. I test this example just before write the answer and it works independently on method "POST"/"GET" when remote host actually return "octet/stream" in the response. Maybe in you case file is not directly returned after requesting
thewebsite.net/v6_1?id=$messageID
but it is not a point of you original question.
Have you tried using Invoke-WebRequest?
$path = [Environment]::GetFolderPath("MyDocuments")
Invoke-WebRequest "example.com" -OutFile "$path\ZippedFile.zip"
A variable does not have to be used, as the path can be completely defined in the Invoke-WebRequest line if desired.

Using PowerShell to download all files in a directory

I have a working PowerShell script that iterates through a directory accessed via FTP and prints it's contents. The script looks like this:
$sourceuri = "<string>"
$targetpath = "<string>"
$username = "<string>"
$password = "<string>"
# Create a FTPWebRequest object to handle the connection to the ftp server
$ftprequest = [System.Net.FtpWebRequest]::create($sourceuri)
# set the request's network credentials for an authenticated connection
$ftprequest.Credentials = New-Object System.Net.NetworkCredential($username,$password)
$ftprequest.Method = [System.Net.WebRequestMethods+Ftp]::ListDirectoryDetails
$ftprequest.UseBinary = $true
$ftprequest.KeepAlive = $false
# send the ftp request to the server
$ftpresponse = $ftprequest.GetResponse()
$stream = $ftpresponse.getresponsestream()
$buffer = new-object System.Byte[] 1024
$encoding = new-object System.Text.AsciiEncoding
$outputBuffer = ""
$foundMore = $false
## Read all the data available from the stream, writing it to the
## output buffer when done.
do
{
## Allow data to buffer for a bit
start-sleep -m 1000
## Read what data is available
$foundmore = $false
$stream.ReadTimeout = 1000
do
{
try
{
$read = $stream.Read($buffer, 0, 1024)
if($read -gt 0)
{
$foundmore = $true
$outputBuffer += ($encoding.GetString($buffer, 0, $read))
}
}
catch
{
$foundMore = $false; $read = 0
}
}
while($read -gt 0)
}
while($foundmore)
$outputBuffer
My actual goal is not to just list the files but download them to the machine running the script. I'm finding this a bit tricky since my loops only reference bytes and not files by name. How can I use this loop to download all the files in a directory instead of just listing them?
I prefer use real FTP libraries like the one of winscp but there are other examples out there. http://winscp.net/eng/docs/library_session_listdirectory#example
once you have a the list of directories and files - use the get-content method to read from the list, and then use the bits transfer to download all files - I'm not sure if this will download the directories as well... but it will definitely download the files...

Ftp uploading failed after every two or three succeeded uploads

I got the the following error message when uploading. The Powershell create a zip file using 7za.exe and call my FTP function to upload the file. What may cause the problem? Will Windows ftp.exe client be more stable?
Exception calling "GetRequestStream" with "0" argument(s): "The remote
server returned an error: (550) File unavailable (e.g., file not
found, no access)."
Update:
It seems the same files always failed in the loop. However, It works if I just run ftpFile file_name_with_full_path. (The file_name_with_full_path is copied from the output of the loop script.
Update 2:
I tried to use webclient ($webclient.UploadFile($uri, $File)) to ftp the files. Same error.
Update 3:
Found this Question. May need to add $ftp.KeepAlive = false. Why?
function ftpFile
{
Param (
[Parameter(Mandatory=$true, ValueFromPipeline=$true)]
[ValidateScript({Test-Path $_})]
[String]
$filePath
,
[Parameter(Mandatory=$false)]
[String]
$ftpUrl = "ftp://10.0.1.1/Data/"
,
[Parameter(Mandatory=$false)]
[String]
$Login = "username"
,
[Parameter(Mandatory=$false)]
[String]
$password = "password"
)
Process {
try {
$ftp = [System.Net.FtpWebRequest]::Create("$ftpUrl/$(Split-Path $filePath -Leaf)")
$ftp = [System.Net.FtpWebRequest]$ftp
$ftp.Method = [System.Net.WebRequestMethods+Ftp]::UploadFile
$ftp.Credentials = new-object System.Net.NetworkCredential("$Login","$password")
$ftp.UseBinary = $true
$ftp.UsePassive = $true
# read in the file to upload as a byte array
$content = gc -en byte $filePath
$ftp.ContentLength = $content.Length
# get the request stream, and write the bytes into it
$rs = $ftp.GetRequestStream()
$rs.Write($content, 0, $content.Length)
$rs.Close()
$rs.Dispose()
echo "ftpFile: $filePath size: $($content.Length)"
}
catch {
throw "FTP: $_"
}
}
}
FTP Error 550 is Access Denied and tends to be a username/password conflict.
If you're using a loop and passing the same username and password each time this function is called
AND
it's working on some iterations of the loops and not others
THEN
you need to check the ftp auth/error logs on the server to get a grasp of why you're being denied.
As #AndyArismendi asked, does it always fail on the same file? Without more complete code and understanding of your use, this is hard to lock down to a simple solution.
Found this Question. Need to add $ftp.KeepAlive = false.