I am trying to use a variable that loops through a list of words to be used in downloading password-protected files from a URL.
sites.txt:
BOS
HFD
LGA
NYC
PHI
WWD
Powershell Script:
$sites = Get-Content C:\Users\...\sites.txt
$time = (Get-Date).ToString("yyyyMMdd")
$Username = 'hello'
$Password = 'world'
$url = "http://my.website/" + $sites + "/some.csv"
$Path = "D:\...\...\" + $sites + "/some.csv"
$WebClient = New-Object System.Net.WebClient
$WebClient.Credentials = New-Object System.Net.Networkcredential($Username, $Password)
$WebClient.DownloadFile( $url, $path )
I am getting "unexpected token" errors so, I'm assuming that I am not using the $site variable correctly in the URL string.
$sites = Get-Content C:\Users\...\sites.txt
ForEach ($Site in $sites){
$url = "http://my.website/" + $Site + "/some.csv"
$Path = "D:\...\...\" + $Site + "\some.csv"
$Path
$url
}
This will pass your list into individual URL's
Edited to include change to '$Path' as well.
Related
Ive tried connecting to an ftp with the following powershell script:
#FTP Server Information - SET VARIABLES
$ftp = "ftp://XXX.com/"
$user = 'UserName'
$pass = 'Password'
$folder = 'FTP_Folder'
$target = "C:\Folder\Folder1\"
#SET CREDENTIALS
$credentials = new-object System.Net.NetworkCredential($user, $pass)
function Get-FtpDir ($url,$credentials) {
$request = [Net.WebRequest]::Create($url)
$request.Method = [System.Net.WebRequestMethods+FTP]::ListDirectory
if ($credentials) { $request.Credentials = $credentials }
$response = $request.GetResponse()
$reader = New-Object IO.StreamReader $response.GetResponseStream()
while(-not $reader.EndOfStream) {
$reader.ReadLine()
}
#$reader.ReadToEnd()
$reader.Close()
$response.Close()
}
#SET FOLDER PATH
$folderPath= $ftp + "/" + $folder + "/"
$files = Get-FTPDir -url $folderPath -credentials $credentials
$files
$webclient = New-Object System.Net.WebClient
$webclient.Credentials = New-Object System.Net.NetworkCredential($user,$pass)
$counter = 0
foreach ($file in ($files | where {$_ -like "*.txt"})){
$source=$folderPath + $file
$destination = $target + $file
$webclient.DownloadFile($source, $target+$file)
#PRINT FILE NAME AND COUNTER
$counter++
$counter
$source
}
But i keep getting 530 error (authentication error) when connecting. I know the username and password is working cause i've tested it in other ftp clients.
So, i think it might be a problem because the webapp demands a certain protocol, which isnt used in this script.
Ive been doing some research and i found something called Posh-SSH which might work. But is there a way to modify my script instead? When im connecting with winscp i use FTP protocol with TLS/SSL implicit encryption to port 990.
UPDATE: WORKS
I made the following work:
#FTP Server Information - SET VARIABLES
$ftp = "ftp://waws-prod-xxx.ftp.azurewebsites.windows.net"
$user = 'xxxxx\xxxxx#email.com'
$pass = '$FRnqxxpxxxxxxx'
$folder = 'site/wwwroot/wwwroot/images/uploaded'
$target = "C:\Folder\Folder1\"
#SET CREDENTIALS
$credentials = new-object System.Net.NetworkCredential($user, $pass)
function Get-FtpDir ($url,$credentials) {
$request = [Net.WebRequest]::Create($url)
$request.Method = [System.Net.WebRequestMethods+FTP]::ListDirectory
if ($credentials) { $request.Credentials = $credentials }
$response = $request.GetResponse()
$reader = New-Object IO.StreamReader $response.GetResponseStream()
while(-not $reader.EndOfStream) {
$reader.ReadLine()
}
#$reader.ReadToEnd()
$reader.Close()
$response.Close()
}
#SET FOLDER PATH
$folderPath= $ftp + "/" + $folder + "/"
$files = Get-FTPDir -url $folderPath -credentials $credentials
Write-Host($files)
$files
$webclient = New-Object System.Net.WebClient
$webclient.Credentials = New-Object System.Net.NetworkCredential($user,$pass)
$counter = 0
foreach ($file in ($files)){
$source=$folderPath + $file
$destination = $target + $file
$webclient.DownloadFile($source, $target+$file)
#PRINT FILE NAME AND COUNTER
$counter++
$counter
$source
}
I ended up adding a new credential to the web app, and changing sftp in hostname to ftp, and now it works. Using the credentials from webpublish file works too.
I also made WinSCP work and im able to download the full folder with children.
#FTP Server Information - SET VARIABLES
$ftp = "waws-prod-xxx-xxx.ftp.azurewebsites.windows.net"
$user = 'xxxxxx\xxxx#xxxxx.no'
$pass = '$xxxxxxxxx'
$folder = 'site/wwwroot/wwwroot/images/uploaded/*'
$target = "C:\Folder\Folder1\*"
# Load WinSCP .NET assembly
Add-Type -Path "WinSCPnet.dll"
# Setup session options
$sessionOptions = New-Object WinSCP.SessionOptions -Property #{
Protocol = [WinSCP.Protocol]::Ftp
HostName = $ftp
UserName = $user
Password = $pass
}
$session = New-Object WinSCP.Session
try
{
# Connect
$session.Open($sessionOptions)
# Download files
$session.GetFiles($folder, $target).Check()
}
finally
{
# Disconnect, clean up
$session.Dispose()
}
FtpWebRequest (nor any other built-in .NET FTP API) does not support implicit TLS/SSL encryption.
See Does .NET FtpWebRequest Support both Implicit (FTPS) and explicit (FTPES)?
Posh-SSH is SSH/SFTP library. It has nothing to do with FTP.
And as you get an "authentication error", I believe your problem is actually different. Double check your credentials. You are probably using wrong ones in your code.
Also, if WinSCP works for you, you can use WinSCP .NET assembly from the PowerShell. WinSCP GUI can even generate a code template for you, based on your working GUI session.
I need to be able to download a folder with its contents from a blob storage account using the Azure Storage REST API only.
I have created a function (New-StorageAccountAthorisationHeader) that creates the (authentication) header that I can download a single file, but I cannot find any reference on how I might go about downloading the whole folder.
If I pass the folder as the $blob parameter, I get a BlobNotFound error.
The URL of the said folder is: https://mystorageaccount.blob.core.windows.net/acontainer/somefolder. The contents of "somefolder" looks like:
Folder1
FolderA
FileA.txt
FolderB
FileB.txt
FileC.txt
New-StorageAccountAthorisationHeader:
function New-StorageAccountAuthorizationHeader
{
[cmdletbinding()]
param
(
[string]$StorageAccountName,
[string]$Container,
[string]$Blob,
[string]$accesskey ,
[string]$ResourceUri,
[string]$xmsversion = "2017-04-17"
)
$xmsdate = Get-Date
$xmsdate = $xmsdate.ToUniversalTime()
$xmsdate = $xmsdate.toString('r')
function GetRestApiParameters
{
[cmdletbinding()]
param
(
[Parameter(Mandatory=$true)]
[string]$Uri
)
if($Uri.Contains("?"))
{
Write-Verbose "URI to extract REST parameters: $uri"
return ($Uri.Split("?")[1]).Split("&")
}
}
Write-Verbose "Generating string for signature encryption..."
$partUrl = "/$StorageAccountName/"
if($Container)
{
$partUrl = $partUrl + "$Container/"
}
if($Blob)
{
$parturl = $partUrl + "$Blob"
}
######Don't change the line count or indentation of the here-string#####
$hereString = #"
GET
x-ms-date:$xmsdate
x-ms-version:$xmsversion
$partUrl
"#
$hereString =$hereString -replace "$([char]13)$([char]10)","$([char]10)" #Change `r`n to just `n
$empty = $oSignature = New-Object System.Text.StringBuilder
$empty = $oSignature.Append($hereString)
Write-Verbose "Appending parameters from URI into authorisation string..."
$restParameters = GetRestApiParameters -Uri $ResourceUri -Verbose
if ($restParameters -ne $null)
{
foreach ($param in $restParameters)
{
$empty = $oSignature.Append("$([char]10)$($param.Replace('=',':'))")
}
}
#$oSignature.toString()
Write-Verbose "Encrypting string..."
$hmacsha = New-Object System.Security.Cryptography.HMACSHA256
$hmacsha.key = [Convert]::FromBase64String($accesskey)
$signature = $hmacsha.ComputeHash([Text.Encoding]::UTF8.GetBytes($oSignature.ToString()))
$signature = [Convert]::ToBase64String($signature)
Write-Verbose "Building header..."
$headers = New-Object "System.Collections.Generic.Dictionary[[String],[String]]"
$headers.Add("x-ms-version", $xmsversion)
$headers.Add("x-ms-date", $xmsdate)
$headers.Add("Authorization", "SharedKey " + $StorageAccountName + ":" + $signature)
#$headers.Add("x-ms-blob-type","BlockBlob")
#$headers.Add("Content-Type", "application\xml")
Write-Verbose ("Header: $($headers | Out-String)")
Return $headers
}
And I would call it:
$StorageAccountName = "mystorageaccount"
$container = "acontainer"
$blob = "somefile.txt"
$uriToDownloadBlobs = "https://" + $StorageAccountName + ".blob.core.windows.net/$container/$blob"
$header = $null
$header = New-StorageAccountAuthorizationHeader -StorageAccountName $StorageAccountName -ResourceUri $uriToDownloadBlobs -Verbose -Container $container -Blob $blob
$result = Invoke-WebRequest -Headers $header -Uri $uriToDownloadBlobs -OutFile C:\Temp\$blob -PassThru
$result
So this works, but as I said, I'm after any hints to help with downloading the whole folder.
It looks like this is not possible? Although I'd be interested to see how it's done with the likes of Azure Storage Explorer.
My solution was to zip the files up and then use the above to download the single ZIP file. A few extra lines of code to compress and extract at either end, but it was the quickest way at the time and it works well with VSTS tasks.
I want to run a PS script when I want to publish to FTP server. I took this script as structure : structure script.
I have very simple folder :
C:\Uploadftp\Files\doc.txt
C:\Uploadftp\Files\Files2
C:\Uploadftp\Files\Files2\doc2.txt
nothing fancy there.
Here is my script :
cd C:\Uploadftp
$location = Get-Location
"We are here: $location"
$user = "test" # Change
$pass = "test" # Change
## Get files
$files = Get-ChildItem -recurse
## Get ftp object
$ftp_client = New-Object System.Net.WebClient
$ftp_client.Credentials = New-Object System.Net.NetworkCredential($user,$pass)
$ftp_address = "ftp://test/TestFolder"
## Make uploads
foreach($file in $files)
{
$directory = "";
$source = $($file.DirectoryName + "/" + $file);
if ($file.DirectoryName.Length -gt 0)
{
$directory = $file.DirectoryName.Replace($location,"")
}
$directory = $directory.Replace("\","/")
$source = $source.Replace("\","/")
$directory += "/";
$ftp_command = $($ftp_address + $directory + $file)
# Write-Host $source
$uri = New-Object System.Uri($ftp_command)
"Command is " + $uri + " file is $source"
$ftp_client.UploadFile($uri, $source)
}
I keep getting this error :
Exception calling "UploadFile" with "2" argument(s): "An exception occurred during a WebClient request."
If I hardcode specific folder for $uri and tell source to be some specific folder on my computer, this script doesn't create directory, it creates a file. What am I doing wrong?
P.S. dont hit me too hard, its my fist time ever doing something in power shell.
Try the "Create-FtpDirectory" function from https://github.com/stej/PoshSupport/blob/master/Ftp.psm1
function Create-FtpDirectory {
param(
[Parameter(Mandatory=$true)]
[string]
$sourceuri,
[Parameter(Mandatory=$true)]
[string]
$username,
[Parameter(Mandatory=$true)]
[string]
$password
)
if ($sourceUri -match '\\$|\\\w+$') { throw 'sourceuri should end with a file name' }
$ftprequest = [System.Net.FtpWebRequest]::Create($sourceuri);
$ftprequest.Method = [System.Net.WebRequestMethods+Ftp]::MakeDirectory
$ftprequest.UseBinary = $true
$ftprequest.Credentials = New-Object System.Net.NetworkCredential($username,$password)
$response = $ftprequest.GetResponse();
Write-Host Upload File Complete, status $response.StatusDescription
$response.Close();
}
I'm trying to make PowerShell send a web request to our SSRS server and capture the results. I've hit a wall using the rs:FORMAT=EXCEL parameter in the SSRS url string. I have the following:
First, init the credentials:
$User = "MYDOMAIN\MyUser"
$PWord = ConvertTo-SecureString -String "WooHooStringP$W0rd" -AsPlainText -Force
$c = New-Object –TypeName System.Management.Automation.PSCredential –ArgumentList $User, $PWord
Now, request a report:
Invoke-WebRequest `
-UserAgent ([Microsoft.PowerShell.Commands.PSUserAgent]::InternetExplorer) `
-Credential $c `
-Uri "http://myserver/ReportServer_DEV/Pages/ReportViewer.aspx?/folder+path/report+name"
This works fine. I can even grab the results (enclosing this request and using ().Content).
Then, specify a format instead of plain rendering:
Invoke-WebRequest `
-UserAgent ([Microsoft.PowerShell.Commands.PSUserAgent]::InternetExplorer) `
-Credential $c `
-Uri "http://myserver/ReportServer_DEV/Pages/ReportViewer.aspx?/folder+path/report+name&rs:format=HTML4.0"
Note the rs:Format specification? Works like a charm.
Then, for the grande finale, give me an Excel file:
Invoke-WebRequest `
-UserAgent ([Microsoft.PowerShell.Commands.PSUserAgent]::InternetExplorer) `
-Credential $c `
-Uri "http://myserver/ReportServer_DEV/Pages/ReportViewer.aspx?/folder+path/report+name&rs:format=EXCEL"
No can do, bud:
Invoke-WebRequest : The remote server returned an error: (401) Unauthorized.
At line:1 char:11
+ $bytez = (Invoke-WebRequest `
+ ~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : InvalidOperation: (System.Net.HttpWebRequest:HttpWebRequest) [Invoke-WebRequest], WebException
+ FullyQualifiedErrorId : WebCmdletWebResponseException,Microsoft.PowerShell.Commands.InvokeWebRequestCommand
Why does the rs:format=EXCEL option throw an Unauthorised exception where all the other URLs are served by SSRS?
I've figured it out! I went about this the wrong way: SSRS offers access through a webservice that PowerShell can consume without the need to hack the URL and capture a response. I found a script that did this and modified it to suit my purpose:
function GetRSConnection($server, $instance)
{
# Create a proxy to the SSRS server and give it the namespace of 'RS' to use for
# instantiating objects later. This class will also be used to create a report
# object.
$User = "DOMAIN\Username"
$PWord = ConvertTo-SecureString -String "Pa$$w0rd" -AsPlainText -Force
$c = New-Object –TypeName System.Management.Automation.PSCredential –ArgumentList $User, $PWord
$reportServerURI = "http://" + $server + "/" + $instance + "/ReportExecution2005.asmx?WSDL"
$RS = New-WebServiceProxy -Class 'RS' -NameSpace 'RS' -Uri $reportServerURI -Credential $c
$RS.Url = $reportServerURI
return $RS
}
function GetReport($RS, $reportPath)
{
# Next we need to load the report. Since Powershell cannot pass a null string
# (it instead just passses ""), we have to use GetMethod / Invoke to call the
# function that returns the report object. This will load the report in the
# report server object, as well as create a report object that can be used to
# discover information about the report. It's not used in this code, but it can
# be used to discover information about what parameters are needed to execute
# the report.
$reportPath = "/" + $reportPath
$Report = $RS.GetType().GetMethod("LoadReport").Invoke($RS, #($reportPath, $null))
# initialise empty parameter holder
$parameters = #()
$RS.SetExecutionParameters($parameters, "nl-nl") > $null
return $report
}
function AddParameter($params, $name, $val)
{
$par = New-Object RS.ParameterValue
$par.Name = $name
$par.Value = $val
$params += $par
return ,$params
}
function GetReportInFormat($RS, $report, $params, $outputpath, $format)
{
# Set up some variables to hold referenced results from Render
$deviceInfo = "<DeviceInfo><NoHeader>True</NoHeader></DeviceInfo>"
$extension = ""
$mimeType = ""
$encoding = ""
$warnings = $null
$streamIDs = $null
# Report parameters are handled by creating an array of ParameterValue objects.
# Add the parameter array to the service. Note that this returns some
# information about the report that is about to be executed.
# $RS.SetExecutionParameters($parameters, "en-us") > $null
$RS.SetExecutionParameters($params, "nl-nl") > $null
# Render the report to a byte array. The first argument is the report format.
# The formats I've tested are: PDF, XML, CSV, WORD (.doc), EXCEL (.xls),
# IMAGE (.tif), MHTML (.mhtml).
$RenderOutput = $RS.Render($format,
$deviceInfo,
[ref] $extension,
[ref] $mimeType,
[ref] $encoding,
[ref] $warnings,
[ref] $streamIDs
)
# Determine file name
$parts = $report.ReportPath.Split("/")
$filename = $parts[-1] + "."
switch($format)
{
"EXCEL" { $filename = $filename + "xls" }
"WORD" { $filename = $filename + "doc" }
"IMAGE" { $filename = $filename + "tif" }
default { $filename = $filename + $format }
}
if($outputpath.EndsWith("\\"))
{
$filename = $outputpath + $filename
} else
{
$filename = $outputpath + "\" + $filename
}
$filename
# Convert array bytes to file and write
$Stream = New-Object System.IO.FileStream($filename), Create, Write
$Stream.Write($RenderOutput, 0, $RenderOutput.Length)
$Stream.Close()
}
$RS = GetRSConnection -server "DEVBOX" -instance "ReportServer_DEV"
$report = GetReport -RS $RS -reportPath "folder name/report name"
$params = #()
$params = AddParameter -params $params -name "Month" -val "201311"
GetReportInformat -RS $RS -report $report -params $params -outputpath "i:\test" -format "EXCEL"
Using web request:
[string]$Domain = "DomainUsername"
[string]$Username = "Username"
[string]$Password = "Password"
[string]$ReportServer = "http://ssrsreportserver/ReportServer/ReportExecution2005.asmx" #Report Server
[string]$ReportLocation = "/Report Location/Report Name" #Report Location ON SSRS
$ReportLocation = $ReportLocation.Replace("/", "%2f")
$ReportLocation = $ReportLocation.Replace(" ", "+")
[string]$outputFile = $PSScriptRoot + '\Report.xlsx' #Save location for the file
#If the report has any parameters
[string]$ParamString = "";
$ParamString += "¶m1=paramvalue"
$ParamString += "¶m2=paramvalue"
[string]$URL = $ReportServer + "?" + $ReportLocation + "&rs:Command=Render&rs:Format=" + "EXCELOPENXML" + "&rs:ParameterLanguage=en-GB" + $ParamString
Write-Host $URL
$Req = [System.Net.WebRequest]::Create($URL);
$Req.Credentials = new-object System.Net.NetworkCredential($Username, $Password, $Domain)
$Req.Timeout = 30000;
$WebStream = $Req.GetResponse().GetResponseStream();
$MemStream = New-Object System.IO.MemoryStream
$WebStream.CopyTo($MemStream);
[long]$Len = $MemStream.Length;
[byte[]]$outBytes = [System.Byte[]]::CreateInstance([System.Byte], $Len)
$MemStream.Seek(0, [System.IO.SeekOrigin]::Begin);
$MemStream.Read($outBytes, 0, [int]$Len);
$WebStream.Close();
$MemStream.Close();
$MemStream.Dispose();
$Stream = New-Object System.IO.FileStream($outputFile), Create, Write
$Stream.Write($outBytes, 0, $outBytes.Length)
$Stream.Close()
Invoke-Item $outputFile
I want to use PowerShell to transfer files with FTP to an anonymous FTP server. I would not use any extra packages. How?
I am not sure you can 100% bullet proof the script from not hanging or crashing, as there are things outside your control (what if the server loses power mid-upload?) - but this should provide a solid foundation for getting you started:
# create the FtpWebRequest and configure it
$ftp = [System.Net.FtpWebRequest]::Create("ftp://localhost/me.png")
$ftp = [System.Net.FtpWebRequest]$ftp
$ftp.Method = [System.Net.WebRequestMethods+Ftp]::UploadFile
$ftp.Credentials = new-object System.Net.NetworkCredential("anonymous","anonymous#localhost")
$ftp.UseBinary = $true
$ftp.UsePassive = $true
# read in the file to upload as a byte array
$content = [System.IO.File]::ReadAllBytes("C:\me.png")
$ftp.ContentLength = $content.Length
# get the request stream, and write the bytes into it
$rs = $ftp.GetRequestStream()
$rs.Write($content, 0, $content.Length)
# be sure to clean up after ourselves
$rs.Close()
$rs.Dispose()
There are some other ways too. I have used the following script:
$File = "D:\Dev\somefilename.zip";
$ftp = "ftp://username:password#example.com/pub/incoming/somefilename.zip";
Write-Host -Object "ftp url: $ftp";
$webclient = New-Object -TypeName System.Net.WebClient;
$uri = New-Object -TypeName System.Uri -ArgumentList $ftp;
Write-Host -Object "Uploading $File...";
$webclient.UploadFile($uri, $File);
And you could run a script against the windows FTP command line utility using the following command
ftp -s:script.txt
(Check out this article)
The following question on SO also answers this: How to script FTP upload and download?
I'm not gonna claim that this is more elegant than the highest-voted solution...but this is cool (well, at least in my mind LOL) in its own way:
$server = "ftp.lolcats.com"
$filelist = "file1.txt file2.txt"
"open $server
user $user $password
binary
cd $dir
" +
($filelist.split(' ') | %{ "put ""$_""`n" }) | ftp -i -in
As you can see, it uses that dinky built-in windows FTP client. Much shorter and straightforward, too. Yes, I've actually used this and it works!
Easiest way
The most trivial way to upload a binary file to an FTP server using PowerShell is using WebClient.UploadFile:
$client = New-Object System.Net.WebClient
$client.Credentials =
New-Object System.Net.NetworkCredential("username", "password")
$client.UploadFile(
"ftp://ftp.example.com/remote/path/file.zip", "C:\local\path\file.zip")
Advanced options
If you need a greater control, that WebClient does not offer (like TLS/SSL encryption, etc), use FtpWebRequest. Easy way is to just copy a FileStream to FTP stream using Stream.CopyTo:
$request = [Net.WebRequest]::Create("ftp://ftp.example.com/remote/path/file.zip")
$request.Credentials =
New-Object System.Net.NetworkCredential("username", "password")
$request.Method = [System.Net.WebRequestMethods+Ftp]::UploadFile
$fileStream = [System.IO.File]::OpenRead("C:\local\path\file.zip")
$ftpStream = $request.GetRequestStream()
$fileStream.CopyTo($ftpStream)
$ftpStream.Dispose()
$fileStream.Dispose()
Progress monitoring
If you need to monitor an upload progress, you have to copy the contents by chunks yourself:
$request = [Net.WebRequest]::Create("ftp://ftp.example.com/remote/path/file.zip")
$request.Credentials =
New-Object System.Net.NetworkCredential("username", "password")
$request.Method = [System.Net.WebRequestMethods+Ftp]::UploadFile
$fileStream = [System.IO.File]::OpenRead("C:\local\path\file.zip")
$ftpStream = $request.GetRequestStream()
$buffer = New-Object Byte[] 10240
while (($read = $fileStream.Read($buffer, 0, $buffer.Length)) -gt 0)
{
$ftpStream.Write($buffer, 0, $read)
$pct = ($fileStream.Position / $fileStream.Length)
Write-Progress `
-Activity "Uploading" -Status ("{0:P0} complete:" -f $pct) `
-PercentComplete ($pct * 100)
}
$ftpStream.Dispose()
$fileStream.Dispose()
Uploading folder
If you want to upload all files from a folder, see
PowerShell Script to upload an entire folder to FTP
I recently wrote for powershell several functions for communicating with FTP, see https://github.com/AstralisSomnium/PowerShell-No-Library-Just-Functions/blob/master/FTPModule.ps1. The second function below, you can send a whole local folder to FTP. In the module are even functions for removing / adding / reading folders and files recursively.
#Add-FtpFile -ftpFilePath "ftp://myHost.com/folder/somewhere/uploaded.txt" -localFile "C:\temp\file.txt" -userName "User" -password "pw"
function Add-FtpFile($ftpFilePath, $localFile, $username, $password) {
$ftprequest = New-FtpRequest -sourceUri $ftpFilePath -method ([System.Net.WebRequestMethods+Ftp]::UploadFile) -username $username -password $password
Write-Host "$($ftpRequest.Method) for '$($ftpRequest.RequestUri)' complete'"
$content = $content = [System.IO.File]::ReadAllBytes($localFile)
$ftprequest.ContentLength = $content.Length
$requestStream = $ftprequest.GetRequestStream()
$requestStream.Write($content, 0, $content.Length)
$requestStream.Close()
$requestStream.Dispose()
}
#Add-FtpFolderWithFiles -sourceFolder "C:\temp\" -destinationFolder "ftp://myHost.com/folder/somewhere/" -userName "User" -password "pw"
function Add-FtpFolderWithFiles($sourceFolder, $destinationFolder, $userName, $password) {
Add-FtpDirectory $destinationFolder $userName $password
$files = Get-ChildItem $sourceFolder -File
foreach($file in $files) {
$uploadUrl ="$destinationFolder/$($file.Name)"
Add-FtpFile -ftpFilePath $uploadUrl -localFile $file.FullName -username $userName -password $password
}
}
#Add-FtpFolderWithFilesRecursive -sourceFolder "C:\temp\" -destinationFolder "ftp://myHost.com/folder/" -userName "User" -password "pw"
function Add-FtpFolderWithFilesRecursive($sourceFolder, $destinationFolder, $userName, $password) {
Add-FtpFolderWithFiles -sourceFolder $sourceFolder -destinationFolder $destinationFolder -userName $userName -password $password
$subDirectories = Get-ChildItem $sourceFolder -Directory
$fromUri = new-object System.Uri($sourceFolder)
foreach($subDirectory in $subDirectories) {
$toUri = new-object System.Uri($subDirectory.FullName)
$relativeUrl = $fromUri.MakeRelativeUri($toUri)
$relativePath = [System.Uri]::UnescapeDataString($relativeUrl.ToString())
$lastFolder = $relativePath.Substring($relativePath.LastIndexOf("/")+1)
Add-FtpFolderWithFilesRecursive -sourceFolder $subDirectory.FullName -destinationFolder "$destinationFolder/$lastFolder" -userName $userName -password $password
}
}
Here's my super cool version BECAUSE IT HAS A PROGRESS BAR :-)
Which is a completely useless feature, I know, but it still looks cool \m/ \m/
$webclient = New-Object System.Net.WebClient
Register-ObjectEvent -InputObject $webclient -EventName "UploadProgressChanged" -Action { Write-Progress -Activity "Upload progress..." -Status "Uploading" -PercentComplete $EventArgs.ProgressPercentage } > $null
$File = "filename.zip"
$ftp = "ftp://user:password#server/filename.zip"
$uri = New-Object System.Uri($ftp)
try{
$webclient.UploadFileAsync($uri, $File)
}
catch [Net.WebException]
{
Write-Host $_.Exception.ToString() -foregroundcolor red
}
while ($webclient.IsBusy) { continue }
PS. Helps a lot, when I'm wondering "did it stop working, or is it just my slow ASDL connection?"
You can simply handle file uploads through PowerShell, like this.
Complete project is available on Github here https://github.com/edouardkombo/PowerShellFtp
#Directory where to find pictures to upload
$Dir= 'c:\fff\medias\'
#Directory where to save uploaded pictures
$saveDir = 'c:\fff\save\'
#ftp server params
$ftp = 'ftp://10.0.1.11:21/'
$user = 'user'
$pass = 'pass'
#Connect to ftp webclient
$webclient = New-Object System.Net.WebClient
$webclient.Credentials = New-Object System.Net.NetworkCredential($user,$pass)
#Initialize var for infinite loop
$i=0
#Infinite loop
while($i -eq 0){
#Pause 1 seconde before continue
Start-Sleep -sec 1
#Search for pictures in directory
foreach($item in (dir $Dir "*.jpg"))
{
#Set default network status to 1
$onNetwork = "1"
#Get picture creation dateTime...
$pictureDateTime = (Get-ChildItem $item.fullName).CreationTime
#Convert dateTime to timeStamp
$pictureTimeStamp = (Get-Date $pictureDateTime).ToFileTime()
#Get actual timeStamp
$timeStamp = (Get-Date).ToFileTime()
#Get picture lifeTime
$pictureLifeTime = $timeStamp - $pictureTimeStamp
#We only treat pictures that are fully written on the disk
#So, we put a 2 second delay to ensure even big pictures have been fully wirtten in the disk
if($pictureLifeTime -gt "2") {
#If upload fails, we set network status at 0
try{
$uri = New-Object System.Uri($ftp+$item.Name)
$webclient.UploadFile($uri, $item.FullName)
} catch [Exception] {
$onNetwork = "0"
write-host $_.Exception.Message;
}
#If upload succeeded, we do further actions
if($onNetwork -eq "1"){
"Copying $item..."
Copy-Item -path $item.fullName -destination $saveDir$item
"Deleting $item..."
Remove-Item $item.fullName
}
}
}
}
You can use this function :
function SendByFTP {
param (
$userFTP = "anonymous",
$passFTP = "anonymous",
[Parameter(Mandatory=$True)]$serverFTP,
[Parameter(Mandatory=$True)]$localFile,
[Parameter(Mandatory=$True)]$remotePath
)
if(Test-Path $localFile){
$remoteFile = $localFile.Split("\")[-1]
$remotePath = Join-Path -Path $remotePath -ChildPath $remoteFile
$ftpAddr = "ftp://${userFTP}:${passFTP}#${serverFTP}/$remotePath"
$browser = New-Object System.Net.WebClient
$url = New-Object System.Uri($ftpAddr)
$browser.UploadFile($url, $localFile)
}
else{
Return "Unable to find $localFile"
}
}
This function send specified file by FTP.
You must call the function with these parameters :
userFTP = "anonymous" by default or your username
passFTP = "anonymous" by default or your password
serverFTP = IP address of the FTP server
localFile = File to send
remotePath = the path on the FTP server
For example :
SendByFTP -userFTP "USERNAME" -passFTP "PASSWORD" -serverFTP "MYSERVER" -localFile "toto.zip" -remotePath "path/on/the/FTP/"
Goyuix's solution works great, but as presented it gives me this error: "The requested FTP command is not supported when using HTTP proxy."
Adding this line after $ftp.UsePassive = $true fixed the problem for me:
$ftp.Proxy = $null;
Simple solution if you can install curl.
curl.exe -p --insecure "ftp://<ftp_server>" --user "user:password" -T "local_file_full_path"