FTP Download Multiple Files using PowerShell - powershell

I’m new to PowerShell and am trying to convert a batch file that downloads multiple files based on names and extension from a directory on an ftp site. While I’ve found several examples that download a file, I’m struggling to find one that shows how to download multiple files. In a batch I can quite simply use the ftp.exe and the mget command with wildcards??
Can someone please point me in the right direction.
Thanks in advance.
John

There are multiple ways to achieve this. One is to use the System.Net.FtpWebRequest as shown in this example:
http://www.systemcentercentral.com/BlogDetails/tabid/143/IndexID/81125/Default.aspx
Or there are /n Software NetCmdlets you can use:
http://www.nsoftware.com/powershell/tutorials/FTP.aspx

In a batch I can quite simply use the ftp.exe and the mget command
with wildcards??
You can do the same in Powershell if you want to.
For a more Powershell way, you can use the FTPWebRequest. See here: http://msdn.microsoft.com/en-us/library/ms229711.aspx. You can build on the example to download multiple files in a loop.
But bottomline is, you do not have to convert something you have in batch to Powershell. You can, if you want, but what you have in batch, especially when calling external programs, should work just as well.

Another resource you might want to check: PowerShell FTP Client Module
http://gallery.technet.microsoft.com/scriptcenter/PowerShell-FTP-Client-db6fe0cb

Oddly enough there are no built in cmdlets to deal with FTP. I'm not sure why the PowerShell team made that decision but it means you'll have to rely on using .NET code, a third party script/module/snap-in or a Win32 program such as FTP.exe as others have already answered with.
Here's is an example of downloading multiple files (binary and text) using .NET code:
$files = "Firefox Setup 9.0.exe", "Firefox Setup 9.0.exe.asc"
$ftpFolder = 'ftp://ftp.mozilla.org/pub/firefox/releases/9.0/win32/en-US'
$outputFolder = (Resolve-Path "~\My Documents").Path
foreach ($file in $files) {
try {
$uri = $ftpFolder + '/' + $file
$request = [Net.WebRequest]::Create($uri)
$request.Method = [Net.WebRequestMethods+Ftp]::DownloadFile
$responseStream = $request.GetResponse().GetResponseStream()
$outFile = Join-Path $outputFolder -ChildPath $file
$fs = New-Object System.IO.FileStream $outFile, "Create"
[byte[]] $buffer = New-Object byte[] 4096
do {
$count = $responseStream.Read($buffer, 0, $buffer.Length)
$fs.Write($buffer, 0, $count)
} while ($count -gt 0)
} catch {
throw "Failed to download file '{0}/{1}'. The error was {2}." -f $ftpFolder, $file, $_
} finally {
if ($fs) { $fs.Flush(); $fs.Close() }
if ($responseStream) { $responseStream.Close() }
}
}

#Jacob. You need ::ListDirectory method to make a list. After, you have to output it in a text file with the out-file command. After that, you import the list with the get-content command. So with a text file, you can make a collection of objects with a foreach loop (don't forget to skip the last line with the '-cne' condition).
You include in this loop your download-ftp function with the parameter of your loop.
Understood ? Not sure if my explanation is good.
So there's an example from one of my script :
$files = Get-FtpList $ftpSource $ftpDirectory $ftpLogin $ftpPassword | Out-File -Encoding UTF8 -FilePath list.txt
$list = Get-Content -Encoding UTF8 -Path list.txt
foreach ($entry in $list -cne "")
{
Get-FtpFile $ftpSource $ftpDirectory $entry $target $ftpLogin $ftpPassword
Start-Sleep -Milliseconds 10
}
Hope it works now for you.
PS:Get-FtpList and Get-FtpFile are custom functions.

This is what i did.As i needed to download a file based on a pattern i dynamically created a command file and then let ftp do the rest
I used basic powershell commands. i did not need to download any additional components
I first Check if the Requisite number of files exist. if they do i invoke the FTP the second time with an Mget.
I run this from a windows 2008 Server connecting to a windows XP remote server
function make_ftp_command_file($p_file_pattern,$mget_flag)
{
# This function dynamically prepares the FTP file
# The file needs to be prepared daily because the pattern changes daily
# Powershell default encoding is Unicode
# Unicode command files are not compatible with FTP so we need to make sure we create an ASCII File
write-output "USER" | out-file -filepath C:\fc.txt -encoding ASCII
write-output "ftpusername" | out-file -filepath C:\fc.txt -encoding ASCII -Append
write-output "password" | out-file -filepath C:\fc.txt -encoding ASCII -Append
write-output "ASCII" | out-file -filepath C:\fc.txt -encoding ASCII -Append
If($mget_flag -eq "Y")
{
write-output "prompt" | out-file -filepath C:\fc.txt -encoding ASCII -Append
write-output "mget $p_file_pattern" | out-file -filepath C:\fc.txt -encoding ASCII -Append
}
else
{
write-output "ls $p_file_pattern" | out-file -filepath C:\fc.txt -encoding ASCII -Append
}
write-output quit | out-file -filepath C:\fc.txt -encoding ASCII -Append
}
########################### Init Section ###############################
$yesterday = (get-date).AddDays(-1)
$yesterday_fmt = date $yesterday -format "yyyyMMdd"
$file_pattern = "BRAE_GE_*" + $yesterday_fmt + "*.csv"
$file_log = $yesterday_fmt + ".log"
echo $file_pattern
echo $file_log
############################## Main Section ############################
# Change location to folder where the files need to be downloaded
cd c:\remotefiles
# Dynamically create the FTP Command to get a list of files from the Remote Servers
echo "Call function that creates a FTP Command "
make_ftp_command_file $file_pattern N
#echo "Connect to remote site via FTP"
# Connect to Remote Server and get file listing
ftp -n -v -s:C:\Clover\scripts\fc.txt 10.129.120.31 > C:\logs\$file_log
$matches=select-string -pattern "BRAE_GE_[A-Z][A-Z]*" C:\logs\$file_log
# Check if the required number of Files available for download
if ($matches.count -eq 36)
{
# Create the ftp command file
# this time the command file has an mget rather than an ls
make_ftp_command_file $file_pattern Y
# Change directory if not done so
cd c:\remotefiles
# Invoke Ftp with newly created command file
ftp -n -v -s:C:\Clover\scripts\fc.txt 10.129.120.31 > C:\logs\$file_log
}
else
{
echo "Full set of Files not available"
}

It's not Powershell specific. But I've tried many other solutions and so far
The http://ncftp.com/ client works the best. It comes with ncftpls.exe for listing remote files and ncftpget.exe for getting files. Use them with Start-Process -Wait

A file list can be constructed in a variable, and used with a regular FTP command....
$FileList="file1_$cycledate.csv
file2_$cycledate.csv
file3_$cycledate.csv
file4_$cycledate.csv"
"open $FTPServer
user $FTPUser $FTPPassword
ascii
cd report
" +
($filelist.split(' ') | %{ "mget $_" }) | ftp -i -n

Related

How to generate Powershell registry import script?

Basically I simply want to right-click on a branch in Regedit and say 'Generate Powershell script for import'. So that instead of a .reg file I get a PS script which will import/create elsewhere the entire selected registry branch with all keys/values etc.
I thought this would be a standard thing somewhere but I can't find anything, nor anyone with the same question, which surprises me.
Of course I could code it all out in PS but I'm feeling really lazy...
What you're looking for would indeed be convenient, but, as of this writing:
There is no official mechanism for customizing the regedit.exe utility's GUI that I'm aware of - unlike the (registry-based) mechanism for customizing File Explorer's shortcut menus.
Conceivably, specialized tools / advanced WinAPI-based techniques exist to achieve that.
Separately, there's no packaged PowerShell solution that I'm aware of that creates self-contained .ps1 scripts that bundle registry-import code with the data to import.
Leaving the regedit.exe GUI-integration aspect out of the picture, the building blocks of what you're looking for are:
(a) Using reg.exe export to export a given registry key's subtree to a .reg file.
(b) Later using reg.exe import to import such a file.
PowerShell code that combines (a) and (b) as follows:
It performs (a) ...
... and embeds the resulting .reg file's content in a dynamically generated script (.ps1) ...
which, when executed on a given machine, imports the embedded data into the registry, via (b).
Below is function New-RegistryImportScript, which implements the steps above; here's a sample invocation:
Get-Item HKCU:\Console | New-RegistryImportScript -OutPath .
The above creates script .\Import-RegKey_HKEY_CURRENT_USER_Console.ps1, which has the data from the HKEY_CURRENT_USER\Console registry key (subtree) embedded and, when executed, imports that data into the registry.
The script file name was auto-generated, from the given key path, because only an output directory was specified to -OutPath (. to target the current dir.), but you may specify a file path instead, so as to use a file name of choice.
As for regedit.exe integration: Invoke shortcut-menu command Copy Key Name on the key of interest, and then pass it as an argument to New-RegistryImportScript; e.g.:
# 'HKEY_CURRENT_USER\Console' is an example path copied from regedit.exe
New-RegistryImportScript HKEY_CURRENT_USER\Console .
New-RegistryImportScript source code:
function New-RegistryImportScript {
<#
.SYNOPSIS
Generates a self-contained registry-import script.
.DESCRIPTION
Generates a self-contained registry-import script that bundles the
data exported from a given registry key (subtree), using `reg.exe`
behind the scenes.
By default, the content of the generated script is output; redirect
it to a file as needed.
Alternatively, use -OutPath to directly save it to a file.
If you specify a *directory*, a file name is auto-generated as
Import-RegKey_<sanitized_key_path>.ps1, where <sanitized_key_path>
is the input key path with all non-alphanumeric characters replaced with
"_".
If you provide multiple key paths via the pipeline, a *single* output file
is created if you pass a *file* path to -OutPath.
With a *directory* path, an auto-named script is generate for each
input key path.
.EXAMPLE
Get-Item HKCU:\Console | New-RegistryImportScript -OutPath .
Creates automatically named script .\Import-RegKey_HKEY_CURRENT_USER_Console.ps1
with the data exported from HKEY_CURRENT_USER\Console embeded in it.
#>
param(
[Alias('PSPath')]
[Parameter(Mandatory, ValueFromPipeline, ValueFromPipelineByPropertyName)] [string] $KeyPath,
[string] $OutPath
)
begin {
# Code to add at the top and bottom of the generated script
$scriptProlog = #'
[CmdletBinding()] param()
$tempFile = "$env:TEMP\" + [System.IO.Path]::GetRandomFileName() + '.reg'
& {
'#
$scriptEpilog = #'
} | Set-Content -Encoding Unicode -LiteralPath $tempFile
reg.exe import $tempFile
Remove-Item -LiteralPath $tempFile
exit $LASTEXITCODE
'#
if ($env:OS -ne 'Windows_NT') { throw "This command runs on Windows only." }
# Note: For cross-PS-edition compatibility we ensure that UTF-8 files *with BOM* are created.
$enc = if ($IsCoreCLR) { 'utf8BOM'} else { 'utf8 '}
$autoGenerateName = $OutPath -and (Test-Path -Type Container -LiteralPath $OutPath)
if (-not $OutPath) {
$scriptProlog # Output the prolog to the success output stream.
} elseif (-not $autoGenerateName) {
if (($parentPath = (Split-Path -Parent $OutPath)) -and -not (Test-Path -Type Container -LiteralPath $parentPath)) {
throw "Cannot find part of the output path: $OutPath"
}
Write-Verbose "Generating script `"$($outFile.FullName)`"..."
# Initialize the single output file.
$scriptProlog | Set-Content -LiteralPath $OutPath -Encoding $enc
}
}
process {
# First, try to convert to a full, provider-native path.
$nativeRegPath = Convert-Path -ErrorAction Ignore -LiteralPath $KeyPath
if (-not $nativeRegPath) { $nativeRegPath = $KeyPath } # Assume that a native registry path was directly given.
# Resolve it to a full, native registry path via a Get-Item call.
# By using "Microsoft.PowerShell.Core\Registry::" as the prefix, we rule out non-registry paths.
# !! Sadly, even the .Name property does NOT contain the *case-exact* form of the key path - it reflects the case *as specified*.
# !! However, given that the registry is inherently case-INsensitive, this should not matter.
$nativeRegPath = (Get-Item -ErrorAction Ignore -LiteralPath "Microsoft.PowerShell.Core\Registry::$nativeRegPath").Name
if (-not $nativeRegPath) {
"Not an (existing) registry path: `"$KeyPath`""
return
}
Write-Verbose "Targeting registry key `"$nativeRegPath`""
# Export the target key's subtree from the registry.
$tempFile = New-TemporaryFile
reg.exe export $nativeRegPath $tempFile /y >$null # Creates a UTF-16LE file.
if ($LASTEXITCODE) {
Write-Error "Export of registry key `"$nativeRegPath`" failed."
return
}
$regFileContent = Get-Content -Raw $tempFile
$tempFile | Remove-Item
# Create the part of the generated script that has the exported
# data embedded as a here-string.
$scriptEmbeddedData = #"
Write-Verbose "Importing into ``"$nativeRegPath``"..."
#'
$regFileContent
'#
"#
if (-not $OutPath) {
$scriptEmbeddedData # output to the success output stream
}
else {
if ($autoGenerateName) {
# Auto-generate a filename for the key path at hand.
$OutFile = Join-Path $OutPath ('Import-RegKey_' + ($nativeRegPath -replace '[^\p{L}\d]', '_') + '.ps1')
Write-Verbose -Verbose "Generating auto-named script `"$OutFile`"..."
$scriptProlog, $scriptEmbeddedData, $scriptEpilog | Set-Content -Encoding $enc $OutFile
} else {
# Append the embedded data to the single output script.
$scriptEmbeddedData | Add-Content -Encoding $enc $OutPath
}
}
}
end {
if (-not $OutPath) {
# Output the the epilog.
$scriptEpilog
}
elseif (-not $autoGenerateName) {
# Single output file? Append the epilog.
$scriptEpilog | Add-Content -Encoding $enc $OutPath
}
}
}

Powershell move files to a new folder that are not still writing to the source folder

I have a powershell script that's moving files from a source directory over to a target directory every 15 minutes. Files of around 1 meg are moving into the source directory by an SFTP server... so the files can be written at anytime by the SFTP clients.
The Move-Item command is moving files, however it seems that it's moving them without making sure the file isn't still being written (in-use?).
I need some help coming up with a way to write the files from the source to the target and make sure the entire file gets to the target. Anyone run across this issue before with Powershell?
I searched and was able to find a few functions that said they solved the problem but when I tried them out I wasn't seeing the same results.
Existing PowerShell script is below:
Move-Item "E:\SFTP_Server\UserFolder\*.*" "H:\TargetFolder\" -Verbose -Force *>&1 | Out-File -FilePath E:\Powershell_Scripts\LOGS\MoveFilesToTarget-$(get-date -f yyyy-MM-dd-HH-mm-ss).txt
I ended up cobbling together a few things and got this working as I wanted it. Basically I'm looping through the files and checking the length of the file once... then waiting a second and checking the length of the file again to see if it's changed. This seems to be working well. Here's a copy of the script incase it helps anyone in the future!
$logfile ="H:\WriteTest\LogFile_$(get-date -format `"yyyyMMdd_hhmmsstt`").txt"
function log($string, $color)
{
if ($Color -eq $null) {$color = "white"}
write-host $string -foregroundcolor $color
$string | out-file -Filepath $logfile -append
}
$SourcePath = "E:\SFTP_Server\UserFolder\"
$TargetPath = "H:\TargetFolder\"
$Stuff = Get-ChildItem "$SourcePath\*.*" | select name, fullname
ForEach($I in $Stuff){
log "Starting to process $I.name" green
$newfile = $TargetPath + $I.name
$LastLength = 1
$NewLength = (Get-Item $I.fullname).length
while ($NewLength -ne $LastLength) {
$LastLength = $NewLength
Start-Sleep -Seconds 1
log "Waiting 1 Second" green
$NewLength = (Get-Item $I.fullname).length
log "Current File Length = $NewLength" green
}
log "File Not In Use - Ready To Move!" green
Move-Item $I.fullname $TargetPath
}

Powershell not sending the right path for a file as argument

I'm trying to apply a hash function to all the files inside a folder as some kind of version control. The idea is to make a testfile that lists the name of the file and the generated checksum. Digging online I found some code that should do the trick (in theory):
$list = Get-ChildItem 'C:\users\public\documents\folder' -Filter *.cab
$sha1 = New-Object System.Security.Cryptography.SHA1CryptoServiceProvider
foreach ($file in $list) {
$return = "" | Select Name, Hash
$returnname = $file.Name
$returnhash = [System.BitConverter]::ToString($sha1.ComputeHash([System.IO.File]::ReadAllBytes($file.Name)))
$return = "$returnname,$returnhash"
Out-File -FilePath .\mylist.txt -Encoding Default -InputObject ($return) -Append
}
When I run it however, I get an error because it tries to read the files from c:\users\me\, the folder where I'm running the script. And the file c:\users\me\aa.cab does not exist and hence can't be reached.
I've tried everything that I could think of, but no luck. I'm using Windows 7 with Powershell 2.0, if that helps in any way.
Try with .FullName instead of just .Name.
$returnhash = [System.BitConverter]::ToString($sha1.ComputeHash([System.IO.File]::ReadAllBytes($file.FullName)))

WinSCP XML log with PowerShell to confirm multiple file upload

With my script, I am attempting to scan a directory for a subdirectory that is automatically created each day that contains the date in the directory name. Once it finds yesterdays date (since I need to upload previous day), it looks for another subdirectory, then any files that contain "JONES". Once it finds those files, it does a foreach loop to upload them using winscp.com.
My issue is that I'm trying to use the .xml log created from winscp to send to a user to confirm uploads. The problem is that the .xml file contains only the last file uploaded.
Here's my script:
# Set yesterday's date (since uploads will happen at 2am)
$YDate = (Get-Date).AddDays(-1).ToString('MM-dd-yyyy')
# Find Directory w/ Yesterday's Date in name
$YesterdayFolder = Get-ChildItem -Path "\\Path\to\server" | Where-Object {$_.FullName.contains($YDate)}
If ($YesterdayFolder) {
#we specify the directory where all files that we want to upload are contained
$Dir= $YesterdayFolder
#list every sql server trace file
$FilesToUpload = Get-ChildItem -Path (Join-Path $YesterdayFolder.FullName "Report") | Where-Object {$_.Name.StartsWith("JONES","CurrentCultureIgnoreCase")}
foreach($item in ($FilesToUpload))
{
$PutCommand = '& "C:\Program Files (x86)\WinSCP\winscp.com" /command "open ftp://USERNAME:PASSWORD#ftps.hostname.com:21/dropoff/ -explicitssl" "put """"' + $Item.FullName + '""""" "exit"'
Invoke-Expression $PutCommand
}
} Else {
#Something Else will go here
}
I feel that it's my $PutCommand line all being contained within the ForEach loop, and it just overwrites the xml file each time it connects/exits, but I haven't had any luck breaking that script up.
You are running WinSCP again and again for each file. Each run overwrites a log of the previous run.
Call WinSCP once instead only. It's even better as you avoid re-connecting for each file.
$FilesToUpload = Get-ChildItem -Path (Join-Path $YesterdayFolder.FullName "Report") |
Where-Object {$_.Name.StartsWith("JONES","CurrentCultureIgnoreCase")}
$PutCommand = '& "C:\Program Files (x86)\WinSCP\winscp.com" /command "open ftp://USERNAME:PASSWORD#ftps.hostname.com:21/dropoff/ -explicitssl" '
foreach($item in ($FilesToUpload))
{
$PutCommand += '"put """"' + $Item.FullName + '""""" '
}
$PutCommand += '"exit"'
Invoke-Expression $PutCommand
Though all you really need to do is checking WinSCP exit code. If it is 0, all went fine. No need to have the XML log as a proof.
And even better, use the WinSCP .NET assembly from PowerShell script, instead of driving WinSCP from command-line. It does all error checking for you (you get an exception if anything goes wrong). And you avoid all nasty stuff of command-line (like escaping special symbols in credentials and filenames).
try
{
# Load WinSCP .NET assembly
Add-Type -Path "WinSCPnet.dll"
# Setup session options
$sessionOptions = New-Object WinSCP.SessionOptions -Property #{
Protocol = [WinSCP.Protocol]::Ftp
FtpSecure = [WinSCP.FtpSecure]::Explicit
TlsHostCertificateFingerprint = "xx:xx:xx:xx:xx:xx..."
HostName = "ftps.hostname.com"
UserName = "username"
Password = "password"
}
$session = New-Object WinSCP.Session
try
{
# Connect
$session.Open($sessionOptions)
# Upload files
foreach ($item in ($FilesToUpload))
{
$session.PutFiles($Item.FullName, "/dropoff/").Check()
Write-Host "Upload of $($Item.FullName) succeeded"
}
}
finally
{
# Disconnect, clean up
$session.Dispose()
}
exit 0
}
catch [Exception]
{
Write-Host "Error: $($_.Exception.Message)"
exit 1
}

Pipe all Write-Output to the same Out-File in PowerShell

As the title suggests, how do you make it so all of the Write-Outputs - no matter where they appear - automatically append to your defined log file? That way the script will be nicer to read and it removes a tiny bit of work!
Little example below, id like to see none of the "| Out-File" if possible, yet have them still output to that file!
$Author = 'Max'
$Time = Get-Date -Format "HH:mm:ss.fff"
$Title = "Illegal Software Removal"
$LogName = "Illegal_Remove_$($env:COMPUTERNAME).log"
$Log = "C:\Windows\Logs\Software" + "\" + $LogName
$RemoteLog = "\\Server\Adobe Illegal Software Removal"
Set-PSBreakpoint -Variable Time -Mode Read -Action { $global:Time = Get-Date -format "HH:mm:ss.fff" } | Out-Null
If((Test-Path $Log) -eq $False){ New-Item $Log -ItemType "File" -Force | Out-Null }
Else { $Null }
"[$Time][Startup] $Title : Created by $Author" | Out-File $Log -Append
"[$Time][Startup] Configuring initial variables required before run..." | Out-File $Log -Append
EDIT: This needs to work on PS v2.0, I don't want the output to appear on screen at all only in the log. So I have the same functionality, but the script would look like so...
"[$Time][Startup] $Title : Created by $Author"
"[$Time][Startup] Configuring initial variables required before run..."
You have two options, one is to do the redirection at the point the script is invoked e.g.:
PowerShell.exe -Command "& {c:\myscript.ps1}" > c:\myscript.log
Or you can use the Start-Transcript command to record everything (except exe output) the shell sees. After the script is done call Stop-Transcript.