How to generate Powershell registry import script? - powershell

Basically I simply want to right-click on a branch in Regedit and say 'Generate Powershell script for import'. So that instead of a .reg file I get a PS script which will import/create elsewhere the entire selected registry branch with all keys/values etc.
I thought this would be a standard thing somewhere but I can't find anything, nor anyone with the same question, which surprises me.
Of course I could code it all out in PS but I'm feeling really lazy...

What you're looking for would indeed be convenient, but, as of this writing:
There is no official mechanism for customizing the regedit.exe utility's GUI that I'm aware of - unlike the (registry-based) mechanism for customizing File Explorer's shortcut menus.
Conceivably, specialized tools / advanced WinAPI-based techniques exist to achieve that.
Separately, there's no packaged PowerShell solution that I'm aware of that creates self-contained .ps1 scripts that bundle registry-import code with the data to import.
Leaving the regedit.exe GUI-integration aspect out of the picture, the building blocks of what you're looking for are:
(a) Using reg.exe export to export a given registry key's subtree to a .reg file.
(b) Later using reg.exe import to import such a file.
PowerShell code that combines (a) and (b) as follows:
It performs (a) ...
... and embeds the resulting .reg file's content in a dynamically generated script (.ps1) ...
which, when executed on a given machine, imports the embedded data into the registry, via (b).
Below is function New-RegistryImportScript, which implements the steps above; here's a sample invocation:
Get-Item HKCU:\Console | New-RegistryImportScript -OutPath .
The above creates script .\Import-RegKey_HKEY_CURRENT_USER_Console.ps1, which has the data from the HKEY_CURRENT_USER\Console registry key (subtree) embedded and, when executed, imports that data into the registry.
The script file name was auto-generated, from the given key path, because only an output directory was specified to -OutPath (. to target the current dir.), but you may specify a file path instead, so as to use a file name of choice.
As for regedit.exe integration: Invoke shortcut-menu command Copy Key Name on the key of interest, and then pass it as an argument to New-RegistryImportScript; e.g.:
# 'HKEY_CURRENT_USER\Console' is an example path copied from regedit.exe
New-RegistryImportScript HKEY_CURRENT_USER\Console .
New-RegistryImportScript source code:
function New-RegistryImportScript {
<#
.SYNOPSIS
Generates a self-contained registry-import script.
.DESCRIPTION
Generates a self-contained registry-import script that bundles the
data exported from a given registry key (subtree), using `reg.exe`
behind the scenes.
By default, the content of the generated script is output; redirect
it to a file as needed.
Alternatively, use -OutPath to directly save it to a file.
If you specify a *directory*, a file name is auto-generated as
Import-RegKey_<sanitized_key_path>.ps1, where <sanitized_key_path>
is the input key path with all non-alphanumeric characters replaced with
"_".
If you provide multiple key paths via the pipeline, a *single* output file
is created if you pass a *file* path to -OutPath.
With a *directory* path, an auto-named script is generate for each
input key path.
.EXAMPLE
Get-Item HKCU:\Console | New-RegistryImportScript -OutPath .
Creates automatically named script .\Import-RegKey_HKEY_CURRENT_USER_Console.ps1
with the data exported from HKEY_CURRENT_USER\Console embeded in it.
#>
param(
[Alias('PSPath')]
[Parameter(Mandatory, ValueFromPipeline, ValueFromPipelineByPropertyName)] [string] $KeyPath,
[string] $OutPath
)
begin {
# Code to add at the top and bottom of the generated script
$scriptProlog = #'
[CmdletBinding()] param()
$tempFile = "$env:TEMP\" + [System.IO.Path]::GetRandomFileName() + '.reg'
& {
'#
$scriptEpilog = #'
} | Set-Content -Encoding Unicode -LiteralPath $tempFile
reg.exe import $tempFile
Remove-Item -LiteralPath $tempFile
exit $LASTEXITCODE
'#
if ($env:OS -ne 'Windows_NT') { throw "This command runs on Windows only." }
# Note: For cross-PS-edition compatibility we ensure that UTF-8 files *with BOM* are created.
$enc = if ($IsCoreCLR) { 'utf8BOM'} else { 'utf8 '}
$autoGenerateName = $OutPath -and (Test-Path -Type Container -LiteralPath $OutPath)
if (-not $OutPath) {
$scriptProlog # Output the prolog to the success output stream.
} elseif (-not $autoGenerateName) {
if (($parentPath = (Split-Path -Parent $OutPath)) -and -not (Test-Path -Type Container -LiteralPath $parentPath)) {
throw "Cannot find part of the output path: $OutPath"
}
Write-Verbose "Generating script `"$($outFile.FullName)`"..."
# Initialize the single output file.
$scriptProlog | Set-Content -LiteralPath $OutPath -Encoding $enc
}
}
process {
# First, try to convert to a full, provider-native path.
$nativeRegPath = Convert-Path -ErrorAction Ignore -LiteralPath $KeyPath
if (-not $nativeRegPath) { $nativeRegPath = $KeyPath } # Assume that a native registry path was directly given.
# Resolve it to a full, native registry path via a Get-Item call.
# By using "Microsoft.PowerShell.Core\Registry::" as the prefix, we rule out non-registry paths.
# !! Sadly, even the .Name property does NOT contain the *case-exact* form of the key path - it reflects the case *as specified*.
# !! However, given that the registry is inherently case-INsensitive, this should not matter.
$nativeRegPath = (Get-Item -ErrorAction Ignore -LiteralPath "Microsoft.PowerShell.Core\Registry::$nativeRegPath").Name
if (-not $nativeRegPath) {
"Not an (existing) registry path: `"$KeyPath`""
return
}
Write-Verbose "Targeting registry key `"$nativeRegPath`""
# Export the target key's subtree from the registry.
$tempFile = New-TemporaryFile
reg.exe export $nativeRegPath $tempFile /y >$null # Creates a UTF-16LE file.
if ($LASTEXITCODE) {
Write-Error "Export of registry key `"$nativeRegPath`" failed."
return
}
$regFileContent = Get-Content -Raw $tempFile
$tempFile | Remove-Item
# Create the part of the generated script that has the exported
# data embedded as a here-string.
$scriptEmbeddedData = #"
Write-Verbose "Importing into ``"$nativeRegPath``"..."
#'
$regFileContent
'#
"#
if (-not $OutPath) {
$scriptEmbeddedData # output to the success output stream
}
else {
if ($autoGenerateName) {
# Auto-generate a filename for the key path at hand.
$OutFile = Join-Path $OutPath ('Import-RegKey_' + ($nativeRegPath -replace '[^\p{L}\d]', '_') + '.ps1')
Write-Verbose -Verbose "Generating auto-named script `"$OutFile`"..."
$scriptProlog, $scriptEmbeddedData, $scriptEpilog | Set-Content -Encoding $enc $OutFile
} else {
# Append the embedded data to the single output script.
$scriptEmbeddedData | Add-Content -Encoding $enc $OutPath
}
}
}
end {
if (-not $OutPath) {
# Output the the epilog.
$scriptEpilog
}
elseif (-not $autoGenerateName) {
# Single output file? Append the epilog.
$scriptEpilog | Add-Content -Encoding $enc $OutPath
}
}
}

Related

Log PowerShell commands to LogFile

I have basic PowerShell script with logging function and some commands to run.
I'm looking for a solution of logging into log file commands what are executed.
For now I only know this, but its quite annoying to copy + paste all command to have been logged:
$LogPath = "C:\Logs"
$FileName = (Get-Item $PSCommandPath).Basename
$LogFile = $LogPath + "\" + $FileName + ".log"
Function WriteLog
{
Param ([string]$LogString)
$Stamp = (Get-Date).toString("yyyy-MM-dd HH:mm:ss")
$LogMessage = "$Stamp $LogString"
Add-content $LogFile -value $LogMessage
}
WriteLog "***********************"
WriteLog ""
WriteLog "Command1"
Command1
WriteLog "Command2"
Command2
WriteLog "Command3"
Command3
WriteLog "Command4"
Command4
WriteLog "Command5"
Command5
WriteLog ""
WriteLog "***********************"
I suggest the following:
Modify your function to alternatively accept a script block ({ ... }) representing the command to execute.
If a script block is given, use its string representation as the log message, and then execute it.
# Create the logging function in a *dynamic module* (see below).
# Place this at the top of your script.
$null = New-Module {
# Define the log-file path.
$LogPath = 'C:\Logs'
$FileName = (Get-Item $PSCommandPath).Basename
$LogFile = $LogPath + '\' + $FileName + '.log'
# Create / truncate the file.
New-Item -Force -ErrorAction Stop $LogFile
function Add-LogMessage {
[CmdletBinding(DefaultParameterSetName = 'String')]
param(
[Parameter(Position = 0, Mandatory, ParameterSetName = 'ScriptBlock')]
[scriptblock] $ScriptBlock
,
[Parameter(Position = 0, ParameterSetName = 'String')]
[string] $String
)
# If a script block was given, use its string representation
# as the log string.
if ($ScriptBlock) {
# Make the string representation single-line by replacing newlines
# (and line-leading whitespace) with "; "
$String = $ScriptBlock.ToString().Trim() -replace '\r?\n *', '; '
}
# Create a timestamped message and append it to the log file.
$stamp = (Get-Date).ToString("yyyy-MM-dd HH:mm:ss")
$logMessage = "$stamp $String"
Add-Content -LiteralPath $LogFile -Value $logMessage
# If a script block was given, execute it now.
if ($ScriptBlock) {
# Because this function is defined in a (dynamic) module,
# its invocation doesn't create a child scope of the *caller's* scope,
# and invoking the given script block, which is bound to the caller's scope,
# with . (dot-sourcing) runs it *directly in the caller's scope*.
. $ScriptBlock
}
}
}
Note:
The function name adheres to PowerShell's verb-noun convention, using Add, which is an approved verb; however, for brevity the aspect of situationally also performing execution (for which Invoke would be the approved verb) is not reflected in the name.
Your script would then look something like this:
Add-LogMessage "***********************"
Add-LogMessage ""
Add-LogMessage { Command1 }
Add-LogMessage { Command2 }
# ...
Add-LogMessage "***********************"
Note:
By placing the function inside a (dynamic, transient) module created via New-Module, its invocation does not create a child scope of the caller's scope.
When a script block created by a literal ({ ... }) in the caller's scope is passed, it can then be invoked with ., the dot-sourcing operator, which executes it directly in the caller's scope, which means that the script block's code is free to modify the script's variables, the same way that placing that code directly in the script would.
If you want the function to also log a given script block's output (while still printing it to the display), you can use Tee-Object as follows (for simplicity I'm assuming the same target log file, adjust as needed):
. $ScriptBlock | Tee-Object -Append -FilePath $LogFile
Caveat: As of PowerShell 7.2.x, Tee-Object uses a fixed character encoding, namely UTF-16LE ("Unicode") in Windows PowerShell and BOM-less UTF-8 in PowerShell (Core) 7+. GitHub issue #11104 suggests adding an -Encoding parameter (which only future PowerShell (Core) versions would benefit from).
Therefore, if you're using Windows PowerShell and you're targeting the same log file for capturing the output, be sure to modify the Add-Content call with -Encoding Unicode as follows:
Add-Content -Encoding Unicode -LiteralPath $LogFile -Value $logMessage
Alternatively, if you want to avoid UTF-16LE ("Unicode") files for their size (with all-ASCII characters, they're twice the size of ANSI and UTF-8 files), you can use one of the workarounds discussed in this answer.

Hidden variable Password parameter - powershell 4.0

I am creating a release definition with a powershell script to replace a file with env variables from the release definition it works but It doesn't seem to catch the password variable which is hidden in the release definition. is there a way to tell powershell to look for hidden variables?
UPDATE: Here is the script it finds all the variables in $paramsFilePath that are not hidden my password in In environmental variables in Release definition is hidden and the script doesn't find it.
param(
[string]$paramsFilePath,
)
Write-Verbose -Verbose "Entering script Replace-SetParameters.ps1"
Write-Verbose -Verbose ("Path to SetParametersFile: {0}" -f $paramsFilePath)
# get the environment variables
$vars = Get-ChildItem -path env:*
# read in the setParameters file
$contents = Get-Content -Path $paramsFilePath
# perform a regex replacement
$newContents = "";
$contents | % {
$line = $_
if ($_ -match "__(\w+)__") {
$setting = Get-ChildItem -path env:* | ? { $_.Name -eq $Matches[1] }
if ($setting) {
Write-Verbose -Verbose ("Replacing key {0} with value from environment" -f $setting.Name)
$line = $_ -replace "__(\w+)__", $setting.Value
}
}
$newContents += $line + [Environment]::NewLine
}
Write-Verbose -Verbose "Overwriting SetParameters file with new values"
Set-Content $paramsFilePath -Value $newContents
Write-Verbose -Verbose "Exiting script Replace-SetParameters.ps1"
Unlike the normal variable, the password you are trying to get is secret variable.
Secret Variables
We recommend that you make the variable Secret if it contains a
password, keys, or some other kind of data that you need to avoid
exposing.
The variable replacement we do is on the inputs on the tasks, we don't parse the scripts. To use secret variables you will have to take those as inputs into your script we explicitly do not populate those into the environment. You could take a look at this discuss: Use hidden / secret variables in commands

WinSCP XML log with PowerShell to confirm multiple file upload

With my script, I am attempting to scan a directory for a subdirectory that is automatically created each day that contains the date in the directory name. Once it finds yesterdays date (since I need to upload previous day), it looks for another subdirectory, then any files that contain "JONES". Once it finds those files, it does a foreach loop to upload them using winscp.com.
My issue is that I'm trying to use the .xml log created from winscp to send to a user to confirm uploads. The problem is that the .xml file contains only the last file uploaded.
Here's my script:
# Set yesterday's date (since uploads will happen at 2am)
$YDate = (Get-Date).AddDays(-1).ToString('MM-dd-yyyy')
# Find Directory w/ Yesterday's Date in name
$YesterdayFolder = Get-ChildItem -Path "\\Path\to\server" | Where-Object {$_.FullName.contains($YDate)}
If ($YesterdayFolder) {
#we specify the directory where all files that we want to upload are contained
$Dir= $YesterdayFolder
#list every sql server trace file
$FilesToUpload = Get-ChildItem -Path (Join-Path $YesterdayFolder.FullName "Report") | Where-Object {$_.Name.StartsWith("JONES","CurrentCultureIgnoreCase")}
foreach($item in ($FilesToUpload))
{
$PutCommand = '& "C:\Program Files (x86)\WinSCP\winscp.com" /command "open ftp://USERNAME:PASSWORD#ftps.hostname.com:21/dropoff/ -explicitssl" "put """"' + $Item.FullName + '""""" "exit"'
Invoke-Expression $PutCommand
}
} Else {
#Something Else will go here
}
I feel that it's my $PutCommand line all being contained within the ForEach loop, and it just overwrites the xml file each time it connects/exits, but I haven't had any luck breaking that script up.
You are running WinSCP again and again for each file. Each run overwrites a log of the previous run.
Call WinSCP once instead only. It's even better as you avoid re-connecting for each file.
$FilesToUpload = Get-ChildItem -Path (Join-Path $YesterdayFolder.FullName "Report") |
Where-Object {$_.Name.StartsWith("JONES","CurrentCultureIgnoreCase")}
$PutCommand = '& "C:\Program Files (x86)\WinSCP\winscp.com" /command "open ftp://USERNAME:PASSWORD#ftps.hostname.com:21/dropoff/ -explicitssl" '
foreach($item in ($FilesToUpload))
{
$PutCommand += '"put """"' + $Item.FullName + '""""" '
}
$PutCommand += '"exit"'
Invoke-Expression $PutCommand
Though all you really need to do is checking WinSCP exit code. If it is 0, all went fine. No need to have the XML log as a proof.
And even better, use the WinSCP .NET assembly from PowerShell script, instead of driving WinSCP from command-line. It does all error checking for you (you get an exception if anything goes wrong). And you avoid all nasty stuff of command-line (like escaping special symbols in credentials and filenames).
try
{
# Load WinSCP .NET assembly
Add-Type -Path "WinSCPnet.dll"
# Setup session options
$sessionOptions = New-Object WinSCP.SessionOptions -Property #{
Protocol = [WinSCP.Protocol]::Ftp
FtpSecure = [WinSCP.FtpSecure]::Explicit
TlsHostCertificateFingerprint = "xx:xx:xx:xx:xx:xx..."
HostName = "ftps.hostname.com"
UserName = "username"
Password = "password"
}
$session = New-Object WinSCP.Session
try
{
# Connect
$session.Open($sessionOptions)
# Upload files
foreach ($item in ($FilesToUpload))
{
$session.PutFiles($Item.FullName, "/dropoff/").Check()
Write-Host "Upload of $($Item.FullName) succeeded"
}
}
finally
{
# Disconnect, clean up
$session.Dispose()
}
exit 0
}
catch [Exception]
{
Write-Host "Error: $($_.Exception.Message)"
exit 1
}

How do I make powershell script transverse zip files and report based off select-string -pattern

I have the following that is working but I need to also have the ability to read the contents of compressed file (zip)
function Search-Files {
param ([string[]]$Servers, [string]$SearchPath, [string]$SearchItem, [string[]]$LogName)
ForEach ($Server in $Servers) {
if ($LogName -eq $null) {
dir -Path \\$server\$SearchPath -Recurse -Force -ErrorAction SilentlyContinue -WarningAction SilentlyContinue | Select-String -pattern $SearchItem -ErrorAction SilentlyContinue -WarningAction SilentlyContinue | Select-Object Filename, Path, Matches, LineNumber
}
Else {
dir -Path \\$server\$SearchPath -Recurse -Force -ErrorAction SilentlyContinue -WarningAction SilentlyContinue | ? {$_.Name -match $LogName} | Select-String -pattern $SearchItem -ErrorAction SilentlyContinue -WarningAction SilentlyContinue | Select-Object Filename, Path, Matches, LineNumber
}
}
}
Currently I am getting the following out put displayed which is what I would like to do for zip files as well
ip.ininlog \CO200197L\C$\Temp\Test\Test\ip\ip.ininlog {3030872954} 136594
I have found the following just not sure how to proceed to get them implemented
Grep File in Zip
List File in Zip
I need the ability to transverse all zip files that are store in a directory
Sample of Directory Structure
2014-07-01 - root
zip.zip
zip_1.zip
zip_2.zip
etc
In case you have NET 4.5 framework installed, you can use 4.5's built-in ZIP support to extract files to a temporary path and run the selection on the temporary file. If no 4.5 is available, I recommend using SharpCompress (https://sharpcompress.codeplex.com/) which works in a similar way.
The following code snippet demonstrates extracting a ZIP archive into a temporary file, running the selection process from your script and the cleanup after the extraction. You can significantly simplify the code by extracting the entire ZIP file at once (just use ExtractToDirectory() on the archive) if it contains only the files you are seeking.
# import .NET 4.5 compression utilities
Add-Type -As System.IO.Compression.FileSystem;
# the input archive
$archivePath = "C:\sample.zip";
# open archive for reading
$archive = [System.IO.Compression.ZipFile]::OpenRead($archivePath);
try
{
# enumerate all entries in the archive, which includes both files and directories
foreach($archiveEntry in $archive.Entries)
{
# if the entry is not a directory (which ends with /)
if($archiveEntry.FullName -notmatch '/$')
{
# get temporary file -- note that this will also create the file
$tempFile = [System.IO.Path]::GetTempFileName();
try
{
# extract to file system
[System.IO.Compression.ZipFileExtensions]::ExtractToFile($archiveEntry, $tempFile, $true);
# create PowerShell backslash-friendly path from ZIP path with forward slashes
$windowsStyleArchiveEntryName = $archiveEntry.FullName.Replace('/', '\');
# run selection
Get-ChildItem $tempFile | Select-String -pattern "yourpattern" | Select-Object #{Name="Filename";Expression={$windowsStyleArchiveEntryName}}, #{Name="Path";Expression={Join-Path $archivePath (Split-Path $windowsStyleArchiveEntryName -Parent)}}, Matches, LineNumber
}
finally
{
Remove-Item $tempFile;
}
}
}
}
finally
{
# release archive object to prevent leaking resources
$archive.Dispose();
}
If you have multiple ZIP files in the directory, you can enumerate them as follows (using your example script):
$zipArchives = Get-ChildItem -Path \\$server\$SearchPath -Recurse "*.zip";
foreach($zipArchive in $zipArchives)
{
$archivePath = $zipArchive.FullName;
...
}
You can place the demo code in ... or move it to a PowerShell function.
Sometimes is not desirable to extract a zip entry as a file. Instead it may be preferable to work with the file in memory. Extracting a Zip entry containing XML or JSON text so it can be parsed in memory is an example.
Here is a technique that will allow you to do this. This example assumes there is a Zip entry with a name ending in .json and it is this file which is to be retrieved. Clearly the idea can be modified to handle different cases.
This code should work with version of the .NET Framework that includes the System.IO.Compression namespace.
try
{
# import .NET 4.5 compression utilities
Add-Type -As System.IO.Compression.FileSystem;
# A variable to hold the recovered JSON content
$json = $null
$zip = [IO.Compression.ZipFile]::OpenRead($zipFileName)
$zip.Entries |
Where-Object { $_.Name.EndsWith(".json") } |
ForEach-Object {
# Use a MemoryStream to hold the inflated file content
$memoryStream = New-Object System.IO.MemoryStream
# Read the entry
$file = $_.Open()
# Copying inflates the entry content
$file.CopyTo($memoryStream)
# Make sure the entry is closed
$file.Dispose()
# After copying, the cursor will be at the end of the stream
# so set the position to the beginning or there will be no output
$memoryStream.Position = 0
# Use a StreamReader because it allows the content to be
# read as a string in one go
$reader = New-Object System.IO.StreamReader($memoryStream)
# Read the content as a string
$json = $reader.ReadToEnd()
# Close the reader and memory stream
$reader.Dispose()
$memoryStream.Dispose()
}
# Finally close the zip file. This is necessary
# because the zip file does get closed automatically
$zip.Dispose()
# Do something with the JSON in memory
if ( $json -ne $null )
{
$objects = $json | ConvertFrom-Json
}
}
catch
{
# Report errors
}

FTP Download Multiple Files using PowerShell

I’m new to PowerShell and am trying to convert a batch file that downloads multiple files based on names and extension from a directory on an ftp site. While I’ve found several examples that download a file, I’m struggling to find one that shows how to download multiple files. In a batch I can quite simply use the ftp.exe and the mget command with wildcards??
Can someone please point me in the right direction.
Thanks in advance.
John
There are multiple ways to achieve this. One is to use the System.Net.FtpWebRequest as shown in this example:
http://www.systemcentercentral.com/BlogDetails/tabid/143/IndexID/81125/Default.aspx
Or there are /n Software NetCmdlets you can use:
http://www.nsoftware.com/powershell/tutorials/FTP.aspx
In a batch I can quite simply use the ftp.exe and the mget command
with wildcards??
You can do the same in Powershell if you want to.
For a more Powershell way, you can use the FTPWebRequest. See here: http://msdn.microsoft.com/en-us/library/ms229711.aspx. You can build on the example to download multiple files in a loop.
But bottomline is, you do not have to convert something you have in batch to Powershell. You can, if you want, but what you have in batch, especially when calling external programs, should work just as well.
Another resource you might want to check: PowerShell FTP Client Module
http://gallery.technet.microsoft.com/scriptcenter/PowerShell-FTP-Client-db6fe0cb
Oddly enough there are no built in cmdlets to deal with FTP. I'm not sure why the PowerShell team made that decision but it means you'll have to rely on using .NET code, a third party script/module/snap-in or a Win32 program such as FTP.exe as others have already answered with.
Here's is an example of downloading multiple files (binary and text) using .NET code:
$files = "Firefox Setup 9.0.exe", "Firefox Setup 9.0.exe.asc"
$ftpFolder = 'ftp://ftp.mozilla.org/pub/firefox/releases/9.0/win32/en-US'
$outputFolder = (Resolve-Path "~\My Documents").Path
foreach ($file in $files) {
try {
$uri = $ftpFolder + '/' + $file
$request = [Net.WebRequest]::Create($uri)
$request.Method = [Net.WebRequestMethods+Ftp]::DownloadFile
$responseStream = $request.GetResponse().GetResponseStream()
$outFile = Join-Path $outputFolder -ChildPath $file
$fs = New-Object System.IO.FileStream $outFile, "Create"
[byte[]] $buffer = New-Object byte[] 4096
do {
$count = $responseStream.Read($buffer, 0, $buffer.Length)
$fs.Write($buffer, 0, $count)
} while ($count -gt 0)
} catch {
throw "Failed to download file '{0}/{1}'. The error was {2}." -f $ftpFolder, $file, $_
} finally {
if ($fs) { $fs.Flush(); $fs.Close() }
if ($responseStream) { $responseStream.Close() }
}
}
#Jacob. You need ::ListDirectory method to make a list. After, you have to output it in a text file with the out-file command. After that, you import the list with the get-content command. So with a text file, you can make a collection of objects with a foreach loop (don't forget to skip the last line with the '-cne' condition).
You include in this loop your download-ftp function with the parameter of your loop.
Understood ? Not sure if my explanation is good.
So there's an example from one of my script :
$files = Get-FtpList $ftpSource $ftpDirectory $ftpLogin $ftpPassword | Out-File -Encoding UTF8 -FilePath list.txt
$list = Get-Content -Encoding UTF8 -Path list.txt
foreach ($entry in $list -cne "")
{
Get-FtpFile $ftpSource $ftpDirectory $entry $target $ftpLogin $ftpPassword
Start-Sleep -Milliseconds 10
}
Hope it works now for you.
PS:Get-FtpList and Get-FtpFile are custom functions.
This is what i did.As i needed to download a file based on a pattern i dynamically created a command file and then let ftp do the rest
I used basic powershell commands. i did not need to download any additional components
I first Check if the Requisite number of files exist. if they do i invoke the FTP the second time with an Mget.
I run this from a windows 2008 Server connecting to a windows XP remote server
function make_ftp_command_file($p_file_pattern,$mget_flag)
{
# This function dynamically prepares the FTP file
# The file needs to be prepared daily because the pattern changes daily
# Powershell default encoding is Unicode
# Unicode command files are not compatible with FTP so we need to make sure we create an ASCII File
write-output "USER" | out-file -filepath C:\fc.txt -encoding ASCII
write-output "ftpusername" | out-file -filepath C:\fc.txt -encoding ASCII -Append
write-output "password" | out-file -filepath C:\fc.txt -encoding ASCII -Append
write-output "ASCII" | out-file -filepath C:\fc.txt -encoding ASCII -Append
If($mget_flag -eq "Y")
{
write-output "prompt" | out-file -filepath C:\fc.txt -encoding ASCII -Append
write-output "mget $p_file_pattern" | out-file -filepath C:\fc.txt -encoding ASCII -Append
}
else
{
write-output "ls $p_file_pattern" | out-file -filepath C:\fc.txt -encoding ASCII -Append
}
write-output quit | out-file -filepath C:\fc.txt -encoding ASCII -Append
}
########################### Init Section ###############################
$yesterday = (get-date).AddDays(-1)
$yesterday_fmt = date $yesterday -format "yyyyMMdd"
$file_pattern = "BRAE_GE_*" + $yesterday_fmt + "*.csv"
$file_log = $yesterday_fmt + ".log"
echo $file_pattern
echo $file_log
############################## Main Section ############################
# Change location to folder where the files need to be downloaded
cd c:\remotefiles
# Dynamically create the FTP Command to get a list of files from the Remote Servers
echo "Call function that creates a FTP Command "
make_ftp_command_file $file_pattern N
#echo "Connect to remote site via FTP"
# Connect to Remote Server and get file listing
ftp -n -v -s:C:\Clover\scripts\fc.txt 10.129.120.31 > C:\logs\$file_log
$matches=select-string -pattern "BRAE_GE_[A-Z][A-Z]*" C:\logs\$file_log
# Check if the required number of Files available for download
if ($matches.count -eq 36)
{
# Create the ftp command file
# this time the command file has an mget rather than an ls
make_ftp_command_file $file_pattern Y
# Change directory if not done so
cd c:\remotefiles
# Invoke Ftp with newly created command file
ftp -n -v -s:C:\Clover\scripts\fc.txt 10.129.120.31 > C:\logs\$file_log
}
else
{
echo "Full set of Files not available"
}
It's not Powershell specific. But I've tried many other solutions and so far
The http://ncftp.com/ client works the best. It comes with ncftpls.exe for listing remote files and ncftpget.exe for getting files. Use them with Start-Process -Wait
A file list can be constructed in a variable, and used with a regular FTP command....
$FileList="file1_$cycledate.csv
file2_$cycledate.csv
file3_$cycledate.csv
file4_$cycledate.csv"
"open $FTPServer
user $FTPUser $FTPPassword
ascii
cd report
" +
($filelist.split(' ') | %{ "mget $_" }) | ftp -i -n