I have a tool that logs some data onto a file. I'd like to tail the file and send the last line of data via mosquitto_pub.
I've used powershell "Get-Content" command without succes.
Here's my command:
Get-Content -Path "C:\test.txt" -Wait | .\mosquitto_pub.exe -t "Events"
But nothing is published by mosquitto_pub.
If I use Get-Content -Path "C:\test.txt" -Wait
I see the tail of the file in stdout.
What's wrong with my solution?
Thanks!
Read this Q and A.
An alternate approach
$minsToRunFor = 10
$secondsToRunFor = $minsToRunFor * 60
foreach ($second in $secondsToRunFor){
$lastline = Get-Content -Path "C:\test.txt" | Select-Object -last 1
# added condition as per VonPryz's good point
# (otherwise will add lastline regardless of whether it's new or not)
if ($lastline -ne $oldlastline){
.\mosquitto_pub.exe -t "Events" -m "$lastline"
}
$oldlastline = $lastline
Start-Sleep 100
}
Related
I use the following command in Powershell to convert files in the background but would like to log the results all in one file. Now the -RedirectStandardOutput replaces the file each run.
foreach ($l in gc ./files.txt) {Start-Process -FilePath "c:\Program Files (x86)\calibre2\ebook-convert.exe" -Argumentlist "'$l' '$l.epub'" -Wait -WindowStyle Hidden -RedirectStandardOutput log.txt}
I tried with a redirect but then the log is empty.
If possible I would like to keep it a one-liner.
foreach ($l in gc ./files.txt) {Start-Process -FilePath "c:\Program Files (x86)\calibre2\ebook-convert.exe" -Argumentlist "`"$l`" `"$l.epub`"" -Wait -WindowStyle Hidden *> log.txt}
If sequential, synchronous execution is acceptable, you can simplify your command to use a single output redirection (the assumption is that ebook-convert.exe is a console-subsystem application, which PowerShell therefore executes synchronously (in a blocking manner).:
Get-Content ./files.txt | ForEach-Object {
& 'c:\Program Files (x86)\calibre2\ebook-convert.exe' $_ "$_.epub"
} *> log.txt
Placing * before > tells PowerShell to redirect all output streams, which in the case of external programs means both stdout and stderr.
If you want to control the character encoding, use Out-File - which > effectively is an alias for - with its -Encoding parameter; or, preferably, with text output - which external-program output always is in PowerShell - Set-Content. To also capture stderr output, append *>&1 to the command in the pipeline segment before the Out-File / Set-Content call.
Note that PowerShell never passes raw output from external programs through to files - they are first always decoded into .NET strings, based on the encoding stored in [Console]::OutputEncoding (the system's active legacy OEM code page by default), and then re-encoded on saving to a file, using the file-writing cmdlet's own defaults, unless overridden with -Encoding - see this answer for more information.
If you want asynchronous, parallel execution (such as via Start-Process, which is asynchronous by default), your best bet is to:
write to separate (temporary) files:
Pass a different output file to -RedirectStandardOutput / -RedirectStandardError in each invocation.
Note that if you want to merge stdout and stderr output and capture it in the same file, you'll have to call your .exe file via a shell (possibly another PowerShell instance) and use its redirection features; for PowerShell, it would be *>log.txt; for cmd.exe (as shown below), it would be > log.txt 2>&1
wait for all launched processes to finish:
Pass -PassThru to Start-Process and collect the process-information objects returned.
Then use Wait-Process to wait for all processes to terminate; use the -Timeout parameter as needed.
and then merge them into a single log file.
Here's an implementation:
$procsAndLogFiles =
Get-Content ./files.txt | ForEach-Object -Begin { $i = 0 } {
# Create a distinct log file for each process,
# and return its name along with a process-information object representing
# each process as a custom object.
$logFile = 'log{0:000}.txt' -f ++$i
[pscustomobject] #{
LogFile = $logFile
Process = Start-Process -PassThru -WindowStyle Hidden `
-FilePath 'cmd.exe' `
-Argumentlist "/c `"`"c:\Program Files (x86)\calibre2\ebook-convert.exe`" `"$_`" `"$_.epub`" >`"$logFile`" 2>&1`""
}
}
# Wait for all processes to terminate.
# Add -Timeout and error handling as needed.
$procsAndLogFiles.Process | Wait-Process
# Merge all log files.
Get-Content -LiteralPath $procsAndLogFiles.LogFile > log.txt
# Clean up.
Remove-Item -LiteralPath $procsAndLogFiles.LogFile
If you want throttled parallel execution, so as to limit how many background processes can run at a time:
# Limit how many background processes may run in parallel at most.
$maxParallelProcesses = 10
# Initialize the log file.
# Use -Force to unconditionally replace an existing file.
New-Item log.txt
# Initialize the list in which those input files whose conversion
# failed due to timing out are recorded.
$allTimedOutFiles = [System.Collections.Generic.List[string]]::new()
# Process the input files in batches of $maxParallelProcesses
Get-Content -ReadCount $maxParallelProcesses ./files.txt |
ForEach-Object {
$i = 0
$launchInfos = foreach ($file in $_) {
# Create a distinct log file for each process,
# and return its name along with the input file name / path, and
# a process-information object representing each process, as a custom object.
$logFile = 'log{0:000}.txt' -f ++$i
[pscustomobject] #{
InputFile = $file
LogFile = $logFile
Process = Start-Process -PassThru -WindowStyle Hidden `
-FilePath 'cmd.exe' `
-ArgumentList "/c `"`"c:\Program Files (x86)\calibre2\ebook-convert.exe`" `"$file`" `"$_.epub`" >`"$file`" 2>&1`""
}
}
# Wait for the processes to terminate, with a timeout.
$launchInfos.Process | Wait-Process -Timeout 30 -ErrorAction SilentlyContinue -ErrorVariable errs
# If not all processes terminated within the timeout period,
# forcefully terminate those that didn't.
if ($errs) {
$timedOut = $launchInfos | Where-Object { -not $_.Process.HasExited }
Write-Warning "Conversion of the following input files timed out; the processes will killed:`n$($timedOut.InputFile)"
$timedOut.Process | Stop-Process -Force
$allTimedOutFiles.AddRange(#($timedOut.InputFile))
}
# Merge all temp. log files and append to the overall log file.
$tempLogFiles = Get-Content -ErrorAction Ignore -LiteralPath ($launchInfos.LogFile | Sort-Object)
$tempLogFiles | Get-Content >> log.txt
# Clean up.
$tempLogFiles | Remove-Item
}
# * log.txt now contains all combined logs
# * $allTimedOutFiles now contains all input file names / paths
# whose conversion was aborted due to timing out.
Note that the above throttling technique isn't optimal, because each batch of inputs is waited for together, at which point the next batch is started. A better approach is to launch a new process as soon as one of the available parallel "slots" up, as shown in the next section; however, note that PowerShell (Core) 7+ is required.
PowerShell (Core) 7+: Efficiently throttled parallel execution, using ForEach-Object -Parallel:
PowerShell (Core) 7+ introduced thread-based parallelism to the ForEach-Object cmdlet, via the -Parallel parameter, which has built-in throttling that defaults to a maximum of 5 threads by default, but can be controlled explicitly via the -ThrottleLimit parameter.
This enables efficient throttling, as a new thread is started as soon as an available slot opens up.
The following is a self-contained example that demonstrates the technique; it works on both Windows and Unix-like platforms:
Inputs are 9 integers, and the conversion process is simulated simply by sleeping a random number of seconds between 1 and 9, followed by echoing the input number.
A timeout of 6 seconds is applied to each child process, meaning that a random number of child processes will time out and be killed.
#requires -Version 7
# Use ForEach-Object -Parallel to launch child processes in parallel,
# limiting the number of parallel threads (from which the child processes are
# launched) via -ThrottleLimit.
# -AsJob returns a single job whose child jobs track the threads created.
$job =
1..9 | ForEach-Object -ThrottleLimit 3 -AsJob -Parallel {
# Determine a temporary, thread-specific log file name.
$logFile = 'log_{0:000}.txt' -f $_
# Pick a radom sleep time that may or may not be smaller than the timeout period.
$sleepTime = Get-Random -Minimum 1 -Maximum 9
# Launch the external program asynchronously and save information about
# the newly launched child process.
if ($env:OS -eq 'Windows_NT') {
$ps = Start-Process -PassThru -WindowStyle Hidden cmd.exe "/c `"timeout $sleepTime >NUL & echo $_ >$logFile 2>&1`""
}
else { # macOS, Linux
$ps = Start-Process -PassThru sh "-c `"{ sleep $sleepTime; echo $_; } >$logFile 2>&1`""
}
# Wait for the child process to exit within a given timeout period.
$ps | Wait-Process -Timeout 6 -ErrorAction SilentlyContinue
# Check if a timout has occurred (implied by the process not having exited yet)
$timedOut = -not $ps.HasExited
if ($timedOut) {
# Note: Only [Console]::WriteLine produces immediate output, directly to the display.
[Console]::WriteLine("Warning: Conversion timed out for: $_")
# Kill the timed-out process.
$ps | Stop-Process -Force
}
# Construct and output a custom object that indicates the input at hand,
# the associated log file, and whether a timeout occurred.
[pscustomobject] #{
InputFile = $_
LogFile = $logFile
TimedOut = $timedOut
}
}
# Wait for all child processes to exit or be killed
$processInfos = $job | Receive-Job -Wait -AutoRemoveJob
# Merge all temporary log files into an overall log file.
$tempLogFiles = Get-Item -ErrorAction Ignore -LiteralPath ($processInfos.LogFile | Sort-Object)
$tempLogFiles | Get-Content > log.txt
# Clean up the temporary log files.
$tempLogFiles | Remove-Item
# To illustrate the results, show the overall log file's content
# and which inputs caused timeouts.
[pscustomobject] #{
CombinedLogContent = Get-Content -Raw log.txt
InputsThatFailed = ($processInfos | Where-Object TimedOut).InputFile
} | Format-List
# Clean up the overall log file.
Remove-Item log.txt
You can use redirection and append to files if you don't use Start-Process, but a direct invocation:
foreach ($l in gc ./files.txt) {& 'C:\Program Files (x86)\calibre2\ebook-convert.exe' "$l" "$l.epub" *>> log.txt}
For the moment I'm using an adaption on mklement0's answer.
ebook-convert.exe often hangs so I need to close it down if the process takes longer than the designated time.
This needs to run asynchronous because the number of files and the processor time taken (5 to 25% depending on the conversion).
The timeout needs to be per file, not on the whole of the jobs.
$procsAndLogFiles =
Get-Content ./files.txt | ForEach-Object -Begin { $i = 0 } {
# Create a distinct log file for each process,
# and return its name along with a process-information object representing
# each process as a custom object.
$logFile = 'd:\temp\log{0:000}.txt' -f ++$i
Write-Host "$(Get-Date) $_"
[pscustomobject] #{
LogFile = $logFile
Process = Start-Process `
-PassThru `
-FilePath "c:\Program Files (x86)\calibre2\ebook-convert.exe" `
-Argumentlist "`"$_`" `"$_.epub`"" `
-WindowStyle Hidden `
-RedirectStandardOutput $logFile `
| Wait-Process -Timeout 30
}
}
# Wait for all processes to terminate.
# Add -Timeout and error handling as needed.
$procsAndLogFiles.Process
# Merge all log files.
Get-Content -LiteralPath $procsAndLogFiles.LogFile > log.txt
# Clean up.
Remove-Item -LiteralPath $procsAndLogFiles.LogFile
Since the problem in my other answer was not completely solved (not killing all the processes that take longer than the timeout limit) I rewrote it in Ruby.
It's not powershell but if you land on this question and also know Ruby (or not) it could help you.
I believe it's the use of Threads that solves the killing issue.
require 'logger'
LOG = Logger.new("log.txt")
PROGRAM = 'c:\Program Files (x86)\calibre2\ebook-convert.exe'
LIST = 'E:\ebooks\english\_convert\mobi\files.txt'
TIMEOUT = 30
MAXTHREADS = 6
def run file, log: nil
output = ""
command = %Q{"#{PROGRAM}" "#{file}" "#{file}.epub" 2>&1}
IO.popen(command+" 2>&1") do |io|
begin
while (line=io.gets) do
output += line
log.info line.chomp if log
end
rescue => ex
log.error ex.message
system("taskkill /f /pid #{io.pid}") rescue log.error $#
end
end
if File.exist? "#{file}.epub"
puts "converted #{file}.epub"
File.delete(file)
else
puts "error #{file}"
end
output
end
threads = []
File.readlines(LIST).each do |file|
file.chomp! # remove line feed
# some checks
if !File.exist? file
puts "not found #{file}"
next
end
if File.exist? "#{file}.epub"
puts "skipping #{file}"
File.delete(file) if File.exist? file
next
end
# go on with the conversion
thread = Thread.new {run(file, log: LOG)}
threads << thread
next if threads.length < MAXTHREADS
threads.each do |t|
t.join(TIMEOUT)
unless t.alive?
t.kill
threads.delete(t)
end
end
end
I'm using FFmpeg with PowerShell.
I have a loop that goes through a folder of mpg files and grabs the names to a variable $inputName.
FFmpeg then converts each one to an mp4.
Works
Batch Processing
$files = Get-ChildItem "C:\Path\" -Filter *.mpg;
foreach ($f in $files) {
$inputName = $f.Name; #name + extension
$outputName = (Get-Item $inputName).Basename; #name only
ffmpeg -y -i "C:\Users\Matt\Videos\$inputName" -c:v libx264 -crf 25 "C:\Users\Matt\Videos\$outputName.mp4"
}
Not Working
Batch Processing with Process Priority
$files = Get-ChildItem "C:\Path\" -Filter *.mpg;
foreach ($f in $files) {
$inputName = $f.Name; #name + extension
$outputName = (Get-Item $inputName).Basename; #name only
($Process = Start-Process ffmpeg -NoNewWindow -ArgumentList '-y -i "C:\Users\Matt\Videos\$inputName" -c:v libx264 -crf 25 "C:\Users\Matt\Videos\$outputName.mp4"' -PassThru).PriorityClass = [System.Diagnostics.ProcessPriorityClass]::AboveNormal;
Wait-Process -Id $Process.id
}
If I set the Process Priority using Start-Process PriorityClass, the $inputName variable is no longer recognized.
Error:
C:\Users\Matt\Videos\$inputName: No such file or directory
Lets go over a few basic things.
In powershell we love piping |, It allows use to pass the information from one command to another command.
A good example of this is the ForEach you have.
Instead of Foreach($F in $Files) you can pipe | into a foreach-object
Get-ChildItem "C:\Path\" -Filter *.mpg | Foreach-Object{
$_
}
When Piping | a command powershell automatically creates the variable $_ which is the object that is passed in the pipe |
The next thing is there are 2 types of quotes " and '.
If you use ' then everthing is taken literally. Example
$FirstName = "TestName"
'Hey There $FirstName'
Will return
Hey There $FirstName
While " allows you to use Variables in it. Example
$FirstName = "TestName"
'Hey There $FirstName'
Will return
Hey There TestName
Now one last thing before we fix this. In powershell we have a escape char ` aka a tick. Its located beside the number 1 on the keyboard with the tilde. You use it to allow the use of char that would otherwise break out of the qoutes. Example
"`"Hey There`""
Would return
"Hey There"
OK so now that we covered the basics lets fix up the script
Get-ChildItem "C:\Users\Matt\Videos\" -Filter *.mpg -File | Foreach-Object{
($Process = Start-Process ffmpeg -NoNewWindow -ArgumentList "-y -i `"$($_.FullName)`" -c:v libx264 -crf 25 `"C:\Users\Matt\Videos\$($_.Name)`"" -PassThru).PriorityClass = [System.Diagnostics.ProcessPriorityClass]::AboveNormal;
Try{
Wait-Process -Id $Process.id
}catch{
}
}
In the case above I changed
Add -File to the Get-ChildItem to designate that you only want Files returned not folders
Pipe | into a Foreach-Object
Changed the Outside Brackets in the -ArgumentList to be double quotes " instead of literal quotes '
Removed the $InputName and $OutputName in favor of the Foreach-Object variable $_
Is it possible to remove or replace the last character on the last non-whitespace line of a file using PowerShell 1?
I'm trying to get an Uptime log that is precise to within 5 minutes.
I've found that there are built logs and commands that can be accessed through command prompt that would tell me when the last time a computer was booted up, or when it shut down correctly, but the native uptime log only records once every 24 hrs, so if there is a power failure, I won't know how long the system has been offline with any precision more refined than 24 hours.
So I have created the following script:
$bootTime = (Get-WmiObject Win32_OperatingSystem).LastBootUpTime
$formBootTime = [Management.ManagementDateTimeConverter]::ToDateTime($bootTime)
$uptime = (Get-Date)-$formBootTime
"$formBootTime,$(Get-Date),{0:00},{1:00},{2:00},{3:00}" -f $uptime.Days,$uptime.Hours,$uptime.Minutes,$uptime.Seconds >> C:\UptimeTracker.csv
However, this gets tediously long to scroll through when I want to evaluate how long my machine has been running over the last X days.
So I thought I would add a marker to identify the current or most recent Uptime log per any given Boot.
But in order for that to work I would need to be able to remove said marker as soon as the previous record is no longer the relevant record.
$bootTime = (Get-WmiObject Win32_OperatingSystem).LastBootUpTime
$formBootTime = [Management.ManagementDateTimeConverter]::ToDateTime($bootTime)
$file = (Get-Content c:\UptimeTracker.csv | Measure-Object)
$numberOfLines = $file.Count
$numberOfWords = (Get-Content c:\UptimeTracker.csv | Select -Index ($numberOfLines -1) | Measure-Object -word)
$Line = Get-Content c:\UptimeTracker.csv | Select -Index ($numberOfLines -2)
$wordArray = $Line.Split(",")
$LastLineBT = $wordArray[0]
if($LastLineBT -eq $formBootTime) {
$unmark = "true"
}
else
{$unmark = "false"}
if($unmark == "true"){ <remove last character of file> }
$uptime = (Get-Date)-$formBootTime
"$formBootTime,$(Get-Date),{0:00},{1:00},{2:00},{3:00},X" -f $uptime.Days,$uptime.Hours,$uptime.Minutes, $uptime.Seconds >> C:\UptimeTracker.csv
Some of the above is borrowed and modified from: https://stackoverflow.com/a/16210970/11035837
I have seen several methods that receive the file as the input file and write to a different output file, and from there it would be an easy thing to do to script renaming the new and old files to switch their positions (new, old, standby - and rotate) the reason I'm trying not to rewrite the whole file is to reduce those instances where the command/script is interrupted and the action doesn't complete. Ideally the only time the action doesn't complete would be on a power failure. However, I have already seen in a previous version, it would skip 5 minute intervals occasionally for up to 15 minutes without any change in the last reported boot time. I suspect this has to do with other higher priority processes preventing the task scheduler from running the script. If this is the case, then a complete rewrite of the file failing part way through the script would lose some percentage of the existing log data, and I would rather miss the latest record than all the data.
Nothing I have found indicates any ability to remove/replace the last character (or two since one is a newline char), neither have I found anything that explicitly declares this is not possible - I have found declarations that it is not possible to elective replace inner or beginning content without a complete rewrite.
Barring any solution definitive answer, or if the definitive answer is no this cannot be done, then I will attempt something like the following:
if($unmark == "true"){
$input = "C:\UptimeTracker_CUR.csv"
$output = "C:\UptimeTracker_NEW.csv"
$content = Get-Content $input
$content[-2] = $content[-2] -replace 'X', ' '
$content | Set-Content $output
Rename-Item -Path "C:\UptimeTracker_CUR.csv" -NewName "C:\UptimeTracker_SBY.csv"
Rename-Item -Path "C:\UptimeTracker_NEW.csv" -NewName "C:\UptimeTracker_CUR.csv"
}
EDIT - due to multi-read comment by TheMadTechnician
...
$file = Get-Content c:\UptimeTracker.csv
$fileMeasure = ($file | Measure-Object)
$numberOfLines = $fileMeasure.Count
$numberOfWords = ($file | Select -Index ($numberOfLines -1) | Measure-Object -word)
$Line = $file | Select -Index ($numberOfLines -2)
...
...
if($unmark == "true"){
$output = "C:\UptimeTracker_NEW.csv"
$file[-2] = $file[-2] -replace 'X', ' '
$file | Set-Content $output
Rename-Item -Path "C:\UptimeTracker.csv" -NewName "C:\UptimeTracker_SBY.csv"
Rename-Item -Path "C:\UptimeTracker_NEW.csv" -NewName "C:\UptimeTracker.csv"
}
You read the whole file in several times, which has got to be slowing the whole script down. I would suggest reading the whole file in, determining if you need to clear your flag, then do so when you output, adding your new line to the file. Assuming you aren't still running PowerShell v2, you can do this:
$bootTime = (Get-WmiObject Win32_OperatingSystem).LastBootUpTime
$formBootTime = [Management.ManagementDateTimeConverter]::ToDateTime($bootTime)
$uptime = (Get-Date)-$formBootTime
$File = Get-Content c:\UptimeTracker.csv -raw
if($File.trim().split("`n")[-1].Split(',')[0] -eq $formBootTime){
$File.trim() -replace 'X(?=\s*$)',' '
}else{
$File.Trim()
},("$formBootTime,$(Get-Date),{0:00},{1:00},{2:00},{3:00},X" -f $uptime.Days,$uptime.Hours,$uptime.Minutes, $uptime.Seconds)|Set-Content c:\UptimeTracker.csv
If you are running an old version you will not have the -raw option for Get-Content. As a work around you can do this instead, and the same solution should still work.
$bootTime = (Get-WmiObject Win32_OperatingSystem).LastBootUpTime
$formBootTime = [Management.ManagementDateTimeConverter]::ToDateTime($bootTime)
$uptime = (Get-Date)-$formBootTime
$File = (Get-Content c:\UptimeTracker.csv) -Join "`n"
if($File.trim().split("`n")[-1].Split(',')[0] -eq $formBootTime){
$File.trim() -replace 'X(?=\s*$)',' '
}else{
$File.Trim()
},("$formBootTime,$(Get-Date),{0:00},{1:00},{2:00},{3:00},X" -f $uptime.Days,$uptime.Hours,$uptime.Minutes, $uptime.Seconds)|Set-Content c:\UptimeTracker.csv
This is going to be slower, so should be considered a secondary option since you'll have to read the whole file in as an array of strings, and convert it to a single multi-line string.
I'm trying to send new lines added to a log file to slack if they have a certain word(s) in it - see in slack if I have errors.
What I hoped would work is:
$message = Get-Content '\\path\to\file.log' -Wait -Tail 0 | Select-String -Pattern 'ERROR'
foreach($line in $message) { Send-SlackMsg -Text "$($line)" -Channel $channel }
Sadly, it does not. I've replaced Send-SlackMsg with Write-Host just to see if that is the issue but it's not.
Get-Content '\\path\to\file.log' -Wait -Tail 0 | Select-String -Pattern 'ERROR'
Works like a charm in my console.
How can I make PowerShell perform an action when a new line appears and it matches the pattern?
Per the comment from Mathias, you could use ForEach-Object:
Get-Content '\\path\to\file.log' -Wait -Tail 0 | Select-String -Pattern 'ERROR' | ForEach-Object { Send-SlackMsg -Text $_ -Channel $channel }
The problem with your code was that because of the -Wait switch the code was never proceeding past the $message = line. However ForEach-Object processes objects via the pipeline, so as soon as a new line is written to the file which matches 'ERROR' it is passed down the pipeline and in to the ForEach-Object.
I’m new to PowerShell and am trying to convert a batch file that downloads multiple files based on names and extension from a directory on an ftp site. While I’ve found several examples that download a file, I’m struggling to find one that shows how to download multiple files. In a batch I can quite simply use the ftp.exe and the mget command with wildcards??
Can someone please point me in the right direction.
Thanks in advance.
John
There are multiple ways to achieve this. One is to use the System.Net.FtpWebRequest as shown in this example:
http://www.systemcentercentral.com/BlogDetails/tabid/143/IndexID/81125/Default.aspx
Or there are /n Software NetCmdlets you can use:
http://www.nsoftware.com/powershell/tutorials/FTP.aspx
In a batch I can quite simply use the ftp.exe and the mget command
with wildcards??
You can do the same in Powershell if you want to.
For a more Powershell way, you can use the FTPWebRequest. See here: http://msdn.microsoft.com/en-us/library/ms229711.aspx. You can build on the example to download multiple files in a loop.
But bottomline is, you do not have to convert something you have in batch to Powershell. You can, if you want, but what you have in batch, especially when calling external programs, should work just as well.
Another resource you might want to check: PowerShell FTP Client Module
http://gallery.technet.microsoft.com/scriptcenter/PowerShell-FTP-Client-db6fe0cb
Oddly enough there are no built in cmdlets to deal with FTP. I'm not sure why the PowerShell team made that decision but it means you'll have to rely on using .NET code, a third party script/module/snap-in or a Win32 program such as FTP.exe as others have already answered with.
Here's is an example of downloading multiple files (binary and text) using .NET code:
$files = "Firefox Setup 9.0.exe", "Firefox Setup 9.0.exe.asc"
$ftpFolder = 'ftp://ftp.mozilla.org/pub/firefox/releases/9.0/win32/en-US'
$outputFolder = (Resolve-Path "~\My Documents").Path
foreach ($file in $files) {
try {
$uri = $ftpFolder + '/' + $file
$request = [Net.WebRequest]::Create($uri)
$request.Method = [Net.WebRequestMethods+Ftp]::DownloadFile
$responseStream = $request.GetResponse().GetResponseStream()
$outFile = Join-Path $outputFolder -ChildPath $file
$fs = New-Object System.IO.FileStream $outFile, "Create"
[byte[]] $buffer = New-Object byte[] 4096
do {
$count = $responseStream.Read($buffer, 0, $buffer.Length)
$fs.Write($buffer, 0, $count)
} while ($count -gt 0)
} catch {
throw "Failed to download file '{0}/{1}'. The error was {2}." -f $ftpFolder, $file, $_
} finally {
if ($fs) { $fs.Flush(); $fs.Close() }
if ($responseStream) { $responseStream.Close() }
}
}
#Jacob. You need ::ListDirectory method to make a list. After, you have to output it in a text file with the out-file command. After that, you import the list with the get-content command. So with a text file, you can make a collection of objects with a foreach loop (don't forget to skip the last line with the '-cne' condition).
You include in this loop your download-ftp function with the parameter of your loop.
Understood ? Not sure if my explanation is good.
So there's an example from one of my script :
$files = Get-FtpList $ftpSource $ftpDirectory $ftpLogin $ftpPassword | Out-File -Encoding UTF8 -FilePath list.txt
$list = Get-Content -Encoding UTF8 -Path list.txt
foreach ($entry in $list -cne "")
{
Get-FtpFile $ftpSource $ftpDirectory $entry $target $ftpLogin $ftpPassword
Start-Sleep -Milliseconds 10
}
Hope it works now for you.
PS:Get-FtpList and Get-FtpFile are custom functions.
This is what i did.As i needed to download a file based on a pattern i dynamically created a command file and then let ftp do the rest
I used basic powershell commands. i did not need to download any additional components
I first Check if the Requisite number of files exist. if they do i invoke the FTP the second time with an Mget.
I run this from a windows 2008 Server connecting to a windows XP remote server
function make_ftp_command_file($p_file_pattern,$mget_flag)
{
# This function dynamically prepares the FTP file
# The file needs to be prepared daily because the pattern changes daily
# Powershell default encoding is Unicode
# Unicode command files are not compatible with FTP so we need to make sure we create an ASCII File
write-output "USER" | out-file -filepath C:\fc.txt -encoding ASCII
write-output "ftpusername" | out-file -filepath C:\fc.txt -encoding ASCII -Append
write-output "password" | out-file -filepath C:\fc.txt -encoding ASCII -Append
write-output "ASCII" | out-file -filepath C:\fc.txt -encoding ASCII -Append
If($mget_flag -eq "Y")
{
write-output "prompt" | out-file -filepath C:\fc.txt -encoding ASCII -Append
write-output "mget $p_file_pattern" | out-file -filepath C:\fc.txt -encoding ASCII -Append
}
else
{
write-output "ls $p_file_pattern" | out-file -filepath C:\fc.txt -encoding ASCII -Append
}
write-output quit | out-file -filepath C:\fc.txt -encoding ASCII -Append
}
########################### Init Section ###############################
$yesterday = (get-date).AddDays(-1)
$yesterday_fmt = date $yesterday -format "yyyyMMdd"
$file_pattern = "BRAE_GE_*" + $yesterday_fmt + "*.csv"
$file_log = $yesterday_fmt + ".log"
echo $file_pattern
echo $file_log
############################## Main Section ############################
# Change location to folder where the files need to be downloaded
cd c:\remotefiles
# Dynamically create the FTP Command to get a list of files from the Remote Servers
echo "Call function that creates a FTP Command "
make_ftp_command_file $file_pattern N
#echo "Connect to remote site via FTP"
# Connect to Remote Server and get file listing
ftp -n -v -s:C:\Clover\scripts\fc.txt 10.129.120.31 > C:\logs\$file_log
$matches=select-string -pattern "BRAE_GE_[A-Z][A-Z]*" C:\logs\$file_log
# Check if the required number of Files available for download
if ($matches.count -eq 36)
{
# Create the ftp command file
# this time the command file has an mget rather than an ls
make_ftp_command_file $file_pattern Y
# Change directory if not done so
cd c:\remotefiles
# Invoke Ftp with newly created command file
ftp -n -v -s:C:\Clover\scripts\fc.txt 10.129.120.31 > C:\logs\$file_log
}
else
{
echo "Full set of Files not available"
}
It's not Powershell specific. But I've tried many other solutions and so far
The http://ncftp.com/ client works the best. It comes with ncftpls.exe for listing remote files and ncftpget.exe for getting files. Use them with Start-Process -Wait
A file list can be constructed in a variable, and used with a regular FTP command....
$FileList="file1_$cycledate.csv
file2_$cycledate.csv
file3_$cycledate.csv
file4_$cycledate.csv"
"open $FTPServer
user $FTPUser $FTPPassword
ascii
cd report
" +
($filelist.split(' ') | %{ "mget $_" }) | ftp -i -n