I want this script to choose the latest file from a folder and then send it via ftp to the server.
I think it is choosing the file late because there is a new file on the FTP after running it. However it crashes constantly showing
uploading .....
uploading .....
uploading .....
$Dir="C:/log1"
$ftp = "ftpftpftp"
$user = "useruseruser"
$pass = "passpasspass"
$latest = Get-ChildItem -Path $Dir | Sort-Object LastAccessTime -Descending | Select-Object -First 1
$webclient = New-Object System.Net.WebClient
$webclient.Credentials = New-Object System.Net.NetworkCredential($user,$pass)
for($latest){
"Uploading $latest..."
$uri = New-Object System.Uri($ftp+$latest.Name)
$webclient.UploadFile($uri, $latest.FullName)
}
I think you are using the wrong code block by accident. Currently you have created an infinite loop as you have no condition to how the for will exit.
A simple example of such a loop would be
for(){"Hello? Is it me you are looking for?"}
It should be structured like this
for (initialization; condition; repeat){code block}
an example would be
for($index =1; $index -lt 6;$index++){$index}
There is no need for that code block at all as long as $Dir is not empty. What you can do for a little error prevention is if($latest){} which will only work if $latest contains a file (in this code structure).
if($latest){
"Uploading $latest..."
$uri = New-Object System.Uri($ftp+$latest.Name)
$webclient.UploadFile($uri, $latest.FullName)
}
Your sample output does not have a file name in it so I suspect your $dir contains no files?
Related
Code below to download a file from website:
https://www.mcafee.com/enterprise/en-us/downloads/security-updates.html
File being downloaded
However, the file name is changing every day. For example: 'mediumepo4981dat.zip' today and 'mediumepo4980dat.zip' yesterday.
How can I create a script which I can run daily which works dynamically?
[System.Net.ServicePointManager]::ServerCertificateValidationCallback = {$true}
$uri = "https://download.nai.com/products/datfiles/med/mediumepo4981dat.zip"
$filename = "C:\DownloadTest\mediumepo4981dat.zip"
$wc = New-Object System.Net.WebClient
$wc.UseDefaultCredentials = $true
$wc.DownloadFile($uri, $filename)
Please let me know if you need any further clarification.
Note: Looking for a solution without Invoke-WebRequest as that does not work in my current environment.
Seems like we can assume that the last file listed is always the newest, can't confirm if this will always be true it may change at some point. For the time being, this is how you can get the last file:
$uri = [uri] "https://download.nai.com/products/datfiles/med"
$wr = Invoke-WebRequest $uri
$file = $wr.ParsedHtml.getElementsByTagName('a') |
Select-Object -Last 1 -ExpandProperty TextContent
$toDownload = [uri]::new($uri, $file)
Now you can combine this with the rest of your script:
$filename = Join-Path 'C:\DownloadTest\' -ChildPath $file
$wc = New-Object System.Net.WebClient
$wc.UseDefaultCredentials = $true
$wc.DownloadFile($toDownload.AbsoluteUri, $filename)
Note that this will work only in Windows PowerShell.
$uri = [uri] "https://download.nai.com/products/datfiles/med"
$wr = Invoke-WebRequest $uri
$file = $wr.ParsedHtml.getElementsByTagName('a') |
Select-Object -Last 1 -ExpandProperty TextContent
$toDownload = [uri]::new($uri, $file)
is missing trailing '/' in $uri.
It should be
$uri = [uri] "https://download.nai.com/products/datfiles/med/"
My Powershell script works fine this way (it uses FTP to send IHM_OTP_CRT_20190812_0701.txt to the server and save it as newfile.txt)
$File = "Z:\Export\IHM_OTP_CRT_20190812_0701.txt"
$ftp = "ftp://ftpuse:mypass#e2b.kpsci.com/inbound/newfile.txt"
"ftp url: $ftp"
$webclient = New-Object System.Net.WebClient
$uri = New-Object System.Uri($ftp)
"Uploading $File..."
$webclient.UploadFile($uri, $File)
But it does not work if I change the second line
$ftp = "ftp://ftpuse:mypass#e2b.kpsci.com/inbound/newfile.txt"
to the following
$File = Get-ChildItem -Recurse | Where-Object { $_.Name -match 'IHM_OTP_CRT_.' } | sort LastWriteTime | select -last 1
$NewFileName = $File.Name
$ftp = "ftp://ftpuse:mypass#e2b.kpsci.com/inbound/" + $NewFileName
and this is driving me crazy.
I've tried various concatenation methods...I used the $var1``$var2 concept saved into $NewFileName to avoid the + sign, I've used parenthesis around the argument like:
$ftp = ("ftp://ftpuse:mypass#e2b.kpsci.com/inbound/" + $NewFileName)
And the more frustrating part is that when I use #echo, the concatenated string looks perfect. Also, this works fine:
$ftp = "ftp://ftpuse:mypass#e2b.kpsci.com/inbound/" + "IHM_OTP_CRT_20190812_0701.txt"
So, it's only concatenating with a separate object (even if it's a string) that doesn't work. I can concatenate to "blah" but not to a variable that equates to "blah". I've spent hours on this and I don't imagine it should be this difficult.
The error I am receiving is:
"Exception calling "UploadFile" with 2 argument(s): "The requested URI is invalid for this FTP command...at :17 char: 22..."
The error makes sense to me - it think's that my "concatenated" object contains a separate argument into the Upload File method, but I can't understand how to make it understand that I'm intending to pass a single string.
The issue is that the Get-ChildItem object .ToString() will only return the file name, not the full path. Therefore in the Webclient Upload you have to specify the .FullName property:
$webclient.UploadFile($uri, $File.FullName)
Full code:
$File = Get-ChildItem -Recurse | Where-Object { $_.Name -match 'IHM_OTP_CRT_.' } | sort LastWriteTime | select -last 1
$NewFileName = $File.Name
$ftp = "ftp://ftpuse:mypass#e2b.kpsci.com/inbound/" + $NewFileName
"ftp url: $ftp"
$webclient = New-Object System.Net.WebClient
$uri = New-Object System.Uri($ftp)
"Uploading $File..."
$webclient.UploadFile($uri, $File.FullName)
I like to write a PowerShell script to download all files and subfolders from my FTP server. I found a script to download all files from one specific folder, but I also like to download the subfolders and their files.
#FTP Server Information - SET VARIABLES
$ftp = "ftp://ftp.abc.ch/"
$user = 'abc'
$pass = 'abc'
$folder = '/'
$target = "C:\LocalData\Powershell\"
#SET CREDENTIALS
$credentials = new-object System.Net.NetworkCredential($user, $pass)
function Get-FtpDir ($url,$credentials) {
$request = [Net.WebRequest]::Create($url)
$request.Method = [System.Net.WebRequestMethods+FTP]::ListDirectory
if ($credentials) { $request.Credentials = $credentials }
$response = $request.GetResponse()
$reader = New-Object IO.StreamReader $response.GetResponseStream()
$reader.ReadToEnd()
$reader.Close()
$response.Close()
}
#SET FOLDER PATH
$folderPath= $ftp + "/" + $folder + "/"
$Allfiles=Get-FTPDir -url $folderPath -credentials $credentials
$files = ($Allfiles -split "`r`n")
$files
$webclient = New-Object System.Net.WebClient
$webclient.Credentials = New-Object System.Net.NetworkCredential($user,$pass)
$counter = 0
foreach ($file in ($files | where {$_ -like "*.*"})){
$source=$folderPath + $file
$destination = $target + $file
$webclient.DownloadFile($source, $target+$file)
#PRINT FILE NAME AND COUNTER
$counter++
$counter
$source
}
Thanks for your help (:
The .NET framework or PowerShell do not have any explicit support for recursive file operations (including downloads). You have to implement the recursion yourself:
List the remote directory
Iterate the entries, downloading files and recursing into subdirectories (listing them again, etc.)
Tricky part is to identify files from subdirectories. There's no way to do that in a portable way with the .NET framework (FtpWebRequest or WebClient). The .NET framework unfortunately does not support the MLSD command, which is the only portable way to retrieve directory listing with file attributes in FTP protocol. See also Checking if object on FTP server is file or directory.
Your options are:
Do an operation on a file name that is certain to fail for file and succeeds for directories (or vice versa). I.e. you can try to download the "name". If that succeeds, it's a file, if that fails, it a directory.
You may be lucky and in your specific case, you can tell a file from a directory by a file name (i.e. all your files have an extension, while subdirectories do not)
You use a long directory listing (LIST command = ListDirectoryDetails method) and try to parse a server-specific listing. Many FTP servers use *nix-style listing, where you identify a directory by the d at the very beginning of the entry. But many servers use a different format. The following example uses this approach (assuming the *nix format)
function DownloadFtpDirectory($url, $credentials, $localPath)
{
$listRequest = [Net.WebRequest]::Create($url)
$listRequest.Method =
[System.Net.WebRequestMethods+Ftp]::ListDirectoryDetails
$listRequest.Credentials = $credentials
$lines = New-Object System.Collections.ArrayList
$listResponse = $listRequest.GetResponse()
$listStream = $listResponse.GetResponseStream()
$listReader = New-Object System.IO.StreamReader($listStream)
while (!$listReader.EndOfStream)
{
$line = $listReader.ReadLine()
$lines.Add($line) | Out-Null
}
$listReader.Dispose()
$listStream.Dispose()
$listResponse.Dispose()
foreach ($line in $lines)
{
$tokens = $line.Split(" ", 9, [StringSplitOptions]::RemoveEmptyEntries)
$name = $tokens[8]
$permissions = $tokens[0]
$localFilePath = Join-Path $localPath $name
$fileUrl = ($url + $name)
if ($permissions[0] -eq 'd')
{
if (!(Test-Path $localFilePath -PathType container))
{
Write-Host "Creating directory $localFilePath"
New-Item $localFilePath -Type directory | Out-Null
}
DownloadFtpDirectory ($fileUrl + "/") $credentials $localFilePath
}
else
{
Write-Host "Downloading $fileUrl to $localFilePath"
$downloadRequest = [Net.WebRequest]::Create($fileUrl)
$downloadRequest.Method =
[System.Net.WebRequestMethods+Ftp]::DownloadFile
$downloadRequest.Credentials = $credentials
$downloadResponse = $downloadRequest.GetResponse()
$sourceStream = $downloadResponse.GetResponseStream()
$targetStream = [System.IO.File]::Create($localFilePath)
$buffer = New-Object byte[] 10240
while (($read = $sourceStream.Read($buffer, 0, $buffer.Length)) -gt 0)
{
$targetStream.Write($buffer, 0, $read);
}
$targetStream.Dispose()
$sourceStream.Dispose()
$downloadResponse.Dispose()
}
}
}
Use the function like:
$credentials = New-Object System.Net.NetworkCredential("user", "mypassword")
$url = "ftp://ftp.example.com/directory/to/download/"
DownloadFtpDirectory $url $credentials "C:\target\directory"
The code is translated from my C# example in C# Download all files and subdirectories through FTP.
Though Microsoft does not recommend FtpWebRequest for a new development.
If you want to avoid troubles with parsing the server-specific directory listing formats, use a 3rd party library that supports the MLSD command and/or parsing various LIST listing formats; and recursive downloads.
For example with WinSCP .NET assembly you can download whole directory with a single call to Session.GetFiles:
# Load WinSCP .NET assembly
Add-Type -Path "WinSCPnet.dll"
# Setup session options
$sessionOptions = New-Object WinSCP.SessionOptions -Property #{
Protocol = [WinSCP.Protocol]::Ftp
HostName = "ftp.example.com"
UserName = "user"
Password = "mypassword"
}
$session = New-Object WinSCP.Session
try
{
# Connect
$session.Open($sessionOptions)
# Download files
$session.GetFiles("/directory/to/download/*", "C:\target\directory\*").Check()
}
finally
{
# Disconnect, clean up
$session.Dispose()
}
Internally, WinSCP uses the MLSD command, if supported by the server. If not, it uses the LIST command and supports dozens of different listing formats.
The Session.GetFiles method is recursive by default.
(I'm the author of WinSCP)
For retrieving files /folder from FTP via powerShell I wrote some functions, you can get even hidden stuff from FTP.
Example for getting all files and subfolders (even hidden ones) in a specific folder:
Get-FtpChildItem -ftpFolderPath "ftp://myHost.com/root/leaf/" -userName "User" -password "pw" -Directory -File
You can just copy the functions from the following module without needing any 3rd library installed:
https://github.com/AstralisSomnium/PowerShell-No-Library-Just-Functions/blob/master/FTPModule.ps1
I need to download a file from a website every hour . Right now I have...
$url = "https://www.misoenergy.org/ria/Consolidated.aspx?format=csv"
$path = "C:\MISO.csv"
# param([string]$url, [string]$path)
if(!(Split-Path -parent $path) -or !(Test-Path -pathType Container (Split-Path -parent $path))) {
$path = Join-Path $pwd (Split-Path -leaf $path)
}
"Downloading [$url]`nSaving at [$path]"
$client = new-object System.Net.WebClient
$client.DownloadFile($url, $path)
#$client.DownloadData($url, $path)
$path
PAUSE
This is getting the response from the website and prompting me to open or save the file. I just want it to save the file. Thank you for any help.
Try this:
$url = "https://www.misoenergy.org/ria/Consolidated.aspx?format=csv"
$data = Invoke-WebRequest -Uri $url
$path = "C:\MISO.csv"
$data.content | Out-file $path
Of course there is no error handling, or checking. It assumes that Invoke-WebRequest has completed successfully and your endpoint does return raw CSV data.
Use wget in my opinion.
The following works for me.
$url = "https://www.misoenergy.org/ria/Consolidated.aspx?format=csv"
$path = "d:\my_cs_folder\MISO.csv"
wget -OutFile $path -Uri $url
I removed the checks for readablity. Destination file Path is changed too. Be advised that you may receive permission denied for placing your file on system drive's root.
I've been attempting a similar thing for a while now and it's very annoying as most cases will not work correctly and sending keystrokes to the save button requires the window to be active, at least for as far as I can see.
Due to this if the computer is being used it will not work, I still need to figure out a way past this but until now I've gotten it to work like this.
$IE = New-Object -ComObject InternetExplorer.Application
$IE.Visible = $true
$IE.Navigate($url)
$IEProc = Get-Process | Where-Object {$_.MainWindowHandle -eq $IE.HWND}
$WS = New-Object -ComObject WScript.Shell
$WS.AppActivate($IEProc.id)
Start-Sleep 2
[Windows.Forms.SendKeys]::SendWait('%{s}')
I'm aware of it being very ugly but it does the job :(
I like to write a PowerShell script to download all files and subfolders from my FTP server. I found a script to download all files from one specific folder, but I also like to download the subfolders and their files.
#FTP Server Information - SET VARIABLES
$ftp = "ftp://ftp.abc.ch/"
$user = 'abc'
$pass = 'abc'
$folder = '/'
$target = "C:\LocalData\Powershell\"
#SET CREDENTIALS
$credentials = new-object System.Net.NetworkCredential($user, $pass)
function Get-FtpDir ($url,$credentials) {
$request = [Net.WebRequest]::Create($url)
$request.Method = [System.Net.WebRequestMethods+FTP]::ListDirectory
if ($credentials) { $request.Credentials = $credentials }
$response = $request.GetResponse()
$reader = New-Object IO.StreamReader $response.GetResponseStream()
$reader.ReadToEnd()
$reader.Close()
$response.Close()
}
#SET FOLDER PATH
$folderPath= $ftp + "/" + $folder + "/"
$Allfiles=Get-FTPDir -url $folderPath -credentials $credentials
$files = ($Allfiles -split "`r`n")
$files
$webclient = New-Object System.Net.WebClient
$webclient.Credentials = New-Object System.Net.NetworkCredential($user,$pass)
$counter = 0
foreach ($file in ($files | where {$_ -like "*.*"})){
$source=$folderPath + $file
$destination = $target + $file
$webclient.DownloadFile($source, $target+$file)
#PRINT FILE NAME AND COUNTER
$counter++
$counter
$source
}
Thanks for your help (:
The .NET framework or PowerShell do not have any explicit support for recursive file operations (including downloads). You have to implement the recursion yourself:
List the remote directory
Iterate the entries, downloading files and recursing into subdirectories (listing them again, etc.)
Tricky part is to identify files from subdirectories. There's no way to do that in a portable way with the .NET framework (FtpWebRequest or WebClient). The .NET framework unfortunately does not support the MLSD command, which is the only portable way to retrieve directory listing with file attributes in FTP protocol. See also Checking if object on FTP server is file or directory.
Your options are:
Do an operation on a file name that is certain to fail for file and succeeds for directories (or vice versa). I.e. you can try to download the "name". If that succeeds, it's a file, if that fails, it a directory.
You may be lucky and in your specific case, you can tell a file from a directory by a file name (i.e. all your files have an extension, while subdirectories do not)
You use a long directory listing (LIST command = ListDirectoryDetails method) and try to parse a server-specific listing. Many FTP servers use *nix-style listing, where you identify a directory by the d at the very beginning of the entry. But many servers use a different format. The following example uses this approach (assuming the *nix format)
function DownloadFtpDirectory($url, $credentials, $localPath)
{
$listRequest = [Net.WebRequest]::Create($url)
$listRequest.Method =
[System.Net.WebRequestMethods+Ftp]::ListDirectoryDetails
$listRequest.Credentials = $credentials
$lines = New-Object System.Collections.ArrayList
$listResponse = $listRequest.GetResponse()
$listStream = $listResponse.GetResponseStream()
$listReader = New-Object System.IO.StreamReader($listStream)
while (!$listReader.EndOfStream)
{
$line = $listReader.ReadLine()
$lines.Add($line) | Out-Null
}
$listReader.Dispose()
$listStream.Dispose()
$listResponse.Dispose()
foreach ($line in $lines)
{
$tokens = $line.Split(" ", 9, [StringSplitOptions]::RemoveEmptyEntries)
$name = $tokens[8]
$permissions = $tokens[0]
$localFilePath = Join-Path $localPath $name
$fileUrl = ($url + $name)
if ($permissions[0] -eq 'd')
{
if (!(Test-Path $localFilePath -PathType container))
{
Write-Host "Creating directory $localFilePath"
New-Item $localFilePath -Type directory | Out-Null
}
DownloadFtpDirectory ($fileUrl + "/") $credentials $localFilePath
}
else
{
Write-Host "Downloading $fileUrl to $localFilePath"
$downloadRequest = [Net.WebRequest]::Create($fileUrl)
$downloadRequest.Method =
[System.Net.WebRequestMethods+Ftp]::DownloadFile
$downloadRequest.Credentials = $credentials
$downloadResponse = $downloadRequest.GetResponse()
$sourceStream = $downloadResponse.GetResponseStream()
$targetStream = [System.IO.File]::Create($localFilePath)
$buffer = New-Object byte[] 10240
while (($read = $sourceStream.Read($buffer, 0, $buffer.Length)) -gt 0)
{
$targetStream.Write($buffer, 0, $read);
}
$targetStream.Dispose()
$sourceStream.Dispose()
$downloadResponse.Dispose()
}
}
}
Use the function like:
$credentials = New-Object System.Net.NetworkCredential("user", "mypassword")
$url = "ftp://ftp.example.com/directory/to/download/"
DownloadFtpDirectory $url $credentials "C:\target\directory"
The code is translated from my C# example in C# Download all files and subdirectories through FTP.
Though Microsoft does not recommend FtpWebRequest for a new development.
If you want to avoid troubles with parsing the server-specific directory listing formats, use a 3rd party library that supports the MLSD command and/or parsing various LIST listing formats; and recursive downloads.
For example with WinSCP .NET assembly you can download whole directory with a single call to Session.GetFiles:
# Load WinSCP .NET assembly
Add-Type -Path "WinSCPnet.dll"
# Setup session options
$sessionOptions = New-Object WinSCP.SessionOptions -Property #{
Protocol = [WinSCP.Protocol]::Ftp
HostName = "ftp.example.com"
UserName = "user"
Password = "mypassword"
}
$session = New-Object WinSCP.Session
try
{
# Connect
$session.Open($sessionOptions)
# Download files
$session.GetFiles("/directory/to/download/*", "C:\target\directory\*").Check()
}
finally
{
# Disconnect, clean up
$session.Dispose()
}
Internally, WinSCP uses the MLSD command, if supported by the server. If not, it uses the LIST command and supports dozens of different listing formats.
The Session.GetFiles method is recursive by default.
(I'm the author of WinSCP)
For retrieving files /folder from FTP via powerShell I wrote some functions, you can get even hidden stuff from FTP.
Example for getting all files and subfolders (even hidden ones) in a specific folder:
Get-FtpChildItem -ftpFolderPath "ftp://myHost.com/root/leaf/" -userName "User" -password "pw" -Directory -File
You can just copy the functions from the following module without needing any 3rd library installed:
https://github.com/AstralisSomnium/PowerShell-No-Library-Just-Functions/blob/master/FTPModule.ps1