I need to create a script that does the following:
Copies all files in a folder to an FTP site.
If the copy was successful move the files to an archive.
The archive should be a freshly created folder with today's date (so we know when they were transmitted).
I've tried to cannibalise other scripts to get something to work but I'm not getting anywhere so I need some help I've been working on this for hours.
I'm using the WinSCP DLL only because my other (working) script uses SFTP which needs it. I know normal FTP doesn't but I couldn't find any easily transferrable code so trying to modify that instead.
So, here's the code I have, which doesn't even run, never mind run properly, can someone help me get it right? Sorry it's a bit of a mess.
param (
$localPath = "c:\test\source",
$remotePath = "/upload",
$folder = ($_.CreationTime | Get-Date -Format yyyyMMdd)
# not sure this works but don't see how to point the destination
# to the newly created folder
$backupPath = "c:\test\destination\$folder"
)
try
{
# Load WinSCP .NET assembly
Add-Type -Path "C:\Windows\System32\WindowsPowerShell\v1.0\WinSCPnet.dll"
# Setup session options
$sessionOptions = New-Object WinSCP.SessionOptions -Property #{
Protocol = [WinSCP.Protocol]::ftp
HostName = "xxxxxxxx"
UserName = "xxxxxxxx"
Password = "xxxxxxxx"
}
$session = New-Object WinSCP.Session
try
{
# Connect
$session.Open($sessionOptions)
# Upload files, collect results
$transferResult = $session.PutFiles($localPath, $remotePath)
# Iterate over every transfer
foreach ($transfer in $transferResult.Transfers)
{
# Success or error?
if ($transfer.Error -eq $Null)
{
# If today's folder doesn't exist, create it
if (!(Test-Path $BackupPath))
{
New-Item -ItemType Directory -Force -Path $BackupPath
}
Write-Host ("Upload of {0} succeeded, moving to Uploaded folder" -f
$transfer.FileName)
# Upload succeeded, move source file to backup
Move-Item $transfer.FileName $backupPath
}
else
{
Write-Host ("Upload of {0} failed: {1}" -f
$transfer.FileName, $transfer.Error.Message)
}
}
}
finally
{
# Disconnect, clean up
$session.Dispose()
}
exit 0
}
catch [Exception]
{
Write-Host ("Error: {0}" -f $_.Exception.Message)
exit 1
}
So there's the code. I'm happy to use built in PowerShell for the FTP side to simplify it, I just want it to work.
I'm not sure what's your concern with the code. It looks pretty much ok, except for a syntax error, when setting $folder:
Why are you even trying to use $_.CreationTime as folder timestamp? Just use current date:
$folder = (Get-Date -Format "yyyyMMdd")
See Formatting timestamps in PowerShell in WinSCP documentation.
Also I do not see a point of setting $folder and $backupPath in the params block. Move it after the params block. If you want this anyway, you are missing a comma after the $folder assignment.
Other than that, your code should work.
You cannot simplify it by using the built-in PowerShell (or rather .NET) FTP functionality, as it does not have as powerful commands as WinSCP .NET assembly.
I'd write the code as:
$localPath = "C:\source\local\path\*"
$remotePath = "/dest/remote/path/"
$folder = (Get-Date -Format "yyyyMMdd")
$backupPath = "C:\local\backup\path\$folder"
# If today's folder doesn't exist, create it
if (!(Test-Path $BackupPath))
{
New-Item -ItemType Directory -Force -Path $BackupPath | Out-Null
}
try
{
# Load WinSCP .NET assembly
Add-Type -Path "WinSCPnet.dll"
# Setup session options
$sessionOptions = New-Object WinSCP.SessionOptions -Property #{
Protocol = [WinSCP.Protocol]::ftp
HostName = "ftp.example.com"
UserName = "username"
Password = "password"
}
$session = New-Object WinSCP.Session
try
{
# Connect
$session.Open($sessionOptions)
# Upload files, collect results
$transferResult = $session.PutFiles($localPath, $remotePath)
# Iterate over every transfer
foreach ($transfer in $transferResult.Transfers)
{
# Success or error?
if ($transfer.Error -eq $Null)
{
Write-Host ("Upload of $($transfer.FileName) succeeded, " +
"moving to backup")
# Upload succeeded, move source file to backup
Move-Item $transfer.FileName $backupPath
}
else
{
Write-Host ("Upload of $($transfer.FileName) failed: " +
"$($transfer.Error.Message)")
}
}
}
finally
{
# Disconnect, clean up
$session.Dispose()
}
exit 0
}
catch [Exception]
{
Write-Host "Error: $($_.Exception.Message)"
exit 1
}
Based on Moving local files to different location after successful upload.
Related
I am using the following slightly modified script (from https://winscp.net/eng/docs/script_local_move_after_successful_upload) to upload files to an SFTP site on AWS...and in a PRD env I will have to have this run through approx 0.5M small files...
param (
$localPath = "C:\FTP\*.DAT",
$remotePath = "/",
$backupPath = "C:\FTP\Complete\"
)
try
{
# Load WinSCP .NET assembly
#Add-Type -Path "WinSCPnet.dll"
$ScriptPath = $(Split-Path -Parent $MyInvocation.MyCommand.Definition)
[Reflection.Assembly]::UnsafeLoadFrom( $(Join-Path $ScriptPath "WinSCPnet.dll") ) | Out-Null
# Setup session options
$sessionOptions = New-Object WinSCP.SessionOptions -Property #{
Protocol = [WinSCP.Protocol]::Sftp
HostName = "somewhere.com"
UserName = "user"
SshHostKeyFingerprint = "ssh-rsa 2048 XXXXXXXI"
SshPrivateKeyPath = "C:\blahblah.ppk"
}
$session = New-Object WinSCP.Session
$transferOptions = New-Object WinSCP.TransferOptions
# Look to ignore any file property change errors
$transferOptions.FilePermissions = $Null # This is default
$transferOptions.PreserveTimestamp = $False
try
{
# Connect
$session.Open($sessionOptions)
# Upload files, collect results
$transferResult = $session.PutFiles($localPath, $remotePath)
# Iterate over every transfer
foreach ($transfer in $transferResult.Transfers)
{
# Success or error?
if ($Null -eq $transfer.Error)
{
Write-Host "Upload of $($transfer.FileName) succeeded, moving to backup"
# Upload succeeded, move source file to backup
Move-Item $transfer.FileName $backupPath
}
else
{
Write-Host "Upload of $($transfer.FileName) failed: $($transfer.Error.Message)"
}
}
}
finally
{
# Disconnect, clean up
$session.Dispose()
}
exit 0
}
catch
{
Write-Host "Error: $($_.Exception.Message)"
exit 1
}
The scripts works great, but unfortunately only loads one file before disconnecting. The script fails if there is an error loading any of the files, so it is working as expected.
But the error I am getting, is
"Upload of file 'somefile.DAT' was successful, but error occurred while setting the permissions and/or timestamp.
If the problem persists, turn off setting permissions or preserving timestamp. Alternatively you can turn on 'Ignore permission errors' option.
The server does not support the operation.
Error code: 8
Error message from server (US-ASCII): SETSTAT unsupported"
I think I have the following settings possibly configured incorrectly, but I'm not sure what I am doing wrong here....thoughts?
$transferOptions.FilePermissions = $Null # This is default
$transferOptions.PreserveTimestamp = $False
I've actually managed to get this to work by modifying the session and transfer options..
$session.Open($sessionOptions)
$transferOptions = New-Object WinSCP.TransferOptions
# Look to ignore any file property change errors
$transferOptions.PreserveTimeStamp = $false
$transferOptions.FilePermissions = $Null
$transferOptions.AddRawSettings("IgnorePermErrors", "1")
# Upload files, collect results
$transferResult = $session.PutFiles($localPath, $remotePath, $False, $transferOptions)
With PowerShell language and WinSCP I'm trying to create a script that daily check an SFTP remote directory to see if there are more than 4 files into it.
If there are less than 4 files it's okay but if there are more that 4 files it will output an error message.
Thanks to WinSCP, the connexion is automatically created and I can below connect into the SFTP-Server:
& "C:\Program Files (x86)\WinSCP\WinSCP.com"
/log="C:\Users\scripts\WinSCP.log" /ini=nul
/command
"open sftp://..."
"cd" `
"cd ./my remote directory"
#"ls *.csv"
#"exit"
$winscpResult = $LastExitCode
if ($winscpResult -eq 0)
{
Write-Host "Success"
}
else
{
Write-Host "Error"
}
exit $winscpResult
I don't know if I have to do this through script or .NET assembly language:
# Load WinSCP .NET assembly
Add-Type -Path "WinSCPnet.dll"
# Set up session options
$sessionOptions = New-Object WinSCP.SessionOptions -Property #{
Protocol = [WinSCP.Protocol]::Sftp
HostName = ""
UserName = ""
Password = ""
SshHostKeyFingerprint = ""
TimeoutInMilliseconds = 60000
}
$session = New-Object WinSCP.Session
try
{
# Connect
$session.Open($sessionOptions)
# Your code
}
finally
{
$session.Dispose()
}
I however currently don't know how to perform the condition.
When the script is done the goal would be to use it into Jenkins to run it daily.
Could you help me to build the check condition with and else? Thanks!
With the WinSCP .NET assembly, it's trivial:
$files =
$session.ListDirectory($remotePath).Files |
Where-Object { -Not $_.IsDirectory }
$count = $files.Count
if ($count -gt 4)
{
Write-Host "There are more than 4 files in $remotePath"
}
I finally modified the script from Martin and now it cans successfully check if there are less or more than 4 files in my directory:
$remotePath = "/my-directory"
$files = $session.ListDirectory($remotePath).Files | Where-Object { -Not $_.IsDirectory }
$count = $files.Count
if ($count -le 4)
{
Write-Host "There are less than 4 files into the"$remotePath" directory. All good!"
}
else
{
Write-Host "There are more than 4 files into the"$remotePath" directory. Please check!"
}
I actually needed the .Files argument after $session.ListDirectory.
Thanks!
Trying to run the following script. Connect to a ftp / sftp server and get the files from a remotepath to a localpath
Since there are a list of sources and destinations, I decided to create a csv file and pass the value using $.Source and $.Destination
Session.getfiles () seems to not be working when I import my csv file.
But if I hard code the path, it seems to work.
$session.GetFiles("c:\source\test", "c:\destination\", $False, $transferOptions)
Goal: Script reads the list of Sources and move files according to its destination. Also go back to a # amount of days and downdload the latest file -
Error: "No such File" or "cant get attributes of file"
try{
Add-Type -Path "WinSCPnet.dll"
# Setup session options
$sessionOptions = New-Object WinSCP.SessionOptions -Property #{
Protocol = [WinSCP.Protocol]::sftp
HostName = "xxxxx"
UserName = "xxxxr"
Password = "xxxxxxx"
PortNumber="xx"
FTPMode="Active"
GiveUpSecurityAndAcceptAnySshHostKey = $true
}
$session = New-Object WinSCP.Session
try
{
# Connect
$session.Open($sessionOptions)
# Download files
$transferOptions = New-Object WinSCP.TransferOptions
$transferOptions.TransferMode = [WinSCP.TransferMode]::Binary
Import-Csv -Path "C:\movefiles.csv" -ErrorAction Stop
$transferResult =
$session.GetFiles($_.Source, $_.Destination, $False, $transferOptions)
# Throw on any error
$transferResult.Check()
# Print results
foreach ($transfer in $transferResult.Transfers)
{
Write-Host "Download of $($transfer.FileName) succeeded"
}
}
finally
{
# Disconnect, clean up
$session.Dispose()
}
#exit 0
}
catch
{
Write-Host "Error: $($_.Exception.Message)"
#exit 1
}
By commenting the line out
$transferresult.check()
Script seems to work. But still missing how to get only new files
I am using the following script to copy files from my local folder to SFTP. Once I upload the file I then want to move the file to a subfolder. I want to upload only files in the C:\Users\Administrator\Desktop\ftp folder, and not files in the other subfolders.
param (
$backupPath = "C:\Users\Administrator\Desktop\ftp\moved"
)
# Load the Assembly and setup the session properties
try
{
# Load WinSCP .NET assembly
Add-Type -Path "C:\Program Files (x86)\WinSCP\WinSCPnet.dll"
$session = New-Object WinSCP.Session
$filelist = Get-ChildItem C:\Users\Administrator\Desktop\ftp
# Connect And send files, then close session
try
{
# Connect
$session.Open($sessionOptions)
$transferOptions = New-Object WinSCP.TransferOptions
$transferOptions.TransferMode = [WinSCP.TransferMode]::Binary
foreach ($file in $filelist)
{
$transferResult = $session.PutFiles("C:\Users\Administrator\Desktop\ftp\$file", "/", $False, $transferOptions)
foreach ($transfer in $transferResult.Transfers)
{
Write-Host "Upload of $($transfer.FileName) succeeded"
Move-Item $transfer.FileName $backupPath
}
}
}
finally
{
# Disconnect, clean up
$session.Dispose()
}
exit 0
}
# Catch any errors
catch
{
Write-Host "Error: $($_.Exception.Message)"
exit 1
}
At this moment, if I run the script all files under the moved folder will also get uploaded through SFTP folder, and I just need to upload the files in the root directory of the ftp folder.
I guess I will need to change this line here, but not sure on how to change it.
$filelist = Get-ChildItem C:\Users\Administrator\Desktop\ftp
To skip the folders add -File switch to the Get-ChildItem (as already commented by #Scepticalist):
$filelist = Get-ChildItem -File C:\Users\Administrator\Desktop\ftp
Though you can achieve the same with less code, if you let WinSCP iterate the files (and skip the folders):
$transferOptions = New-Object WinSCP.TransferOptions
$transferOptions.TransferMode = [WinSCP.TransferMode]::Binary
$transferOptions.FileMask = "|*/" # Exclude the folders
$transferResult = $session.PutFiles(
"C:\Users\Administrator\Desktop\ftp\*", "/", $False, $transferOptions)
foreach ($transfer in $transferResult.Transfers)
{
Write-Host "Upload of $($transfer.FileName) succeeded"
Move-Item $transfer.FileName $backupPath
}
And you do not need any Get-ChildItem call.
The above is basically the code from WinSCP article Moving local files to different location after successful upload, just with an exclusion of subfolders.
Though note that your code (contrary to the article) does not have a test for a successful upload. So you will move even files that fail to upload. Make sure you add the $transfer.Error -eq $Null test.
foreach ($transfer in $transferResult.Transfers)
{
# Success or error?
if ($transfer.Error -eq $Null)
{
Write-Host "Upload of $($transfer.FileName) succeeded, moving to backup"
# Upload succeeded, move source file to backup
Move-Item $transfer.FileName $backupPath
}
else
{
Write-Host "Upload of $($transfer.FileName) failed: $($transfer.Error.Message)"
}
}
try to use
Get-Item C:\Users\Administrator\Desktop\ftp -include *.*
this should not get into subfolders.
I would like to be able to monitor the changes in a folder for a short period of time when a lot of files will be created and other changes will be made.
The code below is working but doesn't catch all the changes.
$folder = ’C:\Data’
$timeout = 1000
$FileSystemWatcher = New-Object System.IO.FileSystemWatcher $folder
Write-Host ”Press CTRL+C to abort monitoring $folder”
while ($true) {
$result = $FileSystemWatcher.WaitForChanged(‘all’, $timeout)
if ($result.TimedOut -eq $false)
{
Write-Warning (‘File {0} : {1}’ -f $result.ChangeType, $result.name)
}
}
Write-Host ’Monitoring aborted.’
If I use this on C:\Data it works but:
I create a .txt and it says 2 times new text document.txt.
Then I fill the name for the new txt and it outputs it 3 times. Or the other way around.
Please see below the output of
creating a hello.txt in my folder
creating a new folder with the name HiThere and
then renaming hello.txt to someTxt.txt
then deleting them both
output:
`Press CTRL+C to abort monitoring C:\Data
WARNING: File Created : New Text Document.txt
WARNING: File Changed : New Text Document.txt
WARNING: File Changed : New Text Document.txt
WARNING: File Changed : hello.txt
WARNING: File Created : New folder
WARNING: File Renamed : HiThere
WARNING: File Renamed : someTxt.txt
WARNING: File Changed : someTxt.txt
WARNING: File Changed : someTxt.txt
WARNING: File Changed : someTxt.txt
WARNING: File Deleted : someTxt.txt
WARNING: File Deleted : HiThere`
More problems: If I use this on a newtwork drive then not all the changes are being catched. (And this would be the point of this script, to monitor a folder from a mapped drive).
Test the code on your machine by only changing the folder path.
Using Powershell ISE 3.0
Insted of the while($true) loop, have you tried the "Register-ObjectEvent"?
I just tested one of my script using this method and could easily take 2000 empty files (generated in powershell). Unfortunatly, this was on a local machine.
Instructions: Define function like normal and off you go.
The command you use is:
Start-BackupScript -WatchFolder "C:\temp\my watch folder\" -DestinationFolder "C:\temp\backup\"
The script now monitors "C:\temp\my watch folder\" for new files created in that specific folder and will move them to "C:\temp\backup\". It will also append the date and time to the file.
Lets say you have started the script with the parameters above. You now place "hello_world.txt" in the watch folder. The script will move the file to "C:\temp\backup\" with the new filename being: "hello_world_2016-02-10_10-00-00.txt"
The script runs in the background. If you want to know how it's doing, then use the command:
Get-Job $backupscript -Keep
There you can see what it has been doing when. Please note that the -Keep parameter keeps the output in the "log", so you can check it later.
Script:
function Start-BackupScript
{
[CmdletBinding()]
Param
(
[Parameter()]
[String]$WatchFolder,
[Parameter()]
[String]$DestinationFolder
)
Process
{
$filter = '*.*'
$fsw = New-Object IO.FileSystemWatcher $WatchFolder, $filter -Property #{IncludeSubdirectories = $false;NotifyFilter = [IO.NotifyFilters]'FileName, LastWrite'}
$action = {
$fileMissing = $false
$FileInUseMessage = $false
$copied = $false
$file = Get-Item $Args.FullPath
$dateString = Get-Date -format "_yyyy-MM-dd_HH-mm-ss"
$DestinationFolder = $event.MessageData
$DestinationFileName = $file.basename + $dateString + $file.extension
$resultfilename = Join-Path $DestinationFolder $DestinationFileName
Write-Output ""
while(!$copied) {
try {
Move-Item -Path $file.FullName -Destination $resultfilename -ErrorAction Stop
$copied = $true
}
catch [System.IO.IOException] {
if(!$FileInUseMessage) {
Write-Output "$(Get-Date -Format "yyyy-MM-dd # HH:mm:ss") - $file in use. Waiting to move file"
$FileInUseMessage = $true
}
Start-Sleep -s 1
}
catch [System.Management.Automation.ItemNotFoundException] {
$fileMissing = $true
$copied = $true
}
}
if($fileMissing) {
Write-Output "$(Get-Date -Format "yyyy-MM-dd # HH:mm:ss") - $file not found!"
} else {
Write-Output "$(Get-Date -Format "yyyy-MM-dd # HH:mm:ss") - Moved $file to backup! `n`tFilename: `"$resultfilename`""
}
}
$backupscript = Register-ObjectEvent -InputObject $fsw -EventName "Created" -Action $action -MessageData $DestinationFolder
Write-Host "Started. WatchFolder: `"$($WatchFolder)`" DestinationFolder: `"$($DestinationFolder)`". Job is in: `$backupscript"
}
}
Have you looked at Register-WMIEvent?
Something like this:
Register-WmiEvent -Query "SELECT * FROM __InstanceModificationEvent WITHIN 5 WHERE TargetInstance ISA 'CIM_DataFile' and TargetInstance.Path = '\\Users\\Administrator\\' and targetInstance.Drive = 'C:' and (targetInstance.Extension = 'txt' or targetInstance.Extension = 'doc' or targetInstance.Extension = 'rtf') and targetInstance.LastAccessed > '$($cur)' " -sourceIdentifier "Accessor" -Action $action `
You can monitor a folder and then specific extensions within the folder. And then you set up a PowerShell scriptblock to handle any accesses. There's more here on this if you're interested: https://blog.varonis.com/practical-powershell-for-it-security-part-i-file-event-monitoring/
Agree with some of the comments above that file monitoring is not reliable -- lags and hiccups. Anywyay, hope this helps.