Moving (not copying) remote files after download with WinSCP .NET assembly - powershell

I have this script that downloads all .txt and .log files. But I need to move them to another directory on the server after the download.
So far I just keep getting errors like "cannot move "file" to "/file".
try
{
# Load WinSCP .NET assembly
Add-Type -Path "C:\Program Files (x86)\WinSCP\WinSCPnet.dll"
# Setup session options
$sessionOptions = New-Object WinSCP.SessionOptions
$sessionOptions.Protocol = [WinSCP.Protocol]::ftp
$sessionOptions.HostName = "host"
$sessionOptions.PortNumber = "port"
$sessionOptions.UserName = "user"
$sessionOptions.Password = "pass"
$session = New-Object WinSCP.Session
try
{
# Connect
$session.DisableVersionCheck = "true"
$session.Open($sessionOptions)
$localPath = "C:\users\user\desktop\file"
$remotePath = "/"
$fileName = "*.txt"
$fileNamee = "*.log"
$remotePath2 = "/completed"
$directoryInfo = $session.ListDirectory($remotePath)
$directoryInfo = $session.ListDirectory($remotePath2)
# Download the file
$session.GetFiles(($remotePath + $fileName), $localPath).Check()
$session.GetFiles(($remotePath + $fileNamee), $localPath).Check()
$session.MoveFile(($remotePath + $fileName, $remotePath2)).Check()
$session.MoveFile(($remotePath + $fileNamee, $remotePath2)).Check()
}
finally
{
# Disconnect, clean up
$session.Dispose()
}
exit 0
}
catch [Exception]
{
Write-Host $_.Exception.Message
exit 1
}

You have many problems in your code:
The targetPath argument of the Session.MoveFile method is a path to move/rename the file to.
So, if you use the target path /complete, you are trying to move the file to a root folder and rename it to the complete. While you probably want to move the file to folder the /complete, and keep its name. For that use the target path /complete/ (or the /complete/* to make it more obvious).
Your current code fails, because you are renaming the file to a name of an already existing folder.
You actually have the same bug in the .GetFiles. You are downloading all files (both *.txt and *.log) to the folder C:\users\user\desktop and save them all to the same name file, overwriting one another.
You have brackets incorrectly around both arguments, instead of around the first argument only. While I'm no PowerShell expert, I'd actually say you are omitting the second argument of the method completely this way.
Further, note that the MoveFile method does not return anything (contrary to the GetFiles). So there's no object to call the .Check() method on.
The MoveFile (note the singular, comparing to the GetFiles), moves only a single file. So you should not use a file mask. Actually the present implementation allows a use of the file mask, but this use is undocumented and may be deprecated in future versions.
Anyway, the best solution is to iterate the list of actually downloaded files, as returned by the GetFiles and move the files one by one.
This way you avoid race condition, where you download set of files, new files are added (which you didn't download) and you incorrectly move them to the "completed" folder.
The code should look like (for the first set of files only, i.e. the *.txt):
$remotePath2 = "/completed/"
...
$transferResult = $session.GetFiles(($remotePath + $fileName), $localPath)
$transferResult.Check()
foreach ($transfer in $transferResult.Transfers)
{
$session.MoveFile($transfer.FileName, $remotePath2)
}
Note that this does not include a fix for the $localPath, as I'm not sure, what the path C:\users\user\desktop\file actually mean.
There's actually a very similar sample code available:
Moving local files to different location after successful upload

Have you checked to make sure your process has rights to move files to the new directory?

I am doing what Martin suggest in here with success.
But I have been stuck for sometimes.
After running "session.MoveFile()", the file in origin folder is gone, but it is not showing in destination folder.
The files will showing to destination after "session" disposed automatically after some time period (around 30 minutes I guess).
To avoid this confusion, dispose the session.
Like this :
session.Dispose();
I know this is trivial, but I hope that you don't fall to same problem.

Related

PowerShell: Error when reading files that are open in other programs

Hello PowerShell Experts,
The script snippet below works when adding files to Zip file. However, if the file to be added is open in another program then it fails with exception, "The process cannot access the file[..]". I tried using [IO.FileShare]::ReadWrite but no success yet.
Any suggestion as to how to open the files for reading and writing to zip regardless whether the file is open in another program or not?
Script Source
# write entries with relative paths as names
foreach ($fname in $FullFilenames) {
$rname = $(Resolve-Path -Path $fname -Relative) -replace '\.\\',''
Write-Output $rname
$zentry = $zip.CreateEntry($rname)
$zentryWriter = New-Object -TypeName System.IO.BinaryWriter $zentry.Open()
$zentryWriter.Write([System.IO.File]::ReadAllBytes($fname)) #FAILS HERE
$zentryWriter.Flush()
$zentryWriter.Close()
}
Since we're missing some important part of your code, I'll just assume what might work in this case, and following assumptions based on your comments.
First you would open the file with FileShare.ReadWrite:
$handle = [System.IO.File]::Open($fname, 'Open', 'Read', 'ReadWrite')
Then you should be able to use the .CopyTo(Stream) method from FileStream:
$zentry = $zip.CreateEntry($rname)
$zstream = $zentry.Open()
$handle.CopyTo($zstream)
$zstream.Flush()
$zstream.Dispose()
$handle.Dispose()

Powershell: Path as input parameter is cropped [duplicate]

This question already has an answer here:
How to access a PSDrive from System.IO.File calls?
(1 answer)
Closed 1 year ago.
I have a powershell-script that I want to execute through the cammand-line. For this I need to input paths to xml files. Its better to show the problem.
Error: File not Found
I have all my files in the same directory C:\Users\fynne\test (reason for that are the users I'm writing this script for) but somehow the test directory is skipped while loading the xml (C:\Users\fynne\01-38-029.xml not found). I have no idea why this happens.
This is how I load the xml.
$PatientA = $args[0]
$PatientB = $args[1]
$XmlA = New-Object System.XML.XMLDocument
$XmlA.Load($PatientA)
$XmlB = New-Object System.XML.XMLDocument
$XmlB.Load($PatientB)
Does somebody knows why and has a fix for it? I know that I can fix it by using some string manipulation and $pwd however I rather prefer not to.
Thx
PowerShell will attempt to resolve non-absolute paths relative to the current provider location - but .NET methods (like XmlDocument.Load()) will resolve them relative to the working directory of the current process instead.
You can manually convert a relative path to an absolute one with Convert-Path:
$PatientA,$PatientB = $args[0..1] |Convert-Path
$XmlA = New-Object System.XML.XMLDocument
$XmlA.Load($PatientA)
$XmlB = New-Object System.XML.XMLDocument
$XmlB.Load($PatientB)
If you want to attempt further validation of the path, you can also use Resolve-Path which includes provider metadata:
$PatientA = $args[0] |Resolve-Path
if($PatientA.Provider.Name -ne 'FileSystem'){
throw 'First path was not a file system path...'
return
}

An exception occurred during a WebClient request (Powershell)

I'm trying to copy a directory folder from our HTTP server using Powershell, I would like to copy it's entire contents including subfolders into the local drive of my current server. The point of this is for server deployment automation so that my boss can run my powershell script and have an entire server setup with all our folders copied to its C: drive. This is the code I have
$source = "http://servername/serverupdates/deploy/Program%20Files/"
$destination = "C:\Program Files"
$client = new-object System.Net.WebClient
$client.DownloadFile($source, $destination)
When I run the script in Powershell ISE as admin, I get the error message
"Exception calling "DownloadFile" with "2" argument(s): "An exception occurred during a WebClient request."
Any suggestions on what could be going on?
I have also tried this block of code, but nothing happens when I run it, no errors or anything.
$source = "http://serverName/serverupdates/deploy/Program%20Files/"
$webclient = New-Object system.net.webclient
$destination = "c:/users/administrator/desktop/test/"
Function Copy-Folder([string]$source, [string]$destination, [bool]$recursive) {
if (!$(Test-Path($destination))) {
New-Item $destination -type directory -Force
}
# Get the file list from the web page
$webString = $webClient.DownloadString($source)
$lines = [Regex]::Split($webString, "<br>")
# Parse each line, looking for files and folders
foreach ($line in $lines) {
if ($line.ToUpper().Contains("HREF")) {
# File or Folder
if (!$line.ToUpper().Contains("[TO PARENT DIRECTORY]")) {
# Not Parent Folder entry
$items =[Regex]::Split($line, """")
$items = [Regex]::Split($items[2], "(>|<)")
$item = $items[2]
if ($line.ToLower().Contains("<dir&gt")) {
# Folder
if ($recursive) {
# Subfolder copy required
Copy-Folder "$source$item/" "$destination$item/" $recursive
} else {
# Subfolder copy not required
}
} else {
# File
$webClient.DownloadFile("$source$item", "$destination$item")
}
}
}
}
}
System.Net.WebClient.DownloadFile expects the second parameter to be a filename, not a directory. It can't download a directory recursively, it can only download a single file.
For the second part, run it line by line and see what happens. But parsing HTML to get paths is prone to error and is generally advised against.
My advice: Don't use http for this. Copy the stuff from a file share, it's only one line and saves you a lot of trouble. If you have to use http, download an archive and extract it in the target directory.
Beside #Gerald Schneider's answer, beware that the same WebException might also occur if the client process does not have needed permission to create output file.
I would suggest you to take the following strategy:
Download file to a unique filename with .tmp (.txt) extensions Windows Temporary Folder to avoid write-permission and other permissions issues
Move temporary file to destination folder
Rename temporary file to destination filename
Hope it helps :-)
In addition to the other answers, the error might occur if you've ran out of disk space.

How do I query a file on FTP server in PowerShell to determine if an upload is required?

The project is an MVC website coded and built using VS2017 and (on premises) TFS2017. The Build Definition is currently working and publishing to the staging location upon check-in.
The PowerShell script below, derived from David Kittle's website, is being used but it uploads all files every time. I abbreviated the listing using comments to focus on the part of the script for which I'd like to ask for help/guidance.
# Setup the FTP connection, destination URL and local source directory
# Put the folders and files to upload into $Srcfolders and $SrcFiles
# Create destination folders as required
# start file uploads
foreach($entry in $SrcFiles)
{
    #Create full destination filename from $entry and put it into $DesFile
    $uri = New-Object System.Uri($DesFile)
    #NEED TO GET THE REMOTE FILE DATA HERE TO TEST AGAINST THE LOCAL FILE
If (#perform a test to see if the file needs to be uploaded)
{ $webclient.UploadFile($uri, $SrcFullname) }
}
In the last few lines of the script above I need to determine if a source file requires upload. I am assuming I can check the time stamp to determine this. So;
If my assumption is wrong, please advise the best way to check for a required upload.
If my assumption is correct, how do I (1) retrieve the time stamp from the remote server and then (2) make the check against the local file?
You can use the FtpWebRequest class with its GetDateTimestamp FTP "method" and parse the UTC timestamp string it returns. The format is specified by RFC 3659 to be YYYYMMDDHHMMSS[.sss].
That would work only if the FTP server supports MDTM command that the method uses under the cover (most servers do, but not all).
$url = "ftp://ftp.example.com/remote/folder/file.txt"
$ftprequest = [System.Net.FtpWebRequest]::Create($url)
$ftprequest.Method = [System.Net.WebRequestMethods+Ftp]::GetDateTimestamp
$response = $ftprequest.GetResponse().StatusDescription
$tokens = $response.Split(" ")
$code = $tokens[0]
if ($code -eq 213)
{
Write-Host "Timestamp is $($tokens[1])"
}
else
{
Write-Host "Error $response"
}
It would output something like:
Timestamp is 20171019230712
Now you parse it, and compare against a UTC timestamp of a local file:
(Get-Item "file.txt").LastWriteTimeUtc
Or save yourself some time and use an FTP library/tool that can do this for you.
For example with WinSCP .NET assembly, you can synchronize whole local folder with a remote folder with one call to the Session.SynchronizeDirectories. Or your can limit the synchronization to a set of files only.
# Load WinSCP .NET assembly
Add-Type -Path "WinSCPnet.dll"
# Setup session options
$sessionOptions = New-Object WinSCP.SessionOptions
$sessionOptions.Protocol = [WinSCP.Protocol]::Ftp
$sessionOptions.HostName = "ftpsite.com"
$session = New-Object WinSCP.Session
# Connect
$session.Open($sessionOptions)
$result = $session.SynchronizeDirectories(
[WinSCP.SynchronizationMode]::Remote, "C:\local\folder", "/remote/folder")
$result.Check()
To use the assembly, just extract a contents of .NET assembly package to your script folder. No other installation is needed.
The assembly supports not only the MDTM, but also other alternative methods to retrieve the timestamp.
(I'm the author of WinSCP)

Error in PowerShell due to copying the content of an S3 bucket

I copy the content of an S3 bucket to a local directory, however I get an error output from the powershell.
Copy-S3Object : The requested range is not satisfiable
It is pointing to this command:
Copy-S3Object -BucketName $bucket -Key $object.Key -LocalFile $localFilePath -Region $region
Why do I get this error ? Note that the desired files that are needed to be copied do indeed get copied locally.
I can't say why you are getting that error returned from S3, but I can tell you that if you are copying multiple objects you probably want to use the -LocalFolder parameter, not -LocalFile. -LocalFolder will preserve the prefixes as subpaths.
When downloading one or more objects from S3, the Read-S3Object cmdlet works the same as Copy-S3Object, but uses -KeyPrefix to specify the common prefix the objects share, and -Folder to indicate the folder they should be downloaded to.
This also reminds me I need to check why we used -LocalFolder on Copy-, and -Folder on Read- although I suspect aliases may also be available to make them consistent.
HTH
(Edit): I spent some time this morning reviewing the cmdlet code and it doesn't appear to me the cmdlet would work as-is on a multi-object download, even though it has a -LocalFolder parameter. If you have a single object to download, then using -Key/-LocalFile is the correct parameter combination. If -LocalFolder is passed, the cmdlet sets up internally to do a single file download instead of treating -Key as a common key prefix to a set of objects. So, I think we have a bug here that I'm looking into.
In the meantime, I would use Read-S3Object to do your downloads. It supports both single (-Key) or multi-object download (-KeyPrefix) modes. https://docs.aws.amazon.com/powershell/latest/reference/index.html?page=Read-S3Object.html&tocid=Read-S3Object
this seems to occur with folders that do not contain files since copy wants to copy files.
i accepted this error and trapped it.
catch [Amazon.S3.AmazonS3Exception]
{
# get error record
[Management.Automation.ErrorRecord]$e = $_
# retrieve information about runtime error
$info = [PSCustomObject]#{
Exception = $e.Exception.Message
Reason = $e.CategoryInfo.Reason
Target = $e.CategoryInfo.TargetName
Script = $e.InvocationInfo.ScriptName
Line = $e.InvocationInfo.ScriptLineNumber
Column = $e.InvocationInfo.OffsetInLine
ErrorCode = $e.Exception.ErrorCode
}
if ($info.ErrorCode="InvalidRange") { #do nothing
} Else {
# output information. Post-process collected info, and log info (optional)
write-host $info -ForegroundColor Red}
}
}
This happened to me when I tried to download the file which had more than one dots in it. Simplifying the file name, fixed the error.
File name that gave me error: myfile-18.10.exe
File name that worked: myfile-1810.exe