PowerShell: Error when reading files that are open in other programs - powershell

Hello PowerShell Experts,
The script snippet below works when adding files to Zip file. However, if the file to be added is open in another program then it fails with exception, "The process cannot access the file[..]". I tried using [IO.FileShare]::ReadWrite but no success yet.
Any suggestion as to how to open the files for reading and writing to zip regardless whether the file is open in another program or not?
Script Source
# write entries with relative paths as names
foreach ($fname in $FullFilenames) {
$rname = $(Resolve-Path -Path $fname -Relative) -replace '\.\\',''
Write-Output $rname
$zentry = $zip.CreateEntry($rname)
$zentryWriter = New-Object -TypeName System.IO.BinaryWriter $zentry.Open()
$zentryWriter.Write([System.IO.File]::ReadAllBytes($fname)) #FAILS HERE
$zentryWriter.Flush()
$zentryWriter.Close()
}

Since we're missing some important part of your code, I'll just assume what might work in this case, and following assumptions based on your comments.
First you would open the file with FileShare.ReadWrite:
$handle = [System.IO.File]::Open($fname, 'Open', 'Read', 'ReadWrite')
Then you should be able to use the .CopyTo(Stream) method from FileStream:
$zentry = $zip.CreateEntry($rname)
$zstream = $zentry.Open()
$handle.CopyTo($zstream)
$zstream.Flush()
$zstream.Dispose()
$handle.Dispose()

Related

Powershell - Download the latest FTP files from Ftp server [duplicate]

I am working on a PowerShell script, which will pull files from an FTP site. The files are uploaded to the FTP site every hour so I need to download the most recent one. The code I currently have downloads all the files from today instead of just one file. How do I make it download only the most recent file?
Here is the code that I am currently using
$ftpPath = 'ftp://***.***.*.*'
$ftpUser = '******'
$ftpPass = '******'
$localPath = 'C:\Temp'
$Date = get-date -Format "ddMMyyyy"
$Files = 'File1', 'File2'
function Get-FtpDir ($url, $credentials)
{
$request = [Net.FtpWebRequest]::Create($url)
if ($credentials) { $request.Credentials = $credentials }
$request.Method = [System.Net.WebRequestMethods+FTP]::ListDirectory
(New-Object IO.StreamReader $request.GetResponse().GetResponseStream()) -split "`r`n"
}
$webclient = New-Object System.Net.WebClient
$webclient.Credentials = New-Object System.Net.NetworkCredential($ftpUser,$ftpPass)
$webclient.BaseAddress = $ftpPath
Foreach ( $item in $Files )
{
Get-FTPDir $ftpPath $webclient.Credentials |
? { $_ -Like $item+$Date+'*' } |
% {
$webClient.DownloadFile($_, (Join-Path $localPath $_))
}
}
It's not easy with the FtpWebRequest. For your task, you need to know file timestamps.
Unfortunately, there's no really reliable and efficient way to retrieve timestamps using features offered by FtpWebRequest/.NET framework/PowerShell as they do not support an FTP MLSD command. The MLSD command provides listing of remote directory in a standardized machine-readable format. The command and the format is standardized by RFC 3659.
Alternatives which you can use, that are supported by .NET framework:
ListDirectoryDetails method (an FTP LIST command) to retrieve details of all files in a directory and then you deal with FTP server specific format of the details (*nix format similar to ls *nix command is the most common, drawback is that the format may change over time, as for newer files "May 8 17:48" format is used and for older files "Oct 18 2009" format is used)
GetDateTimestamp method (an FTP MDTM command) to individually retrieve timestamps for each file. Advantage is that the response is standardized by RFC 3659 to YYYYMMDDHHMMSS[.sss]. Disadvantage is that you have to send a separate request for each file, what can be quite inefficient.
Some references:
C# class to parse WebRequestMethods.Ftp.ListDirectoryDetails FTP response
Parsing FtpWebRequest ListDirectoryDetails line
Retrieving creation date of file (FTP)
Alternatively, use a 3rd party FTP library that supports the MLSD command, and/or supports parsing of the proprietary listing format.
For example WinSCP .NET assembly supports both.
An example code:
# Load WinSCP .NET assembly
Add-Type -Path "WinSCPnet.dll"
# Setup session options
$sessionOptions = New-Object WinSCP.SessionOptions -Property #{
Protocol = [WinSCP.Protocol]::Ftp
HostName = "example.com"
UserName = "user"
Password = "mypassword"
}
$session = New-Object WinSCP.Session
# Connect
$session.Open($sessionOptions)
# Get list of files in the directory
$directoryInfo = $session.ListDirectory($remotePath)
# Select the most recent file
$latest =
$directoryInfo.Files |
Where-Object { -Not $_.IsDirectory } |
Sort-Object LastWriteTime -Descending |
Select-Object -First 1
# Any file at all?
if ($latest -eq $Null)
{
Write-Host "No file found"
exit 1
}
# Download the selected file
$sourcePath = [WinSCP.RemotePath]::EscapeFileMask($remotePath + $latest.Name)
$session.GetFiles($sourcePath, $localPath).Check()
For a full code, see Downloading the most recent file (PowerShell).
(I'm the author of WinSCP)
I tried this, but i get an error:
Error: Exception calling "ListDirectory" with "1" argument(s): "Error listing directory '/path/'.
Could not retrieve directory listing
Can't open data connection for transfer of "/path/"
I read a lot about this problem on the internet, but could not find a solution which seemed fairly simple, and I am not a network setup wizard. So I choose a different approach. In our case the filename of the file which I want to automate the download for, has the date specified in it: backup_2018_08_03_020003_1048387.bak
So we can get the file by using mget *2018_08_03* in a command line ftp session.
Our backup procedure is run every morning at 01.00 AM, so we have a backup each day that we can fetch.
Of course it would have been prettier and nicer to have a script that fetched the latest backup file based on the backup file timestamps, just in case that something went wrong with the latest backup or the backup file naming format changes. The script is just a script to fetch the backup for internal development purposes so its not a big deal if it breaks. I will look into this later and check whether i can make a cleaner solution.
I made a batch script which just asks for todays backup file with the ordinary ftp command prompt scripting.
It is important to get the formatting of todays date right. It must match the formatting of the date in the filename correctly.
If you want to use the script you should replace the variables with your own information. You should also have write access to the directory where you run it from.
This is the script that I made:
#Echo Off
Set _FTPServerName=xxx.xxx.xx.xxx
Set _UserName=Username
Set _Password=Password
Set _LocalFolder=C:\Temp
Set _RemoteFolder="/path/"
Set _Filename=*%date:~-4,4%_%date:~-7,2%_%date:~-10,2%*
Set _ScriptFile=ftptempscript
:: Create script
>"%_ScriptFile%" Echo open %_FTPServerName%
>>"%_ScriptFile%" Echo %_UserName%
>>"%_ScriptFile%" Echo %_Password%
>>"%_ScriptFile%" Echo lcd %_LocalFolder%
>>"%_ScriptFile%" Echo cd %_RemoteFolder%
>>"%_ScriptFile%" Echo binary
>>"%_ScriptFile%" Echo mget -i %_Filename%
>>"%_ScriptFile%" Echo quit
:: Run script
ftp -s:"%_ScriptFile%"
del "%_ScriptFile%"

How do I run a PowerShell script on a file from the context menu?

I have written a PS script, which replaces a specific string at the beginning of the file, adds another piece of string to the end of the file, and finally it puts out an XML.
My code might be ugly (I am not a programmer/engineer or anything, just trying to make life easier for some family members who are running a small business), but it works:
$content = Get-Content -Path 'C:\Users\blabla\Desktop\4440341930.txt'
$newContent = $content -replace 'text to be replaced','this is going to replace stuff'
$newContent | Set-Content -Path 'C:\Users\blabla\Desktop\4440341930.txt'
Add-Content C:\Users\blabla\Desktop\4440341930.txt '</Items>'
$x = [xml](Get-Content "C:\Users\blabla\Desktop\4440341930.txt")
$x.Save("C:\Users\blabla\Desktop\4440341930.xml")
I would like them to be able to run this script from the context menu, by right clicking on a txt file. I did a little research and I kind of get what I have to add to Registry, however, I'm not sure how to make it work. Since the path of each file that they are going to right click on is going to be different, the path that I'm specifying in $content is not going to work.
What do I have to modify in my code to be able to add it to the Registry?
To accomplish this you need to:
Create a Shortcut in the SendTo Folder: "$DestinationPath\AppData\Roaming\Microsoft\Windows\SendTo"
The target: "C:\Windows\System32\WindowsPowerShell\v1.0\powershell.exe"
The Arguments: -File "d:\path\your PS1 file"
In your program read the file name passed by Explorer as:
Param
(
[Parameter(Mandatory=$false)]
[String] $FilePath
)
I've written a Setup Function that accomplishes steps 1-3 that I include in all my programs that I want on the context menu and then just run the program with the -Setup switch. We're not supposed to post developed code here, but if you can't figure it out let me know and I'll post it and hope I don't get killed for it. LOL!
UPDATE:
If you want to pass more than one file you need to process the files a little differently. Delete the Param block above and then use this type of code to retrieve the files.
If ($Args.count -eq 0) {
$Message = "No Files were passed from File Explorer."
[Void]$MsgBox::Show(
"$Message","System Exit",$Buttons::OK, $MBIcons::Stop)
Show-PowerShell
Exit #Comment out for testing from ISE!
}
Else {
$FilesToCopy = $Args
}

Find the last modified file on FTP site using powershell [duplicate]

I am working on a PowerShell script, which will pull files from an FTP site. The files are uploaded to the FTP site every hour so I need to download the most recent one. The code I currently have downloads all the files from today instead of just one file. How do I make it download only the most recent file?
Here is the code that I am currently using
$ftpPath = 'ftp://***.***.*.*'
$ftpUser = '******'
$ftpPass = '******'
$localPath = 'C:\Temp'
$Date = get-date -Format "ddMMyyyy"
$Files = 'File1', 'File2'
function Get-FtpDir ($url, $credentials)
{
$request = [Net.FtpWebRequest]::Create($url)
if ($credentials) { $request.Credentials = $credentials }
$request.Method = [System.Net.WebRequestMethods+FTP]::ListDirectory
(New-Object IO.StreamReader $request.GetResponse().GetResponseStream()) -split "`r`n"
}
$webclient = New-Object System.Net.WebClient
$webclient.Credentials = New-Object System.Net.NetworkCredential($ftpUser,$ftpPass)
$webclient.BaseAddress = $ftpPath
Foreach ( $item in $Files )
{
Get-FTPDir $ftpPath $webclient.Credentials |
? { $_ -Like $item+$Date+'*' } |
% {
$webClient.DownloadFile($_, (Join-Path $localPath $_))
}
}
It's not easy with the FtpWebRequest. For your task, you need to know file timestamps.
Unfortunately, there's no really reliable and efficient way to retrieve timestamps using features offered by FtpWebRequest/.NET framework/PowerShell as they do not support an FTP MLSD command. The MLSD command provides listing of remote directory in a standardized machine-readable format. The command and the format is standardized by RFC 3659.
Alternatives which you can use, that are supported by .NET framework:
ListDirectoryDetails method (an FTP LIST command) to retrieve details of all files in a directory and then you deal with FTP server specific format of the details (*nix format similar to ls *nix command is the most common, drawback is that the format may change over time, as for newer files "May 8 17:48" format is used and for older files "Oct 18 2009" format is used)
GetDateTimestamp method (an FTP MDTM command) to individually retrieve timestamps for each file. Advantage is that the response is standardized by RFC 3659 to YYYYMMDDHHMMSS[.sss]. Disadvantage is that you have to send a separate request for each file, what can be quite inefficient.
Some references:
C# class to parse WebRequestMethods.Ftp.ListDirectoryDetails FTP response
Parsing FtpWebRequest ListDirectoryDetails line
Retrieving creation date of file (FTP)
Alternatively, use a 3rd party FTP library that supports the MLSD command, and/or supports parsing of the proprietary listing format.
For example WinSCP .NET assembly supports both.
An example code:
# Load WinSCP .NET assembly
Add-Type -Path "WinSCPnet.dll"
# Setup session options
$sessionOptions = New-Object WinSCP.SessionOptions -Property #{
Protocol = [WinSCP.Protocol]::Ftp
HostName = "example.com"
UserName = "user"
Password = "mypassword"
}
$session = New-Object WinSCP.Session
# Connect
$session.Open($sessionOptions)
# Get list of files in the directory
$directoryInfo = $session.ListDirectory($remotePath)
# Select the most recent file
$latest =
$directoryInfo.Files |
Where-Object { -Not $_.IsDirectory } |
Sort-Object LastWriteTime -Descending |
Select-Object -First 1
# Any file at all?
if ($latest -eq $Null)
{
Write-Host "No file found"
exit 1
}
# Download the selected file
$sourcePath = [WinSCP.RemotePath]::EscapeFileMask($remotePath + $latest.Name)
$session.GetFiles($sourcePath, $localPath).Check()
For a full code, see Downloading the most recent file (PowerShell).
(I'm the author of WinSCP)
I tried this, but i get an error:
Error: Exception calling "ListDirectory" with "1" argument(s): "Error listing directory '/path/'.
Could not retrieve directory listing
Can't open data connection for transfer of "/path/"
I read a lot about this problem on the internet, but could not find a solution which seemed fairly simple, and I am not a network setup wizard. So I choose a different approach. In our case the filename of the file which I want to automate the download for, has the date specified in it: backup_2018_08_03_020003_1048387.bak
So we can get the file by using mget *2018_08_03* in a command line ftp session.
Our backup procedure is run every morning at 01.00 AM, so we have a backup each day that we can fetch.
Of course it would have been prettier and nicer to have a script that fetched the latest backup file based on the backup file timestamps, just in case that something went wrong with the latest backup or the backup file naming format changes. The script is just a script to fetch the backup for internal development purposes so its not a big deal if it breaks. I will look into this later and check whether i can make a cleaner solution.
I made a batch script which just asks for todays backup file with the ordinary ftp command prompt scripting.
It is important to get the formatting of todays date right. It must match the formatting of the date in the filename correctly.
If you want to use the script you should replace the variables with your own information. You should also have write access to the directory where you run it from.
This is the script that I made:
#Echo Off
Set _FTPServerName=xxx.xxx.xx.xxx
Set _UserName=Username
Set _Password=Password
Set _LocalFolder=C:\Temp
Set _RemoteFolder="/path/"
Set _Filename=*%date:~-4,4%_%date:~-7,2%_%date:~-10,2%*
Set _ScriptFile=ftptempscript
:: Create script
>"%_ScriptFile%" Echo open %_FTPServerName%
>>"%_ScriptFile%" Echo %_UserName%
>>"%_ScriptFile%" Echo %_Password%
>>"%_ScriptFile%" Echo lcd %_LocalFolder%
>>"%_ScriptFile%" Echo cd %_RemoteFolder%
>>"%_ScriptFile%" Echo binary
>>"%_ScriptFile%" Echo mget -i %_Filename%
>>"%_ScriptFile%" Echo quit
:: Run script
ftp -s:"%_ScriptFile%"
del "%_ScriptFile%"

An exception occurred during a WebClient request (Powershell)

I'm trying to copy a directory folder from our HTTP server using Powershell, I would like to copy it's entire contents including subfolders into the local drive of my current server. The point of this is for server deployment automation so that my boss can run my powershell script and have an entire server setup with all our folders copied to its C: drive. This is the code I have
$source = "http://servername/serverupdates/deploy/Program%20Files/"
$destination = "C:\Program Files"
$client = new-object System.Net.WebClient
$client.DownloadFile($source, $destination)
When I run the script in Powershell ISE as admin, I get the error message
"Exception calling "DownloadFile" with "2" argument(s): "An exception occurred during a WebClient request."
Any suggestions on what could be going on?
I have also tried this block of code, but nothing happens when I run it, no errors or anything.
$source = "http://serverName/serverupdates/deploy/Program%20Files/"
$webclient = New-Object system.net.webclient
$destination = "c:/users/administrator/desktop/test/"
Function Copy-Folder([string]$source, [string]$destination, [bool]$recursive) {
if (!$(Test-Path($destination))) {
New-Item $destination -type directory -Force
}
# Get the file list from the web page
$webString = $webClient.DownloadString($source)
$lines = [Regex]::Split($webString, "<br>")
# Parse each line, looking for files and folders
foreach ($line in $lines) {
if ($line.ToUpper().Contains("HREF")) {
# File or Folder
if (!$line.ToUpper().Contains("[TO PARENT DIRECTORY]")) {
# Not Parent Folder entry
$items =[Regex]::Split($line, """")
$items = [Regex]::Split($items[2], "(>|<)")
$item = $items[2]
if ($line.ToLower().Contains("<dir&gt")) {
# Folder
if ($recursive) {
# Subfolder copy required
Copy-Folder "$source$item/" "$destination$item/" $recursive
} else {
# Subfolder copy not required
}
} else {
# File
$webClient.DownloadFile("$source$item", "$destination$item")
}
}
}
}
}
System.Net.WebClient.DownloadFile expects the second parameter to be a filename, not a directory. It can't download a directory recursively, it can only download a single file.
For the second part, run it line by line and see what happens. But parsing HTML to get paths is prone to error and is generally advised against.
My advice: Don't use http for this. Copy the stuff from a file share, it's only one line and saves you a lot of trouble. If you have to use http, download an archive and extract it in the target directory.
Beside #Gerald Schneider's answer, beware that the same WebException might also occur if the client process does not have needed permission to create output file.
I would suggest you to take the following strategy:
Download file to a unique filename with .tmp (.txt) extensions Windows Temporary Folder to avoid write-permission and other permissions issues
Move temporary file to destination folder
Rename temporary file to destination filename
Hope it helps :-)
In addition to the other answers, the error might occur if you've ran out of disk space.

Moving (not copying) remote files after download with WinSCP .NET assembly

I have this script that downloads all .txt and .log files. But I need to move them to another directory on the server after the download.
So far I just keep getting errors like "cannot move "file" to "/file".
try
{
# Load WinSCP .NET assembly
Add-Type -Path "C:\Program Files (x86)\WinSCP\WinSCPnet.dll"
# Setup session options
$sessionOptions = New-Object WinSCP.SessionOptions
$sessionOptions.Protocol = [WinSCP.Protocol]::ftp
$sessionOptions.HostName = "host"
$sessionOptions.PortNumber = "port"
$sessionOptions.UserName = "user"
$sessionOptions.Password = "pass"
$session = New-Object WinSCP.Session
try
{
# Connect
$session.DisableVersionCheck = "true"
$session.Open($sessionOptions)
$localPath = "C:\users\user\desktop\file"
$remotePath = "/"
$fileName = "*.txt"
$fileNamee = "*.log"
$remotePath2 = "/completed"
$directoryInfo = $session.ListDirectory($remotePath)
$directoryInfo = $session.ListDirectory($remotePath2)
# Download the file
$session.GetFiles(($remotePath + $fileName), $localPath).Check()
$session.GetFiles(($remotePath + $fileNamee), $localPath).Check()
$session.MoveFile(($remotePath + $fileName, $remotePath2)).Check()
$session.MoveFile(($remotePath + $fileNamee, $remotePath2)).Check()
}
finally
{
# Disconnect, clean up
$session.Dispose()
}
exit 0
}
catch [Exception]
{
Write-Host $_.Exception.Message
exit 1
}
You have many problems in your code:
The targetPath argument of the Session.MoveFile method is a path to move/rename the file to.
So, if you use the target path /complete, you are trying to move the file to a root folder and rename it to the complete. While you probably want to move the file to folder the /complete, and keep its name. For that use the target path /complete/ (or the /complete/* to make it more obvious).
Your current code fails, because you are renaming the file to a name of an already existing folder.
You actually have the same bug in the .GetFiles. You are downloading all files (both *.txt and *.log) to the folder C:\users\user\desktop and save them all to the same name file, overwriting one another.
You have brackets incorrectly around both arguments, instead of around the first argument only. While I'm no PowerShell expert, I'd actually say you are omitting the second argument of the method completely this way.
Further, note that the MoveFile method does not return anything (contrary to the GetFiles). So there's no object to call the .Check() method on.
The MoveFile (note the singular, comparing to the GetFiles), moves only a single file. So you should not use a file mask. Actually the present implementation allows a use of the file mask, but this use is undocumented and may be deprecated in future versions.
Anyway, the best solution is to iterate the list of actually downloaded files, as returned by the GetFiles and move the files one by one.
This way you avoid race condition, where you download set of files, new files are added (which you didn't download) and you incorrectly move them to the "completed" folder.
The code should look like (for the first set of files only, i.e. the *.txt):
$remotePath2 = "/completed/"
...
$transferResult = $session.GetFiles(($remotePath + $fileName), $localPath)
$transferResult.Check()
foreach ($transfer in $transferResult.Transfers)
{
$session.MoveFile($transfer.FileName, $remotePath2)
}
Note that this does not include a fix for the $localPath, as I'm not sure, what the path C:\users\user\desktop\file actually mean.
There's actually a very similar sample code available:
Moving local files to different location after successful upload
Have you checked to make sure your process has rights to move files to the new directory?
I am doing what Martin suggest in here with success.
But I have been stuck for sometimes.
After running "session.MoveFile()", the file in origin folder is gone, but it is not showing in destination folder.
The files will showing to destination after "session" disposed automatically after some time period (around 30 minutes I guess).
To avoid this confusion, dispose the session.
Like this :
session.Dispose();
I know this is trivial, but I hope that you don't fall to same problem.