An exception occurred during a WebClient request (Powershell) - powershell

I'm trying to copy a directory folder from our HTTP server using Powershell, I would like to copy it's entire contents including subfolders into the local drive of my current server. The point of this is for server deployment automation so that my boss can run my powershell script and have an entire server setup with all our folders copied to its C: drive. This is the code I have
$source = "http://servername/serverupdates/deploy/Program%20Files/"
$destination = "C:\Program Files"
$client = new-object System.Net.WebClient
$client.DownloadFile($source, $destination)
When I run the script in Powershell ISE as admin, I get the error message
"Exception calling "DownloadFile" with "2" argument(s): "An exception occurred during a WebClient request."
Any suggestions on what could be going on?
I have also tried this block of code, but nothing happens when I run it, no errors or anything.
$source = "http://serverName/serverupdates/deploy/Program%20Files/"
$webclient = New-Object system.net.webclient
$destination = "c:/users/administrator/desktop/test/"
Function Copy-Folder([string]$source, [string]$destination, [bool]$recursive) {
if (!$(Test-Path($destination))) {
New-Item $destination -type directory -Force
}
# Get the file list from the web page
$webString = $webClient.DownloadString($source)
$lines = [Regex]::Split($webString, "<br>")
# Parse each line, looking for files and folders
foreach ($line in $lines) {
if ($line.ToUpper().Contains("HREF")) {
# File or Folder
if (!$line.ToUpper().Contains("[TO PARENT DIRECTORY]")) {
# Not Parent Folder entry
$items =[Regex]::Split($line, """")
$items = [Regex]::Split($items[2], "(>|<)")
$item = $items[2]
if ($line.ToLower().Contains("<dir&gt")) {
# Folder
if ($recursive) {
# Subfolder copy required
Copy-Folder "$source$item/" "$destination$item/" $recursive
} else {
# Subfolder copy not required
}
} else {
# File
$webClient.DownloadFile("$source$item", "$destination$item")
}
}
}
}
}

System.Net.WebClient.DownloadFile expects the second parameter to be a filename, not a directory. It can't download a directory recursively, it can only download a single file.
For the second part, run it line by line and see what happens. But parsing HTML to get paths is prone to error and is generally advised against.
My advice: Don't use http for this. Copy the stuff from a file share, it's only one line and saves you a lot of trouble. If you have to use http, download an archive and extract it in the target directory.

Beside #Gerald Schneider's answer, beware that the same WebException might also occur if the client process does not have needed permission to create output file.
I would suggest you to take the following strategy:
Download file to a unique filename with .tmp (.txt) extensions Windows Temporary Folder to avoid write-permission and other permissions issues
Move temporary file to destination folder
Rename temporary file to destination filename
Hope it helps :-)

In addition to the other answers, the error might occur if you've ran out of disk space.

Related

PowerShell: Error when reading files that are open in other programs

Hello PowerShell Experts,
The script snippet below works when adding files to Zip file. However, if the file to be added is open in another program then it fails with exception, "The process cannot access the file[..]". I tried using [IO.FileShare]::ReadWrite but no success yet.
Any suggestion as to how to open the files for reading and writing to zip regardless whether the file is open in another program or not?
Script Source
# write entries with relative paths as names
foreach ($fname in $FullFilenames) {
$rname = $(Resolve-Path -Path $fname -Relative) -replace '\.\\',''
Write-Output $rname
$zentry = $zip.CreateEntry($rname)
$zentryWriter = New-Object -TypeName System.IO.BinaryWriter $zentry.Open()
$zentryWriter.Write([System.IO.File]::ReadAllBytes($fname)) #FAILS HERE
$zentryWriter.Flush()
$zentryWriter.Close()
}
Since we're missing some important part of your code, I'll just assume what might work in this case, and following assumptions based on your comments.
First you would open the file with FileShare.ReadWrite:
$handle = [System.IO.File]::Open($fname, 'Open', 'Read', 'ReadWrite')
Then you should be able to use the .CopyTo(Stream) method from FileStream:
$zentry = $zip.CreateEntry($rname)
$zstream = $zentry.Open()
$handle.CopyTo($zstream)
$zstream.Flush()
$zstream.Dispose()
$handle.Dispose()

Powershell - Download the latest FTP files from Ftp server [duplicate]

I am working on a PowerShell script, which will pull files from an FTP site. The files are uploaded to the FTP site every hour so I need to download the most recent one. The code I currently have downloads all the files from today instead of just one file. How do I make it download only the most recent file?
Here is the code that I am currently using
$ftpPath = 'ftp://***.***.*.*'
$ftpUser = '******'
$ftpPass = '******'
$localPath = 'C:\Temp'
$Date = get-date -Format "ddMMyyyy"
$Files = 'File1', 'File2'
function Get-FtpDir ($url, $credentials)
{
$request = [Net.FtpWebRequest]::Create($url)
if ($credentials) { $request.Credentials = $credentials }
$request.Method = [System.Net.WebRequestMethods+FTP]::ListDirectory
(New-Object IO.StreamReader $request.GetResponse().GetResponseStream()) -split "`r`n"
}
$webclient = New-Object System.Net.WebClient
$webclient.Credentials = New-Object System.Net.NetworkCredential($ftpUser,$ftpPass)
$webclient.BaseAddress = $ftpPath
Foreach ( $item in $Files )
{
Get-FTPDir $ftpPath $webclient.Credentials |
? { $_ -Like $item+$Date+'*' } |
% {
$webClient.DownloadFile($_, (Join-Path $localPath $_))
}
}
It's not easy with the FtpWebRequest. For your task, you need to know file timestamps.
Unfortunately, there's no really reliable and efficient way to retrieve timestamps using features offered by FtpWebRequest/.NET framework/PowerShell as they do not support an FTP MLSD command. The MLSD command provides listing of remote directory in a standardized machine-readable format. The command and the format is standardized by RFC 3659.
Alternatives which you can use, that are supported by .NET framework:
ListDirectoryDetails method (an FTP LIST command) to retrieve details of all files in a directory and then you deal with FTP server specific format of the details (*nix format similar to ls *nix command is the most common, drawback is that the format may change over time, as for newer files "May 8 17:48" format is used and for older files "Oct 18 2009" format is used)
GetDateTimestamp method (an FTP MDTM command) to individually retrieve timestamps for each file. Advantage is that the response is standardized by RFC 3659 to YYYYMMDDHHMMSS[.sss]. Disadvantage is that you have to send a separate request for each file, what can be quite inefficient.
Some references:
C# class to parse WebRequestMethods.Ftp.ListDirectoryDetails FTP response
Parsing FtpWebRequest ListDirectoryDetails line
Retrieving creation date of file (FTP)
Alternatively, use a 3rd party FTP library that supports the MLSD command, and/or supports parsing of the proprietary listing format.
For example WinSCP .NET assembly supports both.
An example code:
# Load WinSCP .NET assembly
Add-Type -Path "WinSCPnet.dll"
# Setup session options
$sessionOptions = New-Object WinSCP.SessionOptions -Property #{
Protocol = [WinSCP.Protocol]::Ftp
HostName = "example.com"
UserName = "user"
Password = "mypassword"
}
$session = New-Object WinSCP.Session
# Connect
$session.Open($sessionOptions)
# Get list of files in the directory
$directoryInfo = $session.ListDirectory($remotePath)
# Select the most recent file
$latest =
$directoryInfo.Files |
Where-Object { -Not $_.IsDirectory } |
Sort-Object LastWriteTime -Descending |
Select-Object -First 1
# Any file at all?
if ($latest -eq $Null)
{
Write-Host "No file found"
exit 1
}
# Download the selected file
$sourcePath = [WinSCP.RemotePath]::EscapeFileMask($remotePath + $latest.Name)
$session.GetFiles($sourcePath, $localPath).Check()
For a full code, see Downloading the most recent file (PowerShell).
(I'm the author of WinSCP)
I tried this, but i get an error:
Error: Exception calling "ListDirectory" with "1" argument(s): "Error listing directory '/path/'.
Could not retrieve directory listing
Can't open data connection for transfer of "/path/"
I read a lot about this problem on the internet, but could not find a solution which seemed fairly simple, and I am not a network setup wizard. So I choose a different approach. In our case the filename of the file which I want to automate the download for, has the date specified in it: backup_2018_08_03_020003_1048387.bak
So we can get the file by using mget *2018_08_03* in a command line ftp session.
Our backup procedure is run every morning at 01.00 AM, so we have a backup each day that we can fetch.
Of course it would have been prettier and nicer to have a script that fetched the latest backup file based on the backup file timestamps, just in case that something went wrong with the latest backup or the backup file naming format changes. The script is just a script to fetch the backup for internal development purposes so its not a big deal if it breaks. I will look into this later and check whether i can make a cleaner solution.
I made a batch script which just asks for todays backup file with the ordinary ftp command prompt scripting.
It is important to get the formatting of todays date right. It must match the formatting of the date in the filename correctly.
If you want to use the script you should replace the variables with your own information. You should also have write access to the directory where you run it from.
This is the script that I made:
#Echo Off
Set _FTPServerName=xxx.xxx.xx.xxx
Set _UserName=Username
Set _Password=Password
Set _LocalFolder=C:\Temp
Set _RemoteFolder="/path/"
Set _Filename=*%date:~-4,4%_%date:~-7,2%_%date:~-10,2%*
Set _ScriptFile=ftptempscript
:: Create script
>"%_ScriptFile%" Echo open %_FTPServerName%
>>"%_ScriptFile%" Echo %_UserName%
>>"%_ScriptFile%" Echo %_Password%
>>"%_ScriptFile%" Echo lcd %_LocalFolder%
>>"%_ScriptFile%" Echo cd %_RemoteFolder%
>>"%_ScriptFile%" Echo binary
>>"%_ScriptFile%" Echo mget -i %_Filename%
>>"%_ScriptFile%" Echo quit
:: Run script
ftp -s:"%_ScriptFile%"
del "%_ScriptFile%"

Error in PowerShell due to copying the content of an S3 bucket

I copy the content of an S3 bucket to a local directory, however I get an error output from the powershell.
Copy-S3Object : The requested range is not satisfiable
It is pointing to this command:
Copy-S3Object -BucketName $bucket -Key $object.Key -LocalFile $localFilePath -Region $region
Why do I get this error ? Note that the desired files that are needed to be copied do indeed get copied locally.
I can't say why you are getting that error returned from S3, but I can tell you that if you are copying multiple objects you probably want to use the -LocalFolder parameter, not -LocalFile. -LocalFolder will preserve the prefixes as subpaths.
When downloading one or more objects from S3, the Read-S3Object cmdlet works the same as Copy-S3Object, but uses -KeyPrefix to specify the common prefix the objects share, and -Folder to indicate the folder they should be downloaded to.
This also reminds me I need to check why we used -LocalFolder on Copy-, and -Folder on Read- although I suspect aliases may also be available to make them consistent.
HTH
(Edit): I spent some time this morning reviewing the cmdlet code and it doesn't appear to me the cmdlet would work as-is on a multi-object download, even though it has a -LocalFolder parameter. If you have a single object to download, then using -Key/-LocalFile is the correct parameter combination. If -LocalFolder is passed, the cmdlet sets up internally to do a single file download instead of treating -Key as a common key prefix to a set of objects. So, I think we have a bug here that I'm looking into.
In the meantime, I would use Read-S3Object to do your downloads. It supports both single (-Key) or multi-object download (-KeyPrefix) modes. https://docs.aws.amazon.com/powershell/latest/reference/index.html?page=Read-S3Object.html&tocid=Read-S3Object
this seems to occur with folders that do not contain files since copy wants to copy files.
i accepted this error and trapped it.
catch [Amazon.S3.AmazonS3Exception]
{
# get error record
[Management.Automation.ErrorRecord]$e = $_
# retrieve information about runtime error
$info = [PSCustomObject]#{
Exception = $e.Exception.Message
Reason = $e.CategoryInfo.Reason
Target = $e.CategoryInfo.TargetName
Script = $e.InvocationInfo.ScriptName
Line = $e.InvocationInfo.ScriptLineNumber
Column = $e.InvocationInfo.OffsetInLine
ErrorCode = $e.Exception.ErrorCode
}
if ($info.ErrorCode="InvalidRange") { #do nothing
} Else {
# output information. Post-process collected info, and log info (optional)
write-host $info -ForegroundColor Red}
}
}
This happened to me when I tried to download the file which had more than one dots in it. Simplifying the file name, fixed the error.
File name that gave me error: myfile-18.10.exe
File name that worked: myfile-1810.exe

Powershell script for copying and logging

I have search for similar answers to this and still I am going round in a circle(s).
I am new to any form of scripting so this is a bastardised script. The script is basically copying log files and data from locations to a remote server and making an append log each time it does it but for the life in me I cant get it to work over the network only local, by changing the $dirname = "D:\${env:computername}".
I would appreciate any feed back and help. This came about from a batch file I created and thought to try and progress in the dark arts.
The script is going to be scheduled to run task when a machines connects to the network.
thanks in advance
update
I get no output or error message from the log file at all no txt or data of any type, As for error messages I am trying to copy from local to server in a vm scenario and will not run, but if I apply this on the local machine it will copy c to d no problem. as I said complete novice
missing function body in function declaration
at line:2 char1
<<<<c:script\copy_log.ps1
+categoryinfo : parser error: (:) []. ParentContainsErrorRecordException
+FullyQualifiedErrorId : MissingFunctionBody
Apologies for format had to type it as I can c+p from the unit
UPDATE
figured out that the share to the other server was not shared correctly fixed this but the script still does not create a log file
function CopyLogFiles ($sourcePackage) { #used this syntax as I couldn't get anything else to work and took it from here
$dirName = "\\server\$sourcePackage" #server it is going to
if (!(Test-Path $dirName)) { mkdir $dirName }
Copy-Item -Path "C:\Program Files (x86)\ESS-T\$sourcePackage\Logs" -Destination $dirName -Recurse -Force
}
CopyLogFiles AppLauncher_V2.0.0.7
CopyLogFiles MMA_V2.0.0.12
CopyLogFiles MML_V2.0.0.4
CopyLogFiles SerialDataReader_V2.0.0.5
function Log-Write {
Param ([string]$LogString)
Add-Content $LogFile -value $LogString
}
$LogFile = "C:\Program Files (x86)\ESS-T\.log"
Don't reinvent the wheel. Copy-Item is convenient for small cases, but Windows has had robocopy included with every install since Windows 7 and it's faster, more robust, and has logging built in with the /log:FILENAME switch.
https://technet.microsoft.com/en-us/library/cc733145.aspx
Go ahead and test for the existence of your destination & create it manually in your PowerShell script, but leave the logging of the copy operation to robocopy.
Edit: You aren't creating the logfile because you don't define the logfile name until after the rest of your code runs.

Moving (not copying) remote files after download with WinSCP .NET assembly

I have this script that downloads all .txt and .log files. But I need to move them to another directory on the server after the download.
So far I just keep getting errors like "cannot move "file" to "/file".
try
{
# Load WinSCP .NET assembly
Add-Type -Path "C:\Program Files (x86)\WinSCP\WinSCPnet.dll"
# Setup session options
$sessionOptions = New-Object WinSCP.SessionOptions
$sessionOptions.Protocol = [WinSCP.Protocol]::ftp
$sessionOptions.HostName = "host"
$sessionOptions.PortNumber = "port"
$sessionOptions.UserName = "user"
$sessionOptions.Password = "pass"
$session = New-Object WinSCP.Session
try
{
# Connect
$session.DisableVersionCheck = "true"
$session.Open($sessionOptions)
$localPath = "C:\users\user\desktop\file"
$remotePath = "/"
$fileName = "*.txt"
$fileNamee = "*.log"
$remotePath2 = "/completed"
$directoryInfo = $session.ListDirectory($remotePath)
$directoryInfo = $session.ListDirectory($remotePath2)
# Download the file
$session.GetFiles(($remotePath + $fileName), $localPath).Check()
$session.GetFiles(($remotePath + $fileNamee), $localPath).Check()
$session.MoveFile(($remotePath + $fileName, $remotePath2)).Check()
$session.MoveFile(($remotePath + $fileNamee, $remotePath2)).Check()
}
finally
{
# Disconnect, clean up
$session.Dispose()
}
exit 0
}
catch [Exception]
{
Write-Host $_.Exception.Message
exit 1
}
You have many problems in your code:
The targetPath argument of the Session.MoveFile method is a path to move/rename the file to.
So, if you use the target path /complete, you are trying to move the file to a root folder and rename it to the complete. While you probably want to move the file to folder the /complete, and keep its name. For that use the target path /complete/ (or the /complete/* to make it more obvious).
Your current code fails, because you are renaming the file to a name of an already existing folder.
You actually have the same bug in the .GetFiles. You are downloading all files (both *.txt and *.log) to the folder C:\users\user\desktop and save them all to the same name file, overwriting one another.
You have brackets incorrectly around both arguments, instead of around the first argument only. While I'm no PowerShell expert, I'd actually say you are omitting the second argument of the method completely this way.
Further, note that the MoveFile method does not return anything (contrary to the GetFiles). So there's no object to call the .Check() method on.
The MoveFile (note the singular, comparing to the GetFiles), moves only a single file. So you should not use a file mask. Actually the present implementation allows a use of the file mask, but this use is undocumented and may be deprecated in future versions.
Anyway, the best solution is to iterate the list of actually downloaded files, as returned by the GetFiles and move the files one by one.
This way you avoid race condition, where you download set of files, new files are added (which you didn't download) and you incorrectly move them to the "completed" folder.
The code should look like (for the first set of files only, i.e. the *.txt):
$remotePath2 = "/completed/"
...
$transferResult = $session.GetFiles(($remotePath + $fileName), $localPath)
$transferResult.Check()
foreach ($transfer in $transferResult.Transfers)
{
$session.MoveFile($transfer.FileName, $remotePath2)
}
Note that this does not include a fix for the $localPath, as I'm not sure, what the path C:\users\user\desktop\file actually mean.
There's actually a very similar sample code available:
Moving local files to different location after successful upload
Have you checked to make sure your process has rights to move files to the new directory?
I am doing what Martin suggest in here with success.
But I have been stuck for sometimes.
After running "session.MoveFile()", the file in origin folder is gone, but it is not showing in destination folder.
The files will showing to destination after "session" disposed automatically after some time period (around 30 minutes I guess).
To avoid this confusion, dispose the session.
Like this :
session.Dispose();
I know this is trivial, but I hope that you don't fall to same problem.