This question already has an answer here:
How to access a PSDrive from System.IO.File calls?
(1 answer)
Closed 1 year ago.
I have a powershell-script that I want to execute through the cammand-line. For this I need to input paths to xml files. Its better to show the problem.
Error: File not Found
I have all my files in the same directory C:\Users\fynne\test (reason for that are the users I'm writing this script for) but somehow the test directory is skipped while loading the xml (C:\Users\fynne\01-38-029.xml not found). I have no idea why this happens.
This is how I load the xml.
$PatientA = $args[0]
$PatientB = $args[1]
$XmlA = New-Object System.XML.XMLDocument
$XmlA.Load($PatientA)
$XmlB = New-Object System.XML.XMLDocument
$XmlB.Load($PatientB)
Does somebody knows why and has a fix for it? I know that I can fix it by using some string manipulation and $pwd however I rather prefer not to.
Thx
PowerShell will attempt to resolve non-absolute paths relative to the current provider location - but .NET methods (like XmlDocument.Load()) will resolve them relative to the working directory of the current process instead.
You can manually convert a relative path to an absolute one with Convert-Path:
$PatientA,$PatientB = $args[0..1] |Convert-Path
$XmlA = New-Object System.XML.XMLDocument
$XmlA.Load($PatientA)
$XmlB = New-Object System.XML.XMLDocument
$XmlB.Load($PatientB)
If you want to attempt further validation of the path, you can also use Resolve-Path which includes provider metadata:
$PatientA = $args[0] |Resolve-Path
if($PatientA.Provider.Name -ne 'FileSystem'){
throw 'First path was not a file system path...'
return
}
Related
Hello PowerShell Experts,
The script snippet below works when adding files to Zip file. However, if the file to be added is open in another program then it fails with exception, "The process cannot access the file[..]". I tried using [IO.FileShare]::ReadWrite but no success yet.
Any suggestion as to how to open the files for reading and writing to zip regardless whether the file is open in another program or not?
Script Source
# write entries with relative paths as names
foreach ($fname in $FullFilenames) {
$rname = $(Resolve-Path -Path $fname -Relative) -replace '\.\\',''
Write-Output $rname
$zentry = $zip.CreateEntry($rname)
$zentryWriter = New-Object -TypeName System.IO.BinaryWriter $zentry.Open()
$zentryWriter.Write([System.IO.File]::ReadAllBytes($fname)) #FAILS HERE
$zentryWriter.Flush()
$zentryWriter.Close()
}
Since we're missing some important part of your code, I'll just assume what might work in this case, and following assumptions based on your comments.
First you would open the file with FileShare.ReadWrite:
$handle = [System.IO.File]::Open($fname, 'Open', 'Read', 'ReadWrite')
Then you should be able to use the .CopyTo(Stream) method from FileStream:
$zentry = $zip.CreateEntry($rname)
$zstream = $zentry.Open()
$handle.CopyTo($zstream)
$zstream.Flush()
$zstream.Dispose()
$handle.Dispose()
The project is an MVC website coded and built using VS2017 and (on premises) TFS2017. The Build Definition is currently working and publishing to the staging location upon check-in.
The PowerShell script below, derived from David Kittle's website, is being used but it uploads all files every time. I abbreviated the listing using comments to focus on the part of the script for which I'd like to ask for help/guidance.
# Setup the FTP connection, destination URL and local source directory
# Put the folders and files to upload into $Srcfolders and $SrcFiles
# Create destination folders as required
# start file uploads
foreach($entry in $SrcFiles)
{
#Create full destination filename from $entry and put it into $DesFile
$uri = New-Object System.Uri($DesFile)
#NEED TO GET THE REMOTE FILE DATA HERE TO TEST AGAINST THE LOCAL FILE
If (#perform a test to see if the file needs to be uploaded)
{ $webclient.UploadFile($uri, $SrcFullname) }
}
In the last few lines of the script above I need to determine if a source file requires upload. I am assuming I can check the time stamp to determine this. So;
If my assumption is wrong, please advise the best way to check for a required upload.
If my assumption is correct, how do I (1) retrieve the time stamp from the remote server and then (2) make the check against the local file?
You can use the FtpWebRequest class with its GetDateTimestamp FTP "method" and parse the UTC timestamp string it returns. The format is specified by RFC 3659 to be YYYYMMDDHHMMSS[.sss].
That would work only if the FTP server supports MDTM command that the method uses under the cover (most servers do, but not all).
$url = "ftp://ftp.example.com/remote/folder/file.txt"
$ftprequest = [System.Net.FtpWebRequest]::Create($url)
$ftprequest.Method = [System.Net.WebRequestMethods+Ftp]::GetDateTimestamp
$response = $ftprequest.GetResponse().StatusDescription
$tokens = $response.Split(" ")
$code = $tokens[0]
if ($code -eq 213)
{
Write-Host "Timestamp is $($tokens[1])"
}
else
{
Write-Host "Error $response"
}
It would output something like:
Timestamp is 20171019230712
Now you parse it, and compare against a UTC timestamp of a local file:
(Get-Item "file.txt").LastWriteTimeUtc
Or save yourself some time and use an FTP library/tool that can do this for you.
For example with WinSCP .NET assembly, you can synchronize whole local folder with a remote folder with one call to the Session.SynchronizeDirectories. Or your can limit the synchronization to a set of files only.
# Load WinSCP .NET assembly
Add-Type -Path "WinSCPnet.dll"
# Setup session options
$sessionOptions = New-Object WinSCP.SessionOptions
$sessionOptions.Protocol = [WinSCP.Protocol]::Ftp
$sessionOptions.HostName = "ftpsite.com"
$session = New-Object WinSCP.Session
# Connect
$session.Open($sessionOptions)
$result = $session.SynchronizeDirectories(
[WinSCP.SynchronizationMode]::Remote, "C:\local\folder", "/remote/folder")
$result.Check()
To use the assembly, just extract a contents of .NET assembly package to your script folder. No other installation is needed.
The assembly supports not only the MDTM, but also other alternative methods to retrieve the timestamp.
(I'm the author of WinSCP)
I copy the content of an S3 bucket to a local directory, however I get an error output from the powershell.
Copy-S3Object : The requested range is not satisfiable
It is pointing to this command:
Copy-S3Object -BucketName $bucket -Key $object.Key -LocalFile $localFilePath -Region $region
Why do I get this error ? Note that the desired files that are needed to be copied do indeed get copied locally.
I can't say why you are getting that error returned from S3, but I can tell you that if you are copying multiple objects you probably want to use the -LocalFolder parameter, not -LocalFile. -LocalFolder will preserve the prefixes as subpaths.
When downloading one or more objects from S3, the Read-S3Object cmdlet works the same as Copy-S3Object, but uses -KeyPrefix to specify the common prefix the objects share, and -Folder to indicate the folder they should be downloaded to.
This also reminds me I need to check why we used -LocalFolder on Copy-, and -Folder on Read- although I suspect aliases may also be available to make them consistent.
HTH
(Edit): I spent some time this morning reviewing the cmdlet code and it doesn't appear to me the cmdlet would work as-is on a multi-object download, even though it has a -LocalFolder parameter. If you have a single object to download, then using -Key/-LocalFile is the correct parameter combination. If -LocalFolder is passed, the cmdlet sets up internally to do a single file download instead of treating -Key as a common key prefix to a set of objects. So, I think we have a bug here that I'm looking into.
In the meantime, I would use Read-S3Object to do your downloads. It supports both single (-Key) or multi-object download (-KeyPrefix) modes. https://docs.aws.amazon.com/powershell/latest/reference/index.html?page=Read-S3Object.html&tocid=Read-S3Object
this seems to occur with folders that do not contain files since copy wants to copy files.
i accepted this error and trapped it.
catch [Amazon.S3.AmazonS3Exception]
{
# get error record
[Management.Automation.ErrorRecord]$e = $_
# retrieve information about runtime error
$info = [PSCustomObject]#{
Exception = $e.Exception.Message
Reason = $e.CategoryInfo.Reason
Target = $e.CategoryInfo.TargetName
Script = $e.InvocationInfo.ScriptName
Line = $e.InvocationInfo.ScriptLineNumber
Column = $e.InvocationInfo.OffsetInLine
ErrorCode = $e.Exception.ErrorCode
}
if ($info.ErrorCode="InvalidRange") { #do nothing
} Else {
# output information. Post-process collected info, and log info (optional)
write-host $info -ForegroundColor Red}
}
}
This happened to me when I tried to download the file which had more than one dots in it. Simplifying the file name, fixed the error.
File name that gave me error: myfile-18.10.exe
File name that worked: myfile-1810.exe
I have this script that downloads all .txt and .log files. But I need to move them to another directory on the server after the download.
So far I just keep getting errors like "cannot move "file" to "/file".
try
{
# Load WinSCP .NET assembly
Add-Type -Path "C:\Program Files (x86)\WinSCP\WinSCPnet.dll"
# Setup session options
$sessionOptions = New-Object WinSCP.SessionOptions
$sessionOptions.Protocol = [WinSCP.Protocol]::ftp
$sessionOptions.HostName = "host"
$sessionOptions.PortNumber = "port"
$sessionOptions.UserName = "user"
$sessionOptions.Password = "pass"
$session = New-Object WinSCP.Session
try
{
# Connect
$session.DisableVersionCheck = "true"
$session.Open($sessionOptions)
$localPath = "C:\users\user\desktop\file"
$remotePath = "/"
$fileName = "*.txt"
$fileNamee = "*.log"
$remotePath2 = "/completed"
$directoryInfo = $session.ListDirectory($remotePath)
$directoryInfo = $session.ListDirectory($remotePath2)
# Download the file
$session.GetFiles(($remotePath + $fileName), $localPath).Check()
$session.GetFiles(($remotePath + $fileNamee), $localPath).Check()
$session.MoveFile(($remotePath + $fileName, $remotePath2)).Check()
$session.MoveFile(($remotePath + $fileNamee, $remotePath2)).Check()
}
finally
{
# Disconnect, clean up
$session.Dispose()
}
exit 0
}
catch [Exception]
{
Write-Host $_.Exception.Message
exit 1
}
You have many problems in your code:
The targetPath argument of the Session.MoveFile method is a path to move/rename the file to.
So, if you use the target path /complete, you are trying to move the file to a root folder and rename it to the complete. While you probably want to move the file to folder the /complete, and keep its name. For that use the target path /complete/ (or the /complete/* to make it more obvious).
Your current code fails, because you are renaming the file to a name of an already existing folder.
You actually have the same bug in the .GetFiles. You are downloading all files (both *.txt and *.log) to the folder C:\users\user\desktop and save them all to the same name file, overwriting one another.
You have brackets incorrectly around both arguments, instead of around the first argument only. While I'm no PowerShell expert, I'd actually say you are omitting the second argument of the method completely this way.
Further, note that the MoveFile method does not return anything (contrary to the GetFiles). So there's no object to call the .Check() method on.
The MoveFile (note the singular, comparing to the GetFiles), moves only a single file. So you should not use a file mask. Actually the present implementation allows a use of the file mask, but this use is undocumented and may be deprecated in future versions.
Anyway, the best solution is to iterate the list of actually downloaded files, as returned by the GetFiles and move the files one by one.
This way you avoid race condition, where you download set of files, new files are added (which you didn't download) and you incorrectly move them to the "completed" folder.
The code should look like (for the first set of files only, i.e. the *.txt):
$remotePath2 = "/completed/"
...
$transferResult = $session.GetFiles(($remotePath + $fileName), $localPath)
$transferResult.Check()
foreach ($transfer in $transferResult.Transfers)
{
$session.MoveFile($transfer.FileName, $remotePath2)
}
Note that this does not include a fix for the $localPath, as I'm not sure, what the path C:\users\user\desktop\file actually mean.
There's actually a very similar sample code available:
Moving local files to different location after successful upload
Have you checked to make sure your process has rights to move files to the new directory?
I am doing what Martin suggest in here with success.
But I have been stuck for sometimes.
After running "session.MoveFile()", the file in origin folder is gone, but it is not showing in destination folder.
The files will showing to destination after "session" disposed automatically after some time period (around 30 minutes I guess).
To avoid this confusion, dispose the session.
Like this :
session.Dispose();
I know this is trivial, but I hope that you don't fall to same problem.
I'm running a Windows Service (Hudson) which in turn spawns a PowerShell process to run my custom PowerShell commands. Part of my script is to unzip a file using CopyHere. When I run this script locally, I see a progress dialog pop up as the files are extracted and copied. However, when this runs under the service, it hangs at the point where a dialog would otherwise appear.
Here's the unzip portion of my script.
# Extract the contents of a zip file to a folder
function Extract-Zip {
param([string]$zipFilePath, [string]$destination)
if(test-path($zipFilePath)) {
$shellApplication = new-object -com shell.application
$zipFile = get-item $zipFilePath
$zipFolder = $shellApplication.NameSpace($zipFile.fullname)
$destinationFile = get-item $destination
$destinationFolder = $shellApplication.NameSpace($destinationFile.fullname)
$destinationFolder.CopyHere($zipFolder.Items())
}
}
I suspect that because its running under a service process which is headless (no interaction with the desktop), its somehow stuck trying to display a dialog.
Is there a way around this?
If it's still actual, I managed to fix this with having CopyHere params equal 1564.
So in my case extract zip function looks like:
function Expand-ZIPFile{
param(
$file, $destination
)
$shell = new-object -com shell.application
$zip = $shell.NameSpace($file)
foreach($item in $zip.items())
{
$shell.Namespace($destination).copyhere($item,1564)
"$($item.path) extracted"
}
1564 description can be found here - http://msdn.microsoft.com/en-us/library/windows/desktop/bb787866(v=vs.85).aspx:
(4) Do not display a progress dialog box.
(8) Give the file being operated on a new name in a move, copy, or rename operation if a file with the target name already exists.
(16) Respond with "Yes to All" for any dialog box that is displayed.
(512) Do not confirm the creation of a new directory if the operation requires one to be created.
(1024) Do not display a user interface if an error occurs.
If this is running on Vista or Windows 7, popping up UI from a service isn't going to be seen by the end user as you suspected. See this paper on Session 0 Isolation. However, does the progress dialog require user input? If not, I wouldn't think that would cause the service to hang. I would look for an option to disable the progress display. If you can't find that, then try switching to another ZIP extractor. PSCX 1.2 comes with an Expand-Archive cmdlet. I'm sure there are also others available.
Looking at the documentation for PowerShell, it looks like the -NonInteractive option may help here