I'm using powershell to copy file to a remote computer witht he following command :
Copy-Item -Path [MyPath]\* -Destination \\[server]\[MyPath] -force
It's working great, but sometime I'm receiving the following error message: "The process cannot access the file [...] because it is being used by another process.".
Is it possible to force the file to be copy even if it's in use?
The only way to do that is to get rid of the handle the process has to the file you are overwriting.
It's either a service or a desktop application accessing the file. You can find out what has access to the file using handle.exe from SysInternals.
Once you know what is accessing the file you can stop/kill it remotely (assuming you have permissions to do so).
Stop a remote service
Stop a remote process (Invoke-Command), Stop a remote process (WMI)
Related
So here is the situation, I am trying to automate the copy of some files that are in a network drive into a local folder on one of my servers. The task seems to be simple and when I try the code with PowerShell or with x copy in the command line both are working pretty great.
I've installed a Jenkins agent on this Windows server 2016 server and run the agent as a service. When I try to run the same code from the Jenkins agent, it is never working.
I tried starting the agent service as local system and as the windows network administrator who has all the right
I tried with PowerShell those lines :
Copy-Item -Path "\\server IP\directory\*" -Destination "D:\Directory\" -Verbose
and
Copy-Item -Path "z:\*" -Destination "D:\Directory\" -Verbose
Both return no error but did not copy the files, and when I tried the same code with x copy I just receive no file found and the file was not copied
xcopy "\\server IP\directory\*" "D:\Directory\" /f /s /h /y
xcopy "z:\*" "D:\Directory\" /f /s /h /y
With PowerShell, I also tried inserting the copy-file command into a script and only calling the script with the Jenkins agent, and it also didn't work
I am now running in a circle and wonder how are we supposed to work with the network drive with the Jenkins agent? Or what I am doing wrong ?
Note that other PowerShell code are working great locally.
I tried starting the agent service as local system and as the windows network administrator who has all the right
Local system doesn't have any network permissions by default. This is the machine account, so you would have to give the machine access to "\\server\share". It is not advisable though, because permissions should be granted on finer granularity. Also, local system has too many local rights, which Jenkins doesn't need.
I don't know what you mean by "Windows Network Administrator". It sounds like this one would also have too many rights.
Usually, one creates a dedicated Jenkins (domain) user. This user will be granted access to any network paths it needs. Then you have two options:
Always run Jenkins service as this user (easiest way).
Run Jenkins under another local user and connect to network drives as the domain user only on demand: net use \\server\share /user:YourDomain\Jenkins <Password>. This adds some security as you don't need to give any local permissions to the domain user.
Both return no error but did not copy the files
To improve error reporting, I suggest you set $ErrorActionPreference = 'Stop' at the beginning of your PowerShell scripts. This way the script will stop execution and show an error as soon as the first error happens. I usually wrap my PS scripts in a try/catch block to also show a script stack trace, which makes it much easier to pinpoint error locations:
$ErrorActionPreference = 'Stop' # Make the script stop at the 1st error
try {
Copy-Item -Path "\\server IP\directory\*" -Destination "D:\Directory\" -Verbose
# Possibly more commands...
# Indicate success to Jenkins
exit 0
}
catch {
Write-Host "SCRIPT ERROR: $($_.Exception)"
if( $_.ScriptStackTrace ) {
Write-Host ( "`t$($_.ScriptStackTrace)" -replace '\r?\n', "`n`t" )
}
# Indicate failure to Jenkins
exit 1
}
In this case the stack trace is not much helpful, but it will come in handy, when you call functions of other scripts, which in turn may call other scripts too.
Please note: security is not an issue at all with this application.
I have a "MachineA" that has a file we will call file.txt. I need to get this file onto a "MachineB". Here's the catch, this needs to be done by remotely sending an Invoke-Command script from MachineB to MachineA to kick back the file.
What I have tried so far:
I have setup a shared drive (using net use) on MachineB that MachineA can access. This drive is known as z:\ to MachineA.
I can send this command via Invoke-Command from MachineB to MachineA and it will transfer a file from one directory to another directory within MachineA
xcopy c:\users\administrator\desktop\file.txt c:\file.txt
Now if I send the following command in the same manner from MachineB to MachineA it will complain about not being able to see drive Z.
xcopy c:\users\administrator\desktop\file.txt z:\FilesFromVMs\file.txt
Note: if run the second command directly on MachineA from powershell it will move the file but I need to be able to do this remotely via Invoke-Command
If security is not an issue, maybe try using UNC Paths \MachineA\C$\Users...\ instead of relying on mapped drive letters? Perhaps that'll do the trick.
The script I'm working on right now performs a bunch of work on the email server for processing Litigation hold requests. One of the components of this script is to copy all the data from an identified folder on a file server to a secure backup location that can't be altered which happens to reside on the same remote server. I have the script working fine but it copies all the files across the network from the file server to the email server where the PowerShell script is run and then pushes it right back to the same file server it came from to the "hold" folder. This is creating severe delays as some of these folders are several thousands of files ranging from a few bytes to several meg each.
I'm looking for a way for my PowerShell script running on the email server to issue a remote command to the file server to copy a folder from one directory to another (all on the same remote server). Any suggestions?
Currently, I'm using the robocopy command in my script but I'm open to other ways of performing this.
robocopy "Source Folder" "Destination Folder" /e
I'm looking for something like:
PSEXEC \\\FileServer -s -d Copy "Source folder" to "Destination folder"
I just can't seem to find any material on the old google train that satisfies the situation.
psexec \\fileserver -s robocopy "source" "destination" /e
should work just fine. With PSRemoting enabled on the fileserver you could also do
Invoke-Command -Computer fileserver -Scriptblock {
& robocopy "source" "destination" /e
}
Note that this assumes local paths on the remote server. If you're working with network shares on the remote host you'll be running into the second hop problem.
For some strange reason the first attempt to use PSEXEC prior to posting resulted in an error stating it didn't know how to handle the command. But I plugged in the PSEXEC command again and it worked perfectly today.
Thank you.
I'm running the command which i have shown you the image and its giving that error
Please suggest me on this
Actually am trying to copy the build from local system to the remote server by using powershell scirpt
and i want to copy the files from on remote server to the other remote server by using local system
please provide me solutions for this and what are the requirements for this
You can use the copy-item
copy-item -path "Your\path\to\folderorfile" -destination "\\servernameorip\dest\location\"
if you have different credentials on the destination server, you can either trick windows by creating a local account using same name and password on your mcopy with different credentials
I am trying to execute powershell script remotely using invoke-command. The script relies on a configuration file which is available over the local network. The script is called in a following way:
Invoke-Command -ComputerName 192.168.137.181 -FilePath c:\scripts\script.ps1 -ArgumentList \\192.168.137.1\share\config.xml
The configuration as you can see is an xml file and it's loaded using:
$xml = New-Object XML
$xml.Load(args[0])
When the script is called locally on the machine then it runs witout any problems and reads the configuration file. However when I run it from different machine using invoke command I get
"Access to the path '\\192.168.137.1\share\config.xml' is denied"
exception, which is thrown when executing Load method.
The file is accessible to everyone with read and write permissions.
Both, machine on which the scrip should be run (.181) and the machine on which it is run physically have the same credentials, thus I do not pass them in invoke-command cmdlet.
The share machine (.1) has different credential, but this was never an issue when calling the script locally from .181.
Can you please point me in the right direction? I stuck on this step and can't find solution by myself. I tried downloading the xml string using WebClient#DownloadString method and passing credentials for the share machine but it did not help.
Thanks in advance
It is probably a double hop issue. You have to use CredSSP to delegate your credentials to the remote computer.
Try the solution mentioned here: http://blogs.msdn.com/b/clustering/archive/2009/06/25/9803001.aspx