How do I issue a remote copy command? - powershell

The script I'm working on right now performs a bunch of work on the email server for processing Litigation hold requests. One of the components of this script is to copy all the data from an identified folder on a file server to a secure backup location that can't be altered which happens to reside on the same remote server. I have the script working fine but it copies all the files across the network from the file server to the email server where the PowerShell script is run and then pushes it right back to the same file server it came from to the "hold" folder. This is creating severe delays as some of these folders are several thousands of files ranging from a few bytes to several meg each.
I'm looking for a way for my PowerShell script running on the email server to issue a remote command to the file server to copy a folder from one directory to another (all on the same remote server). Any suggestions?
Currently, I'm using the robocopy command in my script but I'm open to other ways of performing this.
robocopy "Source Folder" "Destination Folder" /e
I'm looking for something like:
PSEXEC \\\FileServer -s -d Copy "Source folder" to "Destination folder"
I just can't seem to find any material on the old google train that satisfies the situation.

psexec \\fileserver -s robocopy "source" "destination" /e
should work just fine. With PSRemoting enabled on the fileserver you could also do
Invoke-Command -Computer fileserver -Scriptblock {
& robocopy "source" "destination" /e
}
Note that this assumes local paths on the remote server. If you're working with network shares on the remote host you'll be running into the second hop problem.

For some strange reason the first attempt to use PSEXEC prior to posting resulted in an error stating it didn't know how to handle the command. But I plugged in the PSEXEC command again and it worked perfectly today.
Thank you.

Related

Copy file from a Network Drive to a local Drive with a Jenkins Agent

So here is the situation, I am trying to automate the copy of some files that are in a network drive into a local folder on one of my servers. The task seems to be simple and when I try the code with PowerShell or with x copy in the command line both are working pretty great.
I've installed a Jenkins agent on this Windows server 2016 server and run the agent as a service. When I try to run the same code from the Jenkins agent, it is never working.
I tried starting the agent service as local system and as the windows network administrator who has all the right
I tried with PowerShell those lines :
Copy-Item -Path "\\server IP\directory\*" -Destination "D:\Directory\" -Verbose
and
Copy-Item -Path "z:\*" -Destination "D:\Directory\" -Verbose
Both return no error but did not copy the files, and when I tried the same code with x copy I just receive no file found and the file was not copied
xcopy "\\server IP\directory\*" "D:\Directory\" /f /s /h /y
xcopy "z:\*" "D:\Directory\" /f /s /h /y
With PowerShell, I also tried inserting the copy-file command into a script and only calling the script with the Jenkins agent, and it also didn't work
I am now running in a circle and wonder how are we supposed to work with the network drive with the Jenkins agent? Or what I am doing wrong ?
Note that other PowerShell code are working great locally.
I tried starting the agent service as local system and as the windows network administrator who has all the right
Local system doesn't have any network permissions by default. This is the machine account, so you would have to give the machine access to "\\server\share". It is not advisable though, because permissions should be granted on finer granularity. Also, local system has too many local rights, which Jenkins doesn't need.
I don't know what you mean by "Windows Network Administrator". It sounds like this one would also have too many rights.
Usually, one creates a dedicated Jenkins (domain) user. This user will be granted access to any network paths it needs. Then you have two options:
Always run Jenkins service as this user (easiest way).
Run Jenkins under another local user and connect to network drives as the domain user only on demand: net use \\server\share /user:YourDomain\Jenkins <Password>. This adds some security as you don't need to give any local permissions to the domain user.
Both return no error but did not copy the files
To improve error reporting, I suggest you set $ErrorActionPreference = 'Stop' at the beginning of your PowerShell scripts. This way the script will stop execution and show an error as soon as the first error happens. I usually wrap my PS scripts in a try/catch block to also show a script stack trace, which makes it much easier to pinpoint error locations:
$ErrorActionPreference = 'Stop' # Make the script stop at the 1st error
try {
Copy-Item -Path "\\server IP\directory\*" -Destination "D:\Directory\" -Verbose
# Possibly more commands...
# Indicate success to Jenkins
exit 0
}
catch {
Write-Host "SCRIPT ERROR: $($_.Exception)"
if( $_.ScriptStackTrace ) {
Write-Host ( "`t$($_.ScriptStackTrace)" -replace '\r?\n', "`n`t" )
}
# Indicate failure to Jenkins
exit 1
}
In this case the stack trace is not much helpful, but it will come in handy, when you call functions of other scripts, which in turn may call other scripts too.

Hello do i'm trying to code a powershell script that upload and overwrite a folder to an other folder

So I'm basically new to powershell coding and I don't really know if I'm doing this right - just started a few days ago.
My main goal is to send a folder to a ftp server and overwrite the folder that's already on the server due to daily upload. Sorry but I'm a bit of a nerd and would like a few recommendations if I'm doing this well or not. Thank you
I tried to code something using powershell and a batch file .
And it actually work's but localy on my pc ( copy the folder and paste it in wanted folder) But it just won't work when trying to do this on a remote ftp server that's saved on my pc.
script:
[string]$sourceDirectory = "C:\test\*"
[string]$destinationDirectory = "C:\Users\c0ld\Desktop/receive "
Copy-item -Force -Recurse -Verbose $sourceDirectory -Destination $destinationDirectory
exit
Bat script:
#ECHO OFF
PowerShell.exe -Command "& 'C:/test.ps1'"
PAUSE
So this works when working on actual local folder, but when trying to do this on a network FTP folder it's actually not working.
First make sure you have the correct permissions to access the server.
Then remember(this got me a couple times) that instead of c:\path, you would use the admin share to access a remote pc, which is \remotecomputer\c$\path.

Windows - copy file from machine A to machine B through Powershell executed from machine B on machine A

Please note: security is not an issue at all with this application.
I have a "MachineA" that has a file we will call file.txt. I need to get this file onto a "MachineB". Here's the catch, this needs to be done by remotely sending an Invoke-Command script from MachineB to MachineA to kick back the file.
What I have tried so far:
I have setup a shared drive (using net use) on MachineB that MachineA can access. This drive is known as z:\ to MachineA.
I can send this command via Invoke-Command from MachineB to MachineA and it will transfer a file from one directory to another directory within MachineA
xcopy c:\users\administrator\desktop\file.txt c:\file.txt
Now if I send the following command in the same manner from MachineB to MachineA it will complain about not being able to see drive Z.
xcopy c:\users\administrator\desktop\file.txt z:\FilesFromVMs\file.txt
Note: if run the second command directly on MachineA from powershell it will move the file but I need to be able to do this remotely via Invoke-Command
If security is not an issue, maybe try using UNC Paths \MachineA\C$\Users...\ instead of relying on mapped drive letters? Perhaps that'll do the trick.

Bat Backup Script

What I am trying to do is to backup a user profile from their local workstation to our backup servers and send me an email once it's complete. I currently have this is two different scripts. It would be nice if we could make this in one script. If I need two scripts, that won't be a problem.
The first script is the backup, and it has been working just fine.
robocopy C:\Users\TravisWhiteman.ArchwaySys\AppData \\10.1.10.6\WorkstationBackup\Test\AppData /mir /W:3 /R:1 /log:CopylogAppData.txt
robocopy C:\Users\TravisWhiteman.ArchwaySys\Desktop \\10.1.10.6\WorkstationBackup\Test\Desktop /mir /W:3 /R:1 /log:CopylogDesktop.txt
robocopy C:\Users\TravisWhiteman.ArchwaySys\Documents \\10.1.10.6\WorkstationBackup\Test\Documents /mir /W:3 /R:1 /log:CopylogDocuments.txt
robocopy C:\Users\TravisWhiteman.ArchwaySys\Downloads \\10.1.10.6\WorkstationBackup\Test\Downloads /mir /W:3 /R:1 /log:CopylogDownloads.txt
Now I want to add in a few features, and I don't know how. I want to change it from manually setting the user profile directory to the system automatically find out who the user is. I think it's something like %USERNAME%. The goal is having the system figure the user out is so I don't have to change the C:\Users\TravisWhiteman.ArchwaySys for every workstation. All of our workstations turns on automatically, 10 min before the scheduled task to backup, in case a user were to shut off their computer.
Basically, what you need is the profile path of the currently logged on user for a list of remote computers.
Steps for each computer:
Get the currently logged on user's login name (here is the method I currently use)
Get the SID for this user - let's say $userSID (a method is described here)
Browse this registry key HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Windows NT\CurrentVersion\ProfileList\$userSID on the remote computer, and read the value ProfileImagePath, it holds the local profile path for this user on this computer (example of remote registry access)
Convert the local path to a network path (C:\Users\... -> \\computerName\c$\Users)
Call robocopy and get some coffee (removed coffee from loop)
One could simply go for \\computer\c$\Users\$userLogin but as OP's example demonstrates it, Windows sometimes appends your domain name to your user name in your local profile folder name, in quite an unpredictable fashion.
(the Remote Registry service must be running on the remote computers)
If the workstation was shut down and then awoken, you I'd target the last modified folder in C:\Users.

How to force the copy of a file using powershell

I'm using powershell to copy file to a remote computer witht he following command :
Copy-Item -Path [MyPath]\* -Destination \\[server]\[MyPath] -force
It's working great, but sometime I'm receiving the following error message: "The process cannot access the file [...] because it is being used by another process.".
Is it possible to force the file to be copy even if it's in use?
The only way to do that is to get rid of the handle the process has to the file you are overwriting.
It's either a service or a desktop application accessing the file. You can find out what has access to the file using handle.exe from SysInternals.
Once you know what is accessing the file you can stop/kill it remotely (assuming you have permissions to do so).
Stop a remote service
Stop a remote process (Invoke-Command), Stop a remote process (WMI)