Robocopy commands to copy a file to over 50 remote machines - copy

I started looking at robocopy yesterday to try to copy and overwrite a file from one destination to many remote computers. I've tried Robocopy to copy files to a remote machine but it doesn't work. I get the same error as the person in the link. Does anybody have any suggestions or lead me in the right way ? thank you so much !

You could just use PowerShell for this. It has an inefficiency issue wherein it would copy one at a time but that shouldnt be an issue for 50ish machines. This could help if you made a PowerShell script
$computers = Get-Content "C:\filewithcomputers.txt"
$fileToCopy = "C:\filetocopy.txt"
ForEach($computer in $Computers){
Copy-Item -Path $fileToCopy -Destination "\\$computer\C`$\Temp"
}
The would copy the file $fileToCopy to each server in the file C:\filewithcomputers.txt assuming that the file contained a list of computer with each one on its own line. The file would be copied to the temp folder on each machine. Update the paths as required for your scenario. I only suggest this since you tagged powershell-remoting. If you are not adept with PowerShell maybe someone else can give you a better answer more of what you are looking for. Using RoboCopy for one file seemed tedious.
If you wanted to check to see if a folder exists and is accessible you could do something like this.
$computers = Get-Content "C:\filewithcomputers.txt"
$fileToCopy = "C:\filetocopy.txt"
ForEach($computer in $Computers){
$destinationx86 = "\\$computer\C`$\Program Files (x86)"
$destination = "\\$computer\C`$\Program Files"
If(Test-Path $destinationx86){
# Copy this to Program Files (x86)
Copy-Item -Path $fileToCopy -Destination $destinationx86
} Else {
# Copy this to Program Files
Copy-Item -Path $fileToCopy -Destination $destination
}
}

If you need to connect with different credentials, you can use
$credential = Get-Credential
New-PSDrive -Name "Computer01" -PSProvider FileSystem -Root "\\Computer01\Share" -Credential $credential -Scope global
Now you can copy to e.g. Computer01:\Folder01\

If you have set your environment up to support PSRemoting and have placed the file in a file share you can use PowerShell Remoting to instruct many computers to retrieve the file themselves nearly simultaneously with Invoke-Command. You can limit the number of simultaneous actions using -ThrottleLimit depending on the size of the source file and how robust the network/server are:
$computers = Get-Content "C:\filewithcomputers.txt"
$originalsource = "\\fileserver\shared\payload.exe"
$originaldestination = "c:\"
$scriptblockcontent = {
param($source,$destination)
Copy-Item -Path $source -Destination $destination
}
Invoke-Command –ComputerName $Computers –ScriptBlock $scriptblockcontent `
–ThrottleLimit 50 -ArgumentList $originalsource,$originaldestination

Related

Powershell script searching files on domain

Very new to powershell and AD, so apologies if this post has an obvious answer. I have done some research and I am still not finding the answers I am looking for. My script is below for reference.
I have created a simple powershell script that will run on an admin vm i have setup on my domain. I have a separate SQL vm running a backup script that consume a lot of storage over time. I am trying to run this very simple script. My question is, do I need to modify this script in order to store it on my admin vm but have it run on my sql vm? Or can i leave the path as is and just set up in AD task scheduler. I have tried targeting the FQDN and the IP, but it doesn't seem to be working either way.
$backups_file = 'E:\blahBlahBla\SQL\Backups' or
$backups_file = '<IP_ADDRESS>\E:\blahBlahBla\SQL\Backups' or
$backups_file = '<FQDN>E:\blahBlahBla\SQL\Backups'
$backup_file_exist = (Test-Path -Path $backups_file)
if ($backup_file_exist){
# Verifies the folder exists
Write-Output -InputObject "This folder exists"
# returns all the files in the folder.
Get-ChildItem -Path $backups_file
# Deletes all files in the folder that are older that 7 days.
Get-ChildItem -Path $backups_file -Recurse | Where-Object {($_.LastWriteTime -lt (Get-
Date).AddDays(-7))} | Remove-Item
}
else
{
Write-Output -InputObject "Unable to access this directory."
}
Thanks.
well all your $backups_file solutions seems wrong to me.
If you want excess a directory on a Remote system, it has to be at least a fileshare or a administrative share like \\computer\e$\folder\folder\
But why using file shares or something like that when you just simple can connect to a Powershell Session on the Remote Host? here is a example.:
$mySQLServer = "Server1.domain.name", "server2.domain.name"
$backupFolder = "E:\blahBlahBla\SQL\Backups"
foreach ($server in $mySQLServer)
{
$session = New-PSSession -ComputerName $server #maybe -cred if needed
Invoke-Command -Session $session -ArgumentList $backupFolder -ScriptBlock {
param(
$directoy
)
if ($backup_file_exist)
{
# Verifies the folder exists
Write-Output -InputObject "This folder exists"
# returns all the files in the folder.
Get-ChildItem -Path $directoy
# Deletes all files in the folder that are older that 7 days.
Get-ChildItem -Path $directoy -Recurse | Where-Object { ($_.LastWriteTime -lt (Get-Date).AddDays(-7))
} | Remove-Item
}
}
Remove-PSSession
}
Good Luck!

copy one particular data from multiple remote servers and copy the the same on C drive of the server as a backup

I need to take a backup of an xml file from multiple remote servers and the same backup must b e copied on all remote servers C drive.And I was trying to do the same something like below .
Can anyone help me here to correct this please ?
ForEach ($Computer in Get-Content C:\servers.txt)
{
$file = get-childitem "c:\clusterstorage\*.xml*"-Recurse
Invoke-Command -AsJob -ComputerName $Computer -ScriptBlock {Invoke-Expression -Command "Copy-Item -Path '\\$Computer\$file' -Destination '\\$Computer\C:\'"}
}
You can directly use the copy-item inside the loop. You do not need to use invoke-expression on the context.
It should work then.

Copy files to remote computer

As part of project requirement, I am preparing a script to copy the files from local computer to remote servers ( with username and password )
I have tried with below ways for files are 27 KB and 50 MB size
i. Using ReadallBytes and WriteAllBytes
this is working for small file 27 KB, where as for 50 MB its taking 100% CPU and taking too much of time
$myfile = [System.IO.File]::ReadAllBytes("C:\Temp\test\a.txt")
$Stat = $null
$session=$null
$session = New-PSSession -computerName $server -credential $user
$Stat = Invoke-Command -Session $session -ArgumentList $myfile -Scriptblock {[System.IO.File]::WriteAllBytes("C:\temp\a.txt", $args)} -ErrorAction Stop
ii. I tried to copy with Copy-Item , but issue is target directory is not mount pointed
$Stat = Invoke-Command -ComputerName $server -ScriptBlock { Copy-Item -Path "C:\Temp\test\a.txt" -Destination "C:\temp\a.txt" -Recurse -Force -PassThru -Verbose } -Credential $user
Struck in both ways, please suggest any other way to achieve without mounting the target folder
Copy-Item -Path "C:\Temp\test\a.txt" -Dest "\\$($server)\c$\temp\a.txt"
use the built-in drive shares to copy it over, you may need to provide creds for this.
you might find this helper function useful to get the remote path correctly.
Function Get-RemotePath($Server,$Path){
"\\$($Server)\$($Path -replace ':','$')"
}
Get-RemotePath -Server "SERVER01" -Path "C:\Temp\File.txt"
\\SERVER01\C$\Temp\File.txt
Why not use WMI top copy the file instead?
It can be asynchronous and is very efficient.
I have a post here which explains it.
Powershell - Copying File to Remote Host and Executing Install exe using WMI

Powershell Remote file copy is not working

I am trying to copy files remotely on several IIS servers from one source server.
sourcepath is a UNC path like \\server\c$\path\ and destinationpath is a local folder c:\data\destination\
The strange thing is when I run this on the local server this will work perfectly.
$cmd = $sqlConnection.CreateCommand()
$cmd.CommandText ="SELECT * from infrastructure"
$Serverinfo = $cmd.ExecuteReader()
try
{
while ($Serverinfo.Read())
{
$servername = $Serverinfo.GetValue(1)
$location = $Serverinfo.GetValue(2)
#Invoke-Command -ComputerName $servername { new-item -Path $Using:destinationpath -name $Using:versionnumber -Itemtype directory }
Invoke-Command -ComputerName $servername { Copy-Item -Path $Using:sourcepath -destination $Using:destinationpath }
#Invoke-Command -ComputerName $servername {Import-Module WebAdministration ; New-WebApplication -force -Site "Default Web Site" -Name $Using:versionnumber -PhysicalPath $Using:destinationpath$Using:versionnumber }
}
}
What you have posted is incomplete without a catch block and source and destination path defined as others said.
But what I can see here as a possible cause even if you mention all above three constrains.
You will face issues with credential delegation . You are remoting to one server using PSRP and copying one file to that machine and you are taking your source file from a UNC path which requires some authentication.
I could give you two better alternatives, of course 1st once could be the proper solution.
If you are in PS 5.o or later, you can use -ToSession parameter of Copy-Item cmdlet.
$Session = New-PSSession -ComputerName $ServerName
Copy-Item -SourcePath \\\Server\c$\path -Destination c:\data\Destination-ToSession $Session
For more info Get-Help Copy-Item
Edit:
The second one:
Copy-Item -Path <Sourcepath> -DestinationPath \\destinataionserver\c$\folder
from source to the destination by accessing the shared folder of the destination system.

Call VBScript With Return Value on Remote Machine With Powershell

I need to call a remote VB script from Powershell, and the VB script needs to run on the remote machine.
I have been using \$computer\root\cimv2:Win32_Process").Create(C:\test.vbs)
This works, however I can't get a return value from the script, just a return value from the win32 process.
I would convert the whole thing to powershell, but can't as I'm connecting to a legacy domain I can't install additional tools on so have to call the remote vbscript
This is an old question but I would like to share my solution. It's the same as the one posted by Ansgar but it's been tested and working fine:
$VNC = '\\share\software\AppName\_Install_Silent.vbs'
$Computer = 'RemoteHost'
$TMP = "\\$Computer\c$\TEMP"
if (!(Test-Path $TMP)) {
New-Item -Path $TMP -ItemType Directory
}
Copy-Item -LiteralPath (Split-Path $VNC -Parent) -Destination $TMP -Container -Recurse -Force -Verbose
$LocalPath = Join-Path 'C:\TEMP' (Join-Path (Split-Path $VNC -Parent | Split-Path -Leaf) (Split-Path $VNC -Leaf))
Invoke-Command -ScriptBlock {cscript.exe $Using:LocalPath} -Computer $Computer
# Restart might be needed of remote host
The difference is that you have to copy the files first to the remote machine to avoid the double hop issue and then you can install it with the $Using variable.
Hope this helps someone.
I'd probably try either remote invocation:
Invoke-Command -ScriptBlock { cscript.exe "C:\test.vbs" } -Computer $computer
or PsExec:
PsExec \\$computer cscript.exe "C:\test.vbs"
Can't test either of them right now, though.