As part of project requirement, I am preparing a script to copy the files from local computer to remote servers ( with username and password )
I have tried with below ways for files are 27 KB and 50 MB size
i. Using ReadallBytes and WriteAllBytes
this is working for small file 27 KB, where as for 50 MB its taking 100% CPU and taking too much of time
$myfile = [System.IO.File]::ReadAllBytes("C:\Temp\test\a.txt")
$Stat = $null
$session=$null
$session = New-PSSession -computerName $server -credential $user
$Stat = Invoke-Command -Session $session -ArgumentList $myfile -Scriptblock {[System.IO.File]::WriteAllBytes("C:\temp\a.txt", $args)} -ErrorAction Stop
ii. I tried to copy with Copy-Item , but issue is target directory is not mount pointed
$Stat = Invoke-Command -ComputerName $server -ScriptBlock { Copy-Item -Path "C:\Temp\test\a.txt" -Destination "C:\temp\a.txt" -Recurse -Force -PassThru -Verbose } -Credential $user
Struck in both ways, please suggest any other way to achieve without mounting the target folder
Copy-Item -Path "C:\Temp\test\a.txt" -Dest "\\$($server)\c$\temp\a.txt"
use the built-in drive shares to copy it over, you may need to provide creds for this.
you might find this helper function useful to get the remote path correctly.
Function Get-RemotePath($Server,$Path){
"\\$($Server)\$($Path -replace ':','$')"
}
Get-RemotePath -Server "SERVER01" -Path "C:\Temp\File.txt"
\\SERVER01\C$\Temp\File.txt
Why not use WMI top copy the file instead?
It can be asynchronous and is very efficient.
I have a post here which explains it.
Powershell - Copying File to Remote Host and Executing Install exe using WMI
Related
I am trying to install ActivClient remotely onto some machines, I copy the file to the Public Downloads separately.
I am running into an issue where when I try to run this script, it runs locally on my machine. I need to run that Deploy-Application.PS1 file for this.
I also can't figure out how to Unblock the entire folder, there is a few sub folders and files I would like to unblock.
I can't remote into the computer through RDP because I need ActivClient installed to remote in. Remote PowerShell is the only method I can find to run this.
If I am missing information or need to provide more, please let me know.
$mypath = $MyInvocation.MyCommand.Path
$Path = Split-Path $mypath -Parent
[void][Reflection.Assembly]::LoadWithPartialName('Microsoft.VisualBasic')
$title = 'Computer Name'
$msg = 'Enter the Computer Name you need to reinstall ActivClient on:'
$CName = [Microsoft.VisualBasic.Interaction]::InputBox($msg, $title)
if (Test-Connection -ComputerName $CName -Count 1){
Write-Host "Entering Remote Powershell Session"
$PSSession = New-PSSession -Name $CName
Invoke-Command -Session $PSSession -ScriptBlock{
Get-ChildItem "C:\Users\Public\Downloads\SDC NIPR - ActivClient v7.2.x - 210811_18NOV" | Unblock-File
cd "C:\Users\Public\Downloads\SDC NIPR - ActivClient v7.2.x - 210811_18NOV"
.\Deploy-Application.ps1 Install 'NonInteractive'
pause
}
Use New-PSSession -ComputerName $CName, not New-PSSession -Name $CName.
-Name only gives the session being created a friendly name, it is unrelated to which computer you target. In the absence of a -ComputerName argument, the local machine is (somewhat uselessly) targeted, which is why the script ran on your local machine.
Note that if you're only calling Invoke-Command remotely once per remote computer, you needn't create a session explicitly and can instead use -ComputerName $CName directly with Invoke-Command:
# No need for New-PSSession.
Invoke-Command -ComputerName $CName -ScriptBlock { ... }
Add -Recurse -File to your Get-ChildItem call in order to unblock all files in the target folder recursively (all files in the target folder's subtree).
I'm having a heck of a time running a script on a remote system efficiently. When run locally the command takes 20 seconds. When run using Invoke-Command the command takes 10 or 15 minutes - even when the "remote" computer is my local machine.
Can someone explain to me the difference between these two commands and why Invoke-Command takes SO much longer?
Run locally on MACHINE:get-childitem C:\ -Filter *.pst -Recurse -Force -ErrorAction SilentlyContinue
Run remotely on \MACHINE (behaves the same rather MACHINE is my local machine or a remote machine: invoke-command -ComputerName MACHINE -ScriptBlock {get-childitem C:\ -Filter *.pst -Recurse -Force -ErrorAction SilentlyContinue}
Note: The command returns 5 file objects
EDIT: I think part of the problem may be reparse points. When run locally get-childitem (and DIR /a-l) do not follow junction points. When run remotely they do, even if I use the -attributes !ReparsePoint switch)
EDIT2: Yet, if I run the command invoke-command -ComputerName MACHINE -ScriptBlock {get-childitem C:\ -Attributes !ReparsePoint -Force -ErrorAction SilentlyContinue} I don't see the Junction Points (i.e. Documents and Settings). So, it is clear that both DIR /a-l and get-childitem -attributes !ReparsePoint do not prevent it from recursing in to a reparse point. Instead it appears to only filter the actual entry itself.
Thanks a bunch!
It appears the issue is the Reparse Points. For some reason, access is denied to reparse points (like Documents and Settings) when the command is run locally. Once the command is run remotely, both DIR and Get-ChildItem will recurse into reparse points.
Using the -Attributes !ReparsePoint for get-childitem and the /a-l switch for DIR does not prevent this. Instead, it appears those switches only prevent the reparse point from appearing in the file listing output, but it does not prevent the command from recursing into those folders.
Instead I had to write a recursive script and do the directory recursion myself. It's a little bit slower on my machine. Instead of around 20 seconds locally, it took about 1 minute. Remotely it took closer to 2 minutes.
Here is the code I used:
EDIT: With all the problems with PowerShell 2.0, PowerShell remoting, and memory usage of the original code, I had to update my code as shown below.
function RecurseFolder($path) {
$files=#()
$directory = #(get-childitem $path -Force -ErrorAction SilentlyContinue | Select FullName,Attributes | Where-Object {$_.Attributes -like "*directory*" -and $_.Attributes -notlike "*reparsepoint*"})
foreach ($folder in $directory) { $files+=#(RecurseFolder($folder.FullName)) }
$files+=#(get-childitem $path -Filter "*.pst" -Force -ErrorAction SilentlyContinue | Where-Object {$_.Attributes -notlike "*directory*" -and $_.Attributes -notlike "*reparsepoint*"})
$files
}
If it's a large directory structure and you just need the full path names of the files you should be able to speed that up considerably by using to the legacy dir command instead of Get-ChildItem:
invoke-command -ComputerName MACHINE -ScriptBlock {cmd /c dir c:\*.pst /s /b /a-d /a-l}
Try using a remote session:
$yourLoginName = 'loginname'
$server = 'tagetserver'
$t = New-PSSession $server -Authentication CredSSP -Credential (Get-Credential $yourLoginName)
cls
"$(get-date) - Start"
$r = Invoke-Command -Session $t -ScriptBlock{[System.IO.Directory]::EnumerateFiles('c:\','*.pst','AllDirectories')}
"$(get-date) - Finish"
I faced the same problem.
It is obvious that Powershell has problems with the transfer of arrays through PSRemoting.
It demonstrates this little experiment (UPDATED):
$session = New-PSSession localhost
$arrayLen = 1024000
Measure-Command{
Invoke-Command -Session $session -ScriptBlock {
Write-Host ("Preparing test array {0} elements length..." -f $using:arrayLen)
$global:result = [Byte[]]::new($using:arrayLen)
[System.Random]::new().NextBytes($result)
}
} |% {Write-Host ("Completed in {0} sec`n" -f $_.TotalSeconds)}
Measure-Command{
Invoke-Command -Session $session -ScriptBlock {
Write-Host ("Transfer array ({0})" -f $using:arrayLen)
return $result
} | Out-Null
} |% {Write-Host ("Completed in {0} sec`n" -f $_.TotalSeconds)}
Measure-Command{
Invoke-Command -Session $session -ScriptBlock {
Write-Host ("Transfer same array nested in a single object")
return #{array = $result}
}
} |% {Write-Host ("Completed in {0} sec`n" -f $_.TotalSeconds)}
And my output (time in ms):
Preparing test array 1024000 elements length...
Completed in 0.0211385 sec
Transfer array (1024000)
Completed in 48.0192142 sec
Transfer same array nested in a single object
Completed in 0.0990711 sec
As you can see, the array is tranfered more than a minute.
Single objects transfered in a few seconds despite their size
I am writing a script, that creates a new VM, connects to it via New-PSSession and runs serveral Commands to alter a few settings and copy folders from a UNC path to the local C: and C:\Program files (x86).
Everything works fine expect the copy part - I get an error saying
permission denied.
I run the script itself as domain admin and the credentials I pass also has domain admin rights.
For example:
$source = '\\server\share'
$cred = Get-Credential
$pss = New-PSSession -ComputerName "$computername" -Credential $cred
Invoke-Command -Session $pss -ScriptBlock {
Copy-Item "$source\FOLDER" -Destination 'C:\FOLDER' -Recurse -Force -
Credential
$cred
Even if I pass the cedentials it fails. A quick search results in "use robocopy", but in my opinion it must be possible to copy files from a UNC path to a local directory with PowerShell even if that directory is basically protected by Microsoft.
I think you need to pass the variable to the remote session:
invoke-Command -Session $pss -Args $source -ScriptBlock{Copy-Item "$args[0]\FOLDER" ...}
otherwise $source is considered a variable in the remote session and since it doesn't exist there ps will try to copy from \Folder
You can try to use -ToSession parametr without entering to PSSession.
$ses = New-PSSession -ComputerName WorkStation
Copy-Item -ToSession $ses -Destination C:\users\TempUser\Documents\ -Path '\\10.0.0.1\share\' -Recurse
My solution for Copy-Item is:
-Destination $($(dir "env:programfiles(x86)").value + "\Avest\AvPCM_nces\")
I am trying to copy files remotely on several IIS servers from one source server.
sourcepath is a UNC path like \\server\c$\path\ and destinationpath is a local folder c:\data\destination\
The strange thing is when I run this on the local server this will work perfectly.
$cmd = $sqlConnection.CreateCommand()
$cmd.CommandText ="SELECT * from infrastructure"
$Serverinfo = $cmd.ExecuteReader()
try
{
while ($Serverinfo.Read())
{
$servername = $Serverinfo.GetValue(1)
$location = $Serverinfo.GetValue(2)
#Invoke-Command -ComputerName $servername { new-item -Path $Using:destinationpath -name $Using:versionnumber -Itemtype directory }
Invoke-Command -ComputerName $servername { Copy-Item -Path $Using:sourcepath -destination $Using:destinationpath }
#Invoke-Command -ComputerName $servername {Import-Module WebAdministration ; New-WebApplication -force -Site "Default Web Site" -Name $Using:versionnumber -PhysicalPath $Using:destinationpath$Using:versionnumber }
}
}
What you have posted is incomplete without a catch block and source and destination path defined as others said.
But what I can see here as a possible cause even if you mention all above three constrains.
You will face issues with credential delegation . You are remoting to one server using PSRP and copying one file to that machine and you are taking your source file from a UNC path which requires some authentication.
I could give you two better alternatives, of course 1st once could be the proper solution.
If you are in PS 5.o or later, you can use -ToSession parameter of Copy-Item cmdlet.
$Session = New-PSSession -ComputerName $ServerName
Copy-Item -SourcePath \\\Server\c$\path -Destination c:\data\Destination-ToSession $Session
For more info Get-Help Copy-Item
Edit:
The second one:
Copy-Item -Path <Sourcepath> -DestinationPath \\destinataionserver\c$\folder
from source to the destination by accessing the shared folder of the destination system.
I have a requirement to copy file from local machine to remote machine using PowerShell. I can copy the file to remote computer using following command:
copy-item -Path d:\Shared\test.txt -Destination \\server1\Shared
the above command uses network share path to copy the file. I don't want to use network share option as the folder will not be shared on the remote machine. I tried following commands but not working.
copy-item -Path d:\Shared\test.txt -Destination \\server1\c$\Shared
Invoke-Command -ComputerName \\server -ScriptBlock {
copy-item -Path D:\Shared\test.txt -Destination C:\Shared
}
Please let me know how to make it working without using UNC path. I have full permissions on that folder on the remote machine.
Quickest way I found to this, since the account being used is Administrator, is to do the following:
New-PSDrive -Name X -PSProvider FileSystem -Root \\MyRemoteServer\c$\My\Folder\Somewhere\
cd X:\
cp ~\Desktop\MyFile.txt .\
## Important, need to exit out of X:\ for unmouting share
cd c:\
Remove-PSDrive X
Works every time.
You must have a shared folder to be able to copy files from one host to another, either on the remote host if you want to push the file:
Copy-Item -Path D:\folder\test.txt -Destination \\server1\remoteshare\
or on the local host if you want to pull the file:
Invoke-Command -ComputerName server1 -ScriptBlock {
Copy-Item -Path \\localcomputer\localshare\test.txt -Destination C:\Shared\
}
Administrative shares (\\server1\c$) can only be used if your account has admin privileges on that particular computer.
If there is not an accessible share, you'll have to make the file content itself an argument to the script:
Invoke-Command -ComputerName \\server -ScriptBlock {
$args[0] | Set-Content C:\Shared\test.txt
} -ArgumentList (Get-Content D:\Shared\test.txt -Raw)
Powershell 5 (Windows Server 2016)
Also downloadable for earlier versions of Windows. -ToSession can also be used.
$b = New-PSSession B
Copy-Item -FromSession $b C:\Programs\temp\test.txt -Destination C:\Programs\temp\test.txt
Earlier versions of PowerShell
Does not require anything special, these hidden shares exist on all machines.
Copy-Item -Path \\serverb\c$\programs\temp\test.txt -Destination \\servera\c$\programs\temp\test.txt
Invoke-Command -ComputerName compname -Credential $cred -ScriptBlock { Get-Content C:\myfolder\result.txt } >>res.txt
Note the C:\myfolder\result.txt is on the remote computer
Here's a script that worked for me for small files. Run as admin.
#pre 5.0 powershell copy-item to remote computer
Write-Host "Remote copy a file"
$servers = #("server01.dot.com", "server02.dot.com")
foreach($server in $servers) {
$username = 'USERNAME'
$password = 'PASSWORD'
$pw = ConvertTo-SecureString $password -AsPlainText -Force
$cred = New-Object Management.Automation.PSCredential ($username, $pw)
$myfile = [System.IO.File]::ReadAllBytes("C:\Temp\srctest.txt")
$s = New-PSSession -computerName $server -credential $cred
Enter-PSSession $s
Invoke-Command -Session $s -ArgumentList $myfile -Scriptblock {[System.IO.File]::WriteAllBytes("C:\Temp\desttest.txt", $args)}
Write-Host "Completed"
Remove-PSSession $s
}