powershell invoke-command returns upon completion - powershell

I would like to parallel copy using robocopy and returning log data when it is done.
if i were to use invoke-command server1,server2,server3 { robocopy }, it spits out the logs as it is copying for $1, $2, $3. this getting very confusing and hard to read. Is there a way to have the log or progress returns when copying is done?
for example:
server3 done.. display all copied files
server1 done.. display all copied files
server2 done.. display all copied files
instead of
$3 display copied files
server1 display copied files
server2 display copied files
server2 display copied files
server1 display copied files
server2 display copied files
I know i could use runspace to multi threads and check completion for handle to display the data (which i am currently using). but i would like to know if it is possible to do the same with invoke-command and it is easier :).
thanks

To expand on my comment, you could create your desired action as a scriptblock, use the -AsJob parameter to make background jobs for the copy actions, then wait for the jobs to finish and output the results when they're done.
$SB = {
& robocopy \\FileServer\Share\KittenPictures C:\webroot\MySite\Images *.*
}
$Servers = ('ServerA','ServerB','ServerC')
$Jobs = Invoke-Command -ScriptBlock $SB -ComputerName $Servers -AsJob
While($Jobs.JobStateInfo -Contains Running){
$Jobs | Wait-Job -Any
Get-Job -State Completed | ForEach{
$_ | Receive-Job
$_ | Remove-Job
}
}

Related

Using Invoke command on multiple computers. Need to export some results to location where invoke-command is running and name each file unique name

I am running invoke-command on lots of servers to gather info in parallel. I am then exporting the data locally on the server where the invoke-command is run. Problem i am having is exporting each job as its own unique $server.txt file when i am running invoke-command in parallel.
Contents of servers.txt file
Server1
Server2
Server3
Here is my current code..
icm -ComputerName (Get-Content c:\temp\servers.txt) -ErrorAction SilentlyContinue -ThrottleLimit 15 -ScriptBlock{
$A=get some local server info
$B=get some local server info
$C=get some local server info
echo $A,$B,$C} | out-file c:\temp\$server.txt
Problem i have is that i want to export the results as a filename that is the server name the script is running on, and i cannot get the $server variable when using invoke-command to run these jobs in parallel.
i dont want to use a foreach loop because thats not in parallel.
i want the output for each serve to be in unique files like this.
C:\temp\server1.txt
C:\temp\server2.txt
C:\temp\server3.txt
EDIT:
I guess my other question or workaround would be is there any way to use a variable from inside the invoke-command loop outside?
I am gathering the machine name inside the invoke-command, but need to use it to set the filename of the export file.
thank you
invoke-command -ComputerName (Get-Content c:\temp\servers.txt) -ErrorAction SilentlyContinue -ThrottleLimit 15 -ScriptBlock{
$data = #(
get some local server info
get some local server info
get some local server info
)
$server = $env:computername
$data | set-content c:\temp\$server.txt
}
should do the trick - but you could also receive the data directly:
$result = invoke-command -ComputerName (Get-Content c:\temp\servers.txt) -ErrorAction SilentlyContinue -ThrottleLimit 15 -ScriptBlock{
$data = #(
get some local server info
get some local server info
get some local server info
)
return $data
}
$result | export-csv C:\data.csv -delimiter ";" -noclobber -notypeinformation
I would return an object instead, then you get the pscomputername property added on.
icm localhost,localhost,localhost { # elevated prompt for this example
$A,$B,$C=1,2,3
[pscustomobject]#{A=$A;B=$B;C=$C}
} | ft
A B C PSComputerName RunspaceId
- - - -------------- ----------
1 2 3 localhost 40636f4b-0b65-494f-9912-82464e34c0f2
1 2 3 localhost 857d514c-8080-40ce-8848-d9b62088d75d
1 2 3 localhost 6ee0fd30-fb3a-4ad7-abba-bb2da0fbbece

Pulling Win10 activation codes from remote computers

I'm brand new to PS scripting, so bear with me :)
I'm trying to create a PS script that will write the Win10 activation code to a file then copy that file to a central repo to then manually activate.
I'm creating a PS script and trying to run
cscript.exe c:\windows\system32\slmgr.vbs -dti >
$SourceDir\$env:computername.txt
$SourceDir = \\computer01\c$\temp
I need to run it from one computer, remotely connecting to every computer on the network, creating the computername.txt file then copying that file back to a central repository for all the files.
What I have so far:
$s1=New-PSSession -ComputerName computer01 -Credential $AdminCred
Test-Connection -ComputerName computer01
$id='\\computer01\windows\system32'
$SourceDir='\\computer01\c$\temp'
md $SourceDir
$GetActID=cscript.exe $id\slmgr.vbs -dti >
$SourceDir\$env:computername.txt
Invoke-Command -Session $s1 -ScriptBlock { $Using:GetActID }
Then I call a batch file that copies the computername.txt file from the computer01 over to a repository where they are going to sit.
I FINALLY got it working correctly except for the name of the file isn't naming it to the computer01, it's naming it with the hostname of the computer I'm running it from, therefore the filenames are identical. I had the naming piece working, but I had to change the way I was remoting into the computer and now it's not naming correctly.
Any idea on how I could get it to name the file to be related to the remote computer?
**I'm still working on the whole piece of the puzzle where it goes back to an excel sheet pulled from AD and pulls the host names from that sheet to connect to each machine, I believe I'll be adding a ForEach syntax in there somehow for that.
Although not sure how you are getting the list of "every computer on the network", chances are you are doing this using
# get a list of all AD computers (their names only)
$computers = (Get-ADComputer -Filter *).Name
Then I think you don't need to have every computer save the file on its own disk and later copy these files to a central share.
Instead, just capture the info in a variable and after the loop write the file to the central share as structured CSV file combining all computernames and install id's so you can open in Excel.
Using the array of computernames from above, iterate through them
$result = $computers | ForEach-Object {
# test if the computer can be reached
if (Test-Connection -ComputerName $_ -Count 1 -Quiet) {
$installId = Invoke-Command -ComputerName $_ -ScriptBlock {
cscript.exe //nologo "$env:SystemRoot\System32\slmgr.vbs" -dti
}
# $installId is returned as array !
# output an object with two properties
[PsCustomObject]#{
Computer = $_
InstallId = $installId[0] -replace '\D' # remove everything non-numeric
}
}
else {
Write-Warning "Computer $_ is not responding"
}
}
# now you can display the result on screen
$result | Format-Table -AutoSize
# or by means of the GridView if you prefer
$result | Out-GridView -Title 'Computer InstallIds'
# and save the results in your central share as structured CSV file
$result | Export-Csv -Path '\\server\share\restofpath\ComputerInstallIds.csv' -NoTypeInformation
You may have to append -Credential $adminCreds to the Invoke-Command call to make sure you have permissions to have each machine run that piece of code in the scriptblock. The easiest way of obtaining that credential is to start off with $adminCreds = Get-Credential -Message "Please enter administrator credentials"

How to capture external command progress in PowerShell?

I'm using a PowerShell script to synchronize files between network directories. Robocopy is running in the background.
To capture the output and give statistics to the user, currently I'm doing something like:
$out = (robocopy $src $dst $options)
Once that is done, a custom windows form is presented with a multi-line text box containing the output string.
However, doing this way halts the script execution until file copy is done. Since all the other input screen are presented to the user as graphical dialogues, I would like to give user progress output in a graphical way.
Is there a way to capture the stdout from robocopy, on the fly ?
Then the next question would be:
How to pipe that output into a form with a text box?
You can run the robocopy job in the background and keep checking on the progress.
Start-Job will start the process in the background
Receive-Job will give you all the data that has been printed so far.
$job = Start-Job -scriptBlock { robocopy $using:src $using:dst $using:options }
$out = ""
while( ($job | Get-Job).HasMoreData -or ($job | Get-Job).State -eq "Running") {
$out += (Receive-job $job)
Write-output $out
Start-Sleep 1
}
Remove-Job $job

Remote command output to text file (remote system)

I know I must be using these commands wrong but I can't seem to find a solution. I believe the issue is with my use of the invoke-command and out-file. I'm trying to check to see if a process is running on multiple remote machines and write their states back to a text file on the host system. Even if it wrote to the remote system I could work with that but I cant seem to get anything.
$MyDomain=’mydomain’
$MyClearTextUsername=’myusername’
$MyClearTextPassword=’mypassword’
$MyUsernameDomain=$MyDomain+’\’+$MyClearTextUsername
$SecurePassword=Convertto-SecureString –String $MyClearTextPassword
-AsPlainText –force
$MyCredentials=New-object System.Management.Automation.PSCredential
$MyUsernameDomain,$SecurePassword
$Servers = ( "server1","server2","server3")
$output = foreach ($Server in $Servers)
{
$Session = New-PSSession -ComputerName $Server -Credential $MyCredentials
Invoke-Command -Session $Session -ScriptBlock
{
Get-Service -Name service | select name, status, PSComputername, Runspaceid
} | Out-File -filepath 'c:\TEMP\check.txt'
}
Write-output $output | Out-File -filepath 'c:\TEMP\check.txt'
edit: I don't believe the last line is needed but I threw it in just to see if I could get anything out.
You are not capturing anything in $output because you are redirecting all of the output from your Invoke-Command to Out-File -filepath 'c:\TEMP\check.txt'. Get-Service doesn't return that much data, especially once's it's been deserialized when it returns from the remote session, so I wouldn't bother with the Select statement. Even if you do want to include the Select statement you are specifying PSComputerName which doesn't get added until the data comes back from the remote system, so you may want to move that Select to outside of the scriptblock and after the Invoke-Command in the pipeline. Plus, since you are outputting with Out-File your local file is being overwritten each time that call is made, so the first server's results are saved, then overwritten by the second server's results, then by the third server's results. After that, since $output has nothing (as all output was redirected to file), you are outputting an empty variable to the same file, effectively erasing the service state of the third server.
But this really all becomes a moot point if the script is run with the credentials that has access to the remote servers. You can specify one or more computer names to the Get-Service cmdlet, so this could become as simple as:
$Results = Get-Service Service -ComputerName 'Server1','Server2','Server3'
$Results | Select name, status, PSComputername, Runspaceid | Set-Content 'C:\TEMP\check.txt'
Just to make sure... you are looking for a service right? Not just a process? Because if it isn't a service you would need to use Get-Process instead of Get-Service.
If you want to output the data to the remote server you could do:
$output = foreach ($Server in $Servers)
{
$Session = New-PSSession -ComputerName $Server -Credential $MyCredentials
Invoke-Command -Session $Session -ScriptBlock
{
Get-Service -Name service | Tee-Object -FilePath C:\Temp\ServiceState.txt
} | select name, status, PSComputername, Runspaceid
}
$output | Out-File -filepath 'c:\TEMP\check.txt'
That should make a file in the C:\Temp folder on each server with the state of the service, as well as pass the information back to the local host, where it is passed to Select and stored in $output. At the end I output $output to file, just as you did.
I guess in the end you could just remove the Out-File call from within your loop, and it would probably do what you want it to.

Executing a batch file from local server to multiple remote servers using powershell

I would like execute the below batch file (using PowerShell) from local server which will get results from multiple remote servers and I would like to get those results in C:\temp folder.
#echo off
cd "C:\Program Files\Tivoli\TSM\baclient"
dsmc.exe q mgmtclass > C:\temp\TSMmgmtclass.txt
After that, would like to get those output results using PowerShell script as mentioned below.
Get-Content -Path 'C:\Program Files\tivoli\tsm\baclient\dsmerror.log' | select-object -last 15
The best way to do this is probably using invoke-command, and eliminating the .bat file completely. it would look something like this
$scriptBlock = {
& "C:\Program Files\Tivoli\TSM\baclient\dsmc.exe" q mgmtclass
return $return
}
invoke-command -computername $computernames -scriptblock $scriptBlock | out-file $logfile
In this case $computernames would be an array holding the name of each computer you want to run the command against and $logfile is just the path you want the contents output to.