Installing Windows Features on remote server 2012 using powershell 3.0 - powershell

I am wondering which is best practice considering both examples will probably work. Using the built in help examples I have written a script to install windows features on remote servers. Here is my code:
$servers = ('server1', 'server2', 'server3', 'server4')
ForEach ($server in $servers) {
Install-WindowsFeature -Name Desktop-Experience -ComputerName $server -IncludeAllSubFeature -IncludeManagementTools -Restart
}
Would the above be preferred OR should I wrap the "Install-WindowsFeature ..." in an "Invoke-Command" block like the following?
Invoke-Command -ComputerName server1, server2, server3, server4 -command {
Install-WindowsFeature -Name Desktop-Experience -ComputerName $server -IncludeAllSubFeature -IncludeManagementTools -Restart
}
Thanks for your insight!

Personally I would use the latter (directly call Install-WindowsFeature -ComputerName $server rather than do a separate Invoke-Command) in this case for the following reasons:
You may be hard-coding the feature names now, but in the future you may want to put those in a variable. If you put them in a variable, you'll have to pass it as a parameter into the Invoke-Command's script block. This is entirely possible, but more work.
By using your own loop, you can write progress messages, logging, etc.
You gain nothing by using Invoke-Command in this case because you're running a single command on the remote computer (as opposed to running multiple commands with -ComputerName parameters vs. running multiple commands inside the script block).

Related

Run powershell script on multiple servers parallelly

Probably a repetitive question, but need some to-the-point answers as I'm still learning. I want to run the PowerShell script on all remote servers- simultaneously, to install an application. I tried the below cmdlet but it's executing one after the other, and also would like to know the status or job output file. Any help is much appreciated. Thanks
$servers = Get-Content "C:\Dumps\Scripts\servers.txt"
foreach ($server in $servers) {
Invoke-Command -ComputerName $server -Filepath C:\Dumps\Scripts\CompleteSilent.ps1
}
Simply use $Servers as the -ComputerName argument. Invoke-Command will run them parallel.
You can adjust the Throttle on the parallelism with the -Throttle parameter.
Invoke-Command -ComputerName $Servers -Filepath C:\Dumps\Scripts\CompleteSilent.ps1
Note: If there are return objects and or screen echoes they may return disordered. It can be handled, but if this is an installation you may not encounter that particular issue.

How to remotely deploy multiple PowerShell cmdlets/scripts to multiple remote devices?

I'm looking to use a single host server to maintain a PowerShell script, with global variables, that can be interpreted and ran on several other devices in the network cluster.
On the main host I'd like to specifically be able to maintain a list of variables for IP addresses of each other device that I want to run the scripts against, but then how I want to run the script is something I'm having a hard time determining. There are several things I need to do to each other machine in the cluster (change the computer name, modify the time zone and time, configure the network adapters.... there's a decent list of stuff). The commandlets to do the functions on the individual machines is no problem... I have all of that written out and tested. I just don't what my options are for where that script is stored. Preferably, I think I'd like to declare all of the variables for everything that needs to be done on all machines, at the top of the file on the main host. Then I would like to break down everything that needs to be done to each host on the same file, on the main host. I know it will get a little messy, but that would make maintaining the cmdlets for each device much easier, especially when it comes to testing and making changes. Am I trying to do the impossible here??
I learned about using ENABLE-PSSESSION as well as INVOKE-COMMAND, but each seem to have their own challenges. With Enable-PSSession I cannot seem to find a way to wait for the script to connect to each host before it moves on to the next line. I've tried piping in Out-Null, as well as adding a Start-Sleep line. I don't want to have to manually connect to each host and then manually run the list of commands against each host. Invoke-Command doesn't seem to let me break out the SCRIPTBLOCK section into multiple lines.
Is there any suggestion for the best method of accomplishing the desire to run the script from the main host, that performs all of my cmdlets on multiple machines, without any additional human interaction??
Thanks so much!!
-Andrew
EDIT: I found that I can break the ScriptBlock line (contrary to what I thought didn't work yesterday). Here is basically what I'm trying to accomplish, though of course the below does not work when calling the variables from the top of the file:
#Edit These Variables
$NewName_Server2 = "Server2"
$NewName_Server3 = "Server3"
$NewName_Server4 = "Server4"
$IPAddress_Server2 = "10.10.10.2"
$IPAddress_Server3 = "10.10.10.3"
$IPAddress_Server4 = "10.10.10.4"
$TimeZone = "US Eastern Standard Time"
#Do Not Edit These Variables
$Server2 = "192.168.1.2"
$Server3 = "192.168.1.3"
$Server4 = "192.168.1.4"
#Configure Server 2
Invoke-Command -ComputerName $Server2 -ArgumentList $local -ScriptBlock {
Rename-Computer -NewName $NewName_Server2
New-NetIPAddress -InterfaceAlias "Wired Ethernet Connection" -IPv4Address $IPAddress_Server2
Set-TimeZone -ID $TimeZone
Restart-Computer -Force
}
#Configure Server 3
Invoke-Command -ComputerName $Server3 -ArgumentList $local -ScriptBlock {
Rename-Computer -NewName $NewName_Server3
New-NetIPAddress -InterfaceAlias "Wired Ethernet Connection" -IPv4Address $IPAddress_Server3
Set-TimeZone -ID $TimeZone
Restart-Computer -Force
}
#Configure Server 4
Invoke-Command -ComputerName $Server3 -ArgumentList $local -ScriptBlock {
Rename-Computer -NewName $NewName_Server3
New-NetIPAddress -InterfaceAlias "Wired Ethernet Connection" -IPv4Address $IPAddress_Server4
Set-TimeZone -ID $TimeZone
Restart-Computer -Force
}
You can use the using scope to access local variables. I don't know what $local is. Nice try.
$a = 'hi'
invoke-command comp001,comp002 { $using:a }
hi
hi
The other way is using a param, not well documented. Passing arrays is more tricky.
invoke-command comp001,comp002 { param($b) $b } -args $a

Powershell Invoke-GPUpdate - "No Logoff" possible?

Looking to run Invoke-GPUPdate -force to a group of remote computers and respond to the logoff prompt with "No".
Tried:
Echo "n" | invoke-gpupdate
Error:Invoke-gpupdate does not accept pipeline input
Command Used:
Invoke-GPUpdate -Computer $computer -RandomDelayInMinutes 0 -force
Unfortunately it looks like this cmdlet initiates/schedules a run of gpupdate that ends up happening separately (out of process), so there isn't much to do via PowerShell's standard ways of dealing with something like that, since the prompt doesn't come from within PowerShell. There's a -LogOff parameter, but it's a switch parameter which implies that its value is meant to be used just for doing the logoff. You can try it this way: -Logoff:$false but most likely it won't work to get rid of the prompt.
I think your best chance is not to use this cmdlet, but to instead use Invoke-Command with gpupdate.exe directly:
Invoke-Command -ComputerName $computer -ScriptBlock {
echo nn | gpupdate.exe /force
}
But this requires that PowerShell remoting is enabled on the machines you want to manage.

PowerShell failing to copy from UNC path

I'm working on a script to copy a folder from a UNC path to a local server. I'm remotely running my script through an interactive session and utilizing Invoke-Command -ScriptBlock like so:
Invoke-Command -ComputerName MyServer -ScriptBlock $Script
This is the script to do the copying:
$script {
try {
New-PSDrive -Name MyDrive -PSProvider FileSystem -Root \\uncpathserver\e$\SourceCode\ -Credential Contoso\me
Copy-Item -Path \\uncpathserver\e$\SourceCode\* -Destination E:\Inetpub\Target -Recurse -Force
}
catch {
Write-Host "Failed to copy!"
}
}
It is failing and throwing my catch block every time. I can't seem to figure out what I am missing to get this to work - it seems so simple and I hope I'm not missing something blatantly obvious.
EDIT:
I was able to get it to work by now just running the script from my local PC instead of from a server. I'm calling the file copy out of $script block now as well. This is what the new code looks like:
$MyServers= #("server-01", "server-02")
foreach ($server in $MyServers)
{
$TargetSession = New-PSSession -ComputerName $server -Credential
contoso\me
Copy-Item -ToSession $TargetSession -Path C:\Source\TheCode\ -
Destination "E:\InetPub\wherethecodegoes" -Recurse -Force
}
Everything else I'm doing inside my $script block (which has been omitted here for troubleshooting sake) is working A-OK. I do have to enter my credentials for each server, but due to the small nature of servers I'm working with, that isn't a deal breaker.
Sounds like a 'kerberos double hop' problem.
Short-Answer
Avoid the problem. From your system, setup two PSdrives. Then copy \\uncpathserver\e$\SourceCode\ to \\RemoteIISserver\E$\Inetpub\Target\
Long-Answer
From your system (System A), you are remotely executing a script (on System B) that will copy a remote folder (from System C).
It should work, but it doesn't. This is because when you (specifically, your account) from System A, remotely connects to System B, then asks System C for something, 'System C' doesn't trust you.
A quick google of the problem will show a myriad of ways around this issue, however;
Not all methods are secure (example: CredSSP)
Not all methods will work on your version of Windows (which is...?)
Not all methods will work with PowerShell
One secure method that does work with PowerShell leverages delegation.
This can be a bit daunting to setup, and I suggest you read-up on this thoroughly.
## Module 'ActiveDirectory' from RSAT-AD-PowerShell Windows feature required.
$ServerA = $Dnv:COMPUTERNAME
$ServerB = Get-ADComputer -Identity ServerB
$ServerC = Get-ADComputer -Identity ServerC
Delegate 'Server B' to access 'Server C';
# Set the resource-based Kerberos constrained delegation
Set-ADComputer -Identity $ServerC -PrincipalsAllowedToDelegateToAccount $ServerB
# Confirm AllowedToActOnBehalfOfOtherIdentity.Access is correct (indirectly).
Get-ADComputer -Identity $ServerC -Properties PrincipalsAllowedToDelegateToAccount
Wait about 15 minutes for 'Server B' to sync-up (or just reboot it).
You can force this with the following (Note: $Cred should contain your credentials);
Invoke-Command -ComputerName $ServerB.Name -Credential $cred -ScriptBlock {
klist purge -li 0x3e7
}
Run a test-hop;
Invoke-Command -ComputerName $ServerB.Name -Credential $cred -ScriptBlock {
Test-Path \\$($using:ServerC.Name)\C$
Get-Process lsass -ComputerName $($using:ServerC.Name)
Get-EventLog -LogName System -Newest 3 -ComputerName $($using:ServerC.Name)
}
The downside is you have to setup every remote remote-target (every 'Server C') this way. But the upside is that it's secure.

How to remotely check the status of a web application pool with PowerShell?

I see a lot of scripts for recycling application pools on a web server running IIS7 but is there a way to check, with PowerShell, that the web application pool is running or stopped? I can't seem to figure out a way to remotely have Get-WebAppPoolState return the status on the Application Pool, and my Google-fu has not been able to come up with a replacement. I can remotely get gwmi to work and recycle or start my app pools but ideally I only want to have to run this if the app pool is actually stopped.
Would I need to work this out with PSExec or is there an alternative I can use similar to gwmi and have a one line command to call the app pool on the IIS7 server and give me the status?
You can use Invoke-Command to invoke the Get-WebAppPoolState cmdlet on the remote machine.
$appPoolStatus = Invoke-Command -ComputerName RemoteHostName {Import-Module WebAdministration; Get-WebAppPoolState DefaultAppPool}
$appPoolStatus.Value
Note that if you are going to use variables defined locally on the calling machine, you will have to treat them according to the rules. This link is an excellent post explaining them:
http://blogs.msdn.com/b/powershell/archive/2009/12/29/arguments-for-remote-commands.aspx
Example:
$appPoolName = "SomeAppPoolName"
$appPoolStatus = Invoke-Command -ComputerName RemoteHostName { param($apn) Import-Module WebAdministration; Get-WebAppPoolState $apn} -Args $appPoolName
You can use this if you want pretty out:
Invoke-Command -ComputerName RemoteHostName -ScriptBlock { Get-WebAppPoolState | % { return #{($_.itemxpath -split ("'"))[1]="$($_.value)" } }}
For older version of IIS (before version 8):
$items = get-wmiobject -namespace 'root\MicrosoftIISv2' -computername $server -class 'IIsApplicationPoolSetting'