syntax in for-loop using local variables - powershell

Many of my scripts are using PSsession quite often checking on available shares on several remote PCs. PS-Remoting is great.
Below is the simple working code-snippet:
if (Invoke-Command -Session $PSess -ScriptBlock {Get-SMBShare -Name $using:shareName_1 -ea 0}) {
$mapPS_1=0
write-Host "available: $shareName_1"
} else {
$mapPS_1=1
write-Host "NOT available: $shareName_1"
}
Instead of repeating this code several times (only with differing variables for the shareNames according to the examples in the list) I would like to create a for-loop (possibly) containing just that code.
Here is a list of shares to be checked upon:
$shareName_1='TC-MEDIA-SYSTEM'
$shareName_2='TC-MEDIA-DATA1'
$shareName_3='V-15TB_01'
$shareName_4='W-LL-503DD'
$shareName_5='X-TL-503AA'
$shareName_6='Y-LL-503BB'
$shareName_7='Z-LL-503CC'
$shareName_8='DUMPSTER_01'
$shareName_9='BKUP-NAS-01'
$shareName_10='TC-DUMPSTER_02'
As one can read from the code-snippet, depending on the resulting output, one local variable $mapPS gets manipulated.
All this works fine as shown above.
The trouble I am facing is to find the proper syntax for this to run in a for-loop.
I'd assume that the variable $using:shareName_$i is the troubling factor.
This is what I've got so far. Unfortunately, not bearing any fruit.
for ($i = 1; $i -lt 11; $i++) {
if (Invoke-Command -Session $PSess -ScriptBlock {Get-SMBShare -Name $using:(Get-Variable -Name shareName_$i).Value} -ea 0) {
Set-Variable -Name mapPS_$i value 0
write-Host "available: $shareName_$i"
} else {
Set-Variable -Name mapPS_$i value 1
write-Host "NOT available: $shareName_$i"
}
}
I'm sure Powershell has a solution for this - I just don't know it.
Any hints leading to the solution are appreciated.

Generally, consider storing your share names in an array instead of in individual variables (e.g., $shareNames ='TC-MEDIA-SYSTEM', 'TC-MEDIA-DATA1', ...), which allows you to reference them by index (e.g., $shareNames[0]) instead of having to resort to variable indirection via the Get-Variable cmdlet.
A simple solution is to create an aux. local variable in the loop that your remote script can reference via the $using: scope (which doesn't support commands or expressions):
foreach ($i in 1..10) {
$shareName = Get-Variable -ValueOnly "$shareName_$i"
if (Invoke-Command -Session $PSess -ScriptBlock { Get-SMBShare -Name $using:shareName } -ea 0) {
Set-Variable -Name mapPS_$i value 0
write-Host "available: $shareName_$i"
}
else {
Set-Variable -Name mapPS_$i value 1
write-Host "NOT available: $shareName_$i"
}
}

Related

If else statement inside foreach results in overwrite each time it loops

I apologize for the unclear title. I'm finding it hard to articulate my problem. I'm writing a script in powershell for the first time. I primarily use python for short scripts but for this task I'm forced to use powershell because of some limitations where I need to use powercli cmdlets. Let me quickly explain the problem. This is to create and/or assign tags to vms in vsphere.
I read the list of VMs into a variable $vms2tag. These are the ones that need to be tagged.
I read a json file into a variable and create tag name and description variables based on the data in the json (there's key value pairs that i can directly plug into the names and descriptions) This file also has a 'Server' key which has a value of the VM name exactly as it would appear in "Output-VM.csv" file. This file has data about every single VM that exists. Only ones that need tagged the ones in $vms2tag
Based on some if else conditions like if tag category exists, or if tag exists, it will either create one or use/assign one.
Basically the following code "works" in the sense it will create these tags BUT, it will quickly get overwritten by the next $vm until it keeps overwriting each time and the only tag that sticks around on all the $vms is the one created for the last VM in the list.
$myJson = Get-Content 'C:\For-Powershell.json'| Out-String | ConvertFrom-Json
$vms2tag = Get-Content 'C:\Output-VM.txt'
foreach ($vm in $vms2tag) {
For ($j=0; $j -lt $myJson.Length; $j++) {
if ($vm -eq $myJson.Server[$j]) {
Write-Output "Match!"
# Variables for Application Owner
$nameAO = [string]$myJson.Application_Owner[$j]
$descriptionAO = [string]$myJson.Application_Owner[$j]
# check if Tag Category and/or Tag exist
if ((Get-TagCategory -Name "app_owner") -eq $null) {
New-TagCategory -Name "app_owner" -Cardinality "Multiple"
}
if ((Get-Tag -Category "app_owner" | Set-Tag -Name $nameAO -Description $descriptionAO) -eq $null) {
$myTagAO = Get-TagCategory -Name "app_owner" | New-Tag -Name $nameAO -Description $descriptionAO
New-TagAssignment -Tag $myTagAO -Entity $myJson.Server[$j]
}
else {
$myTagAO = Get-Tag -Category "app_owner" | Set-Tag -Name $nameAO -Description $descriptionAO
New-TagAssignment -Tag $myTagAO -Entity $myJson.Server[$j]
}
}
}
}
I tested while the script runs and the tag is properly applied to the VM based on its data but when I refresh it after the script completes, all the tags on each VM exist but they are incorrect as they contain the information that's valid only for the last VM in the $vms2tag list. It seems pretty simple but I just don't see where I'm messing up. My guess is something with if else statements is nested incorrectly? It took me a while (~6 hours) to get this to work as I had other issues with the script but when I finally got the tags to correctly set based on the other conditions, I ended up with this problem so it's possible I'm just burnt out and not seeing it.
The problem is with the Tag logic. The following line is overwriting existing tags every loop:
if ((Get-Tag -Category "app_owner" | Set-Tag -Name $nameAO -Description $descriptionAO) -eq $null) {
The Set-Tag cmdlet should never be used in a test to find existing tags.
I would write the test and assignment block like the following:
$myTagAO = Get-Tag -Category "app_owner" -Name $nameAO -ErrorAction SilentlyContinue
if ($myTagAO -eq $null) {
$myTagAO = Get-TagCategory -Name "app_owner" | New-Tag -Name $nameAO -Description $descriptionAO
}
New-TagAssignment -Tag $myTagAO -Entity $myJson.Server[$j]
This ensures that each tag is only created once, with the appropriate description.

PowerShell Jobs, writing to a file

Having some problems getting a Start-Job script block to output to a file. The following three lines of code work without any problem:
$about_name = "C:\0\ps_about_name.txt"
$about = get-help about_* | select Name,Synopsis
if (-not (Test-
Path $about_name)) { ($about | select Name | sort Name | Out-String).replace("[Aa]bout_", "") > $about_name }
The file is created in C:\0\
But I need to do a lot of collections like this, so I naturally looked at stacking them in parallel as separate jobs. I followed online examples and so put the last line in the above as a script block invoked by Start-Job:
Start-Job { if (-not (Test-Path $about_name)) { { ($about | select Name | sort Name | Out-String).replace("[Aa]bout_", "") > $about_name } }
The Job is created, goes to status Running, and then to status Completed, but no file is created. Without Start-Job, all works, with Start-Job, nothing... I've tried a lot of variations on this but cannot get it to create the file. Can someone advise what I am doing wrong in this please?
IMO, the simplest way to get around this problem by use of the $using scope modifier.
$about_name = "C:\0\ps_about_name.txt"
$about = get-help about_* | select Name,Synopsis
$sb = { if (-not (Test-Path $using:about_name)) {
$using:about.Name -replace '^about_' | Sort-Object > $using:about_name
}
}
Start-Job -Scriptblock $sb
Explanation:
$using allows you to access local variables in a remote command. This is particularly useful when running Start-Job and Invoke-Command. The syntax is $using:localvariable.
This particular problem is a variable scope issue. Start-Job creates a background job with its own scope. When using -Scriptblock parameter, you are working within that scope. It does not know about variables defined in your current scope/session. Therefore, you must use a technique that will define the variable within the scope, pass in the variable's value, or access the local scope from the script block. You can read more about scopes at About_Scopes.
As an aside, character sets [] are not supported in the .NET .Replace() method. You need to switch to -replace to utilize those. I updated the code to perform the replace using -replace case-insensitively.
HCM's perfectly fine solution uses a technique that passes the value into the job's script block. By defining a parameter within the script block, you can pass a value into that parameter by use of -ArgumentList.
Another option is to just define your variables within the Start-Job script block.
$sb = { $about_name = "C:\0\ps_about_name.txt"
$about = get-help about_* | select Name,Synopsis
if (-not (Test-Path $about_name)) {
$about.Name -replace '^about_' | Sort-Object > $about_name
}
}
Start-Job -Scriptblock $sb
You've got to send your parameters to your job.
This does not work:
$file = "C:\temp\_mytest.txt"
start-job {"_" | out-file $file}
While this does:
$file = "C:\temp\_mytest.txt"
start-job -ArgumentList $file -scriptblock {
Param($file)
"_" | out-file $file
}

Where is $PSCmdlet.ShouldProcess state stored between calls (and how can it be reset)?

Consider the following contrived example:
function Test-ProcessContinue {
[CmdletBinding(SupportsShouldProcess=$true, ConfirmImpact='High')]
Param()
for ($i = 1; $i -le 3; $i++) {
if ($PSCmdlet.ShouldProcess("$i", "Process")) {
Write-Output "Processing $i"
}
else {
Write-Verbose "No chosen"
}
}
for ($i = 1; $i -le 3; $i++) {
if ($PSCmdlet.ShouldProcess("$i", "Process")) {
Write-Output "Processing $i"
}
else {
Write-Verbose "No chosen"
}
}
$yta = $false; $nta = $false
for ($i = 1; $i -le 3; $i++) {
if ($PSCmdlet.ShouldContinue("$i", "Continue", [ref]$yta, [ref]$nta) -or $yta) {
Write-Output "Continuing with $i"
}
elseif ($nta) {
Write-Verbose "No to all chosen"
break
}
else {
Write-Verbose "No chosen"
}
}
}
...and one of its potential outputs:
PS C:\> Test-ProcessContinue -Verbose
Confirm
Are you sure you want to perform this action?
Performing the operation "Process" on target "1".
[Y] Yes [A] Yes to All [N] No [L] No to All [S] Suspend [?] Help (default is "Y"): a
Processing 1
Processing 2
Processing 3
Processing 1
Processing 2
Processing 3
Continue
1
[Y] Yes [A] Yes to All [N] No [L] No to All [S] Suspend [?] Help (default is "Y"): a
Continuing with 1
Continuing with 2
Continuing with 3
In the case of the ShouldContinue loop (third for loop), I can see that the overload with the two by-reference boolean parameters is responsible for storing whether the end-user chose Yes to All or No to All into those two booleans.
However, in the case of the two ShouldProcess blocks (first two for loops), how is this state preserved?
In particular, in between the first two ShouldProcess blocks, how could I check if Yes to All or No to All were specified and/or what would I need to reset or clear in order to make the second ShouldProcess block ask for confirmation again?
(Favouring ShouldContinue over ShouldProcess is an option for fine-grained control, but it appears to lose the native/built-in support for [CmdletBinding(SupportsShouldProcess=$true)]
First, I'll address $PSCmdlet.ShouldContinue. This is basically a way to prompt on your own regardless of Confirm preferences.
$PSCmdlet.ShouldProcess on the other hand, doesn't always prompt. It takes into account the ConfirmImpact (which you set to High), and the $ConfirmPreference automatic variable, which defaults to High. The valid values are None, Low, Medium, and High and are meant to indicate how much of an impact a change has, so when $ConfirmPreference's value is equal to or less than a command's ConfirmImpact value, then ShouldProcess will prompt.
I know this isn't your direct question, but the background is important for answering what you should do.
The direct question: "where is the answer stored?" has a boring answer: it's stored in an internal variable in the class that defines the ShouldProcess method.
So, no, you can't get at it yourself, unfortunately.
But that brings us back to .ShouldContinue, which can take those references and store those values for you, so when you want the values, and want to be able to make decisions with them, you should use .ShouldContinue.
But, you should really use both. Because they do different things.
.ShouldProcess isn't just responsible for confirmation prompts, it's also responsible for handling -WhatIf/$WhatIfPreference; when you say your command SupportsShouldProcess you are also saying it supports -WhatIf. If you don't use .ShouldProcess, you'll get into the situation of having commands that appear to be safe but actually take action anyway.
So a pattern of something like this would cover your bases:
if ($PSCmdet.ShouldProcess('thing', 'do')) {
if ($PSCmdlet.ShouldContinue('prompt')) {
# do it
}
}
Problem with this goes back to your confirm impact and preferences. If those line up, or if the user invoked your command with -Confirm, you'll be prompting twice: once in the .ShouldProcess and then again in the .ShouldContinue.
That kind of sucks unfortunately.
I wrote a thing that seems to work around this. It's predicated first on a function that allows you to run an arbitrary scriptblock with confirmation, so that you can still run .ShouldProcess while suppressing its prompt.
Then it also tries to calculate whether a prompt is needed or not, and then selectively call .ShouldContinue. I didn't demonstrate storing or resetting the yesToAll and noToAll vars because you already know how to do that.
This is mainly to demo a pattern that could be used to adhere to standard confirmation prompt semantics, with discoverability, support for the -Confirm parameter, $ConfirmPreference, and ConfirmImpact, while maintaining support for -Verbose and -WhatIf.
function Test-Should {
[CmdletBinding(SupportsShouldProcess, ConfirmImpact = 'High')]
param()
Begin {
$local:ShouldConfirm = $ConfirmPreference -ne [System.Management.Automation.ConfirmImpact]::None -and
$ConfirmPreference -le [System.Management.Automation.ConfirmImpact]::High # hard coded :(
function Invoke-CommandWithConfirmation {
[CmdletBinding(SupportsShouldProcess)]
param(
[Parameter(Mandatory)]
[ScriptBlock]
$ScriptBlock
)
Invoke-Command -NoNewScope -ScriptBlock $ScriptBlock
}
}
Process {
if (Invoke-CommandWithConfirmation -ScriptBlock { $PSCmdlet.ShouldProcess('target', 'action') } -Confirm:$false ) {
if (-not $local:ShouldConfirm -or $PSCmdlet.ShouldContinue('query', 'caption')) {
'Hi' | Write-Host
'Hello' | Write-Verbose
}
}
}
}
Invocations:
Test-Should
Test-Should -Confirm
Test-Should -Confirm:$false
Test-Should -Verbose
Test-Should -Verbose -WhatIf
Test-Should -WhatIf -Confirm
Test-Should -WhatIf -Confirm:$false
And so on, with different values of $ConfirmPreference and different values for the command's ConfirmImpact.
The one thing annoying is the value I marked as hard coded: it has to match what you set as the confirm impact for that command.
It turns out it's kind of a pain in the ass to get at that value programmatically, but maybe you could work that in in some way.
This is mostly an extension of #briantist answer, but I was struggling with how to fully implement this for a while but I have gotten it working the way I want and wanted to share in case anyone else has a similar goal as me. I am hoping to turn this into a complete class that does all of this for me, but baby steps.
Function New-Function {
[CmdletBinding(
ConfirmImpact='None',
DefaultParameterSetName="Default",
HelpURI="",
SupportsPaging=$False,
SupportsShouldProcess=$True,
PositionalBinding=$True
)] Param(
[Parameter(ValueFromPipeline)]
$Items,
$Impact = 'Medium',
[Switch]$Force
)
Begin {
$PSCmdlet.GetDynamicParameters()
If (-Not $PSBoundParameters.ContainsKey('Verbose')) {
$VerbosePreference = $PSCmdlet.SessionState.PSVariable.GetValue('VerbosePreference')
}
If (-not $PSBoundParameters.ContainsKey('Confirm')) {
$ConfirmPreference = $PSCmdlet.SessionState.PSVariable.GetValue('ConfirmPreference')
}
If (-not $PSBoundParameters.ContainsKey('WhatIf')) {
$WhatIfPreference = $PSCmdlet.SessionState.PSVariable.GetValue('WhatIfPreference')
}
New-Variable -Name YesToAll -Value $False -Verbose:$False -Confirm:$False -WhatIf:$False
New-Variable -Name NoToAll -Value $False -Verbose:$False -Confirm:$False -WhatIf:$False
[Bool]$Local:ShouldConfirm = $Force -OR $ConfirmPreference -ne [System.Management.Automation.ConfirmImpact]::None -and $ConfirmPreference -le [System.Management.Automation.ConfirmImpact]::$Impact
$Local:ShouldProcess = ([Scriptblock]::Create(("Function ShouldProcess{{[CmdletBinding(SupportsShouldProcess, ConfirmImpact='{0}')]Param()`$PSCmdlet.ShouldProcess('{1}','{2}')}};ShouldProcess -Confirm:`$False" -f $Impact,'Target','Action')))
$Local:ShouldContinue = ([ScriptBlock]::Create(("Function ShouldContinue {{[CmdletBinding(SupportsShouldProcess, ConfirmImpact='{0}')]Param()New-Variable -Name YesToAll -Value `$PSCmdlet.GetVariableValue('YesToAll') -Verbose:`$False -Confirm:`$False -WhatIf:`$False;New-Variable -Name NoToAll -Value `$PSCmdlet.GetVariableValue('NoToAll') -Verbose:`$False -Confirm:`$False -WhatIf:`$False;`$PSCmdlet.ShouldContinue('{1}','{2}',[Ref]`$YesToAll,[Ref]`$NoToAll);Set-Variable -Name YesToAll -Value `$YesToAll -Scope 1 -Confirm:`$False -Verbose:`$False -WhatIf:`$False;Set-Variable -Name NoToAll -Value `$NoToAll -Scope 1 -Confirm:`$False -Verbose:`$False -WhatIf:`$False;}};ShouldContinue" -f $Impact,'Target','Action')))
}
Process {
ForEach ($Item in $Items) {
IF ((Invoke-Command -NoNewScope -ScriptBlock $Local:ShouldProcess) -eq $True) {
If ($Force -or $Local:ShouldConfirm -eq $False -or (Invoke-Command -NoNewScope -ScriptBlock $Local:ShouldContinue)) {
IF ($Local:Force) {
Write-Verbose -Message 'Force'
} ElseIf ($Local:YesToAll -eq $True) {
Write-Verbose -Message 'YesToAll'
} Else {
Write-Verbose -Message 'Yes'
}
Write-Host $Item
} Else {
If ($Local:NoToAll -eq $True) {
Write-Verbose -Message 'NoToAll'
} Else {
Write-Verbose -Message 'No'
}
}
}
}
}
End {
}
}
$ConfirmPreference = 'High'
'1','2','3','4','5','6','7','8','9' | New-Function -Impact Medium

What's the fastest way to get online computers

I'm writing a function which returns all Online Computers in our network, so I can do stuff like this:
Get-OnlineComputers | % { get-process -computername $_ }
Now I basically got my function ready, but it's taking way too long.
I want to only return Computers which have WinRM active, but I also want to provide the option to get every computer even those which haven't got WinRM set up (switch parameter).
This is my function. first it creates a pssession to the domaincontroller, to get all computers in our LAN. then foreach computer, it will test if they have WinRM active or if they accept ping. if so, it gets returned.
$session = New-PSSession Domaincontroller
$computers = Invoke-Command -Session $session { Get-ADComputer -filter * } | select -ExpandProperty Name
$computers | % {
if ($IncludeNoWinRM.IsPresent)
{
$ErrorActionPreference = "SilentlyContinue"
$ping = Test-NetConnection $_
if ($ping.PingSucceeded -eq 'True')
{
$_
}
}
else
{
$ErrorActionPreference = "SilentlyContinue"
$WinRM = Test-WSMan $_
if ($WinRM)
{
$_
}
}
}
Is this the best way I can go to check my online computers? Does anyone have a faster and better idea?
Thanks!
Very Quick Solution is using the -Quiet Parameter of the Test-Connection cmdlet:
so for example:
$ping = Test-Connection "Computer" -Quiet -Count 1
if ($ping)
{
"Online"
}
else
{
"Offline"
}
if it's not enough fast for you, you can use the Send Method of the System.Net.NetworkInformation.Ping
here's a sample function:
Function Test-Ping
{
Param($computer = "127.0.0.1")
$ping = new-object System.Net.NetworkInformation.Ping
Try
{
[void]$ping.send($computer,1)
$Online = $true
}
Catch
{
$Online = $False
}
Return $Online
}
Regarding execute it on multiple computers, I suggest using RunSpaces, as it's the fastest Multithreading you can get with PowerShell,
For more information see:
Runspaces vs Jobs
Basic Runspaces implemenation
Boe Prox (master of runspaces) has written a function which is available from the Powershell Gallery. I've linked the script below.
He uses many of the answers already given to achieve the simultaneous examination of 100s of computers by name. The script gets WMI network information if test-connection succeeds. It should be fairly easy to adapt to get any other information you want, or just return the result of the test-connection.
The script actually uses runspace pools rather than straight runspaces to limit the amount of simultaneous threads that your loop can spawn.
Boe also wrote the PoSH-RSJob module already referenced. This script will achieve what you want in native PoSH without having to install his module.
https://gallery.technet.microsoft.com/scriptcenter/Speedy-Network-Information-5b1406fb

Powershell parallel set-variable

I have a script that functions the way I want it to but it's slow. I tried using the same method in a workflow with foreach parallel but the set-variable command is not something that can be used within a workflow. I wanted to see if the way I'm doing this is incorrect and if there's a better way to get what I'm doing. The reason I want to do parallel requests is because the script can take quite a long time to complete when expanded to 20+ servers as is does each server in turn where as being able to do them all in one go would be quicker.
Below is a dumbed down version of the script (that works without parallel foreach) but it's effectively what I need to get working:
$servers = #("server1", "server2");
foreach ($s in $servers) {
$counter_value = get-counter "\\$s\counter_name"
Set-Variable -name "{s}counter" -value $counter_value
write-host ${server1counter}
Commands not supported in workflows needs to be executed in an Inlinescript. Try (untested):
workflow t {
$servers = #("server1", "server2");
foreach -parallel ($s in $servers) {
inlinescript {
$counter_value = get-counter "\\$using:s\counter_name"
Set-Variable -name "$($using:s)counter" -value $counter_value
#write-host with a PerformanceCounterSampleSet isn't a good combination. You'll only get the typename since it's a complex type (multiple properties etc.)
write-host (Get-Variable "$($using:s)counter" -ValueOnly)
}
}
}
t