Why does PowerShell code behave different when called from a function? - powershell

Experienced C# developer learning PowerShell here so I am sure I am just missing something silly. What I am trying to do is write a function that simply writes it's input to a temporary file in JSON format. I have code that works fine if I run it 'inline', but that same code writes an empty file when called in a function.
Here is the code:
function Dump-File {
param (
[Parameter(Mandatory=$true)]
$Input
)
$tmp = New-TemporaryFile
$Input | ConvertTo-Json | Out-File $tmp.FullName
Write-Output "Dump file written: $($tmp.FullName)"
}
$args = #{}
$args.Add('a', 1)
$args.Add('b', 2)
$args.Add('c', 3)
$args.Add('d', 4)
# results in json written to temp file
$tmp = New-TemporaryFile
$args | ConvertTo-Json | Out-File $tmp.FullName
Write-Output "args dumped: $($tmp.FullName)"
# results in empty temp file
Dump-File $args
Can anyone help me understand why the code called inline works but the same code does not work when I wrap it up as a function?

$Input is an automatic variable.
Changing the name of your Dump-File parameter to $somethingelse will resolve your problem. Never use $input as a parameter or variable name.
Automatic variables are to be considered read-only.
About Automatic Variables
SHORT DESCRIPTION
Describes variables that store state information for
PowerShell. These variables are created and maintained by PowerShell.
LONG DESCRIPTION
Conceptually, these variables are considered to be
read-only. Even though they can be written to, for backward
compatibility they should not be written to.
Here is a list of the automatic variables in PowerShell:
...
$INPUT
Contains an enumerator that enumerates all input that is passed to a function. The $input variable is available only to functions and script blocks (which are unnamed functions). In the Process block of a function, the $input variable enumerates the object that is currently in the pipeline. When the Process block completes, there are no objects left in the pipeline, so the $input variable enumerates an empty collection. If the function does not have a Process block, then in the End block, the $input variable enumerates the collection of all input to the function.
Source: About_Automatic_Variables
This information is also avaible through Get-help command
Get-Help about_Automatic_Variables

Related

Powershell script parameter as variable name

I'm quite new to Powershell scripting and I have hit a bump where I obviously don't know how to ask Google the right question.
I am writing a script to be called from a system that only allows me to add a single parameter to the command line, but I in fact need more values to execute the script.
My idea is to build a variable for each possible parameter, and then use the variable going forward (Simplified):
$name1= "value1","value2","value3"
$name2= "value4","value5","value6"
$name3= "value7","vlaue8","value9"
foreach ($value in $nameX) { }
and then call the script like: script.ps1 nameX
But how to convert the parameter into the name of the corresponding variable?
Or are there easier ways...?
You should be able to solve your problem with the Get-Variable cmdlet:
# The one and only argument passed: the name of a variable defined inside
# the script; e.g., 'name1'
$variableName = $args[0]
# Define the variables that the argument can refer to:
$name1= "value1","value2","value3"
$name2= "value4","value5","value6"
$name3= "value7","vlaue8","value9"
# Use Get-Variable to get a variable's value by name.
# (Error handling omitted for brevity.)
foreach ($value in (Get-Variable $variableName -ValueOnly)) {
# ...
}

How to pass a custom function inside a ForEach-Object -Parallel

I can't find a way to pass the function. Just variables.
Any ideas without putting the function inside the ForEach loop?
function CustomFunction {
Param (
$A
)
Write-Host $A
}
$List = "Apple", "Banana", "Grape"
$List | ForEach-Object -Parallel {
Write-Host $using:CustomFunction $_
}
The solution isn't quite as straightforward as one would hope:
# Sample custom function.
function Get-Custom {
Param ($A)
"[$A]"
}
# Get the function's definition *as a string*
$funcDef = ${function:Get-Custom}.ToString()
"Apple", "Banana", "Grape" | ForEach-Object -Parallel {
# Define the function inside this thread...
${function:Get-Custom} = $using:funcDef
# ... and call it.
Get-Custom $_
}
Note: This answer contains an analogous solution for using a script block from the caller's scope in a ForEach-Object -Parallel script block.
Note: If your function were defined in a module that is placed in one of the locations known to the module-autoloading feature, your function calls would work as-is with ForEach-Object -Parallel, without extra effort - but each thread would incur the cost of (implicitly) importing the module.
The above approach is necessary, because - aside from the current location (working directory) and environment variables (which apply process-wide) - the threads that ForEach-Object -Parallel creates do not see the caller's state, notably neither with respect to variables nor functions (and also not custom PS drives and imported modules).
As of PowerShell 7.2.x, an enhancement is being discussed in GitHub issue #12240 to support copying the caller's state to the parallel threads on demand, which would make the caller's functions automatically available.
Note that redefining the function in each thread via a string is crucial, as an attempt to make do without the aux. $funcDef variable and trying to redefine the function with ${function:Get-Custom} = ${using:function:Get-Custom} fails, because ${function:Get-Custom} is a script block, and the use of script blocks with the $using: scope specifier is explicitly disallowed in order to avoid cross-thread (cross-runspace) issues.
However, ${function:Get-Custom} = ${using:function:Get-Custom} would work with Start-Job; see this answer for an example.
It would not work with Start-ThreadJob, which currently syntactically allows you to do & ${using:function:Get-Custom} $_, because ${using:function:Get-Custom} is preserved as a script block (unlike with Start-Job, where it is deserialized as a string, which is itself surprising behavior - see GitHub issue #11698), even though it shouldn't. That is, direct cross-thread use of [scriptblock] instances causes obscure failures, which is why ForEach-Object -Parallel prevents it in the first place.
A similar loophole that leads to cross-thread issues even with ForEach-Object -Parallel is using a command-info object obtained in the caller's scope with Get-Command as the function body in each thread via the $using: scope: this too should be prevented, but isn't as of PowerShell 7.2.7 - see this post and GitHub issue #16461.
${function:Get-Custom} is an instance of namespace variable notation, which allows you to both get a function (its body as a [scriptblock] instance) and to set (define) it, by assigning either a [scriptblock] or a string containing the function body.
I just figured out another way using get-command, which works with the call operator. $a ends up being a FunctionInfo object.
EDIT: I'm told this isn't thread safe, but I don't understand why.
function hi { 'hi' }
$a = get-command hi
1..3 | foreach -parallel { & $using:a }
hi
hi
hi
So I figured out another little trick that may be useful for people trying to add the functions dynamically, particularly if you might not know the name of it beforehand, such as when the functions are in an array.
# Store the current function list in a variable
$initialFunctions=Get-ChildItem Function:
# Source all .ps1 files in the current folder and all subfolders
Get-ChildItem . -Recurse | Where-Object { $_.Name -like '*.ps1' } |
ForEach-Object { . "$($_.FullName)" }
# Get only the functions that were added above, and store them in an array
$functions = #()
Compare-Object $initialFunctions (Get-ChildItem Function:) -PassThru |
ForEach-Object { $functions = #($functions) + #($_) }
1..3 | ForEach-Object -Parallel {
# Pull the $functions array from the outer scope and set each function
# to its definition
$using:functions | ForEach-Object {
Set-Content "Function:$($_.Name)" -Value $_.Definition
}
# Call one of the functions in the sourced .ps1 files by name
SourcedFunction $_
}
The main "trick" of this is using Set-Content with Function: plus the function name, since PowerShell essentially treats each entry of Function: as a path.
This makes sense when you consider the output of Get-PSDrive. Since each of those entries can be used as a "Drive" in the same way (i.e., with the colon).

Passing an object from one script to another

I am having an issue passing an array member to another script. I have a VM build script that pulls from a CSV, I end up with a $VM object with .Name, .CPU, .RAM, .IP, etc. I want to pass that VM object to another script (inside the new server) which can then act on it, but am unable to do so. I have been testing the correct syntax just to pass a simple array, like below, but am still not successful:
CSV:
Name,NumCPU,MemoryGB,IPAddress
JLTest01,2,4,172.24.16.25
Script1:
Function TestMe {
[CmdLetBinding()]
Param (
[Parameter(Mandatory, Position=1)]
[array]$arr
)
$arr | Out-GridView
}
TestMe
Calling Script:
$aVMs = Import-Csv -Path "PathToCsv"
foreach($VM in $aVMs) {
$command = "<path>\TestMe.ps1 " + "-arr $($VM)"
Invoke-Expression $command
}
This produces an error, which seems to be on parsing the array. The error states:
The term 'JLTest01' is not recognized as the name of a cmdlet, function, script file, or operable program. Check the spelling of the name, or if a path was included, verify that the path is correct and try again.
At line:1 char:48 + ... \Desktop\TestMe.ps1 -arr #{Name=JLTest01; NumCPU ...
Just trying to figure out what I am doing wrong exactly, and what I need to do to pass the object to the second script.
Don't use Invoke-Expression (which is rarely the right tool and should generally be avoided for security reasons):
The stringification of the custom objects output by Import-Csv performed by $($VM) does not preserve the original objects; it results in a hashtable-like representation that isn't suitable for programmatic processing and breaks the syntax of the command line you're passing to Invoke-Expression.
Instead, just invoke your script directly:
$aVMs = Import-Csv -Path "PathToCsv"
.\TestMe.ps1 -arr $aVMs
Note that I'm passing $aVMs as a whole to your script, given that your -arr parameter is array-typed.
If you'd rather process the objects one by one, stick with the foreach approach (but then you should declare the type of your $arr parameter as [pscustomobject] rather than [array]):
$aVMs = Import-Csv -Path "PathToCsv"
foreach ($VM in $aVMs) {
.\TestMe.ps1 -arr $VMs
}
Another option is to declare $arr as accepting pipeline input, add a process block to your script, and then pipe $aVMs to your script ($aVMs | .\TestMe.ps1).
Also, don't nest a function of the same name inside your .ps1 script that you call from within the script, especially not without passing the arguments through; scripts can directly declare parameters, just like functions:
[CmdLetBinding()]
Param (
[Parameter(Mandatory, Position=1)]
[array]$arr
)
$arr | Out-GridView

In Powershell, what is $input for? [duplicate]

This question already has an answer here:
Pass object[] into a function in PowerShell
(1 answer)
Closed 5 years ago.
I was trying to create a hash table,
$input = #{'G'=100;'E'=50;'D'=35;'A'=100}
and could not figure out for the life of me why it wouldn't display as usual with commands like write-host, or simply $input. write-host returned System.Collections.ArrayList+ArrayListEnumeratorSimple. $input returned nothing. No error was thrown.
On a hunch I renamed the hash table, and boom, it appears as normal. Opening a new powershell tab in ISE, I observe that the variable $input is filled in with intellisense even though I have not defined it in this environment.
Now I'm curious: what is this system variable $input for? I'm on version 4.
It's an automatic variable:
$INPUT
Contains an enumerator that enumerates all input that is passed
to a function. The $input variable is available only to functions and
script blocks (which are unnamed functions). In the Process block of a
function, the $input variable enumerates the object that is currently
in the pipeline. When the Process block completes, there are no
objects left in the pipeline, so the $input variable enumerates an
empty collection. If the function does not have a Process block, then
in the End block, the $input variable enumerates the collection of all
input to the function.
This is also available in PowerShell:
Get-Help about_Automatic_Variables
I also have an open feature request for Set-StrictMode to handle detection of this.

Pass a variable to another script

I have two PowerShell scripts.
The first script has the following code:
$var = "abc"
$DIR = "C:\"
$SCRIPT_NAME = "abc.ps1"
&"${DIR}\${SCRIPT_NAME}" #execute the second script
If I want to pass the variable $var to the second script, how do I achieve that? What code do I need to put in both the first and the second script?
Parameters (Recommended): Use parameters to pass values to the second script.
Step2.ps1:
param ($myparameter)
write-host $myparameter
Step1.ps1:
$var = "abc"
$DIR = "C:\"
$SCRIPT_NAME = "step2.ps1"
&"${DIR}\${SCRIPT_NAME}" -myparameter $var
Alternative: You could also have used arguments $args (extra values not linked to a parameter). You can specify the first argument using $args[0]. I would however always recommend parameters as arguments needs to be in a specific order (if multiple arguments are passed) etc.
Step2.ps1:
write-host $args[0]
Step1.ps1:
$var = "abc"
$DIR = "C:\"
$SCRIPT_NAME = "step2.ps1"
&"${DIR}\${SCRIPT_NAME}" $var
There are several ways to do what you want, two of which have already been suggested by #FrodeF..
Pass the variable as a (named) parameter:
# script1.ps1
$var = 'foo'
$dir = 'C:\some\folder'
$scriptname = "script2.ps1"
& "${dir}\${scriptname}" -Foo $var
# script2.ps1
Param($foo)
Write-Output $foo
This is the cleanest solution. You have a well-defined interface and pass the variable in a clear-cut way from one script to another.
Parameter definitions will also allow you to make a parameter mandatory (so that the script will ask the user to provide input if the parameter was omitted), require a particular data type, easily incorporate validation routines, or add comment-based help.
# script2.ps1
<#
.SYNOPSIS
Short description of the script or function.
.DESCRIPTION
Longer description of what the script or function actually does.
.PARAMETER Foo
Description of the parameter Foo.
#>
[CmdletBinding()]
Param(
[Parameter(Mandatory=$true, Position=0, ValueFromPipeline=$true)]
[ValidateRange(2,42)]
[int]$foo
)
Write-Output $foo
See Get-Help about_Function_Advanced_Parameters for more information.
Pass the variable as an unnamed argument:
# script1.ps1
$var = 'foo'
$dir = 'C:\some\folder'
$scriptname = "script2.ps1"
& "${dir}\${scriptname}" $var
# script2.ps1
Write-Output $args[0]
This is the second best approach, because you still pass the variable in a clear-cut way, but the interface isn't as well defined as before.
Define the variable as an environment variable:
# script1.ps1
$env:var = 'foo'
$dir = 'C:\some\folder'
$scriptname = "script2.ps1"
& "${dir}\${scriptname}"
# script2.ps1
Write-Output $env:var
This is a less clean approach than the argument-based ones, as the variable is passed using a "side-channel" (the process environment, which is inherited by child processes).
Just define the variable in the first script and use it in the second one:
# script1.ps1
$var = 'foo'
$dir = 'C:\some\folder'
$scriptname = "script2.ps1"
& "${dir}\${scriptname}"
# script2.ps1
Write-Output $var
This will work as well, because by using the call operator (&) the second script is run in the same context as the first script and thus has access to the same variables. However, "passing" a variable like this will easily break if someone runs the second script in a different context/scope or modies it without being aware of the implicit dependency.
If you want to go this route it's usually better to use the first script for variable (and function) definitions only, and dot-source it in the second script, so that the definitions are imported into the scope of the second script:
# script1.ps1
$var = 'foo'
# script2.ps1
. 'C:\path\to\script1.ps1'
Write-Output $var
Technically, passing values via a file would be another option. However, I would recommend against using this approach for several reasons:
it's prone to errors due to improper permissions (could be mitigated by creating the file in the $env:TEMP folder),
it's prone to littering the filesystem if you don't clean up the file afterwards,
it needlessly generates disk I/O when simple in-memory operations provided by the language would suffice.