Passing arguments to job initialization script - powershell

I have multiple jobs and for every job I want to have the same initialization script that sets some things up. I'd like to pass some arguments to the initialization script, but unfortunately the arguments passed using -ArgumentList seem to be only accessible in the actual job script.
Here's an example that demonstrates the argument only being accessible in the actual script:
function StartJob([ScriptBlock] $script, [string] $name, [ScriptBlock] $initialization_script = $null, $argument = $null)
{
Start-Job -ScriptBlock $script -Name $name -InitializationScript $initialization_script -ArgumentList $argument | Out-Null
}
[ScriptBlock] $initialization_script =
{
# The argument given to StartJob should be accessible here
param($test)
echo "Test: $test"
}
[ScriptBlock] $actual_script =
{
param($test)
echo "Test: $test"
}
StartJob $actual_script "Test job" $initialization_script "Have this string in the `$initialization_script"
#(Get-Job).ForEach({
# Wait for the job to finish, remove it and output its results
Write-Host "$($_.Name) results:"
Receive-Job -Job $_ -Wait -AutoRemoveJob | Write-Host
})
How would I be able to be access the arguments passed in the $initialization_script?

AFAIK it's not possible to pass parameters to initialization scripts. Init scripts are designed to be reusable scripblocks to load known resources. If something can't be defined once, then it's unique to that job's scriptblock and doesn't belong in a init. script. You have a few alternatives:
If you have a module (.psm1 and maybe a .psd1), then place it in one of the module-folders (see $env:PSModulePath for paths) so you could simply write Import-Module MyImportantModule in your initialization script.
If you can't use the solution above, I would add a paramter to the actual script and pass in the path as a regular argument.
[ScriptBlock] $actual_script =
{
# The argument given to StartJob should be accessible here
param($test, $ModulePath)
#Import-Module $ModulePath
echo "Test: $test"
}
Start-Job -ScriptBlock $actual_script -Name "Test job" -ArgumentList "First argument", "c:\mymodule.ps1"
Or you could generate the initialization scriptblock in your script so it's dynamic:
$ModulePath = "c:\mymodule.ps1"
$init = #"
#Import-Module "$ModulePath"
#Something-Else
"#
$initsb = [scriptblock]::Create($init)

Related

Send string parameters to a Start-Job script block

I need to initialize a job using the shell. The job will be a delay plus a call to a vbScript. The following code works fine. For my example, the vbScript is just a single line with a MsgBox "Hello world!"
$functions = {
Function execute_vbs {
param ([string]$path_VBScript, [int]$secs)
Start-Sleep -Seconds $secs
cscript /nologo $path_VBScript
}
}
$seconds = 2
Start-Job -InitializationScript $functions -ScriptBlock {execute_vbs -path_VBScript 'C:\Users\[USERNAME]\Desktop\hello_world.vbs' -secs $seconds} -Name MyJob
The problem comes the moment I want to parameterize the vbScript path. (the idea is to do several different calls to some different vbScripts).
When I do this, the command seems to ignore the parameter input. I did other tests with int parameter and they work fine, the problem looks to be only with the string parameters. The following code does not work:
$functions = {
Function execute_vbs {
param ([string]$path_VBScript, [int]$secs)
Start-Sleep -Seconds $secs
cscript /nologo $path_VBScript
}
}
$input = 'C:\Users\[USERNAME]\Desktop\hello_world.vbs'
$seconds = 2
Start-Job -InitializationScript $functions -ScriptBlock {execute_vbs -path_VBScript $input -secs $seconds} -Name MyJob
I've also tried using the [-ArgumentList] command, but it has the same problem.
Any idea?
The problem is that the $input and $seconds variables inside your script block are in a different scope and are effectively different variables to the ones in the main script.
I've modified your script slightly to remove the call to VBScript to make it easier to reproduce here - my example code is:
$functions = {
Function execute_vbs {
param ([string]$path_VBScript, [int]$secs)
Start-Sleep -Seconds $secs
write-output "filename = '$path_VBScript'"
write-output "secs = '$secs'"
}
}
Here's two ways to fix it:
The Using: scope modifier
See https://learn.microsoft.com/en-us/powershell/module/microsoft.powershell.core/about/about_scopes?view=powershell-7#the-using-scope-modifier for the full details, but basically:
For any script or command that executes out of session, you need the Using scope modifier to embed variable values from the calling session scope, so that out of session code can access them.
$filename = 'C:\Users\[USERNAME]\Desktop\hello_world.vbs'
$seconds = 2
$job = Start-Job -InitializationScript $functions -ScriptBlock {
execute_vbs -path_VBScript $using:filename -secs $using:seconds
} -Name MyJob
wait-job $job
receive-job $job
# output:
# filename = 'C:\Users\[USERNAME]\Desktop\hello_world.vbs'
# secs = '2'
Note the $using before the variable names inside the script block - this allows you to "inject" the variables from your main script into the scriptblock.
ScriptBlock Parameters
You can define parameters on the script block similar to how you do it with a function, and then provide the values in the -ArgumentList parameter when you invoke Start-Job.
$filename = 'C:\Users\[USERNAME]\Desktop\hello_world.vbs'
$seconds = 2
$job = Start-Job -InitializationScript $functions -ScriptBlock {
param( [string] $f, [int] $s )
execute_vbs -path_VBScript $f -secs $s
} -ArgumentList #($filename, $seconds) -Name MyJob
wait-job $job
receive-job $job
# output:
# filename = 'C:\Users\[USERNAME]\Desktop\hello_world.vbs'
# secs = '2'
``

Function not accessible in a ScriptBlock

I have a script that has some functions and then multiple jobs in the very same script that use those functions. When I start a new job they don't seem to be accessible in the [ScriptBlock] that I have for my jobs.
Here's a minimal example demonstrating this:
# A simple test function
function Test([string] $string)
{
Write-Output "I'm a $string"
}
# My test job
[ScriptBlock] $test =
{
Test "test function"
}
# Start the test job
Start-Job -ScriptBlock $test -Name "Test" | Out-Null
# Wait for jobs to complete and print their output
#(Get-Job).ForEach({
Wait-Job -Job $_ |Out-Null
Receive-Job -Job $_ | Write-Host
})
# Remove the completed jobs
Remove-Job -State Completed
The error that I get in PowerShell ISE is:
The term 'Test' is not recognized as the name of a cmdlet, function, script file, or operable program. Check the spelling of the name, or if a path was included, verify that the path is correct and try again.
+ CategoryInfo : ObjectNotFound: (Test:String) [], CommandNotFoundException
+ FullyQualifiedErrorId : CommandNotFoundException
+ PSComputerName : localhost
Start-Job run jobs in separate PowerShell processes. So that, jobs do not have access to session state of calling PowerShell session. You need to define functions, which get used by jobs, in every job. An easy way to do that without duplicating the code would be using of -InitializationScript parameter, where all common functions can be defined.
$IS = {
function CommonFunction1 {
'Do something'
}
function CommonFunction2 {
'Do something else'
}
}
$SB1 = {
CommonFunction1
CommonFunction2
}
$SB2 = {
CommonFunction2
CommonFunction1
}
$Job1 = Start-Job -InitializationScript $IS -ScriptBlock $SB1
$Job2 = Start-Job -InitializationScript $IS -ScriptBlock $SB2
Receive-Job $Job1,$Job2 -Wait -AutoRemoveJob
Just extending PetSerAl's answer, you can use Runspaces for this, if you want faster code and a little bit more organised. Check out this question:
39180266
So when you run something in different runspace, you need to import functions in both of them. So finished structure would look like:
Module: functions.ps1 - you store here functions to share with both scopes.
Main script: script.ps1 - it's basically your script, with runspaces, but without functions from functions.ps1.
And in beginning of your script.ps1, just simply call Import-module .\functions.ps1, to get access to your functions. Remember that runscape has different scope, and in their scriptblock, you have to call import-module once again. Full example:
#file functions.ps1
function add($inp) {
return $inp + 2
}
#file script.ps1
Import-module .\functions.ps1 #or you can use "dot call": . .\function.ps1
Import-module .\invoke-parallel.ps1 #it's extern module
$argument = 10 #it may be any object, even your custom class
$results = $argument | Invoke-Parallel -ScriptBlock {
import-module .\functions.ps1 #you may have to use here absolute path, because in a new runspace PSScriptRoot may be different/undefined
return (add $_) # $_ is simply passed object from "parent" scope, in fact, the relationship between scopes is not child-parent
}
echo $result # it's 12
echo (add 5) # it's 7

Reuse PowerShell functions in Script Block [duplicate]

I feel like I'm missing something that should be obvious, but I just can't figure out how to do this.
I have a ps1 script that has a function defined in it. It calls the function and then tries using it remotely:
function foo
{
Param([string]$x)
Write-Output $x
}
foo "Hi!"
Invoke-Command -ScriptBlock { foo "Bye!" } -ComputerName someserver.example.com -Credential someuser#example.com
This short example script prints "Hi!" and then crashes saying "The term 'foo' is not recognized as the name of a cmdlet, function, script file, or operable program."
I understand that the function is not defined on the remote server because it is not in the ScriptBlock. I could redefine it there, but I'd rather not. I'd like to define the function once and use it either locally or remotely. Is there a good way to do this?
You need to pass the function itself (not a call to the function in the ScriptBlock).
I had the same need just last week and found this SO discussion
So your code will become:
Invoke-Command -ScriptBlock ${function:foo} -argumentlist "Bye!" -ComputerName someserver.example.com -Credential someuser#example.com
Note that by using this method, you can only pass parameters into your function positionally; you can't make use of named parameters as you could when running the function locally.
You can pass the definition of the function as a parameter, and then redefine the function on the remote server by creating a scriptblock and then dot-sourcing it:
$fooDef = "function foo { ${function:foo} }"
Invoke-Command -ArgumentList $fooDef -ComputerName someserver.example.com -ScriptBlock {
Param( $fooDef )
. ([ScriptBlock]::Create($fooDef))
Write-Host "You can call the function as often as you like:"
foo "Bye"
foo "Adieu!"
}
This eliminates the need to have a duplicate copy of your function. You can also pass more than one function this way, if you're so inclined:
$allFunctionDefs = "function foo { ${function:foo} }; function bar { ${function:bar} }"
You can also put the function(s) as well as the script in a file (foo.ps1) and pass that to Invoke-Command using the FilePath parameter:
Invoke-Command –ComputerName server –FilePath .\foo.ps1
The file will be copied to the remote computers and executed.
Although that's an old question I would like to add my solution.
Funny enough the param list of the scriptblock within function test, does not take an argument of type [scriptblock] and therefor needs conversion.
Function Write-Log
{
param(
[string]$Message
)
Write-Host -ForegroundColor Yellow "$($env:computername): $Message"
}
Function Test
{
$sb = {
param(
[String]$FunctionCall
)
[Scriptblock]$WriteLog = [Scriptblock]::Create($FunctionCall)
$WriteLog.Invoke("There goes my message...")
}
# Get function stack and convert to type scriptblock
[scriptblock]$writelog = (Get-Item "Function:Write-Log").ScriptBlock
# Invoke command and pass function in scriptblock form as argument
Invoke-Command -ComputerName SomeHost -ScriptBlock $sb -ArgumentList $writelog
}
Test
Another posibility is passing a hashtable to our scriptblock containing all the methods that you would like to have available in the remote session:
Function Build-FunctionStack
{
param([ref]$dict, [string]$FunctionName)
($dict.Value).Add((Get-Item "Function:${FunctionName}").Name, (Get-Item "Function:${FunctionName}").Scriptblock)
}
Function MyFunctionA
{
param([string]$SomeValue)
Write-Host $SomeValue
}
Function MyFunctionB
{
param([int]$Foo)
Write-Host $Foo
}
$functionStack = #{}
Build-FunctionStack -dict ([ref]$functionStack) -FunctionName "MyFunctionA"
Build-FunctionStack -dict ([ref]$functionStack) -FunctionName "MyFunctionB"
Function ExecuteSomethingRemote
{
$sb = {
param([Hashtable]$FunctionStack)
([Scriptblock]::Create($functionStack["MyFunctionA"])).Invoke("Here goes my message");
([Scriptblock]::Create($functionStack["MyFunctionB"])).Invoke(1234);
}
Invoke-Command -ComputerName SomeHost -ScriptBlock $sb -ArgumentList $functionStack
}
ExecuteSomethingRemote

Powershell: Call functions outside scriptblock

I was reading this post about getting functions passed into a scriptblock for use with jobs:
Powershell start-job -scriptblock cannot recognize the function defined in the same file?
I get how that works by passing the function in as variable and it works for the simple example. What about a real world solution though, is there a more elegant way of handling this?
I have script I'm using to deploy changes to vendor software. It reads an xml that tells it how to navigate the environment and performs the various tasks, ie: map drives, stop services, call a perl installation script. I would like to provide a parameter to the script to allow it to run concurrently, this way if the perl script takes 5 minutes (not uncommon) and you're rolling out to 11 servers you're not waiting for the script to run for an hour.
I'm just going to post some snippets since the full script is a little lengthy. A log function:
function Log
{
Param(
[parameter(ValueFromPipeline=$true)]
$InputObject,
[parameter()]
[alias("v")]
$verbosity = $debug
)
$messageIndex = [array]::IndexOf($verbosityArray, $verbosity)
$verbosityIndex = [array]::IndexOf($verbosityArray, $loggingVerbosity)
if($messageIndex -ge $VerbosityIndex)
{
switch($verbosity)
{
$debug {Write-Host $verbosity ": " $InputObject}
$info {Write-Host $verbosity ": " $InputObject}
$warn {Write-Host $verbosity ": " $InputObject -ForegroundColor yellow}
$error {Write-Host $verbosity ": " $InputObject -ForegroundColor red}
}
}
}
Here's another function that calls the log function:
function ExecuteRollout
{
param(
[parameter(Mandatory=$true)]
[alias("ses")]
$session,
[parameter(Mandatory=$true)]
$command
)
#invoke command
Invoke-Command -session $session -ScriptBlock {$res = cmd /v /k `"$args[0]`"} -args $command
#get the return code from the remote session
$res = Invoke-Command -session $session {$res}
Log ("Command Output: "+$res)
$res = [string] $res
$exitCode = $res.substring($res.IndexOf("ExitCode:"), 10)
$exitCode = $exitCode.substring(9,1)
Log ("Exit code: "+$exitCode)
return $exitCode
}
And lastly a snippet from my main so you can get an idea of what's going on. $target.Destinations.Destination will contain all the servers and relevant information about them that the deployment will go to. I removed some variable setup and logging to make this more compact so yes you'll see variables referenced that are never defined:
#Execute concurrently
$target.Destinations.Destination | %{
$ScriptBlock = {
$destination = $args[0]
Log -v $info ("Starting remote session on: "+$destination.Server)
$session = New-PSSession -computerName $destination.Server
$InitializeRemote -session $session -destination $destination
#Gets a little tricky here, we need to keep the cmd session so it doesn't lose the sys vars set by env.bat
#String everything together with &'s
$cmdString = $destDrive + ": & call "+$lesDestDir+"data\env.bat & cd "+$rolloutDir+" & perl ..\JDH-rollout-2010.pl "+$rollout+" NC,r:\les & echo ExitCode:!errorlevel!"
Log ("cmdString: "+$cmdString)
Log -v $info ("Please wait, executing the rollout now...")
$exitCode = $ExecuteRollout -session $session -command $cmdString
Log ("ExitCode: "+$exitCode)
#respond to return code from rollout script
$HandleExitCode -session $session -destination $destination -exitCode $exitCode
$CleanUpRemote -session $session -destination $destination
}
Start-Job $ScriptBlock -Args $_
}
So if i go with the approach in the link I'd be converting all my functions to variables and passing them in to the script block. Currently, my log function will by default log in DEBUG unless the verbosity parameter is explicitly passed as a different verbosity. If I convert my functins to variables however powershell doesn't seem to like this syntax:
$Log ("Print this to the log")
So I think I'd need to use the parameter all the time now:
$Log ("Print this to the log" -v $debug
So bottom line it looks like I just need to pass all my functions as variables to the script block and change some formatting when I call them. It's not a huge effort, but I'd like to know if there's a better way before I start hacking my script up. Thanks for the input and for looking, I know this is quite a long post.
I started another post about passing parameters to functions stored as variables, the answer to that also resolves this issue. That post can be found here:
Powershell: passing parameters to functions stored in variables
The short answer is you can use the initializationscript parameter of Start-Job to feed all your functions in if you wrap them in a block and store that in a variable.
Example:
# concurrency
$func = {
function Logx
{
param(
[parameter(ValueFromPipeline=$true)]
$msg
)
Write-Host ("OUT:"+$msg)
}
}
# Execution starts here
cls
$colors = #("red","blue","green")
$colors | %{
$scriptBlock =
{
Logx $args[0]
Start-Sleep 9
}
Write-Host "Processing: " $_
Start-Job -InitializationScript $func -scriptblock $scriptBlock -args $_
}
Get-Job
while(Get-Job -State "Running")
{
write-host "Running..."
Start-Sleep 2
}
# Output
Get-Job | Receive-Job
# Cleanup jobs
Remove-Job *

How do I include a locally defined function when using PowerShell's Invoke-Command for remoting?

I feel like I'm missing something that should be obvious, but I just can't figure out how to do this.
I have a ps1 script that has a function defined in it. It calls the function and then tries using it remotely:
function foo
{
Param([string]$x)
Write-Output $x
}
foo "Hi!"
Invoke-Command -ScriptBlock { foo "Bye!" } -ComputerName someserver.example.com -Credential someuser#example.com
This short example script prints "Hi!" and then crashes saying "The term 'foo' is not recognized as the name of a cmdlet, function, script file, or operable program."
I understand that the function is not defined on the remote server because it is not in the ScriptBlock. I could redefine it there, but I'd rather not. I'd like to define the function once and use it either locally or remotely. Is there a good way to do this?
You need to pass the function itself (not a call to the function in the ScriptBlock).
I had the same need just last week and found this SO discussion
So your code will become:
Invoke-Command -ScriptBlock ${function:foo} -argumentlist "Bye!" -ComputerName someserver.example.com -Credential someuser#example.com
Note that by using this method, you can only pass parameters into your function positionally; you can't make use of named parameters as you could when running the function locally.
You can pass the definition of the function as a parameter, and then redefine the function on the remote server by creating a scriptblock and then dot-sourcing it:
$fooDef = "function foo { ${function:foo} }"
Invoke-Command -ArgumentList $fooDef -ComputerName someserver.example.com -ScriptBlock {
Param( $fooDef )
. ([ScriptBlock]::Create($fooDef))
Write-Host "You can call the function as often as you like:"
foo "Bye"
foo "Adieu!"
}
This eliminates the need to have a duplicate copy of your function. You can also pass more than one function this way, if you're so inclined:
$allFunctionDefs = "function foo { ${function:foo} }; function bar { ${function:bar} }"
You can also put the function(s) as well as the script in a file (foo.ps1) and pass that to Invoke-Command using the FilePath parameter:
Invoke-Command –ComputerName server –FilePath .\foo.ps1
The file will be copied to the remote computers and executed.
Although that's an old question I would like to add my solution.
Funny enough the param list of the scriptblock within function test, does not take an argument of type [scriptblock] and therefor needs conversion.
Function Write-Log
{
param(
[string]$Message
)
Write-Host -ForegroundColor Yellow "$($env:computername): $Message"
}
Function Test
{
$sb = {
param(
[String]$FunctionCall
)
[Scriptblock]$WriteLog = [Scriptblock]::Create($FunctionCall)
$WriteLog.Invoke("There goes my message...")
}
# Get function stack and convert to type scriptblock
[scriptblock]$writelog = (Get-Item "Function:Write-Log").ScriptBlock
# Invoke command and pass function in scriptblock form as argument
Invoke-Command -ComputerName SomeHost -ScriptBlock $sb -ArgumentList $writelog
}
Test
Another posibility is passing a hashtable to our scriptblock containing all the methods that you would like to have available in the remote session:
Function Build-FunctionStack
{
param([ref]$dict, [string]$FunctionName)
($dict.Value).Add((Get-Item "Function:${FunctionName}").Name, (Get-Item "Function:${FunctionName}").Scriptblock)
}
Function MyFunctionA
{
param([string]$SomeValue)
Write-Host $SomeValue
}
Function MyFunctionB
{
param([int]$Foo)
Write-Host $Foo
}
$functionStack = #{}
Build-FunctionStack -dict ([ref]$functionStack) -FunctionName "MyFunctionA"
Build-FunctionStack -dict ([ref]$functionStack) -FunctionName "MyFunctionB"
Function ExecuteSomethingRemote
{
$sb = {
param([Hashtable]$FunctionStack)
([Scriptblock]::Create($functionStack["MyFunctionA"])).Invoke("Here goes my message");
([Scriptblock]::Create($functionStack["MyFunctionB"])).Invoke(1234);
}
Invoke-Command -ComputerName SomeHost -ScriptBlock $sb -ArgumentList $functionStack
}
ExecuteSomethingRemote