I have a script that requires a number of parameters:
param ([string]$FOO="foo",[string]$CFG='\ps\bcpCopyCfg.ps1', [string]$CFROM="none", `
[string]$CTO="none", [switch]$HELP=$FALSE, [switch]$FULL=$FALSE, [string]$CCOL="none" `
,[string]$CDSQUERY="none", [string]$CMSSRV="none" `
,[string]$CSYBDB="none", [string]$CMSDB="none")
when called from the command prompt e.g.
powershell .\bcpCopy.ps1 -CFROM earn_n_deduct_actg -CTO fin_earn_n_deduct_actg -CCOL f_edeh_doc_id
everything works fine.
I need however to start several (dozens) instances of the script in parallel and I've written a wrapper script that calls the one doing the actual work as a job:
I prepare an array with the arguments (including the keywords like "-CFG") anhd pass it to start-job:
# Prepare script block to be released
$ARGS=("-CFG ", $CFG, "-CSYBDB ", $SYBDB, "-CMSDB ",$MSDB, "-CFROM ", $SYBTBL, "-CTO ",$MSTBL)
if ($FULL) {
$ARGS = $ARGS + " -FULL"
} else {
$ARGS = $ARGS + " -CCOL $($args[5]) "
}
"Argument array:"
$ARGS
start-job -scriptblock {powershell.exe -file '\ps\bcpCopy.ps1'} -ArgumentList $ARGS
Unfortunately, the called script does not receive the arguments: the caller prints the array and it looks fine:
Argument array:
-CFG
\ps\bcpCopyCfgOAH.ps1
-CSYBDB
vnimisro
-CMSDB
IMIS_UNOV
-CFROM
earn_n_deduct_ref
-CTO
fin_earn_n_deduct_ref
-FULL
but the output from the called scripts says that the only parameter received is the configuration file -- all the rest are at their default values.
PS C:\ps> receive-job 1391
12/17/2010 10:54:14 Starting the upload of table none; source db none;
12/17/2010 10:54:14 Target table is none; target db is none;
12/17/2010 10:54:14 Config file is \ps\bcpCopyCfg.ps1.
12/17/2010 10:54:14 Target server (MS SQL) is secap900-new
12/17/2010 10:54:14 Source database must be specified. Exiting...
Can you please point me what am I doing wrong?
I'm not sure what exactly you're trying to do, but this looks wrong:
start-job -scriptblock {
powershell.exe -file '\ps\bcpCopy.ps1'} -ArgumentList $ARGS
You are creating an entirely new powershell process needlessly. Try this instead:
start-job -scriptblock {
& 'c:\ps\bcpCopy.ps1' #args } -ArgumentList $ARGS
The "#args" syntax is called "splatting." This will expand the passed arguments and ensure each element is treated as a parameter. The ampersand (&) is the "call" operator.
Another way is to put the # before the array. I changed the $ARGS variable to $flags to differentiate $args in the scriptblock from $flags.
$flags = #("-CFG ", $CFG, "-CSYBDB ", $SYBDB, "-CMSDB ",$MSDB, "-CFROM ", $SYBTBL, "-CTO ",$MSTBL)
If($FULL) {
$flags = $flags + " -FULL"
}
Else {
$flags = $flags + " -CCOL $($args[5]) "
}
"Argument array:"
$flags
start-job -scriptblock {powershell.exe -file '\ps\bcpCopy.ps1' $args} -ArgumentList $flags
Related
I'm looking for method to create a wrapper around Invoke-Command that restores the current directory that I'm using on the remote machine before invoking my command. Here's what I tried to do:
function nice_invoke {
param(
[string]$Computer,
[scriptblock]$ScriptBlock
)
Set-PSDebug -Trace 0
$cwd = (Get-Location).Path
write-host "cmd: $cwd"
$wrapper = {
$target = $using:cwd
if (-not (Test-Path "$target")) {
write-host "ERROR: Directory doesn't exist on remote"
exit 1
}
else {
Set-Location $target
}
$sb = $using:ScriptBlock
$sb.Invoke() | out-host
}
# Execute Command on remote computer in Same Directory as Local Machine
Invoke-Command -Computer pv3039 -ScriptBlock $wrapper
}
Command Line:
PS> nice_invoke -Computer pv3039 -ScriptBlock {get-location |out-host; get-ChildItem | out-host }
Error Message:
Method invocation failed because [System.String]
does not contain a method named 'Invoke'.
+ CategoryInfo : InvalidOperation: (:) [], RuntimeException
+ FullyQualifiedErrorId : MethodNotFound
+ PSComputerName : pv3039
You can't pass a ScriptBlock like this with the $using: scope, it will get rendered to a string-literal first. Use the [ScriptBlock]::Create(string) method instead within your $wrapper block to create a ScriptBlock from a String:
$sb = [ScriptBlock]::Create($using:ScriptBlock)
$sb.Invoke() | Out-Host
Alternatively, you could also use Invoke-Command -ArgumentList $ScriptBlock, but you still have the same issue with the ScriptBlock getting rendered as a string. Nonetheless, here is an example for this case as well:
# Call `Invoke-Command -ArgumentList $ScriptBlock`
# $args[0] is the first argument passed into the `Invoke-Command` block
$sb = [ScriptBlock]::Create($args[0])
$sb.Invoke() | Out-Host
Note: While I kept the format here in the way you were attempting to run the ScriptBlock in your original code, the idiomatic way to run ScriptBlocks locally (from the perspective the nested ScriptBlock it is a local execution on the remote machine) is to use the Call Operator like & $sb rather than using $sb.Invoke().
With either approach, the nested ScriptBlock will execute for you from the nested block now. This limitation is similar to how some other types are incompatible with shipping across remote connections or will not survive serialization with Export/Import-CliXml; it is simply a limitation of the ScriptBlock type.
Worthy to note, this limitation persists whether using Invoke-Command or another cmdlet that initiates execution via a child PowerShell session such as Start-Job. So the solution will be the same either way.
function nice_invoke {
param(
[string]$Computer,
[scriptblock]$ScriptBlock
)
Set-PSDebug -Trace 0
$cwd = (Get-Location).Path
write-host "cmd: $cwd"
$wrapper = {
$target = $using:cwd
if (-not (Test-Path "$target")) {
write-host "ERROR: Directory doesn't exist on remote"
exit 1
}
else {
Set-Location $using:cwd
}
$sb = [scriptblock]::Create($using:ScriptBlock)
$sb.Invoke()
}
# Execute Command on remote computer in Same Directory as Local Machine
Invoke-Command -Computer pv3039 -ScriptBlock $wrapper
}
nice_invoke -Computer pv3039 -ScriptBlock {
hostname
get-location
#dir
}
I have multiple jobs and for every job I want to have the same initialization script that sets some things up. I'd like to pass some arguments to the initialization script, but unfortunately the arguments passed using -ArgumentList seem to be only accessible in the actual job script.
Here's an example that demonstrates the argument only being accessible in the actual script:
function StartJob([ScriptBlock] $script, [string] $name, [ScriptBlock] $initialization_script = $null, $argument = $null)
{
Start-Job -ScriptBlock $script -Name $name -InitializationScript $initialization_script -ArgumentList $argument | Out-Null
}
[ScriptBlock] $initialization_script =
{
# The argument given to StartJob should be accessible here
param($test)
echo "Test: $test"
}
[ScriptBlock] $actual_script =
{
param($test)
echo "Test: $test"
}
StartJob $actual_script "Test job" $initialization_script "Have this string in the `$initialization_script"
#(Get-Job).ForEach({
# Wait for the job to finish, remove it and output its results
Write-Host "$($_.Name) results:"
Receive-Job -Job $_ -Wait -AutoRemoveJob | Write-Host
})
How would I be able to be access the arguments passed in the $initialization_script?
AFAIK it's not possible to pass parameters to initialization scripts. Init scripts are designed to be reusable scripblocks to load known resources. If something can't be defined once, then it's unique to that job's scriptblock and doesn't belong in a init. script. You have a few alternatives:
If you have a module (.psm1 and maybe a .psd1), then place it in one of the module-folders (see $env:PSModulePath for paths) so you could simply write Import-Module MyImportantModule in your initialization script.
If you can't use the solution above, I would add a paramter to the actual script and pass in the path as a regular argument.
[ScriptBlock] $actual_script =
{
# The argument given to StartJob should be accessible here
param($test, $ModulePath)
#Import-Module $ModulePath
echo "Test: $test"
}
Start-Job -ScriptBlock $actual_script -Name "Test job" -ArgumentList "First argument", "c:\mymodule.ps1"
Or you could generate the initialization scriptblock in your script so it's dynamic:
$ModulePath = "c:\mymodule.ps1"
$init = #"
#Import-Module "$ModulePath"
#Something-Else
"#
$initsb = [scriptblock]::Create($init)
I have a script that has some functions and then multiple jobs in the very same script that use those functions. When I start a new job they don't seem to be accessible in the [ScriptBlock] that I have for my jobs.
Here's a minimal example demonstrating this:
# A simple test function
function Test([string] $string)
{
Write-Output "I'm a $string"
}
# My test job
[ScriptBlock] $test =
{
Test "test function"
}
# Start the test job
Start-Job -ScriptBlock $test -Name "Test" | Out-Null
# Wait for jobs to complete and print their output
#(Get-Job).ForEach({
Wait-Job -Job $_ |Out-Null
Receive-Job -Job $_ | Write-Host
})
# Remove the completed jobs
Remove-Job -State Completed
The error that I get in PowerShell ISE is:
The term 'Test' is not recognized as the name of a cmdlet, function, script file, or operable program. Check the spelling of the name, or if a path was included, verify that the path is correct and try again.
+ CategoryInfo : ObjectNotFound: (Test:String) [], CommandNotFoundException
+ FullyQualifiedErrorId : CommandNotFoundException
+ PSComputerName : localhost
Start-Job run jobs in separate PowerShell processes. So that, jobs do not have access to session state of calling PowerShell session. You need to define functions, which get used by jobs, in every job. An easy way to do that without duplicating the code would be using of -InitializationScript parameter, where all common functions can be defined.
$IS = {
function CommonFunction1 {
'Do something'
}
function CommonFunction2 {
'Do something else'
}
}
$SB1 = {
CommonFunction1
CommonFunction2
}
$SB2 = {
CommonFunction2
CommonFunction1
}
$Job1 = Start-Job -InitializationScript $IS -ScriptBlock $SB1
$Job2 = Start-Job -InitializationScript $IS -ScriptBlock $SB2
Receive-Job $Job1,$Job2 -Wait -AutoRemoveJob
Just extending PetSerAl's answer, you can use Runspaces for this, if you want faster code and a little bit more organised. Check out this question:
39180266
So when you run something in different runspace, you need to import functions in both of them. So finished structure would look like:
Module: functions.ps1 - you store here functions to share with both scopes.
Main script: script.ps1 - it's basically your script, with runspaces, but without functions from functions.ps1.
And in beginning of your script.ps1, just simply call Import-module .\functions.ps1, to get access to your functions. Remember that runscape has different scope, and in their scriptblock, you have to call import-module once again. Full example:
#file functions.ps1
function add($inp) {
return $inp + 2
}
#file script.ps1
Import-module .\functions.ps1 #or you can use "dot call": . .\function.ps1
Import-module .\invoke-parallel.ps1 #it's extern module
$argument = 10 #it may be any object, even your custom class
$results = $argument | Invoke-Parallel -ScriptBlock {
import-module .\functions.ps1 #you may have to use here absolute path, because in a new runspace PSScriptRoot may be different/undefined
return (add $_) # $_ is simply passed object from "parent" scope, in fact, the relationship between scopes is not child-parent
}
echo $result # it's 12
echo (add 5) # it's 7
I was reading this post about getting functions passed into a scriptblock for use with jobs:
Powershell start-job -scriptblock cannot recognize the function defined in the same file?
I get how that works by passing the function in as variable and it works for the simple example. What about a real world solution though, is there a more elegant way of handling this?
I have script I'm using to deploy changes to vendor software. It reads an xml that tells it how to navigate the environment and performs the various tasks, ie: map drives, stop services, call a perl installation script. I would like to provide a parameter to the script to allow it to run concurrently, this way if the perl script takes 5 minutes (not uncommon) and you're rolling out to 11 servers you're not waiting for the script to run for an hour.
I'm just going to post some snippets since the full script is a little lengthy. A log function:
function Log
{
Param(
[parameter(ValueFromPipeline=$true)]
$InputObject,
[parameter()]
[alias("v")]
$verbosity = $debug
)
$messageIndex = [array]::IndexOf($verbosityArray, $verbosity)
$verbosityIndex = [array]::IndexOf($verbosityArray, $loggingVerbosity)
if($messageIndex -ge $VerbosityIndex)
{
switch($verbosity)
{
$debug {Write-Host $verbosity ": " $InputObject}
$info {Write-Host $verbosity ": " $InputObject}
$warn {Write-Host $verbosity ": " $InputObject -ForegroundColor yellow}
$error {Write-Host $verbosity ": " $InputObject -ForegroundColor red}
}
}
}
Here's another function that calls the log function:
function ExecuteRollout
{
param(
[parameter(Mandatory=$true)]
[alias("ses")]
$session,
[parameter(Mandatory=$true)]
$command
)
#invoke command
Invoke-Command -session $session -ScriptBlock {$res = cmd /v /k `"$args[0]`"} -args $command
#get the return code from the remote session
$res = Invoke-Command -session $session {$res}
Log ("Command Output: "+$res)
$res = [string] $res
$exitCode = $res.substring($res.IndexOf("ExitCode:"), 10)
$exitCode = $exitCode.substring(9,1)
Log ("Exit code: "+$exitCode)
return $exitCode
}
And lastly a snippet from my main so you can get an idea of what's going on. $target.Destinations.Destination will contain all the servers and relevant information about them that the deployment will go to. I removed some variable setup and logging to make this more compact so yes you'll see variables referenced that are never defined:
#Execute concurrently
$target.Destinations.Destination | %{
$ScriptBlock = {
$destination = $args[0]
Log -v $info ("Starting remote session on: "+$destination.Server)
$session = New-PSSession -computerName $destination.Server
$InitializeRemote -session $session -destination $destination
#Gets a little tricky here, we need to keep the cmd session so it doesn't lose the sys vars set by env.bat
#String everything together with &'s
$cmdString = $destDrive + ": & call "+$lesDestDir+"data\env.bat & cd "+$rolloutDir+" & perl ..\JDH-rollout-2010.pl "+$rollout+" NC,r:\les & echo ExitCode:!errorlevel!"
Log ("cmdString: "+$cmdString)
Log -v $info ("Please wait, executing the rollout now...")
$exitCode = $ExecuteRollout -session $session -command $cmdString
Log ("ExitCode: "+$exitCode)
#respond to return code from rollout script
$HandleExitCode -session $session -destination $destination -exitCode $exitCode
$CleanUpRemote -session $session -destination $destination
}
Start-Job $ScriptBlock -Args $_
}
So if i go with the approach in the link I'd be converting all my functions to variables and passing them in to the script block. Currently, my log function will by default log in DEBUG unless the verbosity parameter is explicitly passed as a different verbosity. If I convert my functins to variables however powershell doesn't seem to like this syntax:
$Log ("Print this to the log")
So I think I'd need to use the parameter all the time now:
$Log ("Print this to the log" -v $debug
So bottom line it looks like I just need to pass all my functions as variables to the script block and change some formatting when I call them. It's not a huge effort, but I'd like to know if there's a better way before I start hacking my script up. Thanks for the input and for looking, I know this is quite a long post.
I started another post about passing parameters to functions stored as variables, the answer to that also resolves this issue. That post can be found here:
Powershell: passing parameters to functions stored in variables
The short answer is you can use the initializationscript parameter of Start-Job to feed all your functions in if you wrap them in a block and store that in a variable.
Example:
# concurrency
$func = {
function Logx
{
param(
[parameter(ValueFromPipeline=$true)]
$msg
)
Write-Host ("OUT:"+$msg)
}
}
# Execution starts here
cls
$colors = #("red","blue","green")
$colors | %{
$scriptBlock =
{
Logx $args[0]
Start-Sleep 9
}
Write-Host "Processing: " $_
Start-Job -InitializationScript $func -scriptblock $scriptBlock -args $_
}
Get-Job
while(Get-Job -State "Running")
{
write-host "Running..."
Start-Sleep 2
}
# Output
Get-Job | Receive-Job
# Cleanup jobs
Remove-Job *
I'm trying to run a job in a background which is a .exe with parameters and the destination has spaces. For example:
$exec = "C:\Program Files\foo.exe"
and I want to run this with parameters:
foo.exe /param1 /param2, etc.
I know that Start-Job does this but I've tried tons of different combinations and it either gives me an error because of the white space or because of the parameters. Can someone help me out with the syntax here? I need to assume that $exec is the path of the executable because it is part of a configuration file and could change later on.
One way to do this is use a script block with a param block.
If there is a single argument with a space in it such as a file/folder path it should be quoted to treat it as a single item. The arguments are an array passed to the script block.
This example uses a script block but you can also use a PowerShell script using the -FilePath parameter of the Start-Job cmdlet instead of the -ScriptBlock parameter.
Here is another example that has arguments with spaces:
$scriptBlock = {
param (
[string] $Source,
[string] $Destination
)
$output = & xcopy $Source $Destination 2>&1
return $output
}
$job = Start-Job -scriptblock $scriptBlock -ArgumentList 'C:\My Folder', 'C:\My Folder 2'
Wait-Job $job
Receive-Job $job
Here is an example using the $args built-in variable instead of the param block.
$scriptBlock = {
$output = & xcopy $args 2>&1
return $output
}
$path1 = "C:\My Folder"
$path2 = "C:\My Folder 2"
"hello world" | Out-File -FilePath "$path1\file.txt"
$job = Start-Job -scriptblock $scriptBlock -ArgumentList $path1, $path2
Wait-Job $job
Receive-Job $job
Andy's trick generally works very well. If you have parameter sets, or otherwise want to move complex information into the job, you can also try this technique:
$jobArgs = #{Source="foo"; Destination="bar"}
$jobArgs |Export-CliXml -Path $env:\Temp\MyArgs.clixml
and in the job...
Start-Job {
.... $jobArgs = Import-CliXml -Path $env:\Temp\MyArgs.clixml
} | Wait-Job | Receive-Job
I use both approaches routinely.
I use -ArgumentList / ScriptBlock parameters when:
I am not dealing with parameter sets
I'm using an In-Memory Job (like the -AsJob capability in ShowUI or WPK) where the arguments are real objects, and they cannot die
I'm running in a user context where I can run a job, but can't store to disk (web servers, ISO compliant labs, etc)
If I need complex arguments, and they don't need to be passed in memory (or its otherwise convenient to have them on disk later), I'll use the hashtable approach.
Hope this Helps