Is there any way to run parallel programmed functions in PowerShell?
Something like:
Function BuildParallel($configuration)
{
$buildJob = {
param($configuration)
Write-Host "Building with configuration $configuration."
RunBuilder $configuration;
}
$unitJob = {
param()
Write-Host "Running unit."
RunUnitTests;
}
Start-Job $buildJob -ArgumentList $configuration
Start-Job $unitJob
While (Get-Job -State "Running")
{
Start-Sleep 1
}
Get-Job | Receive-Job
Get-Job | Remove-Job
}
Does not work because it complains about not recognizing "RunUnitTests" and "RunBuilder", which are functions declared in the same script file. Apparently this happens because the script block is a new context and does not know anything about the scripts declared in the same file.
I could try to use -InitializationScript in Start-Job, but both RunUnitTests and RunBuilder call more functions declared in the same file or referred from other files, so...
I'm sure there's a way to do this, since it's just modular programming (functions, routines and all that stuff).
You could have the functions in a separate file and import them into the current context wherever needed via dot sourcing. I do this in my Powershell profile, so some of my custom functions are available.
$items = Get-ChildItem "$PSprofilePath\functions"
$items | ForEach-Object {
. $_.FullName
}
If you wanted import one file, it would just be:
. C:\some\path\RunUnitTests.ps1
Related
Assuming Get-Foo and Get-Foo2 and Deploy-Jobs are 3 functions that are part of a very large module. I would like to use Get-Foo and Get-Foo2 in Deploy-Jobs's Start-ThreadJob (below) without reloading the entire module each time.
Is an working example available for how to do this?
function Deploy-Jobs {
foreach ($Device in $Devices) {
Start-ThreadJob -Name $Device -ThrottleLimit 50 -InitializationScript $initScript -ScriptBlock {
param($Device)
Get-Foo | Get-Foo2 -List
} -ArgumentList $Device | out-null
}
}
The method you can use to pass the function's definition to a different scope is the same for Invoke-Command (when PSRemoting), Start-Job, Start-ThreadJob and ForeEach-Object -Parallel. Since you want to invoke 2 different functions in your job's script block, I don't think -InitializationScript is an option, and even if it is, it might make the code even more complicated than it should be.
You can use this as an example of how you can store 2 function definitions in an array ($def), which is then passed to the scope of each TreadJob, this array is then used to define each function in said scope to be later used by each Job.
function Say-Hello {
"Hello world!"
}
function From-ThreadJob {
param($i)
"From ThreadJob # $i"
}
$def = #(
${function:Say-Hello}.ToString()
${function:From-ThreadJob}.ToString()
)
function Run-Jobs {
param($numerOfJobs, $functionDefinitions)
$jobs = foreach($i in 1..$numerOfJobs) {
Start-ThreadJob -ScriptBlock {
# bring the functions definition to this scope
$helloFunc, $threadJobFunc = $using:functionDefinitions
# define them in this scope
${function:Say-Hello} = $helloFunc
${function:From-ThreadJob} = $threadJobFunc
# sleep random seconds
Start-Sleep (Get-Random -Maximum 10)
# combine the output from both functions
(Say-Hello) + (From-ThreadJob -i $using:i)
}
}
Receive-Job $jobs -AutoRemoveJob -Wait
}
Run-Jobs -numerOfJobs 10 -functionDefinitions $def
Assuming Get-Foo and Get-Foo2 and Deploy-Jobs are 3 functions that are part of a very large module. I would like to use Get-Foo and Get-Foo2 in Deploy-Jobs's Start-ThreadJob (below) without reloading the entire module each time.
Is an working example available for how to do this?
function Deploy-Jobs {
foreach ($Device in $Devices) {
Start-ThreadJob -Name $Device -ThrottleLimit 50 -InitializationScript $initScript -ScriptBlock {
param($Device)
Get-Foo | Get-Foo2 -List
} -ArgumentList $Device | out-null
}
}
The method you can use to pass the function's definition to a different scope is the same for Invoke-Command (when PSRemoting), Start-Job, Start-ThreadJob and ForeEach-Object -Parallel. Since you want to invoke 2 different functions in your job's script block, I don't think -InitializationScript is an option, and even if it is, it might make the code even more complicated than it should be.
You can use this as an example of how you can store 2 function definitions in an array ($def), which is then passed to the scope of each TreadJob, this array is then used to define each function in said scope to be later used by each Job.
function Say-Hello {
"Hello world!"
}
function From-ThreadJob {
param($i)
"From ThreadJob # $i"
}
$def = #(
${function:Say-Hello}.ToString()
${function:From-ThreadJob}.ToString()
)
function Run-Jobs {
param($numerOfJobs, $functionDefinitions)
$jobs = foreach($i in 1..$numerOfJobs) {
Start-ThreadJob -ScriptBlock {
# bring the functions definition to this scope
$helloFunc, $threadJobFunc = $using:functionDefinitions
# define them in this scope
${function:Say-Hello} = $helloFunc
${function:From-ThreadJob} = $threadJobFunc
# sleep random seconds
Start-Sleep (Get-Random -Maximum 10)
# combine the output from both functions
(Say-Hello) + (From-ThreadJob -i $using:i)
}
}
Receive-Job $jobs -AutoRemoveJob -Wait
}
Run-Jobs -numerOfJobs 10 -functionDefinitions $def
Having some problems getting a Start-Job script block to output to a file. The following three lines of code work without any problem:
$about_name = "C:\0\ps_about_name.txt"
$about = get-help about_* | select Name,Synopsis
if (-not (Test-
Path $about_name)) { ($about | select Name | sort Name | Out-String).replace("[Aa]bout_", "") > $about_name }
The file is created in C:\0\
But I need to do a lot of collections like this, so I naturally looked at stacking them in parallel as separate jobs. I followed online examples and so put the last line in the above as a script block invoked by Start-Job:
Start-Job { if (-not (Test-Path $about_name)) { { ($about | select Name | sort Name | Out-String).replace("[Aa]bout_", "") > $about_name } }
The Job is created, goes to status Running, and then to status Completed, but no file is created. Without Start-Job, all works, with Start-Job, nothing... I've tried a lot of variations on this but cannot get it to create the file. Can someone advise what I am doing wrong in this please?
IMO, the simplest way to get around this problem by use of the $using scope modifier.
$about_name = "C:\0\ps_about_name.txt"
$about = get-help about_* | select Name,Synopsis
$sb = { if (-not (Test-Path $using:about_name)) {
$using:about.Name -replace '^about_' | Sort-Object > $using:about_name
}
}
Start-Job -Scriptblock $sb
Explanation:
$using allows you to access local variables in a remote command. This is particularly useful when running Start-Job and Invoke-Command. The syntax is $using:localvariable.
This particular problem is a variable scope issue. Start-Job creates a background job with its own scope. When using -Scriptblock parameter, you are working within that scope. It does not know about variables defined in your current scope/session. Therefore, you must use a technique that will define the variable within the scope, pass in the variable's value, or access the local scope from the script block. You can read more about scopes at About_Scopes.
As an aside, character sets [] are not supported in the .NET .Replace() method. You need to switch to -replace to utilize those. I updated the code to perform the replace using -replace case-insensitively.
HCM's perfectly fine solution uses a technique that passes the value into the job's script block. By defining a parameter within the script block, you can pass a value into that parameter by use of -ArgumentList.
Another option is to just define your variables within the Start-Job script block.
$sb = { $about_name = "C:\0\ps_about_name.txt"
$about = get-help about_* | select Name,Synopsis
if (-not (Test-Path $about_name)) {
$about.Name -replace '^about_' | Sort-Object > $about_name
}
}
Start-Job -Scriptblock $sb
You've got to send your parameters to your job.
This does not work:
$file = "C:\temp\_mytest.txt"
start-job {"_" | out-file $file}
While this does:
$file = "C:\temp\_mytest.txt"
start-job -ArgumentList $file -scriptblock {
Param($file)
"_" | out-file $file
}
I have a script that functions the way I want it to but it's slow. I tried using the same method in a workflow with foreach parallel but the set-variable command is not something that can be used within a workflow. I wanted to see if the way I'm doing this is incorrect and if there's a better way to get what I'm doing. The reason I want to do parallel requests is because the script can take quite a long time to complete when expanded to 20+ servers as is does each server in turn where as being able to do them all in one go would be quicker.
Below is a dumbed down version of the script (that works without parallel foreach) but it's effectively what I need to get working:
$servers = #("server1", "server2");
foreach ($s in $servers) {
$counter_value = get-counter "\\$s\counter_name"
Set-Variable -name "{s}counter" -value $counter_value
write-host ${server1counter}
Commands not supported in workflows needs to be executed in an Inlinescript. Try (untested):
workflow t {
$servers = #("server1", "server2");
foreach -parallel ($s in $servers) {
inlinescript {
$counter_value = get-counter "\\$using:s\counter_name"
Set-Variable -name "$($using:s)counter" -value $counter_value
#write-host with a PerformanceCounterSampleSet isn't a good combination. You'll only get the typename since it's a complex type (multiple properties etc.)
write-host (Get-Variable "$($using:s)counter" -ValueOnly)
}
}
}
t
Let's say I have a simple scope that is book-ended with Push-Location and Pop-Location:
Function MyFunction($Location)
{
Push-Location $Location
# do other stuff here
Pop-Location
}
Is there any way to set it up at the beginning of the scope so that I don't have to remember to put the Pop-Location at the end? Something like this:
Function MyFunction($Location)
{
Setup-BothPushAndPopHere $Location
# do other stuff here
# at the end of the scope, Pop-Location is automatically called
}
Short answer: No.
My take on Push-Location and Pop-Location is that you should generally avoid them, and adapt your script to use the path names in commands instead; in other words, instead of:
Push-Location $Location
Get-ChildItem
Pop-Location
Just do:
Get-ChildItem $Location
(simplified example)
If you must use the pattern, consider try/finally:
Push-Location $Location
try {
# ...
} finally {
Pop-Location
}
As this helps with unexpected exceptions or the user interrupting program execution.
I typically use the try/finally pattern when the code is outside my control; most often when loading the SQLPS module since it changes the current location to the SQL server provider which in my experience causes everything that uses the current location to become much slower.
As Eris points out, it's also useful when dealing with native applications. This can be especially true if it's painful to escape quotes around path names with spaces, or the application wouldn't handle it correctly anyway.
Depending on what you try to do in the #do other stuff here code, you could try executing those commands as a script block in a child process. A skeleton-code example:
$ScriptBlock = {
Push-Location $Location
#commands
}
$Job = Start-Job -ScriptBlock $ScriptBlock # Add other arguments if needed
# Check the status of the task:
Get-Job $Job
# Wait for the job state to be 'Completed' (do-while, maybe?)
# If your ScriptBlock writes any output, collect in a variable:
$Output = Receive-Job $Job
# Clean up:
Remove-Job $Job
The point of this approach is that you spawn a job to do the work (in the background) and you needn't worry about Pop-Location as you just let that child scope exit while you carry on doing whatever you need to do in your main script.
There are other posts here on StackExchange that go into more detail about powershell-jobs.
Here is a cool way to do it with script blocks
function withPath($action,$path) {
Push-Location $path
try
{
& $action
}
finally
{
Pop-Location
}
}
withPath {
Get-Location | Write-Host
withPath {
Get-Location | Write-Host
throw "oh no, an inner exception!"
} 'Program Files' #relative path
} 'C:\'