Start-Job: Call another functions in a script - powershell

After reading a lot of Q&A here on SO about Start-Job I am still can not understand what I am doing wrong...
The main idea: I need to run a lot of functions that call another functions with different parameters. Something like this:
function Base-Function {
PARAM(
[string]
$Param
)
# I will do something with $Param
}
function Copy-File {
PARAM(
[string]
$CopyFileParam
)
Base-Function -Param $CopyFileParam
}
function Read-File {
PARAM(
[string]
$ReadFileParam
)
Base-Function -Param $ReadFileParam
}
function Move-File {
PARAM(
[string]
$MoveFileParam
)
Base-Function -Param $MoveFileParam
}
So - I am trying to call Copy-File, Read-File and Move-File simultaneously:
function Main-Function {
$copyFileArgs = #{ "CopyFileParam" = 1 }
Start-Job -ScriptBlock ${Function:Copy-File} -ArgumentList $copyFileArgs
$readFileArgs = #{ "ReadFileParam" = 2 }
Start-Job -ScriptBlock ${Function:Read-File} -ArgumentList $readFileArgs
...
...
}
but of course I can not call Base-Function inside Copy-File function this way so I added -InitializationScript argument:
$init = {
function Base-Function {
PARAM(
[string]
$Param
)
# I will do something with $Param
}
}
and then I call it like this:
function Main-Function {
$copyFileArgs = #{ "CopyFileParam" = 1 }
Start-Job -ScriptBlock ${Function:Copy-File} -ArgumentList $copyFileArgs -InitializationScript $init
}
but then I get an error:
OpenError: [localhost] An error occurred while starting the background process. Error reported: An error occurred trying to start process 'C:\Program Files\PowerShell\7\pwsh.exe' with working directory 'C:\Projects\powershell\project'. The filename or extension is too long..
So my question is:
Any suggestion to simultaneously call different function that in they turn call to some in-script functions ?
Why I get this error The filename or extension is too long ?
Here is a link to powershell script for example: Gist
Run the script and let it finish
In the same shell window check for jobs: Get-Job
Check the output of running job: Receive-Job xxx
see that output of job is:
ObjectNotFound: The term 'Base-Function' is not recognized as a name of a cmdlet, function, script file, or executable program.
Check the spelling of the name, or if a path was included, verify that the path is correct and try again.
ObjectNotFound: The term 'Base-Function' is not recognized as a name of a cmdlet, function, script file, or executable program.
Check the spelling of the name, or if a path was included, verify that the path is correct and try again.

Sorry for misinforming you; the code above is correct and functional (many thanks to #mklement0 for hints and suggestions).
The actual problem that I encountered was in this line:
OpenError: [localhost] An error occurred while starting the background process.
Error reported: An error occurred trying to start process 'C:\Program Files\PowerShell\7\pwsh.exe' with working directory 'C:\Projects\powershell\project'.
The filename or extension is too long..
The filename or extension is too long.. -> this means that there is a character's length limit for what can be passed in the '-InitializationScript' parameter. You could check it in Gist example above - everything work OK.
Here is Stakoverflow question that give me this idea: Max size of ScriptBlock / InitializationScript for Start-Job in PowerShell
Once I put all my code instead in -InitializationScript parameter inside ps1 script file and then dot source it inside function - everything started working perfectly.
Thanks again #mklement0

Related

Why won't my DSC configuration let me pass a [ref], and what's an alternative?

I'm trying to pass a variable to my DSC configuration, and have it modified inside it, then returned. To do so, I wanted to use a ref var.
Like so in my powershell script:
configuration My_DSC_config
{
param (
[ref]$data
)
Node $AllNodes.Where({$_.Role -eq "test"}).nodename
{
#some code....
$data.Value = 10
}
}
$myVar = 3
My_DSC_config -data ([ref]$myVar) -ConfigurationData <my_config> -OutputPath <mofPath>
Start-DscConfiguration -Path <mofPath> -credential <myCreds>
However, when I start the powershell script, i get an error :
"New-Object : The argument "2" must not be of type System.Management.Autmation.PSReference. Don't use [ref]."
Why so? And if using [ref] isn't possible, how can I get a 'global' variable to change inside of my configuration call?

touch Function in PowerShell

I recently added a touch function in PowerShell profile file
PS> notepad $profile
function touch {Set-Content -Path ($args[0]) -Value ($null)}
Saved it and ran a test for
touch myfile.txt
error returned:
touch : The term 'touch' is not recognized as the name of a cmdlet, function,
script file, or operable program. Check the spelling of the name, or if a path
was included, verify that the path is correct and try again.
At line:1 char:1
+ touch myfile
+ ~~~~~
+ CategoryInfo : ObjectNotFound: (touch:String) [], CommandNotFoundException
+ FullyQualifiedErrorId : CommandNotFoundException
With PowerShell there are naming conventions for functions. It is higly recommended to stick with that if only to stop getting warnings about it if you put those functions in a module and import that.
A good read about naming converntion can be found here.
Having said that, Powershell DOES offer you the feature of Aliasing and that is what you can see here in the function below.
As Jeroen Mostert and the others have already explained, a Touch function is NOT about destroying the content, but only to set the LastWriteTine property to the current date.
This function alows you to specify a date yourself in parameter NewDate, but if you leave it out it will default to the current date and time.
function Set-FileDate {
[CmdletBinding()]
param(
[Parameter(ValueFromPipeline = $true, Mandatory = $true, Position = 0)]
[string[]]$Path,
[Parameter(Mandatory = $false, Position = 1)]
[datetime]$NewDate = (Get-Date),
[switch]$Force
)
Get-Item $Path -Force:$Force | ForEach-Object { $_.LastWriteTime = $NewDate }
}
Set-Alias Touch Set-FileDate -Description "Updates the LastWriteTime for the file(s)"
Now, the function has a name PowerShell won't object to, but by using the Set-Alias you can reference it in your code by calling it touch
Here is a version that creates a new file if it does not exist or updates the timestamp if it does exist.
Function Touch-File
{
$file = $args[0]
if($file -eq $null) {
throw "No filename supplied"
}
if(Test-Path $file)
{
(Get-ChildItem $file).LastWriteTime = Get-Date
}
else
{
echo $null > $file
}
}
If you have a set of your own custom functions stored in a .ps1 file, you must first import them before you can use them, e.g.
Import-module .\MyFunctions.ps1 -Force
To avoid confusion:
If you have placed your function definition in your $PROFILE file, it will be available in future PowerShell sessions - unless you run . $PROFILE in the current session to reload the updated profile.
Also note that loading of $PROFILE (all profiles) can be suppressed by starting a session with powershell.exe -NoProfile (Windows PowerShell) / pwsh -NoProfile (PowerShell (Core)).
As Jeroen Mostert points out in a comment on the question, naming your function touch is problematic, because your function unconditionally truncates an existing target file (discards its content), whereas the standard touch utility on Unix-like platforms leaves the content of existing files alone and only updates their last-write (and last-access) timestamps.
See this answer for more information about the touch utility and how to implement equivalent behavior in PowerShell.

Testing for mandatory parameters with Pester

I'm trying to figure out how to have Pester test for parameters that are missing:
Find-Waldo.Tests.ps1
$here = Split-Path -Parent $MyInvocation.MyCommand.Path
$sut = (Split-Path -Leaf $MyInvocation.MyCommand.Path) -replace '\.Tests\.', '.'
Describe 'Mandatory paramters' {
it 'ComputerName' {
{
$Params = #{
#ComputerName = 'MyPc'
ScriptName = 'Test'
}
. "$here\$sut" #Params
} | Should throw
}
}
Find-Waldo.ps1
Param (
[Parameter(Mandatory)]
[String]$ComputerName,
[String]$ScriptName
)
Function Find-Waldo {
[CmdletBinding()]
Param (
[String]$FilePath
)
'Do something'
}
Every time I try to assert the result or simply run the test, it will prompt me for the ComputerName parameter instead of failing the test.
Am I missing something super obvious here? Is there a way to test for the presence of mandatory parameters?
Per the comments from Mathias, you can't really test for whether a Mandatory parameter is missing because PowerShell prompts for it rather than throwing an error. Per the comment he linked to from the Pester team you could use Get-Command to test for the Mandatory parameter setting in the script (assuming it is the only parameter attribute set for that variable)
((Get-Command "$here\$sut").Parameters['ComputerName'].Attributes.Mandatory | Should Be $true
An alternative option would be to not use Mandatory parameters in this instance, and instead have a script block that does a Throw as the default value of the parameter:
Param (
[String]$ComputerName = $(Throw '-ComputerName is required'),
[String]$ScriptName
)
If the script is always used as part of an automated process (instead of via user execution) this might be preferred as it allows you to control/capture its behavior and avoids it getting stuck during execution. You can then test the script as you had originally proposed:
Describe 'Mandatory paramters' {
it 'ComputerName' {
{
$Params = #{
#ComputerName = 'MyPc'
ScriptName = 'Test'
}
. "$here\$sut" #Params
} | Should throw '-ComputerName is required'
}
}
Although the accepted answer indicates that this isn't possible, it actually is possible. Here is the solution that I developed to solve for this problem.
It 'Should fail when no priority is specified, for a valid process name' {
{
$ScriptBlock = {
Import-Module -Name $args[0]
Set-ProcessPriority -Name System
}
Start-Job -ScriptBlock $ScriptBlock -ArgumentList $HOME/git/ProcessPriority/src/ProcessPriority | Wait-Job | Receive-Job
} | Should -Throw
}
What you'll notice from the above example is:
🚀 The code being tested has been wrapped in a PowerShell ScriptBlock
🚀 We invoke a PowerShell background job, containing the test code
🚀 We wait for the background job to complete, and then receive the results
🚀 If you run the Get-Job command, you'll notice that there is a job in the Blocked status
The exception that's thrown by the background job is similar to the following:
The Wait-Job cmdlet cannot finish working, because one or more jobs are blocked waiting for user interaction. Process interactive job output by using the Receive-Job cmdlet, and then try again.
You'll notice that I hard-coded the filesystem path to the module. I am not sure how to pass this as an argument into the "outer" ScriptBlock that Pester is invoking for us. Perhaps someone has a suggestion on how to accomplish that final piece of the puzzle.
What's uniquely interesting about PowerShell background jobs is that you can actually resume a job in the Blocked status, and it will prompt you for input, even though it threw the earlier exception.

Passing parameters to a PowerShell job [duplicate]

This question already has an answer here:
Parenthesis Powershell functions
(1 answer)
Closed 7 years ago.
I've been toying around with this dang parameter passing to powershell jobs.
I need to get two variables in the script calling the job, into the job. First I tried using -ArgumentList, and then using $args[0] and $args[1] in the -ScriptBlock that I provided.
function Job-Test([string]$foo, [string]$bar){
Start-Job -ScriptBlock {#need to use the two args in here
} -Name "Test" -ArgumentList $foo, $bar
}
However I realized that -ArgumentList gives these as parameters to -FilePath, so I moved the code in the scriptblock into its own script that required two parameters, and then pointed -FilePath at this script.
function Job-Test([string]$foo, [string]$bar){
$myArray = #($foo,$bar)
Start-Job -FilePath .\Prog\august\jobScript.ps1 -Name 'Test' -ArgumentList $myArray
}
#\Prog\august\jobScript.ps1 :
Param(
[array]$foo
)
#use $foo[0] and $foo[1] here
Still not working. I tried putting the info into an array and then passing only one parameter but still to know avail.
When I say no avail, I am getting the data that I need however it all seems to be compressed into the first element.
For example say I passed in the name of a file as $foo and it's path as $bar, for each method I tried, I would get args[0] as "filename path" and args[1] would be empty.
ie:
function Job-Test([string]$foo, [string]$bar){
$myArray = #($foo,$bar)
Start-Job -FilePath .\Prog\august\jobScript.ps1 -Name 'Test' -ArgumentList $myArray
}
Then I called:
$foo = "hello.txt"
$bar = "c:\users\world"
Job-Test($foo,$bar)
I had jobScript.ps1 simply Out-File the two variables to a log on separate lines and it looked like this:
log.txt:
hello.txt c:\users\world
#(empty line)
where it should have been:
hello.txt
c:\users\world
you don't need to call the function like you would in java. just append the two variables to the end of the function call Job-Test $foo $bar

Powershell parameter namespace collision

I'm a Powershell beginner, although not a programming n00b. I'm trying to create an IDisposable/RAII-style failsafe pattern, sort of like in:
http://www.sbrickey.com/Tech/Blog/Post/IDisposable_in_PowerShell
So I have:
Function global:FailSafeGuard
{
param (
[parameter(Mandatory=$true)] [ScriptBlock] $execute,
[parameter(Mandatory=$true)] [ScriptBlock] $cleanup
)
Try { &$execute }
Finally { &$cleanup }
}
I'm trying to use it to perform a bunch of tasks in a different directory, using Push-Location on the way in and Pop-Location on the way out. So I have:
Function global:Push-Location-FailSafe
{
param (
$location,
[ScriptBlock] $execute
)
FailSafeGuard {
Push-Location $location;
&$execute
} { Pop-Location }
}
I find that the $execute param in Push-Location-FailSafe collides with the $execute param in the FailSafe function.
Push-Location-FailSafe "C:\" {dir}
The expression after '&' in a pipeline element produced an invalid object. It must result in a command name, script block or CommandInfo object.
At C:\TEMP\b807445c-1738-49ff-8109-18db972ab9e4.ps1:line:20 char:10
+ &$ <<<< execute
The reason I think it's a name-collision is that if I rename $execute to $execute2 in Push-Location-FailSafe, it works fine:
Push-Location-FailSafe "C:\" {dir}
Directory: C:\
Mode LastWriteTime Length Name
---- ------------- ------ ----
d---- 2011-08-18 21:34 cygwin
d---- 2011-08-17 01:46 Dell
[snip]
What's wrong in my understanding of parameters?
Your problem is with scriptblocks and how they handle variables. Variables inside a scriptblock doesn't expand until they are executed. Because of this you are hitting a loop. Let me show you:
When you call Push-Location-Failsafe method your variable is like this:
[DBG]: PS C:\>> (Get-Variable execute).Value
dir
But then you call your inner function FailSafeGuard, your $execute variable changes to this:
[DBG]: PS C:\>> (Get-Variable execute).Value
Push-Location $location;
& $execute
Now when you're try { } block starts executing, it begins to expand the variables. When it expands $execute it will get look like this:
Try {
Push-Location $location;
& $execute
}
Then it expands $execute again. Your try block is now:
Try {
Push-Location $location;
& {
Push-Location $location;
& $execute
}
}
And you got yourself an infinite loop caused by recursion. To fix this, you can expand your $execute variable inside a string, that you then create a scriptblock out of. Like this:
Function global:Push-Location-FailSafe
{
param (
$location,
[ScriptBlock] $execute
)
FailSafeGuard ([ScriptBlock]::Create("
Push-Location $location;
& $execute")) { Pop-Location }
}
Be aware that this particular solution will have a problem when $execute includes variables inside. e.g.: $execute = { $h = dir } because it will try to expand $h when it creates the scriptblock.
An easier and better way to solve it is just to use different variablenames so there's no collision in the first place :-)