Powershell start-job scope - powershell

I have a long script. i have a function for logging:
function Log ([string]$Content){
$Date = Get-Date
Add-Content -Path $LogPath -Value ("$Date : $Content")
}
In some point at the script i have the need to run jobs in parallel.
I have a list of computer names and i need to use psexec to each one of them. this should be done as jobs to to run in parallel
Start-Job -ScriptBlock {
Log "$line Has called"
$Program_List = gc $USERS_DB_PATH\$line.txt | select -Skip 1
if (Test-Connection $line -Quiet) {
ForEach ($program in $Program_List){
Log "$line $program"
#"line $line bot is $bot pwd is $pwd"
psexec \\"$line" -u bla.local\"$bot" -p $pwd cmd bla
}
}
else{
Log "Cannot Connect to $line"
}
}
#Remove-Item "$USERS_DB_PATH\$line.txt"
}
I understand this is something to do with Scope but how can I make this scriptblock see the function Log and all the neccesery variables? they all come up empty

tl;dr
Reference variables from the caller's scope via the $using: scope.
Recreate your Log function in the context of the background job, using $function:Log = $using:function:Log
Start-Job -ScriptBlock {
# Required in Windows PowerShell only (if needed).
# Change to the same working directory as the caller.
Set-Location -LiteralPath ($using:PWD).ProviderPath
# Recreate the Log function.
$function:Log = $using:function:Log
# All variable values from the *caller*'s scope must be $using: prefixed.
Log "$using:line Has called"
# ...
}
Read on for an explanation.
See the bottom section for better alternatives to Start-Job: Start-ThreadJob and ForEach-Object -Parallel (PowerShell (Core) 7+ only).
A background job runs in an invisible PowerShell child process, i.e. a separate powershell.exe (Windows PowerShell) pwsh (PowerShell (Core) 7+) process.
Such a child process:
does not load $PROFILE files.
knows nothing about the caller's state; that is, it doesn't have access to the caller's variables, functions, aliases, ... defined in the session; only environment variables are inherited from the caller.
Conversely, this means that only the following commands are available by default in background jobs:
external programs and *.ps1 scripts, via the directories listed in the $env:PATH environment variable.
commands in modules available via the module-autoloading feature, from directories listed in the $env:PSModulePath environment variable (which has a default module).
Passing caller-state information to background jobs:
Variables:
While you cannot pass variables as such to background jobs, you can pass their values, using the $using: scope; in other words: you can get the value of but not update a variable in the caller's scope - see the conceptual about_Remote_Variables.
Alternatively, pass the value as an argument via Start-Job's -ArgumentList (-Args) parameter, which the -ScriptBlock argument must then access in the usual manner: either via the automatic $args variable or via explicitly declared parameters, using a param() block.
Functions:
Analogously, you cannot pass a function as such, but only a function's body, and the simplest way to do that is via namespace variable notation; e.g. to get the body of function foo, use $function:foo; to pass it to a background job (or remote call), use $using:function:foo.
Since namespace variable notation can also be used to assign values, assigning to $function:foo creates or updates a function named foo, so that $function:foo = $using:function:foo effectively recreates a foo function in the background session.
Note that while $function:foo returns the function body as a [scriptblock] instance, $using:function:foo, turns into a string during serialization (see GitHub issue #11698; however, fortunately you can also create functions bodies from strings.
Working directory:
In Windows PowerShell background jobs use a fixed working directory: the users Documents folder. To ensure that the background job uses the same directory as the caller, call
Set-Location -LiteralPath ($using:PWD).ProviderPath as the first statement from inside the script block passed to -ScriptBlock.
In PowerShell (Core) 7+ background job now - fortunately - use the same working directory as the caller.
Caveat re type fidelity:
Since values must be marshaled across process boundaries, serialization and deserialization of values is of necessity involved. Background jobs use the same serialization infrastructure as PowerShell's remoting, which - with the exception of a handful of well-known types, including .NET primitive types - results in loss of type fidelity, both on passing values to background jobs and receiving output from them - see this answer
Preferable alternative to background jobs: thread jobs, via Start-ThreadJob:
PowerShell (Core) 7+ comes with the ThreadJob module, which offers the Start-ThreadJob cmdlet; in Windows PowerShell you can install it on demand.
Additionally, PowerShell (Core) 7+ offers essentially the same functionality as an extension to the ForEach-Object cmdlet, via the -Parallel parameter, which executes a script block passed to it in a separate thread for each input object.
Start-ThreadJob fully integrates with PowerShell's other job-management cmdlets, but uses threads (i.e. in-process concurrency) rather than child processes, which implies:
much faster execution
use of fewer resources
no loss of type fidelity (though you can run into thread-safety issues and explicit synchronization may be required)
Also, the caller's working directory is inherited.
The need for $using: / -ArgumentList equally applies.
For ForEach-Object -Parallel an improvement is being considered to allow copying the caller's state to the thread script blocks on an opt-in basis - see GitHub issue #12240.
This answer provides an overview of ForEach-Object -Parallel and compares and contrasts Start-Job and Start-ThreadJob.

Related

Invoke-command and running ps1 with parameters

I'm trying to run a script using invoke-command to install defender for endpoint with some associated parameters.
If I run a standard ps1 using invoke-command it works with no issues. However, if I run the following:
Invoke-Command -ComputerName NAME -FilePath \\srv\share\install.ps1 -OnboardingScript \\srv\share\WindowsDefenderATPonboardingscript.cmd -Passive
I receive "A parameter cannot be found that matches parameter name 'OnboardingScript'". Can someone please help me understand how I invoke a command and run a script with parameters?
Parameters already defined in the install.Ps1 file
https://github.com/microsoft/mdefordownlevelserver/blob/main/Install.ps1
Many thanks in advance
Your Invoke-Command call has a syntax problem, as Santiago Squarzon points out:
Any pass-through arguments - those to be seen by the script whose path is passed to -FilePath - must be specified via the -ArgumentList (-Args) parameter, as an array.
# Simplified example with - of necessity - *positional* arguments only.
# See below.
Invoke-Command -ComputerName NAME -FilePath .\foo.ps1 -Args 'bar', 'another arg'
The same applies to the more common invocation form that uses a script block ({ ... }), via the (potentially positionally implied) -ScriptBlock parameter.
However, there's a catch: Only positional arguments can be passed that way, which:
(a) requires that the target script support positional argument binding for all arguments of interest...
(b) ... which notably precludes passing switch parameters (type [switch]), such as -Passive in your call.
(c) requires you to pass the invariably positional arguments in the correct order.
Workaround:
Use a -ScriptBlock-based invocation, which allows for regular argument-passing with the usual support for named arguments (including switches):
If, as in your case, the script file is accessible by a UNC path visible to the remote session as well, you can simply call it from inside the remote script block.
Note: It isn't needed in your case, but you generally may need $using: references in order to incorporate values from the local session into the arguments - see further below for an example.
Invoke-Command -ComputerName NAME {
& \\srv\share\install.ps1 -OnboardingScript \\srv\share\WindowsDefenderATPonboardingscript.cmd -Passive
}
Otherwise (typically, a script file local to the caller):
Use a $using: reference to pass the content (source code) of your script file to the remote session, parse it into a script block there, and execute that script block with the arguments of interest :
$scriptContent = Get-Content -Raw \\srv\share\install.ps1
Invoke-Command -ComputerName NAME {
& ([scriptblock]::Create($using:scriptContent)) -OnboardingScript \\srv\share\WindowsDefenderATPonboardingscript.cmd -Passive
}
Small caveat: Since the original script file's source code is executed in memory in the remote session, file-related reflection information won't be available, such as the automatic variables that report a script file's full path and directory path ($PSCommandPath and $PSScriptRoot).
That said, the same applies to use of the -FilePath parameter, which essentially uses the same technique of copying the source code rather than a file to the remote session, behind the scenes.
thanks for your reply. I have managed to get this working by adding -ScriptBlock {. "\srv\share etc}

Start-Process, Invoke-Command or?

Using the program got your back or GYB. I run the following command
Start-Process -FilePath 'C:\Gyb\gyb.exe' -ArgumentList #("--email <Email Address>", "--action backup", "--local-folder $GYBfolder", "--service-account", "--batch-size 4") -Wait
The issue is that when the process is done my script does not complete.
$GYBfolder = $GYBfolder.Replace('"', "")
$output = [PSCustomObject]#{
Name = $SourceGYB
Folder = $GYBfolder
}
$filename = "C:\reports\" + $SourceGYB.Split("#")[0] + "_Backup.csv"
$output | Export-Csv $filename -NoTypeInformation | Format-Table text-align=left -AutoSize
Return $filename
For some reason the script stops right before the return.
I am curious to know if I should be using a different command to run GYB?
Any thoughts on why the script does not process the return?
There's great information in the comments, but let me attempt a systematic overview:
To synchronously execute external console applications and capture their output, call them directly (C:\Gyb\gyb.exe ... or & 'C:\Gyb\gyb.exe' ...), do not use Start-Process - see this answer.
Only if gyb.exe were a GUI application would you need **Start-Process -Wait in order to execute it synchronously**.
A simple, but non-obvious shortcut is to pipe the invocation to another command, such as Out-Null, which also forces PowerShell to wait (e.g. gyb.exe | Out-Null) - see below.
When Start-Process is appropriate, the most robust way to pass all arguments is as a single string encoding all arguments, with appropriate embedded "..." quoting, as needed; this is unfortunate, but required as a workaround for a long-standing bug: see this answer.
Invoke-Command's primary purpose is to invoke commands remotely; while it can be used locally, there's rarely a good reason to do so, as &, the call operator is both more concise and more efficient - see this answer.
When you use an array to pass arguments to an external application, each element must contain just one argument, where parameter names and their values are considered distinct arguments; e.g., you must use #(--'action', 'backup', ...) rather than
#('--action backup', ...)
Therefore, use the following to run your command synchronously:
If gyb.exe is a console application:
# Note: Enclosing #(...) is optional
$argList = '--email', $emailAddress, '--action', 'backup', '--local-folder', $GYBfolder, '--service-account', '--batch-size', 4
# Note: Stdout and stderr output will print to the current console, unless captured.
& 'C:\Gyb\gyb.exe' $argList
If gyb.exe is a GUI application, which necessitates use of Start-Process -Wait (a here-string is used, because it makes embedded quoting easier):
# Note: A GUI application typically has no stdout or stderr output, and
# Start-Process never returns the application's *output*, though
# you can ask to have a *process object* returned with -PassThru.
Start-Process -Wait 'C:\Gyb\gyb.exe' #"
--email $emailAddress --action backup --local-folder "$GYBfolder" --service-account --batch-size 4
#"
The shortcut mentioned above - piping to another command in order to force waiting for a GUI application to exit - despite being obscure, has two advantages:
Normal argument-passing syntax can be used.
The automatic $LASTEXITCODE variable is set to the external program's process exit code, which does not happen with Start-Process. While GUI applications rarely report meaningful exit codes, some do, notably msiexec.
# Pipe to | Out-Null to force waiting (argument list shortened).
# $LASTEXITCODE will reflect gyb.exe's exit code.
# Note: In the rare event that the target GUI application explicitly
# attaches to the caller's console and produces output there,
# pipe to `Write-Output` instead, and possibly apply 2>&1 to
# the application call so as to also capture std*err* output.
& 'C:\Gyb\gyb.exe' --email $emailAddress --action backup | Out-Null
Note: If the above unexpectedly does not run synchronously, the implication is that gyb.exe itself launches another, asynchronous operation. There is no generic solution for that, and an application-specific one would require you to know the internals of the application and would be nontrivial.
A note re argument passing with direct / &-based invocation:
Passing an array as-is to an external program essentially performs splatting implicitly, without the need to use #argList[1]. That is, it passes each array element as its own argument.
By contrast, if you were to pass $argList to a PowerShell command, it would be passed as a single, array-valued argument, so #argList would indeed be necessary in order to pass the elements as separate, positional arguments. However, the more typical form of splatting used with PowerShell commands is to use a hashtable, which allows named arguments to be passed (parameter name-value pairs; e.g., to pass a value to a PowerShell command's
-LiteralPath parameter:
$argHash = #{ LiteralPath = $somePath; ... }; Set-Content #argHash
[1] $args and #args are largely identical in this context, but, strangely, #argList, honors use of --%, the stop-parsing symbol operator, even though it only makes sense in a literally specified argument list.

What is shortest possible way to download script from HTTP and run it with parameters using Powershell?

I have a PowerShell script file stored in an internal artifact server. The script URL is http://company-server/bootstrap.ps1.
What is a concise way to download that script and execute with a custom parameter?
I want to send such a command to users, who will copy-paste it over and over, so it must be a single-line and should be short.
What I currently have works, but it is long and unwieldy:
$c=((New-Object System.Net.WebClient).DownloadString('http://company-server/bootstrap.ps1'));Invoke-Command -ScriptBlock ([Scriptblock]::Create($c)) -ArgumentList 'RunJob'
I am wondering if there is shorter way to do this.
Note: From a code golf perspective, the solutions below could be shortened further, by eliminating insignificant whitespace; e.g., &{$args[0]}hi instead of & { $args[0] } hi. However, in the interest of readability such whitespace was kept.
A short formulation of a command that downloads a script via HTTP and executes it locally, optionally with arguments is probably this, taking advantage of:
alias irm for Invoke-RestMethod, in lieu of (New-Object System.Net.WebClient).DownloadString()
omitting quoting where it isn't necessary
relying on positional parameter binding
& ([scriptblock]::Create((irm http://company-server/bootstrap.ps1))) RunJob
RunJob is the OP's custom argument to pass to the script.
An even shorter, but perhaps more obscure approach is to use iex, the built-in alias for Invoke-Expression, courtesy of this GitHub comment.
iex "& { $(irm http://company-server/bootstrap.ps1) } RunJob"
As an aside: in general use, Invoke-Expression should be avoided.
The command uses an expandable string ("...", string interpolation) to create a string with the remote script's content enclosed in a script block { ... }, which is then invoked in a child scope (&). Note how the arguments to pass to the script must be inside "...".
However, there is a general caveat (which doesn't seem to be a problem for you): if the script terminates with exit, the calling PowerShell instance is exited too.
There are two workarounds:
Run the script in a child process:
powershell { iex "& { $(irm http://company-server/bootstrap.ps1) } RunJob" }
Caveats:
The above only works from within PowerShell; from outside of PowerShell, you must use powershell -c "..." instead of powershell { ... }, but note that properly escaping embedded double quotes, if needed (for a URL with PS metacharacters and/or custom arguments with, say, spaces), can get tricky.
If the script is designed to modify the caller's environment, the modifications will be lost due to running in a child process.
Save the script to a temporary file first:
Note: The command is spread across multiple lines for readability, but it also works as a one-liner:
& {
$f = Join-Path ([IO.Path]::GetTempPath()) ([IO.Path]::GetRandomFileName() + '.ps1');
irm http://company-server/bootstrap.ps1 > $f;
& $f RunJob;
ri $f
}
The obvious down-side is that the command is much longer.
Note that the command is written with robustness and cross-platform compatibility in mind, so that it also works in PowerShell Core, on all supported platforms.
Depending on what platforms you need to support / what assumptions you're willing to make (e.g., that the current dir. is writeable), the command can be shortened.
Potential future enhancements
GitHub issue #5909, written as of PowerShell Core 6.2.0-preview.4 and revised as of PowerShell Core 7.0, proposes enhancing the Invoke-Command (icm) cmdlet to greatly simplify download-script-and-execute scenarios, so that you could invoke the script in question as follows:
# WISHFUL THINKING as of PowerShell Core 7.0
# iwr is the built-in alias for Invoke-WebRequest
# icm is the built-in alias for Invoke-Command.
iwr http://company-server/bootstrap.ps1 | icm -Args RunJob
GitHub issue #8835 goes even further, suggesting an RFC be created to introduce a new PowerShell provider that allows URLs to be used in places where only files were previously accepted, enabling calls such as:
# WISHFUL THINKING as of PowerShell Core 7.0
& http://company-server/bootstrap.ps1 RunJob
However, while these options are very convenient, there are security implications to consider.
Here is a shorter solution (158 chars.)
$C=(New-Object System.Net.WebClient).DownloadString("http://company-server/bootstrap.ps1");icm -ScriptBlock ([Scriptblock]::Create($c)) -ArgumentList "RunJob"
Here is 121
$C=(curl http://company-server/bootstrap.ps1).content;icm -ScriptBlock ([Scriptblock]::Create($c)) -ArgumentList "RunJob"
Here is 108
$C=(curl http://company-server/bootstrap.ps1).content;icm ([Scriptblock]::Create($c)) -ArgumentList "RunJob"
Here is 98
$C=(iwr http://company-server/bootstrap.ps1).content;icm -sc([Scriptblock]::Create($c)) -ar RunJob
Thanks to Ansgar Wiechers

Call a PowerShell script in a new, clean PowerShell instance (from within another script)

I have many scripts. After making changes, I like to run them all to see if I broke anything. I wrote a script to loop through each, running it on fresh data.
Inside my loop I'm currently running powershell.exe -command <path to script>. I don't know if that's the best way to do this, or if the two instances are totally separate from each other.
What's the preferred way to run a script in a clean instance of PowerShell? Or should I be saying "session"?
Using powershell.exe seems to be a good approach but with its pros and cons, of course.
Pros:
Each script is invoked in a separate clean session.
Even crashes do not stop the whole testing process.
Cons:
Invoking powershell.exe is somewhat slow.
Testing depends on exit codes but 0 does not always mean success.
None of the cons is mentioned is a question as a potential problem.
The demo script is below. It has been tested with PS v2 and v3. Script names
may include special characters like spaces, apostrophes, brackets, backticks,
dollars. One mentioned in comments requirement is ability to get script paths
in their code. With the proposed approach scripts can get their own path as
$MyInvocation.MyCommand.Path
# make a script list, use the full paths or explicit relative paths
$scripts = #(
'.\test1.ps1' # good name
'.\test 2.ps1' # with a space
".\test '3'.ps1" # with apostrophes
".\test [4].ps1" # with brackets
'.\test `5`.ps1' # with backticks
'.\test $6.ps1' # with a dollar
'.\test ''3'' [4] `5` $6.ps1' # all specials
)
# process each script in the list
foreach($script in $scripts) {
# make a command; mind &, ' around the path, and escaping '
$command = "& '" + $script.Replace("'", "''") + "'"
# invoke the command, i.e. the script in a separate process
powershell.exe -command $command
# check for the exit code (assuming 0 is for success)
if ($LastExitCode) {
# in this demo just write a warning
Write-Warning "Script $script failed."
}
else {
Write-Host "Script $script succeeded."
}
}
If you're on PowerShell 2.0 or higher, you can use jobs to do this. Each job runs in a separate PowerShell process e.g.:
$scripts = ".\script1.ps1", ".\script2.ps1"
$jobs = #()
foreach ($script in $scripts)
{
$jobs += Start-Job -FilePath $script
}
Wait-Job $jobs
foreach ($job in $jobs)
{
"*" * 60
"Status of '$($job.Command)' is $($job.State)"
"Script output:"
Receive-Job $job
}
Also, check out the PowerShell Community Extensions. It has a Test-Script command that can detect syntax errors in a script file. Of course, it won't catch runtime errors.
One tip for PowerShell V3 users: we (the PowerShell team) added a new API on the Runspace class called ResetRunspace(). This API resets the global variable table back to the initial state for that runspace (as well as cleaning up a few other things). What it doesn't do is clean out function definitions, types and format files or unload modules. This allows the API to be much faster. Also note that the Runspace has to have been created using an InitialSessionState object, not a RunspaceConfiguration instance. ResetRunspace() was added as part of the Workflow feature in V3 to support parallel execution efficiently in a script.
The two instances are totally separate, because they are two different processes. Generally, it is not the most efficient way to start a Powershell process for every script run. Depending on the number of scripts and how often you re-run them, it may be affecting your overall performance. If it's not, I would leave everything AS IS.
Another option would be to run in the same runspace (this is a correct word for it), but clean everything up every time. See this answer for a way to do it. Or use below extract:
$sysvars = get-variable | select -Expand name
function remove-uservars {
get-variable |
where {$sysvars -notcontains $_.name} |
remove-variable
}

Best practices for writing PowerShell scripts for local and remote usage

What are some of the best practices for writing scripts that will execute in a remote context?
For instance, I just discovered that built-in var $Profile doesn't exist during remote execution.
Profile
You've discovered one main difference, $profile not being configured.
Buried in MSDN here are some FAQs about remote powershell, or do get-help about_Remote_FAQ.
Under the "WHERE ARE MY PROFILES?" (heh) it explains:
For example, the following command runs the CurrentUserCurrentHost profile
from the local computer in the session in $s.
invoke-command -session $s -filepath $profile
The following command runs the CurrentUserCurrentHost profile from
the remote computer in the session in $s. Because the $profile variable
is not populated, the command uses the explicit path to the profile.
invoke-command -session $s {. "$home\Documents\WindowsPowerShell\Microsoft.PowerShell_profile.ps1"}
Serialization
Another difference that may affect you is that instead of the .NET objects returned by commands being just directly returned, when you run them remotely and return them, they get serialized and deserialized over the wire. Many objects support this fine, but some do not. Powershell automatically removes methods on objects that are no longer "hooked up", and they're basically data structures then... but it does re-hook methods on some types like DirectoryInfo.
Usually you do not have to worry about this, but if you're returning complex objects over a pipe, you might...
Script blocks don't act as closures, like they do normally:
$var = 5
$sb={ $var }
&$sb # 5
Start-Job $sb | Wait-Job | Receive-Job # nothing