Jenkinsfile Powershell - variable from one session to another? - powershell

I'm trying to find a cleaner way (NOT using a file or export-clixml) to pass powershell variable from one session to another or from one stage to another in terms of Jenkinsfile.
I'm trying to avoid using the following from link below:
How to pass powershell variable from one session to another or from one stage to another in terms of Jenkinsfile

In the context of PowerShell you can take a variable from one PSSession, and set it in another PSSession with:
Invoke-Command -Session $session -ArgumentList $variable {
Set-Variable VariableName $args[0]
}
But if the execution engine is NOT PowerShell (such as in this case, you are using the Jenkinsfile to run two instances of powershell.exe at different stages), you only have two options:
Set the output from a shell command to a variable in the Jenkinsfile, and reference it later when building your next PowerShell command
Serialize the PowerShell object to disk with Export-CliXml and read it in from a different PowerShell session using Import-CliXml, as outlined in the question you linked to

Related

Invoke-command and running ps1 with parameters

I'm trying to run a script using invoke-command to install defender for endpoint with some associated parameters.
If I run a standard ps1 using invoke-command it works with no issues. However, if I run the following:
Invoke-Command -ComputerName NAME -FilePath \\srv\share\install.ps1 -OnboardingScript \\srv\share\WindowsDefenderATPonboardingscript.cmd -Passive
I receive "A parameter cannot be found that matches parameter name 'OnboardingScript'". Can someone please help me understand how I invoke a command and run a script with parameters?
Parameters already defined in the install.Ps1 file
https://github.com/microsoft/mdefordownlevelserver/blob/main/Install.ps1
Many thanks in advance
Your Invoke-Command call has a syntax problem, as Santiago Squarzon points out:
Any pass-through arguments - those to be seen by the script whose path is passed to -FilePath - must be specified via the -ArgumentList (-Args) parameter, as an array.
# Simplified example with - of necessity - *positional* arguments only.
# See below.
Invoke-Command -ComputerName NAME -FilePath .\foo.ps1 -Args 'bar', 'another arg'
The same applies to the more common invocation form that uses a script block ({ ... }), via the (potentially positionally implied) -ScriptBlock parameter.
However, there's a catch: Only positional arguments can be passed that way, which:
(a) requires that the target script support positional argument binding for all arguments of interest...
(b) ... which notably precludes passing switch parameters (type [switch]), such as -Passive in your call.
(c) requires you to pass the invariably positional arguments in the correct order.
Workaround:
Use a -ScriptBlock-based invocation, which allows for regular argument-passing with the usual support for named arguments (including switches):
If, as in your case, the script file is accessible by a UNC path visible to the remote session as well, you can simply call it from inside the remote script block.
Note: It isn't needed in your case, but you generally may need $using: references in order to incorporate values from the local session into the arguments - see further below for an example.
Invoke-Command -ComputerName NAME {
& \\srv\share\install.ps1 -OnboardingScript \\srv\share\WindowsDefenderATPonboardingscript.cmd -Passive
}
Otherwise (typically, a script file local to the caller):
Use a $using: reference to pass the content (source code) of your script file to the remote session, parse it into a script block there, and execute that script block with the arguments of interest :
$scriptContent = Get-Content -Raw \\srv\share\install.ps1
Invoke-Command -ComputerName NAME {
& ([scriptblock]::Create($using:scriptContent)) -OnboardingScript \\srv\share\WindowsDefenderATPonboardingscript.cmd -Passive
}
Small caveat: Since the original script file's source code is executed in memory in the remote session, file-related reflection information won't be available, such as the automatic variables that report a script file's full path and directory path ($PSCommandPath and $PSScriptRoot).
That said, the same applies to use of the -FilePath parameter, which essentially uses the same technique of copying the source code rather than a file to the remote session, behind the scenes.
thanks for your reply. I have managed to get this working by adding -ScriptBlock {. "\srv\share etc}

Azure Automation - how to split out common functions

I'm wondering if there's a way I can define common functions in a separate runbook in Azure Automation? For instance, I've got a logging function that timestamps the messages and exits on errors that I use in multiple runbooks. I'd like to define it once, and then call it from the other runbooks. And if I change it in the future, I only have to change it in one place.
I know I can define parent / child runbooks and call them inline, which led me to wonder if I could split out, for instance, a function definition and then call that runbook from another runbook to "import" the function into the current runbook. So for instance, I have a runbook called "Test-FunctionDefinition" with the following code:
function Test-FunctionDefintion() {
param (
[String] $TestParam
)
Write-Output "Output from test function: $TestParam"
}
I'd like to be able to call it inline from another runbook like this to define the function, and then be able to use that function:
& .\Test-FunctionDefinition.ps1
Test-FunctionDefinition -TestParam "Test String"
I tried creating the two runbooks, but while it appears to call the runbook "Test-FunctionDefiniton" fine on line 1, subsequently calling the function on line 3 fails with:
Test-FunctionDefinition : The term 'Test-FunctionDefinition' is not recognized as the name of a cmdlet, function, script file, or operable program.
Is what I'm trying to do possible? I realize I could just modify my runbook and call & .\Test-FunctionDefinition.ps1 -TestParam "Test String", but would prefer to do it the other way if possible.
It looks like it should be possible - Create modular runbooks in Automation
You have twop options:
Inline - Child runbooks run in the same job as the parent.
Cmdlet - A separate job is created for the child runbook.
For powershell runbook it should be easy as this
$vm = Get-AzVM -ResourceGroupName "LabRG" -Name "MyVM"
$output = .\PS-ChildRunbook.ps1 -VM $vm -RepeatCount 2 -Restart $true
I did some more testing to make sure my original assumption (defining a function in one script and calling it in another) worked as I expected, and I got the same error. It appears that you need to dot-source rather than use the &. This worked both in a PowerShell console and Azure Automation:
. .\Test-FunctionDefinition.ps1
Test-FunctionDefinition -TestParam "Test String"
Note the starting '.' instead of '&'. I'll need to look up what the difference is between those...

determine if powershell called the running exe?

From a running exe, how can one easily determine whether the exe was invoked from powershell? I've not found a predefined environment variable that is a reliable indicator.
My specific issue is that I'm trying to modify PATH and other env vars in an existing PS session from the exe (a Go static linked exe) by creating a "runner" .bat/.ps1 that mangles the env vars of the currently running cmd.exe or PS. If the exe was called from PS, I'll create a .ps1. If the exe was called from cmd.exe, I'll create a .bat. Ideally, I'd use a .bat with something like the following to handle PS:
rem This doesn't work
powershell -C "& { $env:FAKE_PATH_2='C:\ruby193\bin' }"
rem This also doesn't work
powershell -C "& { [Environment]::SetEnvironmentVariable('FAKE_PATH_3', 'Sneaky 1') }"
rem This also doesn't work
powershell -C [Environment]::SetEnvironmentVariable('FAKE_PATH_4', 'Sneaky 2')
but none of the above propagate the env vars to existing PS session. I'm looking for a solution that doesn't require wrapper .bat/.ps1 scripts to setup and call the exe.
Any creative, low-complexity ideas?
You can use WMI to find the parent process ID and then determine if that is PowerShell. I'll show an example here in PowerShell but you would need to convert that to the appropriate WMI API for your EXE:
$parentPid= (Get-WmiObject -Class Win32_Process -Filter "ProcessId='$pid'").ParentProcessId
(Get-Process -Id $parentPid).ProcessName
That said, the rest of the question isn't very clear to me. Executing this:
powershell -C "& { [Environment]::SetEnvironmentVariable('FAKE_PATH_3', 'Sneaky 1') }"
Starts a new PowerShell EXE and doesn't modify an existing PowerShell session. In fact, modifying an existing EXE's env block is going to be tricky. And if the EXE doesn't monitor env block changes via WM_SETTINGCHANGE, it just won't work unless you get help from the EXE itself (like having PowerShell check for some sentinel to tell it to modify its env vars).

Is it possible to call a powershell script within another script as a variable?

I have a Powershell script where the user passes in a script as a parameter. After that is passed in, I cannot call the script by using $scriptvariable. Is there any way to call a Powershell script from within another Powershell script, when the one script needs to be called from a variable.
param(
[string]hostval,
[string]$scriptpath
)
Invoke-Command -Computer $hostval -Scriptblock { $scriptpath } -credential $cred
This does not work, and I'm not sure if what I want is possible. Is there a parameter type (ex: [script]$scriptpath) that I can use so the script can be called from $scriptpath?
It sounds like you need to use the -FilePath parameter, instead of -Scriptblock:
-FilePath <String>
Runs the specified local script on one or more remote computers. Enter the path and file name of the script, or pipe a script path to Invoke-Command. The script must reside on the local computer or in a directory that the local computer can access. Use the ArgumentList parameter to specify the values of parameters in the script.

Best practices for writing PowerShell scripts for local and remote usage

What are some of the best practices for writing scripts that will execute in a remote context?
For instance, I just discovered that built-in var $Profile doesn't exist during remote execution.
Profile
You've discovered one main difference, $profile not being configured.
Buried in MSDN here are some FAQs about remote powershell, or do get-help about_Remote_FAQ.
Under the "WHERE ARE MY PROFILES?" (heh) it explains:
For example, the following command runs the CurrentUserCurrentHost profile
from the local computer in the session in $s.
invoke-command -session $s -filepath $profile
The following command runs the CurrentUserCurrentHost profile from
the remote computer in the session in $s. Because the $profile variable
is not populated, the command uses the explicit path to the profile.
invoke-command -session $s {. "$home\Documents\WindowsPowerShell\Microsoft.PowerShell_profile.ps1"}
Serialization
Another difference that may affect you is that instead of the .NET objects returned by commands being just directly returned, when you run them remotely and return them, they get serialized and deserialized over the wire. Many objects support this fine, but some do not. Powershell automatically removes methods on objects that are no longer "hooked up", and they're basically data structures then... but it does re-hook methods on some types like DirectoryInfo.
Usually you do not have to worry about this, but if you're returning complex objects over a pipe, you might...
Script blocks don't act as closures, like they do normally:
$var = 5
$sb={ $var }
&$sb # 5
Start-Job $sb | Wait-Job | Receive-Job # nothing