I'm wondering if there's a way I can define common functions in a separate runbook in Azure Automation? For instance, I've got a logging function that timestamps the messages and exits on errors that I use in multiple runbooks. I'd like to define it once, and then call it from the other runbooks. And if I change it in the future, I only have to change it in one place.
I know I can define parent / child runbooks and call them inline, which led me to wonder if I could split out, for instance, a function definition and then call that runbook from another runbook to "import" the function into the current runbook. So for instance, I have a runbook called "Test-FunctionDefinition" with the following code:
function Test-FunctionDefintion() {
param (
[String] $TestParam
)
Write-Output "Output from test function: $TestParam"
}
I'd like to be able to call it inline from another runbook like this to define the function, and then be able to use that function:
& .\Test-FunctionDefinition.ps1
Test-FunctionDefinition -TestParam "Test String"
I tried creating the two runbooks, but while it appears to call the runbook "Test-FunctionDefiniton" fine on line 1, subsequently calling the function on line 3 fails with:
Test-FunctionDefinition : The term 'Test-FunctionDefinition' is not recognized as the name of a cmdlet, function, script file, or operable program.
Is what I'm trying to do possible? I realize I could just modify my runbook and call & .\Test-FunctionDefinition.ps1 -TestParam "Test String", but would prefer to do it the other way if possible.
It looks like it should be possible - Create modular runbooks in Automation
You have twop options:
Inline - Child runbooks run in the same job as the parent.
Cmdlet - A separate job is created for the child runbook.
For powershell runbook it should be easy as this
$vm = Get-AzVM -ResourceGroupName "LabRG" -Name "MyVM"
$output = .\PS-ChildRunbook.ps1 -VM $vm -RepeatCount 2 -Restart $true
I did some more testing to make sure my original assumption (defining a function in one script and calling it in another) worked as I expected, and I got the same error. It appears that you need to dot-source rather than use the &. This worked both in a PowerShell console and Azure Automation:
. .\Test-FunctionDefinition.ps1
Test-FunctionDefinition -TestParam "Test String"
Note the starting '.' instead of '&'. I'll need to look up what the difference is between those...
Related
I'm wondering what I'm missing here. I have a powershell script that calls another script with some parameters to execute as a way to keep things tidy. Here is what works:
C:\Scripts\Project\coolscript.ps1 -projname 'my.project' -domain 'work'
I want others to be able to use this script without having to change anything, so I thought I could make the path relative instead of the full one starting from C: so I thought I could execute the script like this:
$pathname = $PSScriptRoot + '\coolscript.ps1'
$pathname -projname 'my.project' -domain 'work'
however I get an error that says 'unexpected token in expression or statement for everything after $pathname
ANy ideas what I'm missing? Thank you
Use the Call operator (&) as follows:
& $pathname -projname 'my.project' -domain 'work'
Call operator &
Runs a command, script, or script block. The call operator, also known as the "invocation operator," lets you run commands that are
stored in variables and represented by strings or script blocks. The
call operator executes in a child scope. For more about scopes, see
about_scopes.
I wrote a script to build all .net projects in a folder.
Issue
The issue is I am getting a missing function error when I call Build-Sollution.
What I tried
I made sure that function was declared before I used it so I am not really sure why it saids that it is not defined.
I am new to powershell but I would think a function calling another functions should work like this?
Thanks in advance!
Please see below for the error message and code.
Error Message
Line |
3 | Build-Sollution $_
| ~~~~~~~~~~~~~~~
The term 'Build-Sollution' is not recognized as the name of a cmdlet, function, script file, or operable program.
Check the spelling of the name, or if a path was included, verify that the path is correct and try again.
Build-Sollution:
Code
param (
#[Parameter(Mandatory=$true)][string]$plugin_path,
[string]$depth = 5
)
$plugin_path = 'path/to/sollutions/'
function Get-Sollutions {
Get-ChildItem -File -Path $plugin_path -Include *.sln -Recurse
}
function Build-Sollution($solution) {
dotnet build $solution.fullname
}
function Build-Sollutions($solutions) {
$solutions | ForEach-Object -Parallel {
Build-Sollution $_
}
}
$solutions_temp = Get-Sollutions
Build-Sollutions $solutions_temp
From PowerShell ForEach-Object Parallel Feature | PowerShell
Script blocks run in a context called a PowerShell runspace. The runspace context contains all of the defined variables, functions and loaded modules.
...
And each runspace must load whatever module is needed and have any variable be explicitly passed in from the calling script.
So in this case, the easiest solution is to define Build-Sollution inside Build-Sollutions
As for this...
I am new to powershell but I would think a function calling another
functions should work like this?
... you cannot use the functions until you load your code into memory. You need to run the code before the functions are available.
If you are in the ISE or VSCode, if the script is not saved, Select All and hit use the key to run. In the ISE use F8 Selected, F5 run all. In VSCode, F8 run selected, crtl+F5 run all. YOu can just click the menu options as well.
If you are doing this from the consolehost, the run the script using dot sourcing.
. .\UncToYourScript.ps1
It's ok to be new, we all started somewhere, but it's vital that you get ramped up first. so, beyond what I address here, be sure to spend time on Youtube and search for Beginning, Intermediate, Advanced PowerShell for videos to consume. There are tons of free training resources all over the web and using the built-in help files would have given you the answer as well.
about_Scripts
SCRIPT SCOPE AND DOT SOURCING Each script runs in its own scope. The
functions, variables, aliases, and drives that are created in the
script exist only in the script scope. You cannot access these items
or their values in the scope in which the script runs.
To run a script in a different scope, you can specify a scope, such as
Global or Local, or you can dot source the script.
The dot sourcing feature lets you run a script in the current scope
instead of in the script scope. When you run a script that is dot
sourced, the commands in the script run as though you had typed them
at the command prompt. The functions, variables, aliases, and drives
that the script creates are created in the scope in which you are
working. After the script runs, you can use the created items and
access their values in your session.
To dot source a script, type a dot (.) and a space before the script
path.
See also:
'powershell .net projects build run scripts'
'powershell build all .net projects in a folder'
Simple build script using Power Shell
Update
As per your comments below:
Sure the script should be saved, using whatever editor you choose.
The ISE does not use PSv7 by design, it uses WPSv5x and earlier.
The editor for PSv7 is VSCode. If you run a function that contains another function, you have explicitly loaded everything in that call, and as such it's available.
However, you are saying, you are using PSv7, so, you need to run your code in the PSv7 consolehost or VSCode, not the ISE.
Windows PowerShell (powershell.exe and powershell_ise.exe) and PowerShell Core (pwsh.exe) are two different environments, with two different executables, designed to run side-by-side on Windows, but you do have to explicitly choose which to use or write your code to branch to a code segment to execute relative to the host you started.
For example, let's say I wanted to run a console command and I am in the ISE, but I need to run that in Pwsh. I use a function like this that I have in a custom module autoloaded via my PowerShell profiles:
# Call code by console executable
Function Start-ConsoleCommand
{
[CmdletBinding(SupportsShouldProcess)]
[Alias('scc')]
Param
(
[string]$ConsoleCommand,
[switch]$PoSHCore
)
If ($PoSHCore)
{Start-Process pwsh -ArgumentList "-NoExit","-Command &{ $ConsoleCommand }" -PassThru -Wait}
Else {Start-Process powershell -ArgumentList "-NoExit","-Command &{ $ConsoleCommand }" -PassThru -Wait}
}
All this code is doing is taking whatever command I send it and if I use the PoSHCore switch...
scc -ConsoleCommand 'SomeCommand' -PoSHCore
... it will shell out to PSCore, run the code, otherwise, it just runs from the ISE>
If you want to use the ISE with PSv7 adn not do the shell out thing, you need to force the ISE to use PSv7 to run code. See:
Using PowerShell Core 6 and 7 in the Windows PowerShell ISE
I'm trying to find a cleaner way (NOT using a file or export-clixml) to pass powershell variable from one session to another or from one stage to another in terms of Jenkinsfile.
I'm trying to avoid using the following from link below:
How to pass powershell variable from one session to another or from one stage to another in terms of Jenkinsfile
In the context of PowerShell you can take a variable from one PSSession, and set it in another PSSession with:
Invoke-Command -Session $session -ArgumentList $variable {
Set-Variable VariableName $args[0]
}
But if the execution engine is NOT PowerShell (such as in this case, you are using the Jenkinsfile to run two instances of powershell.exe at different stages), you only have two options:
Set the output from a shell command to a variable in the Jenkinsfile, and reference it later when building your next PowerShell command
Serialize the PowerShell object to disk with Export-CliXml and read it in from a different PowerShell session using Import-CliXml, as outlined in the question you linked to
I am trying to execute a DSC script via set-azuremrmvmcustomscriptextension within an Azure Automation Runbook.
I do not want to use Azure Automation's DSC since there are other actions in my runbook I need to execute (non-DSC) and I don't want to use the set-azurermvmdscextension as my file is not located in Blob storage but accessible via the fileURI switch offered by set-azurermvmcustomscriptextension.
When I run a DSC file (.ps1) via set-azurermvmcustomscriptextension, I get no errors and it generates the local .mof. However, it doesn't actually execute anything within the configuration brackets, aka, the DSC PowerShell code that needs to run.
The .ps1 has the DSC command to apply the configuration. If I run the script while remoted onto the server via PowerShell ISE it runs, it's only when I call it from set-azurermvmcustomscriptextension that it doesn't execute the configuration (but still creates the mof).
Is there a permissions issue at play? DSC runs at the system level I believe and it had no issues generating the mof/no errors on importing modules, etc. It's like it is just ignoring the configuration.
UPDATE 1
After more testing, we have found that when we use -Argument in Set-AzureRMVMCustomScriptExtension, it fails to run the DSC Configuration. Removing it allows the DSC Configuration to run... however, we need to pass the correct Arguments in from the Runbook/Set-AzureRMVMCustomScriptExtension in order for it to have the correct values.
Here is where we figured out the Args: How can I pass multiple arguments to Set-AzureRmVMCustomScriptExtension?
After multiple attempts and isolating the issue being how I am passing arguments via the -Argument switch, I found that one of my arguments had spaces in it, so that I tried to call argument as follows, it saw the single argument as multiple and did not pass this into my script as expected since -Argument is space delimited. NOTE: The Azure Automation Runbook Prompts for the values:
Input for Value 1: myvalue1
Input for Value 2: my value 2
[parameter(Mandatory=$true)][String] $value1,
[parameter(Mandatory=$true)][String] $value2
-Argument "$value1 value2"
# Value 1 seen as myvalue1, while Value 2 seen as three separate values of "my" "value" "2" since -Argument is space delimited.
Changing this to escape the $value2 argument fixed the issue as follows:
Input for Value 1: myvalue1
Input for Value 2: my value 2
[parameter(Mandatory=$true)][String] $value1,
[parameter(Mandatory=$true)][String] $value2
-Argument "$value1 `"value2`""
# Value 1 seen as myvalue1, while Value 2 seen as "my value 2"
I have written a simple script via PowerShell to gather some files and zip them into one folder, lets call it Script.ps1. I want to make the script run every time Jenkins does a new build, however I also would like the name of the zip file to be the BUILD_NUMBER.
How can I create a variable in PowerShell that is Jenkins's current build number? As of the moment I am calling the Script.ps1 in the execute shell section of configuration.
I'm not familiar with Jenkins, but I believe BUILD_NUMBER is an environment variable.
To access it in PowerShell, use $env:BUILD_NUMBER
E.g. If using 7-zip
7z.exe a "$env:BUILD_NUMBER.zip" C:\Build\Path
You can add arguments to your Script.ps1. Just use Param at the top of the script:
Param( $BuildNumber ) #must be first statement in script
# your current script goes here
then you can call the script passing BUILD_NUMBER as argument from Jenkins. Refer to this question for calling Powershell script with argument.
You could also use the Powershell Plugin, see here.
It allows you to enter PowerShell commands directly in a text box in Jenkins.
Then you can use the available environment variables (see docu of the plugin). The usage is a little cryptic:
(Get-Content ./Sources/SlmMachineControl/GUI/Project1.rc).replace("`"FileVersion`", `"1.0.0.0`"" ,"`"FileVersion`" `"2.3.$($env:BUILD_NUMBER).0`"") | Set-Content ./Sources/SlmMachineControl/GUI/Project1.rc
Also note the escaping of the double-quotes. Took me quite a while :)