Import from strings from a text file and only expand variables [duplicate] - powershell

I'm trying to write a function that will print a user-supplied greeting addressed to a user-supplied name. I want to use expanding strings the way I can in this code block:
$Name = "World"
$Greeting = "Hello, $Name!"
$Greeting
Which successfully prints Hello, World!. However, when I try to pass these strings as parameters to a function like so,
function HelloWorld
{
Param ($Greeting, $Name)
$Greeting
}
HelloWorld("Hello, $Name!", "World")
I get the output
Hello, !
World
Upon investigation, Powershell seems to be ignoring $Name in "Hello, $Name!" completely, as running
HelloWorld("Hello, !", "World")
Produces output identical to above. Additionally, it doesn't seem to regard "World" as the value of $Name, since running
function HelloWorld
{
Param ($Greeting, $Name)
$Name
}
HelloWorld("Hello, $Name!", "World")
Produces no output.
Is there a way to get the expanding string to work when passed in as a function parameter?

In order to delay string interpolation and perform it on demand, with then-current values, you must use $ExecutionContext.InvokeCommand.ExpandString()[1] on a single-quoted string that acts as a template:
function HelloWorld
{
Param ($Greeting, $Name)
$ExecutionContext.InvokeCommand.ExpandString($Greeting)
}
HelloWorld 'Hello, $Name!' 'World' # -> 'Hello, World!'
Note how 'Hello, $Name!' is single-quoted to prevent instant expansion (interpolation).
Also note how HelloWorld is called with its arguments separated with spaces, not ,, and without (...).
In PowerShell, functions are invoked like command-line executables - foo arg1 arg2 - not like C# methods - foo(arg1, arg2) - see Get-Help about_Parsing.
If you accidentally use , to separate your arguments, you'll construct an array that a function sees as a single argument.
To help you avoid accidental use of method syntax, you can use Set-StrictMode -Version 2 or higher, but note that that entails additional strictness checks.
Note that since PowerShell functions by default also see variables defined in the parent scope (all ancestral scopes), you could simply define any variables that the template references in the calling scope instead of declaring individual parameters such as $Name:
function HelloWorld
{
Param ($Greeting) # Pass the template only.
$ExecutionContext.InvokeCommand.ExpandString($Greeting)
}
$Name = 'World' # Define the variable(s) used in the template.
HelloWorld 'Hello, $Name!' # -> 'Hello, World!'
Caveat: PowerShell string interpolation supports full commands - e.g., "Today is $(Get-Date)" - so unless you fully control or trust the template string, this technique can be security risk.
Ansgar Wiechers proposes a safe alternative based on .NET string formatting via PowerShell's -f operator and indexed placeholders ({0}, {1}, ...):
Note that you can then no longer apply transformations on the arguments as part of the template string or embed commands in it in general.
function HelloWorld
{
Param ($Greeting, $Name)
$Greeting -f $Name
}
HelloWorld 'Hello, {0}!' 'World' # -> 'Hello, World!'
Pitfalls:
PowerShell's string expansion uses the invariant culture, whereas the -f operator performs culture-sensitive formatting (snippet requires PSv3+):
$prev = [cultureinfo]::CurrentCulture
# Temporarily switch to culture with "," as the decimal mark
[cultureinfo]::CurrentCulture = 'fr-FR'
# string expansion: culture-invariant: decimal mark is always "."
$v=1.2; "$v"; # -> '1.2'
# -f operator: culture-sensitive: decimal mark is now ","
'{0}' -f $v # -> '1,2'
[cultureinfo]::CurrentCulture = $prev
PowerShell's string expansion supports expanding collections (arrays) - it expands them to a space-separated list - whereas the -f operator only supports scalars (single values):
$arr = 'one', 'two'
# string expansion: array is converted to space-separated list
"$var" # -> 'one two'
# -f operator: array elements are syntactically treated as separate values
# so only the *first* element replaces {0}
'{0}' -f $var # -> 'one'
# If you use a *nested* array to force treatment as a single array-argument,
# you get a meaningless representation (.ToString() called on the array)
'{0}' -f (, $var) # -> 'System.Object[]'
[1] Surfacing the functionality of the $ExecutionContext.InvokeCommand.ExpandString() method in a more discoverable way, namely via an Expand-String cmdlet, is the subject of GitHub feature-request issue #11693.

Your issue occurs because the $Name string replacement is happening outside of the function, before the $Name variable is populated inside of the function.
You could do something like this instead:
function HelloWorld
{
Param ($Greeting, $Name)
$Greeting -replace '\$Name',$Name
}
HelloWorld -Greeting 'Hello, $Name!' -Name 'World'
By using single quotes, we send the literal greeting of Hello, $Name in and then do the replacement of this string inside the function using -Replace (we have to put a \ before the $ in the string we're replace because $ is a regex special character).

Related

PowerShell Classifier with File Server Resource Manager

I'm trying to use the Windows PowerShell Classifier in FSRM on Server 2019. I need it to look for files that start with "~$" and classify them with a Yes or No property I created.
I would also be fine with a REGEX code as well.
This is what I have but it's not working:
# Global variables available:
# $ModuleDefinition (IFsrmPipelineModuleDefinition)
# $Rule (IFsrmClassificationRule)
# $PropertyDefinition (IFsrmPropertyDefinition)
#
# And (optionally) any parameters you provide in the Script parameters box below,
# i.e. "$a = 1; $b = 2" . The string you enter is treated as a script and executed so the
# variables you define become globally available
# optional function to specify when the behavior of this script was last modified
# if it consumes additional files. emit one value of type DateTime
#
# function LastModified
# {
# }
# required function that outputs a value to be assigned to the specified property for each file classified
# emitting no value is allowed, which causes no value to be assigned for the property
# emitting more than one value will result in errors during classification
# begin and end are optional; process is required
#
function GetPropertyValueToApply
{
# this parameter is of type IFsrmPropertyBag
# it also has an additional method, GetStream, which returns a IO.Stream object to use for
# reading the contents of the file. Make sure to close the stream after you are done reading
# from the file
param
(
[Parameter(Position = 0)] $PropertyBag
)
Process
{
$FileName = $_.Name
If($FileName -like "~$*")
{
$True
}
Else
{
$False
}
}

Reuse parameters among multiple powershell functions

Consider two powershell functions:
function A {
params(
[Parameter()]$x,
[Parameter()]$y
)
Write-Host $x $y
}
function B {
params(
[Parameter()]$x,
[Parameter()]$z
)
Write-Host $x $z
}
I'd like to define a single parameter $x once (which could have fairly complex attributes that must be kept identical for both functions) and re-use it in both functions, so something like:
$x = {[Parameter()]$x}
function A {
params(
$x,
[Parameter()]$y
)
Write-Host $x $y
}
function B {
params(
$x,
[Parameter()]$z
)
Write-Host $x $z
}
(How) is this possible?
To reuse parameter declarations across functions - as requested in your question - see the following section.
To reuse parameter values (arguments) across functions, by way of presets (default value), see the bottom section.
In order to reuse parameter declarations - short of using design-time templating to generate source code - you need to to define a script block that creates a dynamic parameter that can be passed to the dynamicparam block of multiple advanced functions:
using namespace System.Management.Automation
# The script block that declares the dynamic parameter to be shared
# across functions.
$sharedDynParam = {
# Define the -x parameter dynamically.
$paramName = 'x'
$dict = [RuntimeDefinedParameterDictionary]::new()
$dict.Add(
$paramName,
[RuntimeDefinedParameter]::new(
$paramName,
[datetime], # Type the parameter [datetime]. for instance.
[ParameterAttribute] #{
Mandatory = $true # Make the parameter mandatory, for instance.
# ParameterSetName = 'default' # Assign it to a parameter set, if neeeded.
}
)
)
# Return the dictionary
return $dict
}
function A {
[CmdletBinding()]
param(
$y
)
# Assign the shared dynamic parameter.
dynamicparam { & $sharedDynParam }
# The use of `dynamicparam { ... }` requires use of an explicit
# `process { ... }` block (and optionally `begin { ... }` and
# `end { ... }`, as needed).
process {
# Note: A dynamic -x parameter cannot be accessed as $x
# Instead, it must be accessed via the $PSBoundParameters dictionary.
"[$($PSBoundParameters['x'])] - [$y]"
}
}
function B {
[CmdletBinding()]
param(
$z
)
# Assign the shared dynamic parameter.
dynamicparam { & $sharedDynParam }
process {
"[$($PSBoundParameters['x'])] - [$z]"
}
}
# Sample calls,
A -x '1970-01-01' -y yval
B -x '1970-01-02' -z zval
Output:
[01/01/1970 00:00:00] - [yval]
[01/02/1970 00:00:00] - [zval]
To preset the value of a parameter by a given name across commands, use the $PSDefaultParameterValues preference variable:
# Preset a parameter value for all commands ('*') that have
# an -x ('x') parameter.
$PSDefaultParameterValues = #{ '*:x' = [pscustomobject] #{ foo = 1; bar = 2 } }
function A {
[CmdletBinding()] # This makes the function an *advanced* one, which respects
# $PSDefaultParameterValues; similarly, at least one
# parameter-individual [Parameter()] attribute does the same.
param(
$x,
$y
)
"[$x] - [$y]"
}
function B {
[CmdletBinding()]
param(
$x,
$z
)
"[$x] - [$z]"
}
# Sample calls, without an -x argument, relying on
# $PSDefaultParameterValues to provide it automatically.
A -y yval
B -z zval
Output, showing that parameter -x was automatically bound via $PSDefaultParameterValues:
[#{foo=1; bar=2}] - [yval]
[#{foo=1; bar=2}] - [zval]

PowerShell: DynamicParam: get list of passed parameters?

Good afternoon
Unfortunately, PowerShell is not able to detect the ParameterSet by the Parameter Types, for example: If the 2nd Parameter is passed as a Int, then select ParameterSet1, otherwise use ParameterSet2.
Therefore I would like to manually detect the passed Parameter-Combinations.
Is it possible to get the list of passed parameters in DynamicParam, something like this?:
Function Log {
[CmdletBinding()]
Param ()
DynamicParam {
# Is is possible to access the passed Parameters?,
# something like that:
If (Args[0].ParameterName -eq 'Message') { … }
# Or like this:
If (Args[0].Value -eq '…') { … }
}
…
}
Thanks a lot for any help and light!
Thomas
This first finding was wrong!:
"I've found the magic, by using $PSBoundParameters we can access the passed parameters."
This is the correct but very disappointing answer:
It's very annoying and unbelievable, but it looks like PowerShell does not pass any information about the dynamically passed arguments.
The following example used the New-DynamicParameter function as defined here:
Can I make a parameter set depend on the value of another parameter?
Function Test-DynamicParam {
[CmdletBinding()]
Param (
[string]$FixArg
)
DynamicParam {
# The content of $PSBoundParameters is just
# able to show the Params declared in Param():
# Key Value
# --- -----
# FixArg Hello
# Add the DynamicParameter str1:
New-DynamicParameter -Name 'DynArg' -Type 'string' -HelpMessage 'DynArg help'
# Here, the content of $PSBoundParameters has not been adjusted:
# Key Value
# --- -----
# FixArg Hello
}
Begin {
# Finally - but too late to dynamically react! -
# $PSBoundParameters knows all Parameters (as expected):
# Key Value
# --- -----
# FixArg Hello
# DynArg World
}
Process {
…
}
}
# Pass a fixed and dynamic parameter
Test-DynamicParam -FixArg 'Hello' -DynArg 'World'

Sed : Add a line at the starting of each TCL proc

I have a TCL proc like this, & want to add a line after the start of the proc, the puts "puts " entered myproc" " line
proc myproc { {filename "input.txt"}
{var1 "x"}
{var2 "y"}
{var3 "z"}
{var4 ""}
{var5 "0"}
{var6 "0"}
{var7 0}
} {
puts " entered myproc"
Can you help?
& it should also work for
proc myproc2 { N val } {
puts " entered myproc"
# comment line
set ret {} for { set i 0 } { $i < $N } { incr i } { lappend ret $val }
return $ret
}
If all you want to do is get an execution trace of your code, such as a call stack dump etc, then you don't need to modify your source code at all. You can use tcl itself to do it for you.
Tcl has no reserved keywords, none at all. Not even proc is reserved. You can therefore redefine it:
rename proc _proc
# Now proc no longer exists but we have _proc instead.
# Use it to redefine "proc":
_proc proc {name arguments body} {
set body "puts \"entered $name\";$body"
_proc $name $arguments $body
}
Just do that before running any of your own code and you'll find that every proc prints out when it's being entered on each call.
This is how a lot of tcl debuggers and profilers work - using tcl to redifine itself.
From your comments it looks like you're trying to also print how deep the stack is with each call. To do that you need to add more code to each proc definition. The most straightforward way is of course something like this:
_proc proc {name arguments body} {
set preamble"set dist2top \[info level\];puts \"\$dist2top entered $name\""
set body "$preamble;$body"
_proc $name $arguments $body
}
But as you can see, writing code inside strings can quickly become unmanagable. There are several tricks you can use to make it more manageable. One of the more common is to split $body by line and use list commands to manipulate code. It should reduce at least one level of quoting hell. My favorite is to use a templating technique similar to how you'd write html templates in MVC frameworks. I usually use string map for this:
_proc proc {name arguments body} {
_proc $name $arguments [string map [list %NAME% $name %BODY% $body] {
set dist2top [info level]
puts "$dist2top entered: %NAME%"
%BODY%
}]
}
The last argument in the _proc definition is just a string but it looks like a code block which makes it easier to read. No nasty quoting hell with this technique.
Using awk you can do:
awk '/^ *proc/ {$0 = $0 "\nputs \" entered myproc\""} 1' RS= proc-file.tcl
Gives this file:
proc myproc { {filename "input.txt"}
{var1 "x"}
{var2 "y"}
{var3 "z"}
{var4 ""}
{var5 "0"}
{var6 "0"}
{var7 0}
} {
puts " entered myproc"

Dot-sourcing functions from file to global scope inside of function

I want to import external function from file, not converting it to a module (we have hundreds of file-per-function, so treat all them as modules is overkill).
Here is code explanation. Please notice that I have some additional logic in Import-Function like adding scripts root folder and to check file existence and throw special error, to avoid this code duplication in each script which requires that kind of import.
C:\Repository\Foo.ps1:
Function Foo {
Write-Host 'Hello world!'
}
C:\InvocationTest.ps1:
# Wrapper func
Function Import-Function ($Name) {
# Checks and exception throwing are omitted
. "C:\Repository\$name.ps1"
# Foo function can be invoked in this scope
}
# Wrapped import
Import-Function -Name 'Foo'
Foo # Exception: The term 'Foo' is not recognized
# Direct import
. "C:\Repository\Foo.ps1"
Foo # 'Hello world!'
Is there any trick, to dot source to global scope?
You can't make the script run in a parent scope, but you can create a function in the global scope by explicitly scoping it.
Would something like this work for you?
# Wrapper func
Function Import-Function ($Path) {
# Checks and exception throwing are omitted
$script = Get-Content $Path
$Script -replace '^function\s+((?!global[:]|local[:]|script[:]|private[:])[\w-]+)', 'function Global:$1'
.([scriptblock]::Create($script))
}
The above regex only targets root functions (functions left justified; no white space to left of the word function). In order to target all functions, regardless of spacing (including sub-functions), change the $Script -replace line to:
$Script -replace '^\s*function\s+((?!global[:]|local[:]|script[:]|private[:])[\w-]+)','function Global:$1'
You can change the functions that are defined in the dot-sourced files so that they are defined in the global scope:
function Global:Get-SomeThing {
# ...
}
When you dot source that from within a function, the function defined in the dot sourced file will be global. Not saying this is best idea, just another possibility.
Just dot-source the function as well:
. Import-Function -Name 'Foo'
Foo # Hello world!
I can't remember a way to run a function in global scope right now. You could do something like this:
$name = "myscript"
$myimportcode= {
# Checks and exception throwing are omitted
. .\$name.ps1
# Foo function can be invoked in this scope
}
Invoke-Expression -Command $myimportcode.ToString()
When you convert the scriptblock to a string .ToString(), the variable will expand.