command line arguments from file - powershell

I have a powershell script for which I expect to pass quite few arguments in the command line. Having many arguments is not a problem since it is configured as a scheduled task but I'd like to make things easier for support people so that if ever they need to run the script from command line they have less things to type.
So, I am considering the option to have the arguments in a text file, either using Java-style properties file with key/value pairs in each line, or a simple XML file that would have one element per argument with a name element and a value element.
arg1=value1
arg2=value2
I'm interested with the views of PowerShell experts on the two options and also if this is the right thing to do.
Thanks

If you want to use the ini file approach, you can easily parse the ini data like so:
$ini = ConvertFrom-StringData (Get-Content .\args.ini -raw)
$ini.arg1
The one downside to this approach is that all the arg types are string this works fine for strings and even numbers AFAICT. Where it falls down is with [switch] args. Passing True or $true doesn't have the desired effect.
Another approach is to put the args in a .ps1 file and execute it to get back a hashtable with all the args that you can then splat e.g:
-- args.ps1 --
#{
ComputerName = 'localhost'
Name = 'foo'
ThrottleLimit = 50
EnableNetworkAccess = $true
}
$cmdArgs = .\args.ps1
$s = New-PSSession #cmdArgs

To extend Keith's answer slightly, it's possible to set actual variables based on the values from the config file e.g.
$ini = ConvertFrom-StringData (Get-Content .\args.ini -raw)
$ini.keys | foreach-object { set-variable -name $_ -value $ini.Item($_) }
So a file containing
a=1
b=2
c=3
When processed with above code should lead to three variables being created:
$a = 1, $b = 2, $c = 3
I expect this would work for simple string values, but i'm fairly sure it would not be 'smart' enough to convert, say myArray = 1,2,3,4,5 into an actual array.

Related

Read value of variable in .ps1 and update the same variable in another .ps1

I'm trying to find an efficient way to read the value of a string variable in a PowerShell .ps1 file and then update the same variable/value in another .ps1 file. In my specific case, I would update a variable for the version # on script one and then I would want to run a script to update it on multiple other .ps1 files. For example:
1_script.ps1 - Script I want to read variable from
$global:scriptVersion = "v1.1"
2_script.ps1 - script I would want to update variable on (Should update to v1.1)
$global:scriptVersion = "v1.0"
I would want to update 2_script.ps1 to set the variable to "v1.1" as read from 1_script.ps1. My current method is using get-content with a regex to find a line starting with my variable, then doing a bunch of replaces to get the portion of the string I want. This does work, but it seems like there is probably a better way I am missing or didn't get working correctly in my tests.
My Modified Regex Solution Based on Answer by #mklement0 :
I slightly modified #mklement0 's solution because dot-sourcing the first script was causing it to run
$file1 = ".\1_script.ps1"
$file2 = ".\2_script.ps1"
$fileversion = (Get-Content $file1 | Where-Object {$_ -match '(?m)(?<=^\s*\$global:scriptVersion\s*=\s*")[^"]+'}).Split("=")[1].Trim().Replace('"','')
(Get-Content -Raw $file2) -replace '(?m)(?<=^\s*\$global:scriptVersion\s*=\s*")[^"]+',$fileversion | Set-Content $file2 -NoNewLine
Generally, the most robust way to parse PowerShell code is to use the language parser. However, reconstructing source code, with modifications after parsing, may situationally be hampered by the parser not reporting the details of intra-line whitespace - see this answer for an example and a discussion.[1]
Pragmatically speaking, using a regex-based -replace solution is probably good enough in your simple case (note that the value to update is assumed to be enclosed in "..." - but matching could be made more flexible to support '...' quoting too):
# Dot-source the first script in order to obtain the new value.
# Note: This invariably executes *all* top-level code in the script.
. .\1_script.ps1
# Outputs to the display.
# Append
# | Set-Content -Encoding utf8 2_script.ps1
# to save back to the input file.
(Get-Content -Raw 2_script.ps1) -replace '(?m)(?<=^\s*\$global:scriptVersion\s*=\s*")[^"]+', $global:scriptVersion
For an explanation of the regex and the ability to experiment with it, see this regex101.com page.
[1] Syntactic elements are reported in terms of line and column position, and columns are character-based, meaning that spaces and tabs are treated the same, so that a difference of, say, 3 character positions can represent 3 spaces, 3 tabs, or any mix of it - the parser won't tell you. However, if your approach allows keeping the source code as a whole while only removing and splicing in certain elements, that won't be a problem, as shown in iRon's helpful answer.
To compliment the helpful answer from #mklement0. In case your do go for the PowerShell abstract syntax tree (AST) class, you might use the Extent.StartOffset/Extent.EndOffset properties to reconstruct your script:
Using NameSpace System.Management.Automation.Language
$global:scriptVersion = 'v1.1' # . .\Script1.ps1
$Script2 = { # = Get-Content -Raw .\Script2.ps1
[CmdletBinding()]param()
begin {
$global:scriptVersion = "v1.0"
}
process {
$_
}
end {}
}.ToString()
$Ast = [Parser]::ParseInput($Script2, [ref]$null, [ref]$null)
$Extent = $Ast.Find(
{
$args[0] -is [AssignmentStatementAst] -and
$args[0].Left.VariablePath.UserPath -eq 'global:scriptVersion' -and
$args[0].Operator -eq 'Equals'
}, $true
).Right.Extent
-Join (
$Script2.SubString(0, $Extent.StartOffset),
$global:scriptVersion,
$Script2.SubString($Extent.EndOffset)
) # |Set-Content .\Script2.ps1

How to escape the following regex, so its usable in PowerShell?

As far as I know, there is no way in PowerShell to execute an exe with parameters specified in a variable and direct the return of the exe into a variable. Therefore I am currently writing a small function to make this possible. But now I am stuck at the point that the parameters have to be passed individually when calling with &. (This is not necessary for all programs, but some programs cause problems if you pass all parameters as a string in a variable) Therefore I want to use a split to write the parameters passed to my function into an array. And then pass the array with the parameters in my exe call.
For this I have the following regex:
[^\s"']+|"([^"]*)"|'([^']*)'
This regex allows that single and double quotes are taken into account when passing parameters and that a text with spaces inside them is not split.
But unfortunately I don't have the slightest idea how to best escape this regex so that it doesn't cause any problems in the PowerShell script.
Here then still my function to make it a little easier to understand:
The function executes the file passed in the $Path parameter with the parameters from the $Arguments. Before the execution i try to split the $Arguments with the regex. As return of the function, you get an object with the ExitCode and the output of the executed file. Here you can see my attempt, but the quotes cause problems with the following code.
function Invoke-Process ($Path,$Arguments){
[PsObject[]] $ReturnValue = #()
$Params=$Arguments -split([regex]::escape([^\s"']+|"([^"]*)"|'([^']*)'))
$ExCode = 0
try {
$ProcOutput = & $Path $Params | out-string
}catch{
$ProcOutput = "Failed: $($_.Exception.Message)"
$ExCode = 1
}
$ReturnValue += [PsObject]#{ ExitCode = $ExCode; Output = $ProcOutput}
Return $ReturnValue
}
The function is called as follows:
$ExePath = "C:\arcconf\arcconf.exe"
$ExeParams = "getconfig 1"
$Output = Invoke-Process $ExePath $ExeParams
I hope you can understand my problem. I am also open to other ways of writing the function.
Greetings
There's nothing you need to escape - the pattern is perfectly valid.
All you need is a string literal type that won't treat the quotation marks as special. For this, I would suggest a verbatim here-string:
#'
This is a single-line string ' that " can have all sorts of verbatim quotation marks
'#
The qualifiers for a here-string is #' as the last token on the preceding line and '# as the first token on the following line (for an expandable here-string, use #" and "#).
Try running it with valid sample input:
$pattern = #'
[^\s"']+|"([^"]*)"|'([^']*)'
'#
'getconfig 1 "some string" unescaped stuff 123' |Select-String $pattern -AllMatches |ForEach-Object {$_.Matches.Value}
Which, as expected, returns:
gesture
1
"some string"
unescaped
stuff
123
As an alternative to a here-string, the best alternative is a regular verbatim string literal. The only character you need to escape is ', and the escape sequence is simply two in a row '', so your source code becomes:
$pattern = '[^\s"'']+|"([^"]*)"|''([^'']*)'''
Mathias R. Jessen's helpful answer answers your immediate question well.
Taking a step back:
there is no way in PowerShell to execute an exe with parameters specified in a variable and direct the return of the exe into a variable
No: PowerShell does support both of those things:
$output = someExternalProgram ... captures the stdout output from an external program in variable $output; use redirection 2>&1 to also capture stderr output; use >$null or 2>$null to selectively discard stdout or stderr output - see this answer for details.
someExternalProgram $someVariable ... passes the value of variable $someVariable as an argument to an external program; if $someVariable is an array (collection) of values, each element is passed as a separate argument. Instead of a variable reference, you may also use the output from a command or expression, via (...); e.g., someExternalProgram (1 + 2) passes 3 as the argument - see this answer for details.
Note: An array's elements getting passed as individual arguments only happens for external programs by default; to PowerShell-native commands, arrays are passed as a whole, as a single argument - unless you explicitly use splatting, in which case you must pass #someVariable rather than $someVariable. For external programs, #someVariable is effectively the same as $someVariable. While array-based splatting also works for PowerShell-native commands, for passing positional arguments only, the typical and more robust and complete approach is to use hashtable-based splatting, where the target parameters are explicitly identified.
A general caveat that applies whether or not you use variables or literal values as arguments: Up to at least PowerShell 7.2.x, passing empty-string arguments or arguments with embedded " chars. to external programs is broken - see this answer.
With the above in mind you can rewrite your function as follows, which obviates the need for - ultimately brittle - regex parsing:
function Invoke-Process {
# Split the arguments into the executable name / path and all
# remaining arguments.
$exe, $passThruArgs = $args
try {
# Call the executable with the pass-through arguments.
# and capture its *stdout* output.
$output = & $exe #passThruArgs
$exitCode = $LASTEXITCODE # Save the process' exit code.
} catch {
# Note: Only If $exe is empty or isn't a valid executable name or path
# does a trappable statement-terminating error occur.
# By contrast, if the executable can be called in principle, NO
# trappable error occurs if the process exit code is nonzero.
$exitCode = 127 # Use a special exit code
$output = $_.ToString() # Use the exception message
}
# Construct and output a custom object containing
# the captured stdout output and the process' exit code.
[pscustomobject] #{
ExitCode = $exitCode
Output = $output
}
}
Sample calls:
Invoke-Process cmd /c 'echo hi!'
# Equivalent, using an array variable for the pass-through arguments.
$argsForCmd = '/c', 'echo hi!'
Invoke-Process cmd #argsForCmd
Note:
Since Invoke-Process is a PowerShell command,
splatting (#argsForCmd) is necessary here in order to pass the array
elements as individual arguments, which inside the function are then reflected in the automatic $args variable variable.
The automatic $args variable is only available in simple functions, as opposed to advanced ones, which behave like cmdlets and therefore automatically support additional features, such as common parameters; to make your function and advanced one, replace the line $exe, $passThruArgs = $args at the top of the function with the following:
[CmdletBinding()]
param(
[Parameter(Mandatory)]
[string] $exe,
[Parameter(ValueFromRemainingArguments)]
[string[]] $passThruArgs
)

How do you expand a powershell variable as parameters to another program?

First I have tried to locate other questions and these all seem related to replacement rather than expansion.
How to expand variable in powershell?
Powershell variable expansion when calling other programs
Powershell variable expansion in parameters
How to expand a variable when calling another powershell instance?
So I must first create clarity on what I mean by "expansion".
> $TEST="Foo Bar"
> echo foo bar
foo
bar
> echo $TEST
Foo Bar
When passing two parameters, powershell's implementation of echo will print each parameter on its own line.
If I echo a variable with two parameters it is passed as a single parameter. I would like the variable to be expanded into the multiple arguments which it contains, getting the behavior from the first instance.
I have looked at:
Single quotes
Parentheses
Curly braces in different forms
Is this possible in powershell?
Context Update:
Unfortunately I'm not the one setting the variable, this comes from gitlab environment variables.
It sounds like you want the # splat operator:
$test = "foo","bar"
echo #test
This will have the exact same effect as echo foo bar
How to pass a list of arguments into a powershell commandlet?
Let's say you need to pass a bunch of CSV header fields into -Header argument, that is part of the ConvertFrom-Csv commandlet in order to convert an array into a CSV.
# The test array
$pac = #("app1", "app2")
# The needed command:
$pac | ConvertFrom-Csv -Header id, gui, src, sco, name, loc
# This works, but is too verbose, and can't be modified.
# So you Try-1:
$hdr = 'id,gui,src,sco,name,loc'
$pac | ConvertFrom-Csv -Header $hdr
# Not working!
Q: What is going on?
A-1: It's complicated, but solution is easy, once you find it.
A-2: There are 3 main solutions to skin this cat.
Solution-1
Write the header variable as a string, and use the -split operator inline.
Caveats: Watch you spaces! And harder to read.
Advantage: Simple.
$hdr = 'id,gui,src,sco,name,loc'
$pac | ConvertFrom-Csv -Header ($hdr -split ',')
Solution-2
Write the header variable as a quoted list.†
Caveats: Still long and verbose.
Advantage: Easy to read.
† I was never able to use the splat # operator here.
$hdr = 'id', 'gui', 'src', 'sco', 'name', 'loc'
$pac | ConvertFrom-Csv -Header $hdr
Solution-3
Write the header as a tricky [just my way to name it] argument function that only expands its arguments.
Caveats: Is tricky at first sight.
Advantage: No quotes and easy to read, and can be reused.
function ql { $args }
$hdr = (ql id, gui, src, sco, name, loc)
$pac | ConvertFrom-Csv -Header $hdr

Can I Pipe Powershell output to an accelerator?

I've obtained a file path to an xml-resource, by interrogating task scheduler arguments.
I'd like to pipe these files paths to [xml], to return data using XPath.
Online I see accelerators and variables are used, eg
$xml = [XML](Get-Content .\Test.xml)
tried piping to convert-to-xml, but that's an XML object containing the filepath, so I need to convert to [xml] - hoping to do this in the pipeline, potentially for > 1 xmldocument
Is it possible to pipe to [typeaccelerators] ?
Should I be piping to New-Object, or Tee-Variable, as required?
I hope to eventually be able to construct a one-liner to interrogate several nodes (eg LastRan, LastResult)
currently I have this, which only works for one
([xml](Get-Content ((Get-ScheduledTask -TaskPath *mytask* | select -First 1).Actions.Arguments | % {$_.Split('"')[-2]}))).MyDocument.LastRan
returns the value of LastRan, from MyDocument node.
Thanks in advance 👍
If you want to take pipeline input you need to make a function and set the parameter attribute ValueFromPipeline
Function Convert-XML {
Param(
[Parameter(ValueFromPipeline)]$xml
)
process{
[xml]$xml
}
}
Then you could take the content of an xml file (all at once, not line by line)
Get-Content .\Test.xml -Raw | Convert-XML
Of course to get your one liner you'd probably want to add the logic for that in the function. However this is how you'd handle pipeline input.

Updating a txt file of variables in powershell

So my challenge today.
I have a config file (really just a txt document) that stores variables to store information passed between scripts or to be used after restarts.
I am looking for a more efficient way to read and update the file. Currently I read the file with:
Get-Content $current\Install.cfg | ForEach-Object {
Set-Variable -Name line -Value $_
$a, $b = $line.Split('=')
Set-Variable -name $a -Value $b
}
But to overwrite the contents, I recreate the file with:
ECHO OSV=$OSV >>"$ConfigLoc\tool.cfg"
ECHO OSb=$OSb >>"$ConfigLoc\tool.cfg"
ECHO cNum=$cNum >>"$ConfigLoc\tool.cfg"
ECHO cCode=$cCode >>"$ConfigLoc\tool.cfg"
ECHO Comp=$Comp >>"$ConfigLoc\tool.cfg"
Each time I have added a new saved variable, I have just hardcoded the new variable into both the original config file and the config updater.
As my next updates require an additional 30 variables to my current 15. I would like something like:
Get-Content $current\Install.cfg | ForEach-Object {
Set-Variable -Name line -Value $_
$a, $b = $line.Split('=')
ECHO $a=$$a
}
Where $$a uses the variable $a in the loop as the variable name to load the value.
Best example i can show to clarify is:
ECHO $a=$$a (in current loop)
Echo OSV=$OSV (actually appears in code as)
Not sure how to clarify this anymore, or how to achieve it with the variable title also being a variable.
If you want to create a file that has name=value parameters, here's an alternate suggestion. This is a snippet of a real script I use every day. You might modify it so it reads your .csv input and uses it instead of the hard coded values.
$Sites = ("RawSiteName|RoleName|DevUrl|SiteID|HttpPort|HttpsPort", `
"SiteName|Name of role|foo.com|1|80|443" `
) | ConvertFrom-CSV -Delimiter "|"
$site = $sites[0]
Write-Host "RawSiteName =$($site.RawSiteName)"
You might be able to use something similar to $text = Get-Content MyParameters.csv and pipe that to the ConvertFrom-CSV cmdlet. I realize it's not a direct answer to what you are doing but it will let you programmatically create a file to pass across scripts.
Thanks for the help everyone. This is the solution I am going with. Importing and exporting couldn't be simpler. If I have to manually update the XML install default I can with ease which is also amazing. I also love the fact that even if you import as $Test you can still use $original to access variables. I will be creating multiple hashtables to organize the different data I will be using going forward and just import/export it in a $config variable as the master.
$original = #{
OSV='0'
OSb='0'
cNum='00000'
cCode='0000'
Client='Unknown'
Comp='Unknown'
}
$original | Export-Clixml $Home\Desktop\sample.cfg
$Test = Import-Clixml $Home\Desktop\sample.cfg
Write $Test
Write $original.Client
In essence, you're looking for variable indirection: accessing a variable indirectly, via its name stored in another variable.
In PowerShell, Get-Variable allows you to do that, as demonstrated in the following:
# Sample variables.
$foo='fooVal'
$bar='barVal'
# List of variables to append to the config file -
# the *names* of the variables above.
$varsToAdd =
'foo',
'bar'
# Loop over the variable names and use string expansion to create <name>=<value> lines.
# Note how Get-Variable is used to retrieve each variable's value via its *name*.
$(foreach ($varName in $varsToAdd) {
"$varName=$(Get-Variable $varName -ValueOnly)"
}) >> "$ConfigLoc/tool.cfg"
With the above, the following lines are appended to the output *.cfg file:
foo=fooVal
bar=barVal
Note that you can read such a file more easily with the ConvertFrom-StringData, which outputs a hashtable with the name-value pairs from the file:
$htSettings = Get-Content -Raw "$ConfigLoc/tool.cfg" | ConvertFrom-StringData
Accessing $htSettings.foo would then return fooVal, for instance.
With a hashtable as the settings container, updating the config file becomes easier, as you can simply recreate the file with all settings and their current values:
$htSettings.GetEnumerator() |
ForEach-Object { "$($_.Key)=$($_.Value)" } > "$ConfigLoc/tool.cfg"
Note: PowerShell by default doesn't enumerate the entries of a hashtable in the pipeline, which is why .GetEnumerator() is needed.
Generally, though, this kind of manual serialization is fraught, as others have pointed out, and there are more robust - though typically less friendly - alternatives.
With your string- and line-based serialization approach, there are two things to watch out for:
All values are saved as a strings, so you have to manually reconvert to the desired data type, if necessary - and even possible, given that not all objects provide meaningful string representations.
Generally, the most robust serialization format is Export-CliXml, but note that it is not a friendly format - be careful with manual edits.
ConvertFrom-StringData will fail with duplicate names in the config file, which means you have to manually ensure that you create no duplicate entries when you append to the file - if you use the above approach of recreating the file from a hashtable every time, however, you're safe.