On a whim, in order to learn PowerShell, which I know I'm woefully behind on, I decided to write a super simple script to use MSBuild to clean a bunch of solutions in code, then build them with a certain set of arguments. Here's the script:
$sourceDir = Split-Path -Path $MyInvocation.MyCommand.Definition -Parent
$cleanArgs = '/t:clean /verbosity:q /nologo /P:Configuration=Debug'
$buildArgs = '/verbosity:q /nologo /P:Platform="Any CPU";Configuration=Debug'
$solutions = #("asdf.sln","qwer.sln","zxcv.sln","yuio.sln","hjkl.sln","nm.sln")
function buildWithArgs([string[]]$solutionsToBuild = #(), [string]$args = 'default')
{
Write-Host args: $args
foreach($sln in $solutionsToBuild)
{
& $env:WINDIR\Microsoft.NET\Framework64\v4.0.30319\msbuild.exe "$sourceDir\$sln $args" | Out-Default
}
}
buildWithArgs $solutions $buildArgs
buildWithArgs $solutions $cleanArgs
Now, my problem is that the function "buildWithArgs" doesn't respect the $args parameter, which is always empty when I run or debug the script, no matter what I pass in. The $solutionsToBuild parameter works flawlessly, however. Do I have improper syntax? Even if it were the case that my $cleanArgs string was somehow improperly formatted or not escaped, I can't even make the function work with simple strings like "asdf", as $args is always empty when I run it. Thanks for any help you can provide to what is assuredly some small, silly error!
First, $args is a special, built-in parameter in PowerShell, which represents all the arguments to a function. One of the side-effects to this is you can't see its value in the ISE debugger. Try changing its name to MSBuildArgs.
Make sure you define, buildArgs and cleanArgs as arrays. You're defining them as one single string. You do this so you can use PowerShell's splat operator (# instead of $ when referencing a variable), which sends each argument in an array as a parameter to a command.
Lastly, you don't need to quote parameters that are stored in variables. PowerShell will do any quoting for you.
After making these changes, you'll have something like this:
$sourceDir = Split-Path -Path $MyInvocation.MyCommand.Definition -Parent
$cleanArgs = #('/t:clean','/verbosity:q','/nologo','/P:Configuration=Debug')
$buildArgs = #('/verbosity:q','/nologo','/P:Platform="Any CPU";Configuration=Debug')
$solutions = #("asdf.sln","qwer.sln","zxcv.sln","yuio.sln","hjkl.sln","nm.sln")
function buildWithArgs([string[]]$solutionsToBuild = #(), [string[]]$MSBuildArgs= 'default')
{
Write-Host args: $args
foreach($sln in $solutionsToBuild)
{
$msbuildPath = Join-Path $env:WINDIR Microsoft.NET\Framework64\v4.0.30319\msbuild.exe -Resolve
& $msbuildPath (Join-Path $sourceDir $sln -Resolve) #MSBuildArgs | `
Out-Default
}
}
buildWithArgs $solutions $buildArgs
buildWithArgs $solutions $cleanArgs
$args is a special automatic variable in powershell functions. It contains an array of parameters which are not otherwise bound to the declared parameters of the function. By plain bad luck/inexperience, you happened to name one of your declared parameters the same, which will cause issues.
If you rename your parameter to something else, say $BuildArgs, it should be properly bound.
Note, however, that your current code will execute msbuild with a single parameter consisting of the solution path and args together. It is not the syntax to invoke msbuild with multiple arguments. If this causes issues for you I would suggest opening another question or searching online for that particular issue, just so this question does not get too big.
Related
I have a self elevate snippet which is quite wordy, so I decided instead of duplicating it at the top of every script that needs to be run as admin to move it into a separate .ps1:
function Switch-ToAdmin {
# Self-elevate the script if required
if (-not ([Security.Principal.WindowsPrincipal] [Security.Principal.WindowsIdentity]::GetCurrent()).IsInRole([Security.Principal.WindowsBuiltInRole] 'Administrator')) {
if ([int](Get-CimInstance -Class Win32_OperatingSystem | Select-Object -ExpandProperty BuildNumber) -ge 6000) {
$Cmd = #(
"-Command Set-Location `"$(Get-Location)`"; & `"$PSCommandPath`""
"-$($PSBoundParameters.Keys)"
)
$ProcArgs = #{
FilePath = 'PowerShell.exe'
Verb = 'RunAs'
ArgumentList = $Cmd
}
Start-Process #ProcArgs
Exit
}
}
}
So for every script that needs elevation I'd prepend
. "$PSScriptRoot\self-elevate.ps1"
Switch-ToAdmin
# rest of script
Doing above successfully procs the UAC prompt, but the rest of the script won't get executed.
Is this sorta stuff disallowed?
Darin and iRon have provided the crucial pointers:
Darin points out that the automatic $PSCommandPath variable variable in your Switch-ToAdmin function does not contain the full path of the script from which the function is called, but that of the script in which the function is defined, even if that script's definitions are loaded directly into the scope of your main script via ., the dot-sourcing operator.
The same applies analogously to the automatic $PSScriptRoot variable, which reflects the defining script's full directory path.
Also, more generally, the automatic $PSBoundParameters variable inside a function reflects that function's bound parameters, not its enclosing script's.
iRon points out that the Get-PSCallStack cmdlet can be used to get information about a script's callers, starting at index 1; the first object returned - index 0, when Get-PSCallStack output is captured in an array, represents the current command. Index 1 therefore refers to the immediate caller, which from the perspective of your dot-sourced script is your main script.
Therefore:
Replace $PSCommandPath with $MyInvocation.PSCommandPath, via the automatic $MyInvocation variable. $MyInvocation.PSCommandPath truly reflects the caller's full script path, irrespective of where the called function was defined.
Alternatively, use (Get-PSCallStack)[1].ScriptName, which despite what the property name suggests, returns the full path of the calling script too.
Replace $PSBoundParameters with (Get-PSCallStack)[1].InvocationInfo.BoundParameters
Note that there's also (Get-PSCallStack)[1].Arguments, but it seems to contain a single string only, containing a representation of all arguments that is only semi-structured and therefore doesn't allow robust reconstruction of the individual parameters.
As an aside:
Even if $PSBoundParameters contained the intended information, "-$($PSBoundParameters.Keys)" would only succeed in passing the bound parameters through if your script defines only one parameter, if that parameter is a [switch] parameter, and if it is actually passed in every invocation.
Passing arguments through robustly in this context is hard to do, and has inherent limitations - see this answer for a - complex - attempt to make it work as well as possible.
issue
the called powershell script will accept parameters but not all of them:
Current Set-Up and code:
I have a common folder where two .ps1 scripts are located:
DoWork.ps1
Workmanager.ps1
Workmanager.ps1 calls the Dowork.ps1:
$targetPath="M:\target"
echo "target path: $targetPath"
start powershell {.\DoWork.ps1 -target $targetPath -tempdrive D:\}
output (as expected):
target path: M:\target
DoWork.ps1 contains some start code:
param
(
[string]$tempdrive,
[string]$target,
[int] $threads = 8,
[int] $queuelength = -1
)
echo "variables:"
echo "temp drive: $tempdrive"
echo "target path: $target"
Unexpectedly, the $target is not beeing assigned. Previously I had the variable named $targetpath, which did not work either.
variables:
temp drive: D:\
target path:
Findings
It appears that the issue relies in Workmanager.ps1. Spcifying the parameter as fixed string rather than as variable will load the parameter. Any solution for this?
start powershell {.\DoWork.ps1 -target "foo" -tempdrive D:\}
When you use a ScriptBlock as an argument to powershell.exe, variables aren't going to be evaluated until after the new session starts. $targetPath has not been set in the child PowerShell process called by Workmanager.ps1 and so it has no value. This is actually an expected behavior of a ScriptBlock in general and behaves this way in other contexts too.
The solution is mentioned in the help text for powershell -?:
[-Command { - | <script-block> [-args <arg-array>] <========== THIS GUY
| <string> [<CommandParameters>] } ]
You must provide the -args parameter which will be passed to the ScriptBlock on execution (separate multiple arguments with a ,). Passed arguments are passed positionally, and must be referenced as though you were processing the arguments to a function manually using the $args array. For example:
$name = 'Bender'
& powershell { Write-Output "Hello, $($args[0])" } -args $name
However, especially with more complicated ScriptBlock bodies, having to remember which index of $args[i] contains the value you want at a given time is a pain in the butt. Luckily, we can use a little trick with defining parameters within the ScriptBlock to help:
$name = 'Bender'
& powershell { param($name) Write-Output "Hello, $name" } -args $name
This will print Hello, Bender as expected.
Some additional pointers:
The ScriptBlock can be multiline as though you were defining a function. way. The examples above are single line due to their simplicity.
A ScriptBlock is just an unnamed function, which is why defining parameters and referencing arguments within one works the same way.
To exemplify this behavior outside of powershell.exe -Command, Invoke-Command requires you to pass variables to its ScriptBlock in a similar fashion. Note however that answer uses an already-defined function body as the ScriptBlock (which is totally valid to do)
You don't need to use Start-Process here (start is its alias), at least as demonstrated in your example. You can simply use the call operator & unless you need to do something more complex than "run the program and wait for it to finish". See this answer of mine for more information.
If you opt to pass a string to powershell.exe instead, you don't need to provide arguments and your variables will get rendered in the current PowerShell process. However, so will any other unescaped variables that might be intended to set within the child process, so be careful with this approach. Personally, I prefer using ScriptBlock regardless, and just deal with the extra parameter definition and arguments.
Using the call & operator is optional when you are not executing a path rendered as a string. It can be omitted in the examples above, but is more useful like so:
& "C:\The\Program Path\Contains\spaces.exe"
& $programPathAsAVariable
i'm trying to create a script that handle copy of folders and files.
i need to pass the switches '-recures -container' if it's a folder and nothing is it's a file.
is there a way to create a variable that will hold the '-recurse -container' and pass it to the command like this:
$copy_args = '-Recurse -container '
Copy-Item $tmptmp\$file -Destination \\$server\d$\$tmpprd\ $copy_args -Force
thanks
Mor
The best way to do this is with a technique called splatting. You create a hashtable of the parameters you want to pass and then you use # with the variable name (instead of $) to indicate that you want to splat it in to the required cmdlets parameters:
$copy_args = #{
Recurse = $true
Container = $true
}
Copy-Item $tmptmp\$file -Destination \\$server\d$\$tmpprd\ #copy_args -Force
Mark Wragg's helpful answer recommends splatting, which gives you the most flexibility.
As an aside:
Setting -Recurse is sufficient in your case, because it implies -Container
In fact, you can even use -Recurse unconditionally, because it is simply ignored if the source path is a file.
On occasion you may want to conditionally pass a switch directly, without the added verbosity of splatting.
Given that the syntax - -SomeSwitch:$boolVar or -SomeSwitch:(<boolExpression>) (optionally with whitespace after :) - isn't obvious, let me demonstrate:
Using a Boolean variable:
# The source path.
$sourcePath = $tmptmp\$file
# Set the Boolean value that will turn the -Recurse switch on / off.
$doRecurse = Test-Path -PathType Container $sourcePath # $true if $sourcePath is a dir.
# Use -Recurse:$doRecurse
Copy-Item -Recurse:$doRecurse $sourcePath -Destination \\$server\d$\$tmpprd\ -Force
Alternatively, using a Boolean expression:
Copy-Item -Recurse:(Test-Path -PathType Container $sourcePath) $sourcePath -Destination \\$server\d$\$tmpprd\ -Force
Note that the : to separate the parameter name from the argument is a necessity in the case of a switch parameter, so as to indicate that the argument is intended for the switch (which normally do not take an argument) rather than being a separate, positional argument.
Caveat: Both in this case and with splatting passing an effective $false to a switch is technically not the same as omitting the switch, and there are situations where the difference matters.
Read on to learn more.
Technically, a cmdlet or advanced function can distinguish between an omitted switch and one with a $false argument via the automatic $PSBoundParameters variable, which contains a dictionary of all explicitly passed parameters.
In the case of the common -Confirm parameter, this distinction is used intentionally - which is atypical.
Here's a simple demonstration:
# Sample advanced function that supports -Confirm with a medium impact level.
function foo {
[CmdletBinding(SupportsShouldProcess, ConfirmImpact='Medium')]
param()
if ($PSCmdlet.ShouldProcess('dummy')) { 'do it' }
}
# Invocation *with -Confirm* prompts unconditionally.
foo -Confirm # ditto with -Confirm:$true
# Invocation *without -Confirm*:
# Whether you'll be prompted depends on the value of the $ConfirmPreference
# variable: If the value is 'Medium' or 'Low', you'll be prompted.
foo
# Invocation with *-Confirm:$false* NEVER prompts,
# irrespective of the $ConfirmPreference value.
foo -Confirm:$false
I have two PowerShell scripts.
The first script has the following code:
$var = "abc"
$DIR = "C:\"
$SCRIPT_NAME = "abc.ps1"
&"${DIR}\${SCRIPT_NAME}" #execute the second script
If I want to pass the variable $var to the second script, how do I achieve that? What code do I need to put in both the first and the second script?
Parameters (Recommended): Use parameters to pass values to the second script.
Step2.ps1:
param ($myparameter)
write-host $myparameter
Step1.ps1:
$var = "abc"
$DIR = "C:\"
$SCRIPT_NAME = "step2.ps1"
&"${DIR}\${SCRIPT_NAME}" -myparameter $var
Alternative: You could also have used arguments $args (extra values not linked to a parameter). You can specify the first argument using $args[0]. I would however always recommend parameters as arguments needs to be in a specific order (if multiple arguments are passed) etc.
Step2.ps1:
write-host $args[0]
Step1.ps1:
$var = "abc"
$DIR = "C:\"
$SCRIPT_NAME = "step2.ps1"
&"${DIR}\${SCRIPT_NAME}" $var
There are several ways to do what you want, two of which have already been suggested by #FrodeF..
Pass the variable as a (named) parameter:
# script1.ps1
$var = 'foo'
$dir = 'C:\some\folder'
$scriptname = "script2.ps1"
& "${dir}\${scriptname}" -Foo $var
# script2.ps1
Param($foo)
Write-Output $foo
This is the cleanest solution. You have a well-defined interface and pass the variable in a clear-cut way from one script to another.
Parameter definitions will also allow you to make a parameter mandatory (so that the script will ask the user to provide input if the parameter was omitted), require a particular data type, easily incorporate validation routines, or add comment-based help.
# script2.ps1
<#
.SYNOPSIS
Short description of the script or function.
.DESCRIPTION
Longer description of what the script or function actually does.
.PARAMETER Foo
Description of the parameter Foo.
#>
[CmdletBinding()]
Param(
[Parameter(Mandatory=$true, Position=0, ValueFromPipeline=$true)]
[ValidateRange(2,42)]
[int]$foo
)
Write-Output $foo
See Get-Help about_Function_Advanced_Parameters for more information.
Pass the variable as an unnamed argument:
# script1.ps1
$var = 'foo'
$dir = 'C:\some\folder'
$scriptname = "script2.ps1"
& "${dir}\${scriptname}" $var
# script2.ps1
Write-Output $args[0]
This is the second best approach, because you still pass the variable in a clear-cut way, but the interface isn't as well defined as before.
Define the variable as an environment variable:
# script1.ps1
$env:var = 'foo'
$dir = 'C:\some\folder'
$scriptname = "script2.ps1"
& "${dir}\${scriptname}"
# script2.ps1
Write-Output $env:var
This is a less clean approach than the argument-based ones, as the variable is passed using a "side-channel" (the process environment, which is inherited by child processes).
Just define the variable in the first script and use it in the second one:
# script1.ps1
$var = 'foo'
$dir = 'C:\some\folder'
$scriptname = "script2.ps1"
& "${dir}\${scriptname}"
# script2.ps1
Write-Output $var
This will work as well, because by using the call operator (&) the second script is run in the same context as the first script and thus has access to the same variables. However, "passing" a variable like this will easily break if someone runs the second script in a different context/scope or modies it without being aware of the implicit dependency.
If you want to go this route it's usually better to use the first script for variable (and function) definitions only, and dot-source it in the second script, so that the definitions are imported into the scope of the second script:
# script1.ps1
$var = 'foo'
# script2.ps1
. 'C:\path\to\script1.ps1'
Write-Output $var
Technically, passing values via a file would be another option. However, I would recommend against using this approach for several reasons:
it's prone to errors due to improper permissions (could be mitigated by creating the file in the $env:TEMP folder),
it's prone to littering the filesystem if you don't clean up the file afterwards,
it needlessly generates disk I/O when simple in-memory operations provided by the language would suffice.
Here is a sample PowerShell script (it doesn't work) that illustrates what I want to do:
$BuildRoot = '_Provided from script parameter_'
$Files = 'a.dll', 'b.dll', 'c.dll'
$BuiltFiles = $Files | Join-Path $BuildRoot
I have a list of filenames, and a directory name, and I want to join them all together, simple. The problem is that this doesn't work because the Join-Path parameter -ChildPath accepts input from the pipeline ByPropertyName, so the following error is reported:
The input object cannot be bound to any parameters for the command
either because the command does not take pipeline input or the input
and its properties do not match any of the parameters that take
pipeline input.
I can "fix" it by changing the line to the following:
$BuiltFiles = $Files | Select #{ Name = "ChildPath"; Expression = {$_}} | join-path $BuildRoot
Basically, the select operation is turning the object into a property value. This works, but it introduces a lot of syntactic noise to accomplish something that seems so trivial. If this is the only way to do it, so be it, but I'd like to make this script maintainable for people in the future, and this is a little hard to grok at first glance.
Is there a cleaner way to accomplish what I'm trying to do here?
You can accomplish this a little easier like so:
$Files = 'a.dll', 'b.dll', 'c.dll'
$Files | Join-Path $BuildRoot -ChildPath {$_}
Note: you don't want to put {} around the files. That creates a scriptblock in PowerShell which is essentially an anonymous function. Also, when a parameter is pipeline bound you can use the trick of supplying a scriptblock ({}) where $_ is defined in that scriptblock to be the current pipeline object.