I was not able to figure out if the following is possible. Looks like it is not but thought I would ask community.
Let's say I have simple handling of script parameters, like this:
[CmdletBinding()]
Param(
[parameter(Position=0)]
[string] version,
[parameter(Position=1)]
[ValidateScript( { SomeModule/Search(version, $_); $true # FIND THIS VERSION IN SOURCE. **VERSION MUST HAVE BEEN SET BY NOW** } )]
[string] source,
)
Now we can say 'version' represents component's version and 'source' represents some source (location) to search for that component (could be multiple sources). The earliest I can actually call search(source, version) is in the ValidateScript{} and this is good. It will not proceed with processing other parameters in the case some error happens in search(...). It also eliminates a bunch of if(...){} statements later in the code that otherwise would be necessary to check whether a parameter was passed and then run some action on it. However, the order of supplied values is important:
This is fine. Values are supplied in the right order
MyScript.ps1 -version 12345ABC -source 'filesystem'
This is fine. Values are supplied in the right order. Positional binding is in effect.
MyScript.ps1 12345ABC 'filesystem'
This is will not work. Values are supplied in the wrong (from scripts' logic point of view) order
MyScript.ps1 -source 'filesystem' -version 12345ABC
I do not want to put a restriction on client requiring specific order when parameter name is used.
As a work around I could re-arrange $ARGS but I cannot have any code prior to [CmdletBinding]. I could have a new script that would change the order in $ARGS and then call MyScript.ps1. Though this wouldn't be a nice solution. I am using PS 4.0.
I suspect that the problem is that you are trying to use one parameter in the other's validation script, and PowerShell appears to do the validate on invocation parse rather than actual execution. If you move the validation for $source into the body of the function, you will probably not have any further problem. I have not as yet tried this, but will when I get a chance, and will comment to confirm at that time.
If i remember correctly, you can omit the Position declarations completely and powershell will process them as positional in order if you don't specify the parameter name (-name) on the command line.
[CmdletBinding()]
Param(
[string] version,
[ValidateScript( { SomeModule/Search(version, $_); $true # FIND THIS VERSION IN SOURCE. **VERSION MUST HAVE BEEN SET BY NOW** } )]
[string] source,
)
At this point the following should work:
MyScript.ps1 12345ABC 'filesystem'
And also this should work:
MyScript.ps1 -source 'filesystem' -version 12345AB
Named parameters can be in any order as they are not bound by Position.
In your script these two are interchangeable as they are named parameters.
MyScript.ps1 -version 12345ABC -source 'filesystem'
MyScript.ps1 -source 'filesystem' -version 12345ABC
Positional parameters are not explicitly named, and do need to be in the correct order as defined by Position in your params.
MyScript.ps1 12345ABC 'filesystem'
Related
I have currently had to take a huge leap from my unix scripting to the MS side of things and found myself overwhelmed with PowerShell.
My situation is as follows:
I have a script script.ps1 which can be only run under specific windows account. In order to facilitate the use, it was decided that if user runs the script from a different account, it will pop up a query for credentials and restart itself from within (similarly to recursion), but importantly - maintaining the input parameters.
I have found out, that the Invoke-Command is probably what I am looking for, but I cannot seem to be able to build the PS query for this.
my code snippet looks like
if(!([System.Environment]::UserName -eq $user)){
$Credential = Get-Credential -credential INTRANET\$user
Invoke-Command -FilePath $script -Credential $Credential -ArgumentList $arguments
}
where $user contains the desired user, $script contains filepath to the script.ps1 and $arguments contain command line arguments that were passed to the script as a String, i. e. -order 66 -location UAT
but currently I get an error
Parameter set cannot be resolved using the specified named parameters.
...
FullyQualifiedErrorId : AmbiguousParameterSet
I tried shuffling the parameters around, I tried using Start-Process instead of Invoke-Command, but everything resulted in same or similar errors.
Also, because I am really new to the powershell, please do not hesitate to offer different solution, if it is viable. I do not know the capabilities of the language well.
Lastly, please note that the starting point is always powershell prompt running with non-elevated user account. Unfortunately, the option to start up powershell under a different account in the first place is not available to us.
The problem probably is that you specify the parameters stored in the variable $arguments as string in the regular format like you said: -order 66 -location UAT
The parameter -ArgumentList works differently, its an array used for array splatting. So you can't pass the values by the parameter name. You have to pass the values by parameter order, e.g.:
$Arguments = #(66,'uat')
Invoke-Command -FilePath $script -Credential $Credential -ArgumentList $Arguments
See Parameter Argumentlist.
See Array Splatting.
The value 66 is passed to the first parameter, the value uat to the 2nd... So you must know the order of the parameters and insert the related values into the array at the right position.
To control the position of the parameters, the param specification in the other script should at least have:
param (
[parameter(Position=1)]
[int]$order,
[parameter(Position=2)]
[string]$location
)
For presenting the problem, I have this simple script saved as PowerShell module (test.psm1)
Write-Verbose 'Verbose message'
In real life, it includes command to import additional functions, but that is irrelevant at the moment.
If I run Import-Module .\test.psm1 -Verbose -Force I get only
VERBOSE: Loading module from path 'C:\tmp\test.psm1'.
My Write-Verbose is ignored 😟
I tried adding cmdletbinging but it also did not work.
[cmdletbinding()]
param()
Write-Verbose 'Verbose message'
Any clue how to provide Verbose output while importing the PowerShell module?
P.S. I do not want to display Verbose information always, but only if -Verbose is specified. Here would be my expected output for these two different cases:
PS C:\> Import-Module .\test.psm1 -Verbose -Force # with verbose output
VERBOSE: Loading module from path 'C:\tmp\test.psm1'.
VERBOSE: Verbose message
PS C:\> Import-Module .\test.psm1 -Force # without verbose output
PS C:\>
That is an interesting situation. I have a theory, but if anyone can prove me wrong, I would be more than happy.
The short answer: you probably cannot do what you want by playing with -Verbose only. There may be some workarounds, but the shortest path could be setting $VerbosePreference.
First of all, we need to understand the lifetime of a module when it is imported:
When a module is imported, a new session state is created for the
module, and a System.Management.Automation.PSModuleInfo object is
created in memory. A session-state is created for each module that is
imported (this includes the root module and any nested modules). The
members that are exported from the root module, including any members
that were exported to the root module by any nested modules, are then
imported into the caller's session state. [..] To send output to the host, users should run the Write-Host cmdlet.
The last line is the first hint that pointed me to a solution: when a module is imported, a new session state is created, but only exported elements are attached to the global session state. This means that test.psm1 code is executed in a session different than the one where you run Import-Module, therefore the -Verbose option, related to that single command, is not propagated.
Instead, and this is an assumption of mine, since I did not find it on the documentation, configurations from the global session state are visible to all the child sessions. Why is this important? Because there are two ways to turn on verbosity:
-Verbose option, not working in this case because it is local to the command
$VerbosePreference, that sets the verbosity for the entire session using a preference variable.
I tried the second approached and it worked, despite not being so elegant.
$VerbosePreference = "Continue" # print all the verbose messages, disabled by default
Import-Module .\test.psm1 -Force
$VerbosePreference = "SilentlyContinue" # restore default value
Now some considerations:
Specifying -Verbose on the Import-Module command is redundant
You can still override the verbosity configuration inside your module script, by using
Write-Verbose -Message "Verbose message" -Verbose:$false
As #Vesper pointed out, $false will always suppress the Write-Verbose output. Instead, you may want to parameterized that option with a boolean variable assigned in a previous check, perhaps. Something like:
if (...)
{
$forceVerbose=$true
}
else
{
$forceVerbose=$false
}
Write-Verbose -Message "Verbose message" -Verbose:$forceVerbose
There might be other less invasive workarounds (for instance centered on Write-Host), or even a real solution. As I said, it is just a theory.
Marco Luzzara's answer is spot on (and deserves the bounty in my opinion) in regards to the module being run in its own session state, and that by design you can't access those variables.
An alternative solution to setting $VerbosePreference and restoring it, is to have your module take a parameter specifically for this purpose. You touched on this a little bit by trying to add [CmdletBinding()] to your module; the problem is you have no way to pass in named parameters, only unnamed arguments, via Import-Module -ArgumentList, so you can't specifically pass in a $true for -Verbose.
Instead you can specify your own parameter and use it.
(psm1)
[CmdletBinding()]param([bool]$myverbose)
Write-Verbose "Message" -Verbose:$myverbose
followed with:
Import-Module test.psm1 -Force -ArgumentList $true
In the above example, it would apply only to a specific command, where you were setting -Verbose:$myverbose every time.
But you could apply it to the module's $VerbosePreference:
[CmdletBinding()]param([bool]$myverbose)
$VerbosePreference = if ($myverbose) { 'Continue' } else { 'SilentlyContinue' }
Write-Verbose "Message"
That way it applies throughout.
At this point I should mention the drawback of what I'm showing: you might notice I didn't include -Verbose in the Import-Module call, and that's because, it doesn't change the behavior inside the module. The verbose messages from inside will be shown purely based on the argument you passed in, regardless of the -Verbose setting on Import-Module.
An all-in-one solution then goes back to Marco's answer: manipulating $VerbosePreference on the caller's side. I think it's the only way to get both behaviors aligned, but only if you don't use -Verbose switch on Import-Module to override.
On the other hand, within a scope, like within an advanced function that can take -Verbose, setting the switch changes the local value of $VerbosePreference. That can lead us to wrap Import-Module in our own function:
function Import-ModuleVerbosely {
[CmdletBinding()]
param($Name, [Switch]$Force)
Import-Module $Name -Force:$Force
}
Great! Now we can call Import-ModuleVerbosely test.psm1 -Force -Verbose. But... it didn't work. Import-Module did recognize the verbose setting but it didn't make it down into the module this time.
Although I haven't been able to find a way to see it, I suspect it's because the variable is set to Private (even though Get-Variable seems to say otherwise) and so that value doesn't make it this time. Whatever the reason.. we could go back to making our module accept a value. This time let's make it the same type for ease of use:
(psm1)
[CmdletBinding()]param([System.Management.Automation.ActionPreference]$myverbose)
if ($myverbose) { $VerbosePreference = $myverbose }
Write-Verbose "message"
Then let's change the function:
function Import-ModuleVerbosely {
[CmdletBinding()]
param($Name, [Switch]$Force)
Import-Module $Name -Force:$Force -ArgumentList $VerbosePreference
}
Hey now we're getting somewhere! But.. it's kind of clunky isn't it?
You could go farther with it, making a full on proxy function for Import-Module, then making an alias to it called Import-Module to replace the real one.
Ultimately you're trying to do something not really supported, so it depends how far you want to go.
I have a PowerShell script that contains this at the top:
Param(
# [snip]
[string] [Parameter(Mandatory=$true)] $Server
)
In my VSTS Release definition, I added an Azure Powershell task calling the script, and passed the argument as:
-Server '$(ServerName)' [snip]
However, when I trigger a new Release, at the step for this script, I get this error:
##[error]A parameter cannot be found that matches parameter name 'Server'.
I verified in the log output that the server name is passed properly. I even copy/pasted the command logged, and after fixing paths, it ran locally with no problems.
Why is this happening, and how can I fix it?
Changing the name of the argument fixed the issue. My script now contains:
Param(
# [snip]
[string] [Parameter(Mandatory=$true)] $ServerName
)
And I pass the arguments as:
-ServerName '$(ServerName)' [snip]
As for the why, I can only speculate. I checked the source code but couldn't find anything obvious. My only guess is that the Azure PowerShell task overwrote $Server for some reason (though I'm not sure why the log output would show the correct argument in that case).
Your $Server variable is already declared. I give you some trick when you want to test code or something like this. First remove variable and clear screen like this example:
cls
Remove-Variable server
function ssssss
{
param
(
[string][parameter(Mandatory = $true)]$server
)
Write-Host $server
}
ssssss -server "Enter servername here"
It's working fine.
For me the error was the way I was passing the arguments to the task,
When I passed all in one line it actually worked
I'm attempting to run a PowerShell script with the input being the results of another PowerShell cmdlet. Here's the cross-forest Exchange 2013 PowerShell command I can run successfully for one user by specifying the -Identity parameter:
.\Prepare-MoveRequest.ps1 -Identity "user#domain.com" -RemoteForestDomainController "dc.remotedomain.com" $Remote -UseLocalObject -OverwriteLocalObject -Verbose
I want to run this command for all MailUsers. Therefore, what I want to run is:
Get-MailUser | select windowsemailaddress | .\Prepare-MoveRequest.ps1 -RemoteForestDomainController "dc.remotedomain.com" $Remote -LocalForestDomainController "dc.localdomain.com" -UseLocalObject -OverwriteLocalObject -Verbose
Note that I removed the -Identity parameter because I was feeding it from each Get-MailUser's WindowsEmailAddress property value. However, this returns with a pipeline input error.
I also tried exporting the WindowsEmailAddress property values to a CSV, and then reading it as per the following site, but I also got a pipeline problem: http://technet.microsoft.com/en-us/library/ee861103(v=exchg.150).aspx
Import-Csv mailusers.csv | Prepare-MoveRequest.ps1 -RemoteForestDomainController DC.remotedomain.com -RemoteForestCredential $Remote
What is the best way to feed the windowsemailaddress field from each MailUser to my Prepare-MoveRequest.ps1 script?
EDIT: I may have just figured it out with the following foreach addition to my Import-Csv option above. I'm testing it now:
Import-Csv mailusers.csv | foreach { Prepare-MoveRequest.ps1 -Identity $_.windowsemailaddress -RemoteForestDomainController DC.remotedomain.com -RemoteForestCredential $Remote }
You should declare your custom function called Prepare-MoveRequest instead of simply making it a script. Then, dot-source the script that declares the function, and then call the function. To accept pipeline input into your function, you need to declare one or more parameters that use the appropriate parameter attributes, such as ValueFromPipeline or ValueFromPipelineByPropertyName. Here is the official MSDN documentation for parameter attributes.
For example, let's say I was developing a custom Stop-Process cmdlet. I want to stop a process based on the ProcessID (or PID) of a Windows process. Here is what the command would look like:
function Stop-CustomProcess {
# Specify the CmdletBinding() attribute for our
# custom advanced function.
[CmdletBinding()]
# Specify the PARAM block, and declare the parameter
# that accepts pipeline input
param (
[Parameter(ValueFromPipelineByPropertyName = $true)]
[int] $Id
)
# You must specify the PROCESS block, because we want this
# code to execute FOR EACH process that is piped into the
# cmdlet. If we do not specify the PROCESS block, then the
# END block is used by default, which only would run once.
process {
Write-Verbose -Message ('Stopping process with PID: {0}' -f $ID);
# Stop the process here
}
}
# 1. Launch three (3) instances of notepad
1..3 | % { notepad; };
# 2. Call the Stop-CustomProcess cmdlet, using pipeline input
Get-Process notepad | Stop-CustomProcess -Verbose;
# 3. Do an actual clean-up
Get-Process notepad | Stop-Process;
Now that we've taken a look at an example of building the custom function ... once you've defined your custom function in your script file, dot-source it in your "main" script.
# Import the custom function into the current session
. $PSScriptRoot\Prepare-MoveRequest.ps1
# Call the function
Get-MailUser | Prepare-MoveRequest -RemoteForestDomainController dc.remotedomain.com $Remote -LocalForestDomainController dc.localdomain.com -UseLocalObject -OverwriteLocalObject -Verbose;
# Note: Since you've defined a parameter named `-WindowsEmailAddress` that uses the `ValueFromPipelineByPropertyName` attribute, the value of each object will be bound to the parameter, as it passes through the `PROCESS` block.
EDIT: I would like to point out that your edit to your post does not properly handle parameter binding in PowerShell. It may achieve the desired results, but it does not teach the correct method of binding parameters in PowerShell. You don't have to use the ForEach-Object to achieve your desired results. Read through my post, and I believe you will increase your understanding of parameter binding.
My foreach loop did the trick.
Import-Csv mailusers.csv | foreach { Prepare-MoveRequest.ps1 -Identity $_.windowsemailaddress -RemoteForestDomainController DC.remotedomain.com -RemoteForestCredential $Remote }
I have the following Param block at the start of my script:
Param
(
[Parameter(Mandatory = $true)]
[ValidateScript({Test-Path $_ -PathType Leaf})]
[string]$Config,
[switch]$OverThresholdOnly,
[switch]$SendEmail,
[switch]$Debug
)
When I run the script I get the error:
"A parameter with the name 'Debug' was defined multiple times for this command. At line:1 char:1"
Line:1 and char:1 is the start of the Param block.
If I change the $Debug to $Verbose I get the same error about Verbose. I've tried putting the $debug at the top of the Param block with the same error.
If I remove the [ValidateScript] section it works fine.
Can anybody tell me why it does this? Why [ValidateScript] is using $Debug and how to get around this short of renaming the variable?
PowerShell has ubiquitous parameters which exist for every cmdlet.
Check out get-help about_common_parameters or click here.
-Debug and -Verbose are two of the common parameters. Chose a different name to avoid the naming collision.
When you add the parameter attribute it changes the way PowerShell treats the parameters. It becomes an advanced function at that point a.k.a a script cmdlet. Script cmdlets receive the common parameters automatically.
Check out get-help about_Functions_Advanced or click here
And get-help about_Functions_Advanced_Parameters or click here