I'm trying to understand Powershell, but find somethings not so intuitive. What I understand of it is that in the pipeline objects are passed, instead of traditionally text. And $_ refers to the current object in the pipeline. Then, why is the following not working:
get-date|Write-Host "$_"
The errormessage is:
Write-Host : The input object cannot be bound to any parameters for the command either because the command does not take pipeline input or the input and its properties do not matc
h any of the parameters that take pipeline input.
At line:1 char:10
+ get-date|Write-Host $_
+ ~~~~~~~~~~~~~
+ CategoryInfo : InvalidArgument: (10-9-2014 15:17:00:PSObject) [Write-Host], ParameterBindingException
+ FullyQualifiedErrorId : InputObjectNotBound,Microsoft.PowerShell.Commands.WriteHostCommand
$_ is the current single item in the pipeline. To write each item in the pipeline you would write
get-data | foreach { Write-Host $_ }
Or in the short form
get-data |% { Write-Host $_ }
Conceptually, Foreach is a cmdlet that receives a function parameter, a pipeline input and applies the function on each item of the pipeline. You can't just write code with $_ - you need to have a function explicitly states that it agrees to receive pipeline input
And $_ refers to the current object in the pipeline
Indeed, the automatic $_ variable refers to the current pipeline object, but only in script blocks { ... }, notably those passed to the ForEach-Object and Where-Object cmdlets.
Outside of script blocks it has no meaningful value.
Therefore, the immediate fix to your command is the following:
Get-Date | ForEach-Object { Write-Host $_ }
However, note that:
Write-Host is is typically the wrong tool to use, unless the intent is to write to the display only, bypassing the success output stream and with it the ability to send output to other commands, capture it in a variable, or redirect it to a file.
To output a value, use it by itself; e.g, $value, instead of Write-Host $value (or use Write-Output $value); see this answer. To explicitly print only to the display but with rich formatting, use Out-Host.
Therefore, if merely outputting each pipeline input object is the goal, Get-Date | ForEach-Object { $_ } would do, where the ForEach-Object call is redundant if each input object is to simply be passed through (without transformation); that is, in the latter case just Get-Date would do.
As for what you tried:
get-date|Write-Host "$_"
As noted, the use of $_ in this context is pointless, but the reason for the error message you saw is unrelated to that problem:
Instead, the reason for the error is that you're mistakenly trying to provide input to Write-Host both via the pipeline Get-Date | Write-Host ... and by way of an argument (... | Write-Host "...")
Given that the argument ("$_") (positionally) binds to the -Object parameter, the pipeline input then has no parameter left to bind to, which causes the error at hand.
Related
My powershell function should accept a list of valid paths of mixed files and/or directories either as a named parameter or via pipeline, filter for files that match a pattern, and return the list of files.
$Paths = 'C:\MyFolder\','C:\MyFile'
This works: Get-Files -Paths $Paths This doesn't: $Paths | Get-Files
$PSDefaultParameterValues = #{
"Get-Files:Paths" = ( Get-Location ).Path
}
[regex]$DateRegex = '(20\d{2})([0-1]\d)([0-3]\d)'
[regex]$FileNameRegex = '^term2-3\.7_' + $DateRegex + '\.csv$'
Function Get-Files {
[CmdletBinding()]
[OutputType([System.IO.FileInfo[]])]
[OutputType([System.IO.FileInfo])]
param (
[Parameter(
Mandatory = $false, # Should default to $cwd provided by $PSDefaultParameterValues
ValueFromPipeline,
HelpMessage = "Enter filesystem paths that point either to files directly or to directories holding them."
)]
[String[]]$Paths
)
begin {
[System.IO.FileInfo[]]$FileInfos = #()
[System.IO.FileInfo[]]$SelectedFileInfos = #()
}
process { foreach ($Path in $Paths) {
Switch ($Path) {
{ Test-Path -Path $Path -PathType leaf } {
$FileInfos += (Get-Item $Path)
}
{ Test-Path -Path $Path -PathType container } {
foreach ($Child in (Get-ChildItem $Path -File)) {
$FileInfos += $Child
}
}
Default {
Write-Warning -Message "Path not found: $Path"
continue
}
}
$SelectedFileInfos += $FileInfos | Where-Object { $_.Name -match $FileNameRegex }
$FileInfos.Clear()
} }
end {
Return $SelectedFileInfos | Get-Unique
}
}
I found that both versions work if I remove the default parameter value. Why?
Why does passing a parameter via the pipeline cause an error when that parameter has a default defined in PSDefaultParameterValues, and is there a way to work around this?
Mathias R. Jessen provided the crucial pointer in a comment:
A parameter that is bound via an entry in the dictionary stored in the $PSDefaultParameterValues preference variable is bound before it is potentially bound via the pipeline, just like passing a parameter value explicitly, as an argument would.
Once a given parameter is bound that way, it cannot be bound again via the pipeline, causing an error:
The input object cannot be bound to any parameters for the command either because the command does not take pipeline input or the input and its properties do not match any of the parameters that take pipeline input.
As you can see, the specific problem at hand - a parameter already being bound - is unfortunately not covered by this message. The unspoken part is that once a given parameter has been bound by argument (possibly via $PSDefaultParameterValues), it is removed from the set of candidate pipeline-binding parameters the input could bind to, and if there are no candidates remaining, the error occurs.
The only way to override a $PSDefaultParameterValue preset value is to use an (explicit) argument.
This comment on a related GitHub issue provides details on the order of parameter binding.
A simplified way to reproduce the problem:
& {
# Preset the -Path parameter for Get-Item
# In any later Get-Item calls that do not use -Path explicitly, this
# is the same as calling Get-Item -Path /
$PSDefaultParameterValues = #{ 'Get-Item:Path' = '/' }
# Trying to bind the -Path parameter *via the pipeline* now fails,
# because it has already been bound via $PSDefaultParameterValues.
# Even without the $PSDefaultParameterValues entry in the picture,
# you'd get the same error with: '.' | Get-Item -Path /
'.' | Get-Item
# By contrast, using an *argument* allows you to override the preset.
Get-Item -Path .
}
What's happening here?!
This is a timing issue.
PowerShell attempts to bind and process parameter arguments in roughly* the following order:
Explicitly named parameter arguments are bound (eg. -Param $value)
Positional arguments are bound (abc in Write-Host abc)
Default parameter values are applied for any parameter that wasn't processed during the previous two steps - note that applicable $PSDefaultParameterValues always take precedence over defaults defined in the parameter block
Resolve parameter set, validate all mandatory parameters have values (this only fails if there are no upstream command in the pipeline)
Invoke the begin {} blocks on all commands in the pipeline
For any commands downstream in a pipeline: wait for input and then start binding it to the most appropriate parameter that hasn't been handled in previous steps, and invoke process {} blocks on all commands in the pipeline
As you can see, the value you assign to $PSDefaultParameterValues takes effect in step 3 - long before PowerShell even has a chance to start binding the piped string values to -Paths, in step 6.
*) this is a gross over-simplification, but the point remains: default parameter values must have been handled before pipeline binding starts.
How to work around it?
Given the procedure described above, we should be able to work around this behavior by explicitly naming the parameter we want to bind the pipeline input to.
But how do you combine -Paths with pipeline input?
By supplying a delay-bind script block (or a "pipeline-bound parameter expression" as they're sometimes called):
$Paths | Get-Files -Paths { $_ }
This will cause PowerShell to recognize -Paths during step 1 above - at which point the default value assignment is skipped.
Once the command starts receiving input, it transforms the input value and binds the resulting value to -Paths, by executing the { $_ } block - since we just output the item as-is, the effect is the exact same as when the pipeline input is bound implicitly.
Digging deeper
If you want to learn more about what happens "behind the curtain", the best tool available is Trace-Command:
$PSDefaultParameterValues['Get-Files:Paths'] = $PWD
Trace-Command -Expression { $Paths |Get-Files } -Name ParameterBinding -PSHost
I should mention that the ParameterBinding tracer is very verbose - which is great for surmising what's going on - but the output can be a bit overwhelming, in which case you might want to replace the -PSHost parameter with -PSPath .\path\to\output.txt to write the trace output to a file
Basically I'm trying to get the below "inline if-statement" function working (credit here)
Function IIf($If, $Then, $Else) {
If ($If -IsNot "Boolean") {$_ = $If}
If ($If) {If ($Then -is "ScriptBlock") {&$Then} Else {$Then}}
Else {If ($Else -is "ScriptBlock") {&$Else} Else {$Else}}
}
Using PowerShell v5 it doesn't seem to work for me and calling it like
IIf "some string" {$_.Substring(0, 4)} "no string found :("
gives the following error:
You cannot call a method on a null-valued expression.
At line:1 char:20
+ IIf "some string" {$_.Substring(0, 4)} "no string found :("
+ ~~~~~~~~~~~~~~~~~~
+ CategoryInfo : InvalidOperation: (:) [], RuntimeException
+ FullyQualifiedErrorId : InvokeMethodOnNull
So, as a more general question, how do you make $_ available to the scriptblock passed into a function?
I kind of tried following this answer, but it seems it's meant for passing it to a separate process, which is not what I'm looking for.
Update:
It seems the issue is that I have the function in a module rather than directly in a script/PS session. A workaround would be to avoid putting it in the module, but I feel a module is more portable, so I'd like to figure out a solution for that.
There are two changes worth making, which make your problem go away:
Do not try to assign to $_ directly; it is an automatic variable under PowerShell's control, not meant to be set by user code (even though it may work situationally, it shouldn't be relied upon).
Instead, use the ForEach-Object cmdlet to implicitly set $_ via its -InputObject parameter.
Note that use of ForEach-Object with -InputObject rather than with input from the pipeline is unusual, because it results in atypical behavior: even collections passed to -InputObject are passed as a single object to the -Process block; that is, the usual enumeration does not take place; however, in the context at hand, this is precisely what is desired here: whatever $If represents should be passed as-is to the -Process script block, even if it happens to be a collection.
Use the -is operator with type literals such as [Boolean], not type names such as "Boolean".
Function IIf($If, $Then, $Else) {
If ($If) {
If ($Then -is [scriptblock]) { ForEach-Object -InputObject $If -Process $Then }
Else { $Then }
} Else {
If ($Else -is [scriptblock]) { ForEach-Object -InputObject $If -Process $Else }
Else { $Else }
}
}
As for what you tried:
In a later update you state that your IIf function is defined in a module, which explains why your attempt to set $_ by direct assignment ($_ = $If, which, as stated, is to be avoided in general), was ineffective:
It created a function-local $_ instance, which the $Then script block, due to being bound to the scope of the (module-external) caller, does not see.
The reason is that each module has its own scope domain (hierarchy of scopes aka session state), which only shares the global scope with non-module callers - see the bottom section of this answer for more information about scopes in PowerShell.
Here's what I'm trying to do:
param([Switch]$myparameter)
If($myparamter -eq $true) {$export = Export-CSV c:\temp\temp.csv}
Get-MyFunction | $export
If $myparameter is passed, export the data to said location. Else, just display the normal output (in other words, ignore the $export). What doesn't work here is setting $export to the "Export-csv...". Wrapping it in quotes does not work.
I'm trying to avoid an if, then statement saying "if it's passed, export this. If it's not passed, output data"
I have a larger module that everything works in so there is a reason behind why I am looking to do it this way. Please let me know if any additional information is needed.
Thank you everyone in advance.
tl;dr:
param([Switch] $myparameter)
# Define the core command as a *script block* (enclosed in { ... }),
# to be invoked later, either with operator . (no child variable scope)
# or & (with child variable scope)
$scriptBlock = { Get-MyFunction }
# Invoke the script block with . (or &), and pipe it to the Export-Csv cmdlet,
# if requested.
If ($myparameter) { # short for: ($myparameter -eq $True), because $myparameter is a switch
. $scriptBlock | Export-Csv c:\temp\temp.csv
} else {
. $scriptBlock
}
TessellatingHeckler's answer is concise, works, and uses a number of advanced features cleverly - however, while it avoids an if statement, as requested, doing so may not yield the best or most readable solution in this case.
What you're looking for is to store a command in a variable for later execution, but your own attempt to do so:
If ($myparameter -eq $true) { $export = Export-CSV c:\temp\temp.csv }
results in immediate execution, which is not only unintended, but fails, because the Export-Csv cmdlet is missing input in the above statement.
You can store a snippet of source code for later execution in a variable via a script block, simply by enclosing the snippet in { ... }, which in your case would mean:
If ($myparameter -eq $true) { $export = { Export-Csv c:\temp\temp.csv } }
Note that what you pass to if is itself a script block, but it is by definition one that is executed as soon as the if condition is found to be true.
A variable containing a script block can then be invoked on demand, using one of two operators:
., the "dot-sourcing" operator, which executes the script block in the current scope.
&, the call operator, which executes the script block in a child scope with respect to potential variable definitions.
However, given that you only need the pipeline with an additional command if switch $myparameter is specified, it's better to change the logic:
Store the shared core command, Get-MyFunction, in a script block, in variable $scriptBlock.
Invoke that script block in an if statement, either standalone (by default), or by piping it to Export-Csv (if -MyParameter was specified).
I'm trying to avoid an if, then statement
Uh, if you insist...
param([Switch]$myparameter)
$cmdlet, $params = (('Write-output', #{}),
('Export-Csv', #{'LiteralPath'='c:\temp\temp.csv'}))[$myparameter]
Get-MyFunction | & $cmdlet #params
What works -
Lets say I have a scriptblock which I use with Select-Object cmdlet.
$jobTypeSelector = `
{
if ($_.Type -eq "Foo")
{
"Bar"
}
elseif ($_.Data -match "-Action ([a-zA-Z]+)")
{
$_.Type + " [" + $Matches[1] + "]"
}
else
{
$_.Type
}
}
$projectedData = $AllJobs | Select-Object -Property State, #{Name="Type"; Expression=$jobTypeSelector}
This works fine, and I get the results as expected.
What I am trying to do -
However, at a later point in code, I want to reuse the scriptblock defined as $jobTypeSelector.
For example, I expected below code to take $fooJob (note that it is a single object) passed as parameter below, and be used for $_ automatic variable in the scriptblock and return me the same result, as it returns when executed in context of Select-Object cmdlet.
$fooType = $jobTypeSelector.Invoke($fooJob)
What doesn't work -
It does not work as I expected and I get back empty value.
What I have already tried -
I checked, the properties are all correctly set, and it's not due to the relevant property itself being blank or $null.
I looked on the internet, and it's seemed pretty hard to find any relevant page on the subject, but I eventually found one which was close to explaining the issue in a slightly different context - calling the script blocks in PowerShell. The blog doesn't directly answer my question, and any relevant explanation only leads to a solution which would be very ugly, hard to read and maintain in my opinion.
Question -
So, what is the best way to invoke a scriptblock for a single object, which uses $_ automatic variable as parameter, (instead of param block)
After fiddling around with varoius options, I ended up sloving the problem, in a sort of Hackish way.. But I find it to be the best solution because it's small, easy to read, maintain and understand.
Even though we are talking about single object, use it in the pipeline (which is when PowerShell defines the $_ automatic variable) with ForEach-Object cmdlet
$fooType = $fooJob | ForEach-Object $jobTypeSelector
You can certainly use foreach or ForEach-Object as you mention.
You can also pipe to the ScriptBlock directly, if you change it from a function ScriptBlock to a filter ScriptBlock by setting IsFilter to $true:
$jobTypeSelector.IsFilter = $true
$fooType = $fooJob | $jobTypeSelector
But, what would be even better is if you used a named function instead of an anonymous ScriptBlock, for example:
function Get-JobType
{
Param (
[object] $Job
)
if ($Job.Type -eq "Foo")
{
"Bar"
}
elseif ($Job.Data -match "-Action ([a-zA-Z]+)")
{
$Job.Type + " [" + $Matches[1] + "]"
}
else
{
$Job.Type
}
}
Then you can use it with Select-Object aka select like this:
$projectedData = $AllJobs |
select -Property State, #{Name="Type"; Expression={Get-JobType $_}}
Or with a single job, like this:
$fooType = Get-JobType $fooJob
I have many oracle forms in one folder and I want to compile those forms through frmcmp command in powershell script.
I have written a powershell script which is following
$module="module="
get-childitem "C:\forms\fortest" -recurse |
where { $_.extension -eq ".fmb" } |
foreach {
C:\Oracle\Middleware\Oracle_FRHome1\BIN\frmcmp $module $_.FullName userid=xyz/xyz#xyz Output_File=C:\forms\11\common\fmx\$_.BaseName+'.fmx'
}
but this one is not working. i am new in powershell.
but when I try to compile a single form through command prompt its working like following.
frmcmp module=C:\forms\src\xyz.fmb userid=xyz/xyz#xyz Output_File=C:\forms\11\common\fmx\xyz.fmx
When you want to use variables inside a string in PowerShell you have different options. To start with, you will always need to use " as opposed to ' to wrap the string, if you want variables in your string.
$myVariable = "MyPropertyValue"
Write-Host "The variable has the value $MyVariable"
The above code would yield the output:
The variable has the value MyPropertyValue
If you want to use a property of a variable (or any expression) and insert it into the string, you need to wrap it in the string with $(expression goes here), e.g.
$MyVariable = New-Object PSObject -Property #{ MyPropertyName = 'MyPropertyValue' }
# The following will fail getting the property since it will only consider
# the variable name as code, not the dot or the property name. It will
# therefore ToString the object and append the literal string .MyPropertyName
Write-Host "Failed property value retrieval: $MyVariable.MyPropertyName"
# This will succeed, since it's wrapped as code.
Write-Host "Successful property value retrieval: $($MyVariable.MyPropertyName)"
# You can have any code in those wrappers, for example math.
Write-Host "Maths calculating: 3 * 27 = $( 3 * 27 )"
The above code would yield the following output:
Failed property value retrieval: #{MyPropertyName=MyPropertyValue}.MyPropertyName
Successful property value retrieval: MyPropertyValue
Maths calculating: 3 * 27 = 81
I generally try to use the Start-Process cmdlet when I start processes in PowerShell, since it gives me the possibility of additional control over the process started. This means that you could use something similar to the following.
Get-ChildItem "C:\forms\fortest" -Filter "*.fmb" -recurse | Foreach {
$FormPath = $_.FullName
$ResultingFileName = $_.BaseName
Start-Process -FilePath "C:\Oracle\Middleware\Oracle_FRHome1\BIN\frmcmp.exe" -ArgumentList "module=$FormPath", "userid=xyz/xyz#xyz", "Output_File=C:\forms\11\common\fmx\$ResultingFileName.fmx"
}
You could also add the -Wait parameter to the Start-Process command, if you want to wait with compilation of the next item until the current compilation has completed.