I am trying to write a PowerShell script that can get pipeline input (and is expected to do so), but trying something like
ForEach-Object {
# do something
}
doesn't actually work when using the script from the commandline as follows:
1..20 | .\test.ps1
Is there a way?
Note: I know about functions and filters. This is not what I am looking for.
In v2 you can also accept pipeline input (by propertyName or byValue), add parameter aliases etc:
function Get-File{
param(
[Parameter(
Position=0,
Mandatory=$true,
ValueFromPipeline=$true,
ValueFromPipelineByPropertyName=$true)
]
[Alias('FullName')]
[String[]]$FilePath
)
process {
foreach($path in $FilePath)
{
Write-Host "file path is: $path"
}
}
}
# test ValueFromPipelineByPropertyName
dir | Get-File
# test ValueFromPipeline (byValue)
"D:\scripts\s1.txt","D:\scripts\s2.txt" | Get-File
- or -
dir *.txt | foreach {$_.fullname} | Get-File
This works and there are probably other ways to do it:
foreach ($i in $input) {
$i
}
17:12:42 PS>1..20 | .\cmd-input.ps1
1
2
3
-- snip --
18
19
20
Search for "powershell $input variable" and you will find more information and examples.
A couple are here:
PowerShell Functions and Filters PowerShell Pro!
(see the section on "Using the PowerShell Special Variable “$input”")
"Scripts, functions, and script blocks all have access to the $input variable, which provides an enumerator over the elements in the incoming pipeline. "
or
$input gotchas « Dmitry’s PowerBlog PowerShell and beyond
"... basically $input in an enumerator which provides access to the pipeline you have."
For the PS command line, not the DOS command line Windows Command Processor.
You can either write a filter which is a special case of a function like so:
filter SquareIt([int]$num) { $_ * $_ }
or you can create a similar function like so:
function SquareIt([int]$num) {
Begin {
# Executes once before first item in pipeline is processed
}
Process {
# Executes once for each pipeline object
$_ * $_
}
End {
# Executes once after last pipeline object is processed
}
}
The above works as an interactive function definiton or if in a script can be dotted into your global session (or another script). However your example indicated you wanted a script so here it is in a script that is directly usable (no dotting required):
--- Contents of test.ps1 ---
param([int]$num)
Begin {
# Executes once before first item in pipeline is processed
}
Process {
# Executes once for each pipeline object
$_ * $_
}
End {
# Executes once after last pipeline object is processed
}
With PowerShell V2, this changes a bit with "advanced functions" which embue functions with the same parameter binding features that compiled cmdlets have. See this blog post for an example of the differences. Also note that in this advanced functions case you don't use $_ to access the pipeline object. With advanced functions, pipeline objects get bound to a parameter just like they do with a cmdlet.
The following are the simplest possible examples of scripts/functions that use piped input. Each behaves the same as piping to the "echo" cmdlet.
As Scripts:
# Echo-Pipe.ps1
Begin {
# Executes once before first item in pipeline is processed
}
Process {
# Executes once for each pipeline object
echo $_
}
End {
# Executes once after last pipeline object is processed
}
# Echo-Pipe2.ps1
foreach ($i in $input) {
$i
}
As functions:
Function Echo-Pipe {
Begin {
# Executes once before first item in pipeline is processed
}
Process {
# Executes once for each pipeline object
echo $_
}
End {
# Executes once after last pipeline object is processed
}
}
Function Echo-Pipe2 {
foreach ($i in $input) {
$i
}
}
E.g.
PS > . theFileThatContainsTheFunctions.ps1 # This includes the functions into your session
PS > echo "hello world" | Echo-Pipe
hello world
PS > cat aFileWithThreeTestLines.txt | Echo-Pipe2
The first test line
The second test line
The third test line
Related
I'm making some modules to make RESTAPI calls easier for our software (since all resources use different logic to how I can find them and how they should be called but that's a separate story). I'm making Get-, Set-, New-, and Remove- for each resource and just found that you can't pipe more than one object to modules that takes arrays from the pipeline. Here's an illustration of the problem that you can copy-paste into your environment:
function Get-MyTest {
param(
[string[]]$MyInput
)
[array]$Output = #()
$i=0
foreach($Item in $MyInput){
$i++
$Output += [pscustomobject]#{
Name = $Item
Id = $i
}
}
return $Output
}
function Get-IncorrectResults {
param(
[parameter(ValueFromPipeline)][array]$MyIncorrectInput
)
foreach ($Item in $MyIncorrectInput) {
Write-Host "Current object: $Item `n "
}
}
The Get module can return more than one object. Each object returned is a PSCustomObject, so if there are more than one returned it becomes an array of PSCustomObjects. Here's the problem:
This works:
$Results = Get-MyTest -MyInput "Test1","Test2","Test3"
Get-IncorrectResults -MyIncorrectInput $Results
This only returns the first item:
$Results = Get-MyTest -MyInput "Test1","Test2","Test3"
$Results | Get-IncorrectResults
If the Get part returns more than one object, only the first object is passed on to the Remove-MyModule. I tried changing the parameter definition from [pscustomobject[]] to [array] but it's the same result. What is the reason for dropping all but the first array item when piping but not when using it as a parameter?
The structure of an advanced PowerShell function is as follows:
function Function-Name {
param(
<# parameter definitions go here#>
)
begin {
<# this runs once when the pipeline starts #>
}
process {
<# this runs once for every input item piped to the function #>
}
end {
<# this runs once at the end of the pipeline #>
}
}
When you omit the begin/process/end blocks, PowerShell assumes your function body is really the end block - so this function definition:
function Function-Name {
param()
Do-Something
}
... is the exact same as:
function Function-Name {
param()
begin {}
process {}
end {Do-Something}
}
Notice that the process block - the only piece of the function body that repeats with new input - is assumed empty, and the Do-Something statement will not be executed until after the last input item has been received.
This is why it looks like it "drops" everything but the last input object.
Place the code that consumes the pipeline-bound variable inside the process block and it'll work:
Remove-MyModule{
param(
[parameter(ValueFromPipeline)][pscustomobject[]]$MyModules
)
process {
foreach($Module in $MyModules){
Invoke-RestMethod -Method Delete -Uri "MyURL" -Body (ConvertTo-Json $Module) -UseDefaultCredentials
}
}
}
For more information about the syntax and semantics of advanced functions, see the about_Functions_Advanced_Methods help file
In Powershell, I frequently use a StreamReader to iterate over files and read/manipulate text. Instead of constantly having to type a script similar to:
## StreamReader setup / file availability check
try {
## Create Stream Reader
$readStream = [System.IO.StreamReader]::new($Path)
## Do stuff ...
} finally {
$readStream.close()
}
How can I make the entire setup/open/close process into a function that I can call whenever I need to automate the 'Do Stuff' portion of my above code? I am aware of how to make functions but I cant figure out how to turn this into a usable function so I only have to write it and edit it once but can use many times.
This may not be the most elegant solution but it does work.
You have different Functions for each type of processing I just called my test Process-Stream.
Function Process-Stream {
Do {
$Line = $readStream.ReadLine()
"$Line"
} While ($readStream.EndOfStream -eq $False)
} #End Function Process-Stream
Next you have a function that does all of your setup and error processing for the Stream.
Function Get-Stream {
Param (
[Parameter(Mandatory=$True)]
[String] $SourcePath,
[Parameter(Mandatory=$True)]
[String] $ProcessFunction
)
try {
## Create Stream Reader
$readStream = [System.IO.StreamReader]::new(
"$SourcePath")
& $ProcessFunction
} finally {
$readStream.close()
}
} #End Function Get-Stream
Now you just call Get-Stream with the name of your processing function.
PS> Get-Stream -SourcePath "G:\Test\StreamIOTest.txt" -ProcessFunction Process-Stream
Line 1
Line 2
Line 3
Line 4
PS>
Note: the test text file I used had 4 lines. Don't forget you need to have the functions loaded!
Updated: I realized after I posted that I should have parameterized the file to be read and passed that into Get-Stream also.
HTH
Assuming that you want to process your text files line by line:
There's no need to deal with [System.IO.StreamReader] instances directly - you can use PowerShell's built-in features.
In the simplest case, if performance isn't paramount, combine Get-Content with ForEach-Object:
Get-Content $Path | ForEach-Object { <# Do Stuff with $_, the line at hand #> }
When performance matters, use a switch statement with the -File parameter:
switch -File $Path { Default { <# Do Stuff with $_, the line at hand #> } }
I want to use start-job to run a .ps1 script requiring a parameter. Here's the script file:
#Test-Job.ps1
Param (
[Parameter(Mandatory=$True)][String]$input
)
$output = "$input to output"
return $output
and here is how I am running it:
$input = "input"
Start-Job -FilePath 'C:\PowerShell\test_job.ps1' -ArgumentList $input -Name "TestJob"
Get-Job -name "TestJob" | Wait-Job | Receive-Job
Get-Job -name "TestJob" | Remove-Job
Run like this, it returns " to output", so $input is null in the script run by the job.
I've seen other questions similar to this, but they mostly use -Scriptblock in place of -FilePath. Is there a different method for passing parameters to files through Start-Job?
tl;dr
$input is an automatic variable (value supplied by PowerShell) and shouldn't be used as a custom variable.
Simply renaming $input to, say, $InputObject solves your problem.
As Lee_Dailey notes, $input is an automatic variable and shouldn't be assigned to (it is automatically managed by PowerShell to provide an enumerator of pipeline input in non-advanced scripts and functions).
Regrettably and unexpectedly, several automatic variables, including $input, can be assigned to: see this answer.
$input is a particularly insidious example, because if you use it as a parameter variable, any value you pass to it is quietly discarded, because in the context of a function or script $input invariably is an enumerator for any pipeline input.
Here's a simple example to demonstrate the problem:
PS> & { param($input) "[$input]" } 'hi'
# !! No output - the argument was quietly discarded.
That the built-in definition of $input takes precedence can be demonstrated as follows:
PS> 'ho' | & { param($input) "[$input]" } 'hi'
ho # !! pipeline input took precedence
While you can technically get away with using $input as a regular variable (rather than a parameter variable) as long as you don't cross scope boundaries, custom use of $input should still be avoided:
& {
$input = 'foo' # TO BE AVOIDED
"[$input]" # Technically works: -> '[foo]'
& { "[$input]" } # FAILS, due to child scope: -> '[]'
}
In Jenkins Pipeline i want return a value from powershell to pipeline but i dont know how
Example:
pipeline {
agent any
stages {
stage('Return Value') {
steps {
parameters([
string(name: 'Value1'),
])
powershell '''
parameters for conection ...
extra parameters ....
$resultQuery= Invoke-Sqlcmd #conection -QueryTimeout 0 -ErrorAction Stop
$value1 = $resultQuery.code <# 1000 #>
$message = $resultQuery.message <# Some message #>
''')
}
}
stage('Another Step') {
steps {
//I want ... if ($value1 <= 1000)
// do something
}
}
}
}
}
Then i want return out of the powershell script the $value1 for use it in another step.
i try with $ENV but doesn't work
$ENV:Value1 = $resultQuery.code
any idea??
I've used this:
powershell('''
"env.PACKAGE_VERSION='$newversion'" | Out-File packageVersion.properties -Encoding ASCII
''')
later:
script {
load('packageVersion.properties')}
using the value:
echo("---- PACKAGE_VERSION: ${env.PACKAGE_VERSION} ----")
If you have a powershell script that just outputs the single piece of text you want, then you can use the returnStdout param to get that value back to the pipeline script:
steps {
script {
env.MY_RESULT = powershell(returnStdout: true, script:'echo hi')
}
echo "${env.MY_RESULT}" // prints "hi"
}
more here: https://www.jenkins.io/blog/2017/07/26/powershell-pipeline/
I'm not familiar with Jenkins but have you tried using Write-output $value1 or return $value1?
I found that in some of my powershell scripts, anything I output is captured and returned to the calling function.
Of course, you will need to somehow save the value on the Jenkins side to reuse it.
Another way would be to save the value to a file and read it from the file. You could do it using $value1 | out-file C:\temp\temp.txt and then read it using Get-Content C:\temp\temp.txt in a separate script.
I normally do the following to invoke a script block containing $_:
$scriptBlock = { $_ <# do something with $_ here #> }
$theArg | ForEach-Object $scriptBlock
In effect, I am creating a pipeline which will give $_ its value (within the Foreach-Object function invocation).
However, when looking at the source code of the LINQ module, it defines and uses the following function to invoke the delegate:
# It is actually surprisingly difficult to write a function (in a module)
# that uses $_ in scriptblocks that it takes as parameters. This is a strange
# issue with scoping that seems to only matter when the function is a part
# of a module which has an isolated scope.
#
# In the case of this code:
# 1..10 | Add-Ten { $_ + 10 }
#
# ... the function Add-Ten must jump through hoops in order to invoke the
# supplied scriptblock in such a way that $_ represents the current item
# in the pipeline.
#
# Which brings me to Invoke-ScriptBlock.
# This function takes a ScriptBlock as a parameter, and an object that will
# be supplied to the $_ variable. Since the $_ may already be defined in
# this scope, we need to store the old value, and restore it when we are done.
# Unfortunately this can only be done (to my knowledge) by hitting the
# internal api's with reflection. Not only is this an issue for performance,
# it is also fragile. Fortunately this appears to still work in PowerShell
# version 2 through 3 beta.
function Invoke-ScriptBlock {
[CmdletBinding()]
param (
[Parameter(Position=1,Mandatory=$true)]
[ScriptBlock]$ScriptBlock,
[Parameter(ValueFromPipeline=$true)]
[Object]$InputObject
)
begin {
# equivalent to calling $ScriptBlock.SessionState property:
$SessionStateProperty = [ScriptBlock].GetProperty('SessionState',([System.Reflection.BindingFlags]'NonPublic,Instance'))
$SessionState = $SessionStateProperty.GetValue($ScriptBlock, $null)
}
}
process {
$NewUnderBar = $InputObject
$OldUnderBar = $SessionState.PSVariable.GetValue('_')
try {
$SessionState.PSVariable.Set('_', $NewUnderBar)
$SessionState.InvokeCommand.InvokeScript($SessionState, $ScriptBlock, #())
}
finally {
$SessionState.PSVariable.Set('_', $OldUnderBar)
}
}
}
This strikes me as a bit low-level. Is there a recommended, safe way of doing this?
You can invoke scriptblocks with the ampersand. No need to use Foreach-Object.
$scriptblock = {## whatever}
& $scriptblock
#(1,2,3) | % { & {write-host $_}}
To pass parameters:
$scriptblock = {write-host $args[0]}
& $scriptblock 'test'
$scriptBlock = {param($NamedParam) write-host $NamedParam}
& $scriptBlock -NamedParam 'test'
If you're going to be using this inside of Invoke-Command, you could also usin the $using construct.
$test = 'test'
$scriptblock = {write-host $using:test}