in powershell, is there a way to pipe an output to another script? - powershell

I have a script that writes output to a log file and also the console. I am running the command Add-WindowsFeatures... I want to take output of this command and pipe it to my script. Is it possible?

Absolutely. You just need to include the CmdletBinding attribute on your param statement. Then, add an attribute to one of your parameters which details the way the pipeline input binds to the parameter. For instance put this in c:\temp\get-extension.ps1:
[CmdletBinding()]
Param(
[parameter(Mandatory=$true,
ValueFromPipeline=$true)][System.IO.FileInfo[]]$file
)
process {
$file.Extension
}
Then, you can do this:
dir -File| C:\temp\get-extension.ps1
updating to address the latest comment: I'm guessing that setting the parameter type to [object[]]$stuff rather than [fileinfo[]] and putting
$stuff | out-file c:\logs\logfile.txt #or wherever you want
in the process block will get you close.

Related

Correct param type for piping from Get-Content (PowerShell)?

What is the correct type for piping all of the content from Get-Content?
My script:
param(
[Parameter(ValueFromPipeline)]
[???]$content
)
Write-Output $content
According to the docs for PowerShell 5.1, Get-Content returns "a collection of objects", but I'm not sure how to specify that in PowerShell. Without specifying a type for the [???], only the last line of the file is output.
Regarding your last comment:
When I specify [string] or [string[]], it is still only printing out the last line of the file. Is this related to missing the process block?
This is correct, otherwise your function, scriptblock or script is executed in the end block, if you want them to be able to process input from pipeline, you must add your logic in the process block.
Note, this assumes you want to use ValueFromPipeline which, in effect, converts it into an advanced function.
If you want it be able to process input from pipeline but also be compatible with positional binding and named parameter you would use [string[]] and a loop:
param(
[Parameter(ValueFromPipeline)]
[string[]] $Content
)
process {
foreach($line in $Content) {
$line
}
}
Then, assuming the above is called myScript.ps1, you would be able to:
Get-Content .\test.txt | .\myScript.ps1
And also:
.\myScript.ps1 (Get-Content .\test.txt)

Can I Pipe Powershell output to an accelerator?

I've obtained a file path to an xml-resource, by interrogating task scheduler arguments.
I'd like to pipe these files paths to [xml], to return data using XPath.
Online I see accelerators and variables are used, eg
$xml = [XML](Get-Content .\Test.xml)
tried piping to convert-to-xml, but that's an XML object containing the filepath, so I need to convert to [xml] - hoping to do this in the pipeline, potentially for > 1 xmldocument
Is it possible to pipe to [typeaccelerators] ?
Should I be piping to New-Object, or Tee-Variable, as required?
I hope to eventually be able to construct a one-liner to interrogate several nodes (eg LastRan, LastResult)
currently I have this, which only works for one
([xml](Get-Content ((Get-ScheduledTask -TaskPath *mytask* | select -First 1).Actions.Arguments | % {$_.Split('"')[-2]}))).MyDocument.LastRan
returns the value of LastRan, from MyDocument node.
Thanks in advance 👍
If you want to take pipeline input you need to make a function and set the parameter attribute ValueFromPipeline
Function Convert-XML {
Param(
[Parameter(ValueFromPipeline)]$xml
)
process{
[xml]$xml
}
}
Then you could take the content of an xml file (all at once, not line by line)
Get-Content .\Test.xml -Raw | Convert-XML
Of course to get your one liner you'd probably want to add the logic for that in the function. However this is how you'd handle pipeline input.

Passing an object from one script to another

I am having an issue passing an array member to another script. I have a VM build script that pulls from a CSV, I end up with a $VM object with .Name, .CPU, .RAM, .IP, etc. I want to pass that VM object to another script (inside the new server) which can then act on it, but am unable to do so. I have been testing the correct syntax just to pass a simple array, like below, but am still not successful:
CSV:
Name,NumCPU,MemoryGB,IPAddress
JLTest01,2,4,172.24.16.25
Script1:
Function TestMe {
[CmdLetBinding()]
Param (
[Parameter(Mandatory, Position=1)]
[array]$arr
)
$arr | Out-GridView
}
TestMe
Calling Script:
$aVMs = Import-Csv -Path "PathToCsv"
foreach($VM in $aVMs) {
$command = "<path>\TestMe.ps1 " + "-arr $($VM)"
Invoke-Expression $command
}
This produces an error, which seems to be on parsing the array. The error states:
The term 'JLTest01' is not recognized as the name of a cmdlet, function, script file, or operable program. Check the spelling of the name, or if a path was included, verify that the path is correct and try again.
At line:1 char:48 + ... \Desktop\TestMe.ps1 -arr #{Name=JLTest01; NumCPU ...
Just trying to figure out what I am doing wrong exactly, and what I need to do to pass the object to the second script.
Don't use Invoke-Expression (which is rarely the right tool and should generally be avoided for security reasons):
The stringification of the custom objects output by Import-Csv performed by $($VM) does not preserve the original objects; it results in a hashtable-like representation that isn't suitable for programmatic processing and breaks the syntax of the command line you're passing to Invoke-Expression.
Instead, just invoke your script directly:
$aVMs = Import-Csv -Path "PathToCsv"
.\TestMe.ps1 -arr $aVMs
Note that I'm passing $aVMs as a whole to your script, given that your -arr parameter is array-typed.
If you'd rather process the objects one by one, stick with the foreach approach (but then you should declare the type of your $arr parameter as [pscustomobject] rather than [array]):
$aVMs = Import-Csv -Path "PathToCsv"
foreach ($VM in $aVMs) {
.\TestMe.ps1 -arr $VMs
}
Another option is to declare $arr as accepting pipeline input, add a process block to your script, and then pipe $aVMs to your script ($aVMs | .\TestMe.ps1).
Also, don't nest a function of the same name inside your .ps1 script that you call from within the script, especially not without passing the arguments through; scripts can directly declare parameters, just like functions:
[CmdLetBinding()]
Param (
[Parameter(Mandatory, Position=1)]
[array]$arr
)
$arr | Out-GridView

In a PowerShell script, is it possible to tell whether a default parameter value is being used?

Lets say I have a simple script with one parameter, which has a default value:
param(
[string] $logOutput = "C:\SomeFolder\File.txt"
)
# Script...
And lets say a user runs the script like so:
PS C:\> .\MyScript.ps1 -logOutput "C:\SomeFolder\File.txt"
Is there any way the script is able to know that the user explicitly entered a value (which happens to be the same as the default), rather than let the default be decided automatically?
In this example, the script is going to print some output to the given file. If the file was not specified by the user (i.e. the default got used automatically), then the script will not produce an error in the event it is unable to write to that location, and will just carry on silently. However, if the user did specify the location, and the script is unable to write to that location, then it should produce an error and stop, warning the user. How could I implement this kind of logic?
The simplest way to tell if a parameter was specified or not when you have a default is to look at $PSBoundParameters. For example:
if ($PSBoundParameters.ContainsKey('LogPath')) {
# User specified -LogPath
} else {
# User did not specify -LogPath
}
Would this work for you?
function Test-Param
{
[CmdletBinding()]
param(
[ValidateScript({Try {$null | Set-Content $_ -ErrorAction Stop
Remove-Item $_
$True}
Catch {$False}
})]
[string] $logOutput = 'C:\SomeFolder\File.txt'
)
}
The validate script will only run and throw an error if it is unable to write to the file location passed in using the -logOutput parameter. If the parameter is not specified, the test will not be run, and $logOutput will get set to the default value.

Using an answer file with a PowerShell script

I have a PowerShell script with a number of 'params' at the start:
param(
[switch] $whatif,
[string] $importPath = $(Read-Host "Full path to import tool"),
[string] $siteUrl = $(Read-Host "Enter URL to create or update"),
[int] $importCount = $(Read-Host "Import number")
)
Is there any way I can run this against an answer file to avoid entering the parameter values every time?
I am not getting the reason for the question. All you have to do to call your script is something like:
.\script.ps1 -whatif -importPath import_path -siteUrl google.com -importCount 1
The Read-Host are there as defaults, to be executed ( and then read and assign the values to the parameters ) only if you don't specify the values. As long you have the above comand ( saved in a file so that you can copy and paste into console or run from another script or whatever ), you don't have to enter the values again and again.
Start by setting the function or script up to accept pipeline input.
[CmdletBinding(SupportsShouldProcess=$True,ConfirmImpact='Low')]
param(
[Parameter(Mandatory=$True,ValueFromPipelineByPropertyName=$True)]
[string] $importPath,
[Parameter(Mandatory=$True,ValueFromPipelineByPropertyName=$True)]
[string] $siteUrl,
[Parameter(Mandatory=$True,ValueFromPipelineByPropertyName=$True)]
[int] $importCount
)
Notice that I removed your manually-created -whatif. No need for it - I'll get to it in a second. Also note that Mandatory=$True will make PowerShell prompt for a value if it isn't provided, so I removed your Read-Host.
Given the above, you could create an "answer file" that is a CSV file. Make an importPath column, a siteURL column, and an importCount column in the CSV file:
importPath,siteURL,importCount
"data","data",1
"x","y",2
Then do this:
Import-CSV my-csv-file.csv | ./My-Script
Assuming your script is My-Script.ps1, of course.
Now, to -whatif. Within the body of your script, do this:
if ($pscmdlet.shouldprocess($target)) {
# do whatever your action is here
}
This assumes you're doing something to $target, which might be a path, a computer name, a URL, or whatever. It's the thing you're modifying in your script. Put your modification actions/commands inside that if construct. Doing this, along with the SupportsShouldProcess() declaration at the top of the script, will enable -whatif and -confirm support. You don't need to code those parameters yourself.
What you're building is called an "Advanced Function," or if it's just a script than I guess it'd be an "Advanced Script." Utilizing pipeline input parameters in this fashion is the "PowerShell way of doing things."
To my knowledge, Powershell doesn't have a built-in understanding of answer files. You'll have to pass them in somehow or read them yourself from the answer file.
Wrapper. You could write another script that calls this script with the same parameters you want to use every time. You could also make a wrapper script that reads the values from the answer file, then pass them in.
Optional Parameters. Or you could change the parameters to use defaults that indicate no parameters were passed, then check for a file of a specific name to read values from. If the file isn't found, then prompt for the values.
If the format of the answer file is flexible, (i.e., you're only going to be using it with this Powershell script), you could get much closer to the behavior of an actual answer file by writing it as a Powershell script itself and dot-sourcing it.
if (test-path 'myAnswerfile'){
. 'myAnswerFile'
#process whatever was sourced from the answer file, if necessary
} else {
#prompt for values
}
It still requires removing the Read-Host calls from the parameters of the script.
Following on from Joel you could set up a different parameter set, based around the switch -answerfile.
If that's set the function will look for an answer file and parse though it - as he said you'll need to do that yourself. If it's not set and the others are then the functionis used with the parameters given. Minor benefit I see is that you can still have the parameters mandatory when used that way.
Matt