Is there a way to echo a Powershell command line? - powershell

I've got several Powershell scripts under construction, and one thing I'd like to do in them is spit out a line at the top of the output echoing the command line used.
Use case is: output is being redirected to a file, and a year from now when someone examines that file, I want them to be able to copy/paste the command from the output file to regenerate the same output where the only differences are chronological. [ Okay, that was a little too generic ... first case: I'm examining ACLs and want to be able to repeat the same examination on the newer data at any point in the future by simply copy/pasting the same command. ]
My script begins with the parameter definitions:
[CmdletBinding()]
Param(
[string] $filter="Name -like '*'",
[string] $user=$null,
[switch] $test01=$false,
[switch] $test02=$false
)
What I'm doing now is a fall-back position, knowing what parameters can be accepted, I'm dumping out the names & values of those parameters:
if ($user.Length -eq 0) { $u = "NULL" } else { $u = "|$user|" }
if ($test01) { $u += ", -TEST01" }
if ($test02) { $u += ", -TEST02" }
"RUN BEGINS at $((get-date).ToString('F')) -- Filter is |$filter|, User is $u"
Ugly, hacky, not even a hint of portability in it, and definitely NOT a copy/paste of the command.
Regardless, this CAN be mangled into a command line; but not generically, and not with 100% surety.
I've tried using $args, but apparently the either defining named parameters or the CmdletBinding() breaks that mechanism, because it's always empty. Tried $PsBoundParameters, Get-History, and even $0 .. $9 bash-like variables. So far, nothing I can find gives the command line that launched the script that's running.
$PsBoundParameters is close, it's got all the right data as key,value pairs that could be built up into a command line. but it still isn't a command line, and would require mangling to get it into one.
Get-History came even closer as it includes a complete command line; problem is it gives the command run RIGHT BEFORE the command that launched the script, not the command that launched it.
Running out of options ... but am way open to suggestion.

Found it! And it was as simple as I'd hoped it would be. [ to use ... finding it was a pain ]
$MyInvocation.Line

While I agree that $MyInvocation.Line will get the literal command used which seems to be what you want on the surface, I'd still argue that the data in $PSBoundParameters is more useful long term simply because you can't guarantee users will call your function in a way that makes the command line actually useful.
Consider the common case where callers have declared variables to hold parameter values:
$myfilter = "Name -like '*Joe*'"
MyFunction -filter $myfilter
Consider the case where callers create a hashtable to splat with:
$myParams = #{
filter = "Name -like '*Joe*'"
test01 = $true
}
MyFunction #myParams
If you only record the command line, you'd lose the parameter data in both of these cases. And if you really want a literal command that people can copy/paste from the log, it shouldn't be that hard to generate a synthetic command based on the data in $PSBoundParameters. It doesn't have to be literally the same command as long as the same parameter data gets passed in, right?

Related

Powershell Profile to append parameters to a certain command

I have a certain command that I want to be able to append a parameter to as a powershell profile function. Though I'm not quite sure the best way to be able to capture each time this command is run, any insight would be helpful.
Command: terraform plan
Each time a plan is run I want to be able to check the parameters and see if -lock=true is passed in and if not then append -lock=false to it. Is there a suitable way to capture when this command is run, without just creating a whole new function that builds that command? So far the only way I've seen to capture commands is with Start-Transcript but that doesn't quite get me to where I need.
The simplest approach is to create a wrapper function that analyzes its arguments and adds -lock=false as needed before calling the terraform utility.
function terraform {
$passThruArgs = $args
if (-not ($passThruArgs -match '^-lock=')) { $passThruArgs += '-lock=false'}
& (Get-Command -Type Application terraform) $passThruArgs
}
The above uses the same name as the utility, effectively shadowing the latter, as is your intent.
However, I would caution against using the same name for the wrapper function, as it can make it hard to understand what's going on.
Also, if defined globally via $PROFILE or interactively, any unsuspecting code run in the same session will call the wrapper function, unless an explicit path or the shown Get-Command technique is used.
Not to take away from the other answer posted, but to offer an alternative solution here's my take:
$Global:CMDLETCounter = 0
$ExecutionContext.InvokeCommand.PreCommandLookupAction = {
Param($CommandName, $CommandLookupEvents)
if ($CommandName -eq 'terraform' -and $Global:CMDLETCounter -eq 0)
{
$Global:CMDLETCounter++
$CommandLookupEvents.CommandScriptBlock = {
if ($Global:CMDLETCounter -eq 1)
{
if (-not ($args -match ($newArg = '-lock=')))
{
$args += "${newArg}true"
}
}
& "terraform" #args
$Global:CMDLETCounter--
}
}
}
You can make use of the $ExecutionContext automatic variable to tap into PowerShells parser and insert your own logic for a specific expression. In your case, youd be using terraform which the command input will be parsed for each token and checked against -lock= in the existing arguments. If not found, append -lock=true to the current arguments and execute the command again.
The counter you see ($Global:CMDLETCounter) is to prevent an endless loop as it would just recursively call itself without there being something to halt it.

Referencing text after script is called within PS1 Script

Let's take the PowerShell statement below as an example:
powershell.exe c:\temp\windowsbroker.ps1 IIS
Is it possible to have it scripted within windowsbroker.ps1 to check for that IIS string, and if it's present to do a specific install script? The broker script would be intended to install different applications depending on what string followed it when it was called.
This may seem like an odd question, but I've been using CloudFormation to spin up application environments and I'm specifying an "ApplicationStack" parameter that will be referenced at the time when the powershell script is run so it knows which script to run to install the correct application during bootup.
What you're trying to do is called argument or parameter handling. In its simplest form PowerShell provides all arguments to a script in the automatic variable $args. That would allow you to check for an argument IIS like this:
if ($args -contains 'iis') {
# do something
}
or like this if you want the check to be case-sensitive (which I wouldn't recommend, since Windows and PowerShell usually aren't):
if ($args -ccontains 'IIS') {
# do something
}
However, since apparently you want to use the argument as a switch to trigger specific behavior of your script, there are better, more sophisticated ways of doing this. You could add a Param() section at the top of your script and check if the parameter was present in the arguments like this (for a list of things to install):
Param(
[Parameter()]
[string[]]$Install
)
$Install | ForEach-Object {
switch ($_) {
'IIS' {
# do something
}
...
}
}
or like this (for a single option):
Param(
[switch]$IIS
)
if ($IIS.IsPresent) {
# do something
}
You'd run the script like this:
powershell "c:\temp\windowsbroker.ps1" -Install "IIS",...
or like this respectively:
powershell "c:\temp\windowsbroker.ps1" -IIS
Usually I'd prefer switches over parameters with array arguments (unless you have a rather extensive list of options), because with the latter you have to worry about spelling of the array elements, whereas with switches you got a built-in spell check.
Using a Param() section will also automatically add a short usage description to your script:
PS C:\temp> Get-Help windowsbroker.ps1
windowsbroker.ps1 [-IIS]
You can further enhance this online help to your script via comment-based help.
Using parameters has a lot of other advantages on top of that (even though they probably aren't of that much use in your scenario). You can do parameter validation, make parameters mandatory, define default values, read values from the pipeline, make parameters depend on other parameters via parameter sets, and so on. See here and here for more information.
Yes, they are called positional parameters. You provide the parameters at the beginning of your script:
Param(
[string]$appToInstall
)
You could then write your script as follows:
switch ($appToInstall){
"IIS" {"Install IIS here"}
}

Passing string variable to Sharepoint command without desired outcome

I have a number of fields that I am using to build a string for an extraction of values from a SharePoint 2013 list.
I use this to build the string.
foreach($Column in $StringColumns){
$Fields=$Fields+"`""+$Column+"`""
if($Loop -ne $ColumnCount){
$Fields=$Fields+","
$Loop++}
}
I take the built $Fields [string] variable and pass it to this command.
$SPList.getitems($queryfromsource)[$ItemNumber][$Fields]
The result is that I receive no output from the command. What makes it odd is that I can confirm that $Fields has the appropriate string in it for that command. I have done so by calling it in the console and then copying the output into the SharePoint command directly. When I do that, I receive the output I am looking for.
This seems like it should be incredibly simple but it is driving me insane.
would something like this work?
foreach ($Column in $StringColumns)
{
$Fields+= "`"$($Column)`""
if ($Loop -ne $ColumnCount)
{
$Fields+= ","
$Loop++
# Only use Write-Host for debugging
Write-Host "Column $Loop of $ColumnCount"
}
}
Do you initialize $Loop somewhere, might try having it print the loop number you're on?
You probably should learn a bit about string building techniques in PowerShell. That convoluted construct you have there is not just too much work to create, it's also difficult to debug, as your example proves perfectly. You need to initialize $Loop to 1, not 0 (I assume that's what you are doing, please always post the complete code), or put the $Loop++ at the beginning of the loop, otherwise the string has a comma at the end. You also need to initialize the $Fields variable to an empty string.
Have you checked the string output of your script by printing the actual variable? If you check it in the console, you might have already initialized some variable which you forgot to do in the script.
A much easier way to build the string:
$columns = 'Title', 'Created', 'Description'
$string = "`"$($columns -join '","')`""

Indent output from private functions in Powershell

I have a bunch of functions that I call that produce output that is displayed to the console. Functions might look something like the following:
exec { & .\xunit.console.clr4 tests.xunit }
#or
exec { & .\nuget.exe pack $source_dir\ZocMonLib\NuSpec\ZocMon.nuspec -OutputDirectory $build_dir\local -Symbols -Version $version }
Now I know I could do something like powershell indentation but that only works if I control the output.
How do I do the indenting of output for these private functions?
Ok, I wrote a version that does the line wrapping right. But it's slightly complicated. I posted it on PoshCode http://poshcode.org/3386
That should work for Write-Host or Write-Verbose, but it will not work if those functions are actually outputting objects -- you'd have to pipe to Write-Host.
The function on PoshCode will (optionally) auto-indent based on the stack depth, but also allow you to specify -Pad 5 or something to manually indent, so you can call nuget.exe ... | write-host -pad 5 or just stick | Write-Host wherever you need it, and then set $WriteVerboseAutoIndent = $true ...
Hope that helps -- it does do manual line wrapping on the output of exes, so it should work.
There's not a great solution, because PowerShell doesn't always run in the console window. Other hosting applications might or might not support tab characters, and might not even support Write-Host. If your goal is strictly to support console display, consider writing a "Format-Console" function.
nuget list NuGetPowerTools | Format-Console
Inside that function, you can capture the pipeline input (which I presume would be strings since this is an external command). Each line of output would be a single String object, so...
Write-Host " $x"
Would display that indented by four spaces.
function Format-Console {
[CmdletBinding()]
param([Parameter(ValueFromPipeline=$True)][string[]]$inputObject)
PROCESS { Write-Host " $inputObject" }
}
That's kinda quick and dirty, but assuming you only ever pipe strings to it, it'll work. Building this as a function lets it be more reusable; using the Format- verb cues other users that the output of this isn't intended to be consumable. It technically isn't a true "Format" cmdlet since it doesn't output internal formatting directives, but it's consistent with the usage pattern for
Can't you assign the result of your private functions to a string and "tab" that string?
$x = nuget list NuGetPowerTools
Write-Host "`t`t$x"

Using an answer file with a PowerShell script

I have a PowerShell script with a number of 'params' at the start:
param(
[switch] $whatif,
[string] $importPath = $(Read-Host "Full path to import tool"),
[string] $siteUrl = $(Read-Host "Enter URL to create or update"),
[int] $importCount = $(Read-Host "Import number")
)
Is there any way I can run this against an answer file to avoid entering the parameter values every time?
I am not getting the reason for the question. All you have to do to call your script is something like:
.\script.ps1 -whatif -importPath import_path -siteUrl google.com -importCount 1
The Read-Host are there as defaults, to be executed ( and then read and assign the values to the parameters ) only if you don't specify the values. As long you have the above comand ( saved in a file so that you can copy and paste into console or run from another script or whatever ), you don't have to enter the values again and again.
Start by setting the function or script up to accept pipeline input.
[CmdletBinding(SupportsShouldProcess=$True,ConfirmImpact='Low')]
param(
[Parameter(Mandatory=$True,ValueFromPipelineByPropertyName=$True)]
[string] $importPath,
[Parameter(Mandatory=$True,ValueFromPipelineByPropertyName=$True)]
[string] $siteUrl,
[Parameter(Mandatory=$True,ValueFromPipelineByPropertyName=$True)]
[int] $importCount
)
Notice that I removed your manually-created -whatif. No need for it - I'll get to it in a second. Also note that Mandatory=$True will make PowerShell prompt for a value if it isn't provided, so I removed your Read-Host.
Given the above, you could create an "answer file" that is a CSV file. Make an importPath column, a siteURL column, and an importCount column in the CSV file:
importPath,siteURL,importCount
"data","data",1
"x","y",2
Then do this:
Import-CSV my-csv-file.csv | ./My-Script
Assuming your script is My-Script.ps1, of course.
Now, to -whatif. Within the body of your script, do this:
if ($pscmdlet.shouldprocess($target)) {
# do whatever your action is here
}
This assumes you're doing something to $target, which might be a path, a computer name, a URL, or whatever. It's the thing you're modifying in your script. Put your modification actions/commands inside that if construct. Doing this, along with the SupportsShouldProcess() declaration at the top of the script, will enable -whatif and -confirm support. You don't need to code those parameters yourself.
What you're building is called an "Advanced Function," or if it's just a script than I guess it'd be an "Advanced Script." Utilizing pipeline input parameters in this fashion is the "PowerShell way of doing things."
To my knowledge, Powershell doesn't have a built-in understanding of answer files. You'll have to pass them in somehow or read them yourself from the answer file.
Wrapper. You could write another script that calls this script with the same parameters you want to use every time. You could also make a wrapper script that reads the values from the answer file, then pass them in.
Optional Parameters. Or you could change the parameters to use defaults that indicate no parameters were passed, then check for a file of a specific name to read values from. If the file isn't found, then prompt for the values.
If the format of the answer file is flexible, (i.e., you're only going to be using it with this Powershell script), you could get much closer to the behavior of an actual answer file by writing it as a Powershell script itself and dot-sourcing it.
if (test-path 'myAnswerfile'){
. 'myAnswerFile'
#process whatever was sourced from the answer file, if necessary
} else {
#prompt for values
}
It still requires removing the Read-Host calls from the parameters of the script.
Following on from Joel you could set up a different parameter set, based around the switch -answerfile.
If that's set the function will look for an answer file and parse though it - as he said you'll need to do that yourself. If it's not set and the others are then the functionis used with the parameters given. Minor benefit I see is that you can still have the parameters mandatory when used that way.
Matt