The way our business is set up, our custom cmdlets are spread out across the network in several different larger files. We refer to these files in the usual "Microsoft.PowerShell_profile.ps1"
Is there something I can run within Powershell to find where a cmdlet is running from instead of manually going through all the files referenced in "Microsoft.PowerShell_profile.ps1" to find it?
E.g. if I have written a cmdlet called Get-UserExpiry and it is saved in C:\User\Name\Documents\CustomCmds.ps1, and I include that file path in Microsoft.PowerShell_profile.ps1, is there a command I can use to find that file path if all I know is the cmdlet name?
Get-Command is what you need. That being said, depending on the command you are testing and the type of the command (external application, function, cmdlet, profile function), the command path won't always be assigned to the same property / subproperty.
Here's a way to get the path no matter where it is disclosed.
Function definition
Here we check the possible locations of the path based on the Get-Command result, filter out everything that is $null or empty and pick the first result we get.
Function Get-CommandLocation($Command) {
$definition = (Get-Command -Name $Command)
#(
$definition.Module.Path
$definition.Source
$definition.ScriptBlock.File
) | Where-Object { $_ } | Select-Object -First 1
}
Examples
Get-CommandLocation Get-Item # Native cmdlet
# Value obtained from (Get-Command -Name 'Get-Item').Module.Path
# Return C:\Windows\system32\WindowsPowerShell\v1.0\Modules\Microsoft.PowerShell.Management\Microsoft.PowerShell.Management.psd1
Get-CommandLocation Edit-Profile # Custom profile function (Test with a function in your profile (if any))
# Value obtained from(Get-Command -Name 'Edit-Profile').ScriptBlock.File
# Return C:\Users\SagePourpre\Documents\WindowsPowerShell\Microsoft.VSCode_profile.ps1
Get-CommandLocation New-ExternalHelp # PlatyPS module downloaded from the gallery
# Value obtained from (Get-Command New-ExternalHelp).Module.Path
# Return C:\Program Files\WindowsPowerShell\Modules\platyPS\0.14.2\platyPS.psm1
Get-CommandLocation -command cmd # External Program
# Value located in (Get-Command -Name 'cmd').Source
# Return C:\Windows\system32\cmd.exe
Environment:
Windows Server 2016
Windows 10 Pro
PowerShell 5.1
$myVariable is empty, I think and I'm expecting there to be a string value.
Invoke-Command -ComputerName WKSP000D1E3F -Credential $creds -ScriptBlock {
sqlcmd -E -Q "select top 1 FirstName from customers" -d database1 -S "(localdb)\ProjectsV13" | Tee-Object -Variable myVariable
}
Write-Host $myVariable
Cpt.Whale has provided the crucial pointer in a comment: you fundamentally cannot set local variables from a script block being executed remotely (via Invoke-Command -ComputerName) - you must use output from the script block to communicate data back to the caller.
While you could apply Tee-Object locally instead (Invoke-Command ... | Tee-Object), there's a simpler solution, which works with all cmdlets, including cmdlet-like (advanced) functions and scripts:
Use the common -OutVariable (-ov) parameter to capture a cmdlet's output in a self-chosen variable while passing that output through:
# Note the `-OutVariable myVariable` part
# and that the variable name must be specified *without* a leading "$"
# Output is still being passed through.
Invoke-Command -OutVariable myVariable -ComputerName WKSP000D1E3F -Credential $creds -ScriptBlock {
sqlcmd -E -Q "select top 1 FirstName from customers" -d database1 -S "(localdb)\ProjectsV13"
}
# $myVariable now contains the captured content.
By contrast, if you want to capture output only, without also passing it through (to the display, by default), you can heed Santiago Squarzon's advice and simply assign the Invoke-Command call to your variable ($myVariable = Invoke-Command ...).
Notes re -OutVariable(-ov):
As shown above, and as shown with Tee-Object -Variable in your question, the name of the self-chosen target variable must be specified without a leading $, e.g. -OutVariable var, not Out-Variable $var; if you did the latter, the value of a preexisting $var variable (if defined) would be used as the variable name.
Unlike directly captured output, the target variable always receives array(-like) data, specifically, an instance of the System.Collections.ArrayList class - even if only one output object is present; e.g.:
# -> 'types: v1: String vs. v2: ArrayList'
$v1 = Write-Output -OutVariable v2 'one'
"types: v1: $($v1.GetType().Name) vs. v2: $($v2.GetType().Name)"
That is, while directly capturing output captures a single output object as-is, and multiple ones in a regular PowerShell array (of type [object[]], -OutVariable always creates an ArrayList - see GitHub issue #3154 for a discussion of this inconsistency.
With commands that do not support -OutVariable, namely simple scripts and functions as well as external programs:
To pass the output through in streaming fashion, i.e. as it becomes available, pipe to Tee-Object -Variable; e.g.:
# Passes output through as it is being emitted.
some.exe | Tee-Object -Variable myVariable
Otherwise - i.e. if it is acceptable to collect all output first, before passing it through - simply enclose an assignment statement in (...) to pass its value through - this approach performs better than Tee-Object -Variable; e.g.:
# Collects all output first, then passes it through.
($myVariable = some.exe)
I have PowerShell console script (Not GUI) that has the following code:
$Servername = Read-host -Prompt "What is the server name?"
However, when running the script I want to type a few characters at time and have it lookup in external text file called servernames.txt for matches and the more characters I have the finer the results (Ideally I want to be able to select the match directly from the dynamic lookup).
The purpose is to facilitate typing names of hundreds of servers, as you wouldn't need to remember every name, because the servernames.txt file will have the whole server inventory.
I thought about Out-GridView, but not sure that would work in console script. Ideally should not pop-up another window.
The Register-ArgumentCompleter cmdlet registers a custom argument completer. An argument completer allows you to provide dynamic tab completion, at run time for any command that you specify.
Here is a function example tailored to your question (assumes C:\servernames.txt)
$scriptBlock = {
param($commandName, $parameterName, $wordToComplete, $commandAst, $fakeBoundParameters)
Get-Content C:\servernames.txt | Where-Object {
$_ -like "*$wordToComplete*"
} | ForEach-Object {
"'$_'"
}
}
#Register the above scriptblock to the Test-DynamicArguments function ComputerName Parameter
Register-ArgumentCompleter -CommandName Test-DynamicArguments -ParameterName ComputerName -ScriptBlock $scriptBlock
function Test-DynamicArguments {
[CmdletBinding()]
param
(
$ComputerName
)
"You Selected $ComputerName"
}
Now try Test-DynamicArguments with -ComputerName and part of a server name, you can tab complete to cycle options and also Ctrl-Space to show all.
Read the Register-ArgumentCompleter help page for more info, hope this helps
Is there a way to provide powershell parameters with a file?
At the moment I have a script which is called My_Script.ps1. To start this script I have to provide the right parameters in the command:
.\My_Script.ps1 -param1="x" -param2="x" -param3="x" -param4="x" -param5="x" -param6="x" ...
This works but it isn't a very easy way to start the script. Is it possible in powershell to use a file in which you store your parameters and to use that file when you start the script?
Example
In My_Script.ps1 I add something like:
Param(
[string]$File="Path/to/file"
)
In my file I have something like
param1="x"
param2="x"
param3="x"
param4="x"
...
To execute the script you can edit the file and just start the script with .\My_Script.ps1
Another option:
Just use a ps1 file as config file and define your variables as you would do in your main script
$Param1 = "Value"
$Param2 = 42
Then you can use dot-sourcing or import-module to get the data from the config file
. .\configfile.ps1
or
Import-Module .\Configfile.ps1
afterwards you can just use the variables
In addition to splatting you can create variables from = separated values in a file.
param1=foo
param2=bar
param3=herp
param4=derp
Don't quote the values. The parameter names should be valid for a variable (no spaces etc.)
PowerShell 3 and newer:
(Get-Content c:\params.ini -raw | ConvertFrom-StringData).GetEnumerator() |
ForEach { Set-Variable $_.name $_.value }
PowerShell 2:
([IO.File]::ReadAllText('c:\params.ini') | ConvertFrom-StringData).GetEnumerator() |
ForEach { Set-Variable $_.name $_.value }
The code creates variables in current scope. It's possible to create in a global/script/parent scope.
You can use this blog posting
for a start and declare your parameters in an ini-like format.
For sure you could also use a csv-like format and work with import-csv cmdlet.
I have a PowerShell script that I am debugging and would like to redirect all Write-Host statements to a file. Is there an easy way to do that?
Until PowerShell 4.0, Write-Host sends the objects to the host. It does not return any objects.
Beginning with PowerShell 5.0 and newer, Write-Host is a wrapper for Write-Information, which allows to output to the information stream and redirect it with 6>> file_name.
http://technet.microsoft.com/en-us/library/hh849877.aspx
However, if you have a lot of Write-Host statements, replace them all with Write-Log, which lets you decide whether output to console, file or event log, or all three.
Check also:
Add-Content
redirection operators like >, >>, 2>, 2>, 2>&1
Write-Log
Tee-Object
Start-Transcript.
You can create a proxy function for Write-Host which sends objects to the standard output stream instead of merely printing them. I wrote the below cmdlet for just this purpose. It will create a proxy on the fly which lasts only for the duration of the current pipeline.
A full writeup is on my blog here, but I've included the code below. Use the -Quiet switch to suppress the console write.
Usage:
PS> .\SomeScriptWithWriteHost.ps1 | Select-WriteHost | out-file .\data.log # Pipeline usage
PS> Select-WriteHost { .\SomeScriptWithWriteHost.ps1 } | out-file .\data.log # Scriptblock usage (safer)
function Select-WriteHost
{
[CmdletBinding(DefaultParameterSetName = 'FromPipeline')]
param(
[Parameter(ValueFromPipeline = $true, ParameterSetName = 'FromPipeline')]
[object] $InputObject,
[Parameter(Mandatory = $true, ParameterSetName = 'FromScriptblock', Position = 0)]
[ScriptBlock] $ScriptBlock,
[switch] $Quiet
)
begin
{
function Cleanup
{
# Clear out our proxy version of write-host
remove-item function:\write-host -ea 0
}
function ReplaceWriteHost([switch] $Quiet, [string] $Scope)
{
# Create a proxy for write-host
$metaData = New-Object System.Management.Automation.CommandMetaData (Get-Command 'Microsoft.PowerShell.Utility\Write-Host')
$proxy = [System.Management.Automation.ProxyCommand]::create($metaData)
# Change its behavior
$content = if($quiet)
{
# In quiet mode, whack the entire function body,
# simply pass input directly to the pipeline
$proxy -replace '(?s)\bbegin\b.+', '$Object'
}
else
{
# In noisy mode, pass input to the pipeline, but allow
# real Write-Host to process as well
$proxy -replace '(\$steppablePipeline\.Process)', '$Object; $1'
}
# Load our version into the specified scope
Invoke-Expression "function ${scope}:Write-Host { $content }"
}
Cleanup
# If we are running at the end of a pipeline, we need
# to immediately inject our version into global
# scope, so that everybody else in the pipeline
# uses it. This works great, but it is dangerous
# if we don't clean up properly.
if($pscmdlet.ParameterSetName -eq 'FromPipeline')
{
ReplaceWriteHost -Quiet:$quiet -Scope 'global'
}
}
process
{
# If a scriptblock was passed to us, then we can declare
# our version as local scope and let the runtime take
# it out of scope for us. It is much safer, but it
# won't work in the pipeline scenario.
#
# The scriptblock will inherit our version automatically
# as it's in a child scope.
if($pscmdlet.ParameterSetName -eq 'FromScriptBlock')
{
. ReplaceWriteHost -Quiet:$quiet -Scope 'local'
& $scriptblock
}
else
{
# In a pipeline scenario, just pass input along
$InputObject
}
}
end
{
Cleanup
}
}
You can run your script in a secondary PowerShell shell and capture the output like this:
powershell -File 'Your-Script.ps1' > output.log
That worked for me.
Using redirection will cause Write-Host to hang. This is because Write-Host deals with various formatting issues that are specific to the current terminal being used. If you just want your script to have flexibility to output as normal (default to shell, with capability for >, 2>, etc.), use Write-Output.
Otherwise, if you really want to capture the peculiarities of the current terminal, Start-Transcript is a good place to start. Otherwise you'll have to hand-test or write some complicated test suites.
Try adding a asterisk * before the angle bracket > to redirect all streams:
powershell -File Your-Script.ps1 *> output.log
When stream redirection is requested, if no specific stream is indicated then by default only the Success Stream(1>) is redirected. Write-Host is an alias for Write-Information which writes to the Information Stream (6>). To redirect all streams use *>.
Powershell-7.1 supports redirection of multiple output streams:
Success Stream (#1): PowerShell 2.0 Write-Output
Error Stream (#2): PowerShell 2.0 Write-Error
Warning Stream (#3): PowerShell 3.0 Write-Warning
Verbose Stream (#4): PowerShell 3.0 Write-Verbose
Debug Stream (#5): PowerShell 3.0 Write-Debug
Information Stream (#6): PowerShell 5.0 Write-Information
All Streams (*): PowerShell 3.0
This worked for me in my first PowerShell script that I wrote few days back:
function logMsg($msg)
{
Write-Output $msg
Write-Host $msg
}
Usage in a script:
logMsg("My error message")
logMsg("My info message")
PowerShell script execution call:
ps> .\myFirstScript.ps1 >> testOutputFile.txt
It's not exactly answer to this question, but it might help someone trying to achieve both logging to the console and output to some log file, doing what I reached here :)
Define a function called Write-Host. Have it write to a file. You may have some trouble if some invocations use a weird set of arguments. Also, this will only work for invocations that are not Snapin qualified.
If you have just a few Write-Host statements, you can use the "6>>" redirector operator to a file:
Write-Host "Your message." 6>> file_path_or_file_name
This is the "Example 5: Suppress output from Write-Host" provided by Microsoft, modified accordingly to about_Operators.
I just added Start-Transcript at the top of the script and Stop-Transcript at the bottom.
The output file was intended to be named <folder where script resides>-<datestamp>.rtf, but for some reason the trace file was being put where I did not expect it — the desktop!
You should not use Write-Host if you wish to have the messages in a file. It is for writing to the host only.
Instead you should use a logging module, or Set/Add-Content.
I have found the best way to handle this is to have a logging function that will detect if there is a host UI and act accordingly. When the script is executed in interactive mode it will show the details in the host UI, but when it is run via WinRM or in a non-interactive mode it will fall back on the Write-Output so that you can capture it using the > or *> redirection operators
function Log-Info ($msg, $color = "Blue") {
if($host.UI.RawUI.ForegroundColor -ne $null) {
Write-Host "`n[$([datetime]::Now.ToLongTimeString())] $msg" -ForegroundColor $color -BackgroundColor "Gray"
} else {
Write-Output "`r`n[$([datetime]::Now.ToLongTimeString())] $msg"
}
}
In cases where you want to capture the full output with the Write-Host coloring, you can use the Get-ConsoleAsHtml.ps1 script to export the host's scrolling buffer to an HTML or RTF file.
Use Write-Output instead of Write-Host, and redirect it to a file like this:
Deploy.ps1 > mylog.log or Write-Output "Hello World!" > mylog.log
Try using Write-Output instead of Write-Host.
The output goes down the pipeline, but if this is the end of the pipe, it goes to the console.
> Write-Output "test"
test
> Write-Output "test" > foo.txt
> Get-Content foo.txt
test