Temporarily change powershell language to English? - powershell

I wrote some software that uses the output of system (powershell) commands, but did not foresee that the output would be different for languages other than English.
Is there a way to temporarily change the language in Powershell to English for just that one, single powershell session?
Notes
In case it is of significance, the particular powershell code I wish to run is netstat -n -a
I have come across some ways to change powershell language (e.g. here, here). But I want to be careful not to change it permanently! (that would be bad)

(a) For external programs such as netstat.exe, there is unfortunately no way (that I know of) to change the UI language in-session:
On Windows Server 2012 / Windows 8 and above, the Set-WinUILanguageOverride cmdlet allows you to (persistently) change the system-wide UI language for the current user, but that only takes effect in future logon sessions - that is, logging off and back on or a reboot are required.
As an aside: On Windows Server 2012 / Windows 8 and above, there is also the Set-Culture cmdlet, but its purpose is not to change the UI culture (display language), but only culture-specific settings such as date, number, and currency formats. It too changes the setting persistently for the current user, but only requires a new session (process) for the change to take effect.
(b) For PowerShell commands and .NET types, there is an in-session (non-persistent) solution - assuming the commands are culture-aware and come with localized strings:
Set [cultureinfo]::CurrentUICulture (temporarily) to the desired culture name (use [cultureinfo]::GetCultures('SpecificCultures') to see all predefined ones) ; e.g., [cultureinfo]::CurrentUICulture = 'en-US'
Complementarily, you may want to set [cultureinfo]::CurrentCulture (note the missing UI part) as well, which determines the culture-specific number, date, ... formatting.
In older versions of PowerShell / .NET, you'll have to set these properties on [System.Threading.Thread]::CurrentThread instead; e.g.,
[System.Threading.Thread]::CurrentThread.CurrentUICulture = 'en-US'
See the bottom section for helper function Use-Culture that wraps this functionality for execution of code with a different culture temporarily in effect; here's an example call
with the culture-sensitive Get-LocalGroupMember cmdlet:
# Try with values other than "en-US", e.g. "fr-FR" to see localized
# values in the "ObjectClass" output column.
Use-Culture en-US { Get-LocalGroupMember Administrators }
An ad hoc example, if you don't want to define a helper function (only the UI culture is changed here):
& {
$prev=[cultureinfo]::CurrentUICulture
[cultureinfo]::CurrentUICulture='en-US'
Get-LocalGroupMember Administrators
[cultureinfo]::CurrentUICulture=$prev
}
Caveats:
PowerShell [Core] itself is not localized yet, as of v7.2.x; progress is being tracked in GitHub issue #666; however, the solution below does work with third-party modules that ship with localized messages and help content, as well as select Windows-specific modules that talk to platform APIs, such as the Microsoft.PowerShell.LocalAccounts module, whose Get-LocalGroupMember cmdlet was used in the example above.
Due to a bug in Windows PowerShell (PowerShell [Core] v6+ is not affected), in-session changes to [cultureinfo]::CurrentUICulture and [cultureinfo]::CurrentCulture are automatically reset at the command prompt, whenever a command finishes executing; however, for a given script the changes remain in effect for the entire script and its callees - see this answer.
Taking a step back:
I wrote some software that uses the output of system (powershell) commands, but did not foresee that the output would be different for languages other than English.
This is precisely why it's generally worth looking for PowerShell-native solutions as opposed to calling external programs:
Instead of having to parse - possibly localized - text, as with netstat.exe, for instance, PowerShell commands return objects whose properties you can robustly access in a culture-independent fashion.
Specifically, Mathias R. Jessen suggests looking at Get-NetTCPConnection as a PowerShell alternative to netstat.exe (available on Windows Server 2012 / Windows 8 and above).
Function Use-Culture's source code:
Note: The code was gratefully adapted from this venerable blog post; it is designed
# Runs a script block in the context of the specified culture, without changing
# the session's culture persistently.
# Handy for quickly testing the behavior of a command in the context of a different culture.
# Example:
# Use-Culture fr-FR { Get-Date }
function Use-Culture
{
param(
[Parameter(Mandatory)] [cultureinfo] $Culture,
[Parameter(Mandatory)] [scriptblock] $ScriptBlock
)
# Note: In Windows 10, a culture-info object can be created from *any* string.
# However, an identifier that does't refer to a *predefined* culture is
# reflected in .LCID containing 4096 (0x1000)
if ($Culture.LCID -eq 4096) { Throw "Unrecognized culture: $($Culture.DisplayName)" }
# Save the current culture / UI culture values.
$PrevCultures = [Threading.Thread]::CurrentThread.CurrentCulture, [Threading.Thread]::CurrentThread.CurrentUICulture
try {
# (Temporarily) set the culture and UI culture for the current thread.
[Threading.Thread]::CurrentThread.CurrentCulture = [Threading.Thread]::CurrentThread.CurrentUICulture = $Culture
# Now invoke the given code.
& $ScriptBlock
}
finally {
# Restore the previous culture / UI culture values.
[Threading.Thread]::CurrentThread.CurrentCulture = $PrevCultures[0]
[Threading.Thread]::CurrentThread.CurrentUICulture = $PrevCultures[1]
}
}

Original author of this code is #Scepticalist.
Run this from powershell console. It will change the culture to en-US for current session.
function Set-CultureWin([System.Globalization.CultureInfo] $culture) { [System.Threading.Thread]::CurrentThread.CurrentUICulture = $culture ; [System.Threading.Thread]::CurrentThread.CurrentCulture = $culture } ; Set-CultureWin en-US ; [system.threading.thread]::currentthread.currentculture
Then you have to use the command Get-NetTCPConnection Instead of netstat. For its usage see https://learn.microsoft.com/en-us/powershell/module/nettcpip/get-nettcpconnection?view=win10-ps

Related

Change title of another window through PowerShell

I have a VB application, which starts several instances of a third party non-GUI application. To keep track of these multiple instances, I update their title, using the SetWindowText() function. This application however has the nasty habit of continuously updating the title, so each SetWindowText works only temporary. As soon as you click anywhere in the screen, the tile is changed back.
I found a way to update the title through PowerShell, using the following code:
$titletext = "My Title"
# Start a thread job to change the window title to $titletext
$null = Start-ThreadJob { param( $rawUI, $windowTitle )
Start-Sleep -s 2
if ( $rawUI.WindowTitle -ne $windowTitle ) {
$rawUI.WindowTitle = $windowTitle
}
}-ArgumentList $host.ui.RawUI, $titletext
& 'c:\Program Files\Application\Application.exe' '-id=userid -pass=password'
This works perfectly and the title change is permanent, so exactly what I want. The only problem is that everything is being logged in the Windows PowerShell log, including the parameters -id= and -pass=.
A solution would be if I can start application.exe through my VB application and do the rename through a PowerShell script, but I don't know if that is possible through a ThreadJob.
Is it possible to start a ThreadJob and rename another window, maybe through it's handle?
Changing the console title from inside that console is your best bet, which is what your PowerShell code does.
While it is possible to call the SetWindowText() API function to set another process' console-window title, this change isn't guaranteed to stay in effect, because any subsequent interaction with such a window causes the original window title to be restored (this behavior seems to be built into conhost.exe, the console host underlying regular console windows on Windows).
By contrast, setting the title of the console window associated with the current process, does stay in effect (unless overridden again later), which is what the SetConsoleWindow() WinAPI function does (which shell- and API-based mechanisms such as title in cmd.exe, and [Console]::Title / $hostUI.RawUI.WindowTitle in PowerShell presumably ultimately call).
Therefore, stick with your PowerShell approach and avoid the password-logging problem with the help of an environment variable, as detailed below.
Windows PowerShell's script-block logging - see about_Logging - logs the source code of code being created.
You can avoid argument values from being logged if you - instead of providing literal arguments - provide them indirectly, via variables that you set from outside PowerShell.
Therefore:
Make your VB.NET application (temporarily) set an environment variable that contains the password. (Perhaps needless to say, storing and passing plain-text passwords is best avoided).
In your PowerShell script, refer to that environment variable instead of passing a literal password - that way, the actual password will not be shown in the logs.
For example, assuming that your VB.NET application has created environment variable MYPWD containing the password, before launching the PowerShell script:
$titletext = "My Title"
# Start a thread job to change the window title to $titletext
$null = Start-ThreadJob { param( $rawUI, $windowTitle )
Start-Sleep -s 2
if ( $rawUI.WindowTitle -ne $windowTitle ) {
$rawUI.WindowTitle = $windowTitle
}
} -ArgumentList $host.ui.RawUI, $titletext
# Note:
# * Assumes that your VB.NET application has set env. var. "MYPWD".
# * The arguments must be passed *individually*, not inside a single string.
& 'c:\Program Files\Application\Application.exe' -id=userid "-pass=$env:MYPWD"

powershell string urldecoding execution time

Performing URLdecoding in CGI hybrid script by powershell oneliner:
echo %C4%9B%C5%A1%C4%8D%C5%99%C5%BE%C3%BD%C3%A1%C3%AD%C3%A9%C5%AF%C3%BA| powershell.exe "Add-Type -AssemblyName System.Web;[System.Web.HttpUtility]::UrlDecode($Input) | Write-Host"
execution time of this oneliner is between 2-3 seconds on virtual machine. Is it because of .NET object is employed? Is there any way to decrease execution time? Have also some C written, lightning fast urldecode.exe utility, but unfortunatelly it does not eat STDIN.
A note if you're passing the input data as a string literal, as in the sample call in the question (rather than as output from a command):
If you're calling from an interactive cmd.exe session, %C4%9B%C5%A1%C4%8D%C5%99%C5%BE%C3%BD%C3%A1%C3%AD%C3%A9%C5%AF%C3%BA works as-is - unless any of the tokens between paired % instances happen to be the name of an existing environment variable (which seems unlikely).
From a batch file, you'll have to escape the % chars. by doubling them - see this answer; which you can obtain by applying the following PowerShell operation on the original string:
'%C4%9B%C5%A1%C4%8D%C5%99%C5%BE%C3%BD%C3%A1%C3%AD%C3%A9%C5%AF%C3%BA' -replace '%', '%%'
Is it because of .NET object is employed?
Yes, powershell.exe, as a .NET-based application requires starting the latter's runtime (CLR), which is nontrivial in terms of performance.
Additionally, powershell.exe by default loads the initialization files listed in its $PROFILE variable, which can take additional time.
Pass the -NoProfile CLI option to suppress that.
Have also some C written, lightning fast urldecode.exe utility, but unfortunately it does not eat STDIN.
If so, pass the data as an argument, if feasible; e.g.:
urldecode.exe "%C4%9B%C5%A1%C4%8D%C5%99%C5%BE%C3%BD%C3%A1%C3%AD%C3%A9%C5%AF%C3%BA"
If the data comes from another command's output, you can use for /f to capture in a variable first, and then pass the latter.
If you do need to call powershell.exe, PowerShell's CLI, after all:
There's not much you can do in terms of optimizing performance:
Add -NoProfile, as suggested.
Pass the input data as an argument.
Avoid unnecessary calls such as Write-Host and rely on PowerShell's implicit output behavior instead.[1]
powershell.exe -NoProfile -c "Add-Type -AssemblyName System.Web;[System.Web.HttpUtility]::UrlDecode('%C4%9B%C5%A1%C4%8D%C5%99%C5%BE%C3%BD%C3%A1%C3%AD%C3%A9%C5%AF%C3%BA')"
[1] Optional reading: How do I output machine-parseable data from a PowerShell CLI call?
Note: The sample commands are assumed to be run from cmd.exe / outside PowerShell.
PowerShell's CLI only supports text as output, not also raw byte data.
In order to output data for later programmatic processing, you may have to explicitly ensure that what is output is machine-parseable rather than something that is meant for display only.
There are two basic choices:
Rely on PowerShell's default output formatting for outputting what are strings (text) to begin with, as well as for numbers - though for fractional and very large or small non-integer numbers additional effort may be required.
Explicitly use a structured, text-based data format, such as CSV or Json, to represent complex objects.
Rely on PowerShell's default output formatting:
If the output data is itself text (strings), no extra effort is needed. This applies to your case, and therefore simply implicitly outputting the string returned from the [System.Web.HttpUtility]::UrlDecode() call is sufficient:
# A simple example that outputs a 512-character string; note that
# printing to the _console_ (terminal) will insert line breaks for
# *display*, but the string data itself does _not_ contain any
# (other than a single _trailing one), irrespective of the
# console window width:
powershell -noprofile -c "'x' * 512"
If the output data comprises numbers, you may have to apply explicit, culture-invariant formatting if your code must run with different cultures in effect:
True integer types do not require special handling, as their string representation is in effect culture-neutral.
However, fractional numbers ([double], [decimal]) do use the current culture's decimal mark, which later processing may not expect:
# With, say, culture fr-FR (French) in effect, this outputs
# "1,2", i.e. uses a *comma* as the decimal mark.
powershell -noprofile -c "1.2"
# Simple workaround: Let PowerShell *stringify* the number explicitly
# in an *expandable* (interpolating) string, which uses
# the *invariant culture* for formatting, where the decimal
# mark is *always "." (dot).
# The following outputs "1.2", irrespective of what culture is in effect.
powershell -noprofile -c " $num=1.2; \"$num\" "
Finally, very large and very small [double] values can result in exponential notation being output (e.g., 5.1E-07 for 0.00000051); to avoid that, explicit number formatting is required, which can be done via the .ToString() method:
# The following outputs 0.000051" in all cultures, as intended.
powershell -noprofile -c "$n=0.000051; $n.ToString('F6', [cultureinfo]::InvariantCulture)"
More work is needed if you want to output representations of complex objects in machine-parseable form, as discussed in the next section.
Relying on PowerShell's default output formatting is not an option in this case, because implicit output and (equivalent explicit Write-Output calls) cause the CLI to apply for-display-only formatting, which is meaningful to the human observer but cannot be robustly parsed.
# Produces output helpful to the *human observer*, but isn't
# designed for *parsing*.
# `Get-ChildItem` outputs [System.IO.FileSystemInfo] objects.
powershell -noprofile -c "Get-ChildItem /"
Note that use of Write-Host is not an alternative: Write-Host fundamentally isn't designed for data output, and the textual representation it creates for complex objects are typically not even meaningful to the human observer - see this answer for more information.
Use a structured, text-based data format, such as CSV or Json:
Note:
Hypothetically, the simplest approach is to use the CLI's -OutputFormat Xml option, which serializes the output using the XML-based CLIXML format PowerShell itself uses for remoting and background jobs - see this answer.
However, this format is only natively understood by PowerShell itself, and for third-party applications to parse it they'd have to be .NET-based and use the PowerShell SDK.
Also, this format is automatically used for both serialization and deserialization if you call another PowerShell instance from PowerShell, with the command specified as a script block ({ ... }) - see this answer. However, there is rarely a need to call the PowerShell CLI from PowerShell itself, and direct invocation of PowerShell code and scripts provides full type fidelity as well as better performance.
Finally, note that all serialization formats, including CSV and JSON discussed below, have limits with respect to faithfully representing all aspects of the data, though -OutputFormat Xml comes closest.
PowerShell comes with cmdlets such as ConvertTo-Csv and ConvertTo-Json, which make it easy to convert output to the structured CSV and JSON formats.
Using a Get-Item call to get information about PowerShell's installation directory ($PSHOME) as an example; Get-Item outputs a System.IO.DirectoryInfo instance in this case:
Use of ConvertTo-Csv:
C:\>powershell -noprofile -c "Get-Item $PSHOME | ConvertTo-Csv -NoTypeInformation"
"PSPath","PSParentPath","PSChildName","PSDrive","PSProvider","PSIsContainer","Mode","BaseName","Target","LinkType","Name","FullName","Parent","Exists","Root","Extension","CreationTime","CreationTimeUtc","LastAccessTime","LastAccessTimeUtc","LastWriteTime","LastWriteTimeUtc","Attributes"
"Microsoft.PowerShell.Core\FileSystem::C:\Windows\System32\WindowsPowerShell\v1.0","Microsoft.PowerShell.Core\FileSystem::C:\Windows\System32\WindowsPowerShell","v1.0","C","Microsoft.PowerShell.Core\FileSystem","True","d-----","v1.0","System.Collections.Generic.List`1[System.String]",,"v1.0","C:\Windows\System32\WindowsPowerShell\v1.0","WindowsPowerShell","True","C:\",".0","12/7/2019 4:14:52 AM","12/7/2019 9:14:52 AM","3/14/2021 10:33:10 AM","3/14/2021 2:33:10 PM","11/6/2020 3:52:41 AM","11/6/2020 8:52:41 AM","Directory"
Note: -NoTypeInformation is no longer needed in PowerShell (Core) 7+
Using ConvertTo-Json:
C:\>powershell -noprofile -c "Get-Item $PSHOME | ConvertTo-Json -Depth 1"
{
"Name": "v1.0",
"FullName": "C:\\Windows\\System32\\WindowsPowerShell\\v1.0",
"Parent": {
"Name": "WindowsPowerShell",
"FullName": "C:\\Windows\\System32\\WindowsPowerShell",
"Parent": "System32",
"Exists": true,
"Root": "C:\\",
"Extension": "",
"CreationTime": "\/Date(1575710092565)\/",
"CreationTimeUtc": "\/Date(1575710092565)\/",
"LastAccessTime": "\/Date(1615733476841)\/",
"LastAccessTimeUtc": "\/Date(1615733476841)\/",
"LastWriteTime": "\/Date(1575710092565)\/",
"LastWriteTimeUtc": "\/Date(1575710092565)\/",
"Attributes": 16
},
"Exists": true
// ...
}
Since JSON is a hierarchical data format, the serialization depth must be limited with -Depth in order to prevent "runaway" serialization when serializing arbitrary .NET types; this isn't necessary for [pscustomobject] and [hashtable] object graphs composed of primitive .NET types only.

Powershell Pipeline data to external console application

I have a console application which can take standard input. It buffers up the data until the execute command, at which point it executes it all, and sends the output to standard output.
At the moment, I am running this application from Powershell, piping commands into it, and then parsing the output. The data piped in is relatively small; however this application is being called about 1000 times. Each time it is executed, it has to load, and create network connections. I am wondering whether it might be more efficient to pipeline all the commands into a single instantiation of the console application.
I have tried this by adding all Powershell script, that manufactures the standard input for the console, into a function, then piping that function to the console application. This seems to work at first, but you eventually realise it is buffering up all the data in Powershell until the function has finished, then sending it to the console's StdIn. You can see this because I have a whole load of Write-Host statements that flash by, and only then do you see the output.
e.g.
Function Run-Command1
{
Write-Host "Run-Command1"
"GET nethost xxxx COLS id,name"
"EXEC"
}
Function Run-Command2
{
Write-Host "Run-Command2"
"GET nethost yyyy COLS id,name"
"GET users yyyy COLS id,name"
"EXEC"
}
...
Function Run-CommandX
{
...
}
Previously, I would use this as:
Run-Command1 | netapp.exe -connect QQQQ -U user -P password
Run-Command2 | netapp.exe -connect QQQQ -U user -P password
...
Run-CommandX | netapp.exe -connect QQQQ -U user -P password
But now I would like to do:
Function Run-Commands
{
Run-Command1
Run-Command2
...
Run-CommandX
}
Run-Commands |
netapp.exe -connect QQQQ -U user -P password
Ideally, I would like the Powershell pipeline behaviour to be extended to an external application. Is this possible?
I would like the Powershell pipeline behaviour to be extended to an external application.
I have a whole load of Write-Host statements that flash by, and only then do you see the output.
Tip of the hat to marsze.
PowerShell [Core] v6+ performs no buffering at all, and sends (stringified) output as it is being produced by a command to an external program, in the same manner that output is streamed between PowerShell commands.[1]
PowerShell's legacy edition (versions up to 5.1), Windows PowerShell, buffers in that it collects all output from a command first before sending it(s stringification) to an external program.
marsze's helpful answer shows a workaround based on direct use of .NET APIs.
However, I think even Windows PowerShell's behavior isn't the problem here: Your Run-Commands function executes very quickly - given that the functions it calls merely output string literals - and the resulting array of lines is then sent all at once to netapp.exe - and further processing, including when to produce output, is then up to netapp.exe. In PowerShell [Core] v6+, with PowerShell-side buffering out of the picture, the individual Run-Commmand<n> functions' output would be sent to netapp.exe ever so slightly earlier, but I wouldn't expect that to make a difference.
The upshot is that unless netapp.exe offers a way to adjust its input and output buffering, you won't be able to control the timing of its input processing and output production.
How PowerShell sends objects to an external program (native utility) via the pipeline:
It sends a stringified representation of each object:
in PowerShell [Core] v6+: as the object becomes available.
in Windows PowerShell: after having collected all output objects in memory first.
In other words: on the PowerShell side, from v6 onward, there is no buffering.[1]
However, receiving external programs typically do buffer the stdin (standard input) data they receive via the pipeline[2].
Similarly, external programs typically do buffer their stdout (standard output) streams (but PowerShell performs no additional buffering before passing the output on, such as to the terminal (console)).
PowerShell has no control over this behavior; either the external program itself offers an option to adjust buffering or, in limited cases on Linux, you can call the external program via the stdbuf utility.
Optional reading: How PowerShell stringifies objects when piping to external programs:
PowerShell, as of v7.1, knows only text when communicating with external programs; that is, data sent to such programs is converted to text, and output from such programs is interpreted as text - even though the underlying system IPC features are simply byte conduits.
The UTF-16-based .NET strings PowerShell uses are converted to byte streams for external programs based on the character encoding specified in the $OutputEncoding preference variable, which, regrettably, defaults to ASCII(!) in Windows PowerShell, and now sensibly to (BOM-less) UTF-8 in PowerShell [Core] v6+.
In other words: The encoding specified via $OutputEncoding must match the character encoding that the external program expects.
Conversely, it is the encoding specified in [Console]::OutputEncoding that determines how PowerShell interprets text received from an external program, i.e. how it converts the bytes received to .NET strings, line by line, with newlines stripped (which, when captured in a variable, amounts to either a single string, if only one line was output, or an array of strings).
The for-display representations you see in the PowerShell console (terminal) are also what is sent to external programs via the pipeline, as lines of text, specifically:
If an object (already) is a string (or [char] instance), PowerShell sends it as-is to the pipe, but with a platform-appropriate newline invariably appended.
That is, a CRLF newline is appended on Windows, and a LF-only newline on Unix-like platforms.
This behavior can be problematic, as there are situations where you do not want that, and there's no way to prevent it - see GitHub issue #5974, GitHub issue #13579, and this answer for a workaround.
If an object is, loosely speaking, a primitive type - something that is conceptually a single value, notably the various number types - it is stringified in a culture-sensitive manner, where available[3], a platform-appropriate newline is again invariably appended.
E.g., with, a French culture in effect (as reflected in Get-Culture), decimal fraction 1.2 - which PowerShell parses as a [double] value - is sent as 1,2<newline>.
Note that [bool] instances are not culture-sensitive and are always converted to strings True or False.
All other (complex) types are subject to PowerShell's rich for-display output formatting, and whatever you would see in the terminal (console) is also what is sent to external programs - which not only again potentially contains culture-sensitive representations, but is generally problematic in that these representations are designed for the human observer, not for programmatic processing.
The upshot:
Beware encoding problems - make sure $OutputEncoding and [Console]::OutputEncoding are set correctly.
To avoid unexpected culture-sensitivity and unexpected for-display formatting, it is best to deliberately construct the string representation you want to send.
[1] By default; however, you can explicitly request buffering - expressed as an object count - via the common -OutBuffer parameter
[2] On recent macOS and Linux platforms, the stdin buffer size is 64KB. On Unix-like platforms, utilities typically switch to line-buffering in interactive invocations, i.e. when the stream in question is connected to a terminal.
[3] The behavior is delegated to the .ToString() method of a type at hand, i.e. whether or not that method outputs a culture-sensitive representation.
EDIT: As #mklement0 pointed out, this is different in PowerShell Core.
In PowerShell 5.1 (and lower) think you would have to manually write each pipeline item to the external application's input stream.
Here's an attempt to build a function for that:
function Invoke-Pipeline {
[CmdletBinding()]
param (
[Parameter(Mandatory, Position = 0)]
[string]$FileName,
[Parameter(Position = 1)]
[string[]]$ArgumentList,
[int]$TimeoutMilliseconds = -1,
[Parameter(ValueFromPipeline)]
$InputObject
)
begin {
$process = [System.Diagnostics.Process]::Start((New-Object System.Diagnostics.ProcessStartInfo -Property #{
FileName = $FileName
Arguments = $ArgumentList
UseShellExecute = $false
RedirectStandardInput = $true
RedirectStandardOutput = $true
}))
$output = [System.Collections.Concurrent.ConcurrentQueue[string]]::new()
$event = Register-ObjectEvent -InputObject $process -EventName 'OutputDataReceived' ` -Action {
$Event.MessageData.TryAdd($EventArgs.Data)
} -MessageData $output
$process.BeginOutputReadLine()
}
process {
$process.StandardInput.WriteLine($InputObject)
[string]$line = ""
while (-not ($output.TryDequeue([ref]$line))) {
start-sleep -Milliseconds 1
}
do {
$line
} while ($output.TryDequeue([ref]$line))
}
end {
if ($TimeoutMilliseconds -lt 0) {
$exited = $process.WaitForExit()
}
else {
$exited = $process.WaitForExit($TimeoutMilliseconds)
}
if ($exited) {
$process.Close()
}
else {
try {$process.Kill()} catch {}
}
}
}
Run-Commands | Invoke-Pipeline netapp.exe "-connect QQQQ -U user -P password"
The problem is, that there is no perfect solution, because by definition, you cannot know when the external program will write something to its output stream, or how much.
Note: This function doesn't redirect the error stream. The approach would be the same though.

Understanding the get-culture command

I recently had some trouble with culture dependent returned values from my powershell script. The same script returned different values, depending on which machine it was.
So I thought that maybe the culture settings are different and for one server it returned.
get-culture : de-DE
for the other it was like : en-US
One value is for the keyboard settings but what does the other (second) stand for?
And is the second value bound to the OS installation or is that just a setting?
Is there a command in powershell to change the value?
Of course I first read the gelp get-help get-culture
DESCRIPTION
The Get-Culture cmdlet gets information about the current culture settings. This includes information about the
current language settings on the system, such as the keyboard layout, and the display format of items such as
numbers, currency, and dates.
But I am not satisfied with it.
The help for the Cmdlet Get-Culture contains a subheading name related links. Please note the last 2 lines.
Related Links
Online Version: http://go.microsoft.com/fwlink/p/?linkid=293965
Set-Culture
Get-UICulture
When searching for help also use the Get-Command Cmdlet.
Get-Command "*culture*"
You can view your 'current culture' by using the built in Powershell variables.
$PSCulture
$PSUICulture
The following code block returns the short date pattern of three different cultures.
### Creates an array of cultureinfo objects:
$myCulturesArray = #(
( $myDECulture = New-Object System.Globalization.CultureInfo("de-DE") ),
( $myGBCulture = New-Object System.Globalization.CultureInfo("en-GB") ),
( $myUSCulture = New-Object System.Globalization.CultureInfo("en-US") )
);
### Outputs today's date using each CultureInfo object
$myCulturesArray | foreach {
(Get-date).ToString('d', $_ )
}
Further reading:
Tobias Weltner put together a very useful set of pdfs, volume 3 is on culture.
Also, at the prompt:
Get-Help Get-Culture -Full
help about_Script_Internationalization

Forcing PowerShell errors output in English on localized systems

I need to run some PowerShell scripts across various operating systems. Most of them are in English version, however, some are localized for example German, French, Spanish, etc. The problem is local system administrators mostly do not now PowerShell and in the case the script fails and throws an error at them, instead of reading it they just send screenshots of such error messages to me and if the cause to this error is not obvious I am stuck with typing it to g. translate to find out what is going on.
Is there a switch I can run the whole script or single command with or a parameter or any other way to force errors in PowerShell to be displayed in English instead of the language that is default for that particular machine?
You can change the pipeline thread's CurrrentUICulture like so:
[Threading.Thread]::CurrentThread.CurrentUICulture = 'fr-FR'; Get-Help Get-Process
I'm on an English system but before I executed the line above, I updated help like so:
Update-Help -UICulture fr-FR
With that, the Get-Help call above gave me French help on my English system. Note: if I put the call to Get-Help on a new line, it doesn't work. Confirmed that PowerShell resets the CurrentUICulture before the start of each pipeline which is why it works when the commands are in the same pipeline.
In your case, you would need to have folks install English help using:
Update-Help -UICulture en-US
And then execute your script like so:
[Threading.Thread]::CurrentThread.CurrentUICulture = 'en-US'; .\myscript.ps1
[Threading.Thread]::CurrentThread.CurrentUICulture only affects to current one-liner, so you can use it for execution of single .ps1 file.
If you want to change messages to English throughout every command in a PowerShell window, you have to change the culture setting cached in PowerShell runtime with reflection like this:
# example: Set-PowerShellUICulture -Name "en-US"
function Set-PowerShellUICulture {
param([Parameter(Mandatory=$true)]
[string]$Name)
process {
$culture = [System.Globalization.CultureInfo]::CreateSpecificCulture($Name)
$assembly = [System.Reflection.Assembly]::Load("System.Management.Automation")
$type = $assembly.GetType("Microsoft.PowerShell.NativeCultureResolver")
$field = $type.GetField("m_uiCulture", [Reflection.BindingFlags]::NonPublic -bor [Reflection.BindingFlags]::Static)
$field.SetValue($null, $culture)
}
}
(from https://gist.github.com/sunnyone/7486486)