PowerShell stderr redirect to file inserts newlines - powershell

Edit: I created a PowerShell UserVoice "suggestion" for (against?) this behavior; feel free to upvote.
PowerShell (5.1.16299.98, Windows 10 Pro 10.0.16299) is inserting newlines into my stderr when I redirect to file—as if to format for the console. Let's generate error messages of arbitrary length:
class Program
{
static void Main(string[] args)
{
System.Console.Error.WriteLine(new string('x', int.Parse(args[0])));
}
}
I compiled the above to longerr.exe. Then I call it like this:
$ .\longerr.exe 60 2>error.txt
I ran the following script in a PowerShell console with window width 60:
$h = '.\longerr.exe : '.Length
$w = 60 - 1
$f = 'error.txt'
Remove-Item $f -ea Ignore
(($w-$h), ($w-$h+1), ($w), ($w+1), ($w*2-$h), ($w*2-$h+1)) |
% {
$_ >> $f
.\longerr.exe $_ 2>>$f
}
Now in a wider console I ran the following:
$ Get-Content $f | Select-String '^(?![+\t]|At line| )'
(I could have just opened the file in a text editor and trimmed lines.) Here's the output:
43
.\longerr.exe : xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
44
.\longerr.exe :
xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
59
.\longerr.exe :
xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
60
.\longerr.exe : xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
xxxxxxxxxxxxxxxxx
102
.\longerr.exe : xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
103
.\longerr.exe : xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
x
Why is PowerShell doing this? Can I make it stop? I'd rather not have to do something like this:
.\longerr.exe $_ 2>&1 |
% {
if ($_ -is [System.Management.Automation.ErrorRecord]) {
$_.Exception.Message | Out-File -FilePath $f -Append
}
}
First, all the opening and closing of that file can be slow (I know I can add more code and use a StreamWriter), and second, there is still a problem (bug?) with that approach, which I won't go into in this question.
For a sanity check, I ran longerr.exe 1000 2>test.txt in cmd.exe; it inserted no spurious linebreaks.

Thanks to TessellatingHeckler's comment pointing me to the question Why does PowerShell chops message on stderr?, I've been able to fix both problems in my question—the main one and the one I mentioned at the end. The key is the following:
The complete output on stderr of the executable is simply split across several objects of type System.Management.Automation.ErrorRecord. The actual splitting seems to be non deterministic (*). Moreover, the partial strings are stored inside the property Exception instead of TargetObject. Only the first ErrorRecord has a non-null TargetObject.
⋮
(*) It depends on the order of write/flush calls of the program in relation to the read calls of the Powershell. If one adds a fflush(stderr) after each fprintf() in my test program below, there will be much more ErrorRecord objects. Except the first one, which seems deterministic, some of them include 2 output lines and some of them 3.
With this, I was able to modify longerr.exe to also reproduce the bug I alluded to at the end:
class Program
{
static void Main(string[] args)
{
if (args.Length == 1)
{
System.Console.Error.WriteLine(new string('x', int.Parse(args[0])));
}
else
{
for (int i = 0; i < int.Parse(args[1]); i++)
{
System.Console.Error.WriteLine("\n");
System.Console.Error.WriteLine(new string('x', int.Parse(args[0])));
}
}
}
}
Here's the PowerShell script which works (and efficiently):
$p_out = 'success.txt'
$p_err = 'error.txt'
try
{
[Environment]::CurrentDirectory = $PWD
$append = $false
$out = [System.IO.StreamWriter]::new($p_out, $append)
$err = [System.IO.StreamWriter]::new($p_err, $append)
.\longerr.exe 2000 4 2>&1 |
% {
if ($_ -is [System.Management.Automation.ErrorRecord]) {
# https://stackoverflow.com/a/33858097/2328341
if ($_.TargetObject -ne $null) {
$err.WriteLine();
}
$err.Write($_.Exception.Message)
} else {
$out.WriteLine($_)
}
}
}
finally
{
$out.Close()
$err.Close()
}
Notes:
Without the test for TargetObject, I would eliminate the main problem I was focusing on in my question, but I would still get the "bug" I mentioned at the end, which the linked SO question addresses.
I could have used Out-File (with -NoNewline) instead of StreamWriter, but there are two problems with that:
Windows Defender was making all the file opens and closes over an order of magnitude slower than when I turned off "Real-time protection" (either globally or on the directory containing error.txt and success.txt).
Even without Defender slowing things down, StreamWriter out-performs Out-File by over an order of magnitude. For reference, I'm using a Samsung 960 EVO for storage.
The StreamWriter(string, bool) constructor writes UTF-8 with no Byte-Order Mark (BOM), while PowerShell 5.1's redirection operators > and >> use UTF-16 LE with BOM. For reference, PowerShell 6.0 defaults to UTF-8 with no BOM.
(I've included stdout for completeness in real-world situations.) Now, that's an absolutely ridiculous amount of work to get the following functionality of cmd.exe:
$ .\longerr.exe 2000 4 2>error.txt

Related

"sh: 1: file: not found" thrown in Perl

So this is an issue I see thrown around on several coding help-sites that always have a slight variation. I'm not entirely familiar with what it means, and what's even more curious is that this error is thrown midway through a larger Upload.pm script, and does not cause any sort of fatal error. It gets tossed into my error log somewhere during this unless conditional snippet
# If this is the first slice, validate the file extension and mime-type. Mime-type of following slices should be "application/octet-stream".
unless ( defined $response{'error'} ) {
if ( $slice->{'index'} == 1 ) {
my ($filename, $directory, $extension) = fileparse($path.$parent_file, qr/\.[^.]*/);
unless ( is_valid_filetype($slice->{'tmp_file'}, $extension) ) {
$response{'error'} = "Invalid file type.";
$response{'retry'} = 0;
}
}
}
Now, let me be perfectly honest. I don't really understand the error message, and I could really use some help understanding it, as well as solving it.
Our Perl based web app has refused to let us upload files correctly since upgrading to Debian Bullseye, and I've been stuck debugging this code I didn't write for a few days now. I'm wondering if the upgrade depreciated some Perl modules, or if the directories to said modules are no longer working?
I'm testing this in a Ubuntu based Docker environment running Debian Bullseye on an Apache 2 server.
If you need any more context, clarification, etc, please let me know.
is_valid_filetype() looks like this:
sub is_valid_filetype
{
my ($tmp_file, $extension) = #_;
if ( $tmp_file && $extension ) {
# Get temp file's actual mime-type.
my $mime = qx/file --mime-type -b '${tmp_file}'/;
$mime =~ s/^\s+|\s+$//g;
# Get valid mime-types matching this extension.
my $dbh = JobTracker::Common::dbh or die("DBH not available.");
my $mime_types = $dbh->selectrow_array('SELECT `mime_types` FROM `valid_files` WHERE `extension` = ?', undef, substr($extension, 1));
if ( $mime && $mime_types ) {
if ( $mime_types !~ /,/ ) {
# Single valid mime-type for this extension.
if ( $mime eq $mime_types ) {
return 1;
}
} else {
# Multiple valid mime-types for this extension.
my %valid_mimes = map { $_ => 1 } split(/,/, $mime_types);
if ( defined $valid_mimes{$mime} ) {
return 1;
}
}
}
}
return 0;
}
It's a message from sh (not Perl). It concerns an error on line 1 of the script, which was apparently an attempt to run the file utility. But sh couldn't find it.
The code in question executes this command using
qx/file --mime-type -b '${tmp_file}'/
Install file or adjust the PATH so it can be found.
Note that this code suffers from a code injection bug. It will fail if the string in $tmp_path contains a single quote ('), possibly resulting in the unintentional execution of code.
Fixed:
use String::ShellQuote qw( shell_quote );
my $cmd = shell_quote( "file", "--mime-type", "-b", $tmp_file" );
qx/$cmd/
Debian Bullseye was reading our CSV files as the wrong mime-type. It was interpreting the file command as application/csv, despite obviously not being an application.
This may be an actual bug in Bullseye, because both my boss and I have scoured the internet with no lucky finding anyone else with this issue. I may even report to Bullseye's devs for further awareness.
The fix was manually adding in our own mime-types that interpreted this file correctly.
It took us dumping the tmp directory to confirm the files existed, and triple checking I had my modules installed.
This was such a weird and crazy upstream issue that either of us could not have imaged it would be the file type interpretation at an OS level in Bullseye.
I really hope this helps someone, saves them the time it took us to find this.

Running Access Macro in Powershell

I'm trying to run an Access 2010 macro in PowerShell (v4.0 Windows 8.1) with the below code:
$Access = New-Object -com Access.Application
$Access.OpenCurrentDatabase("SomePath", $False, "Password")
$Access.Run("SomeProc")
$Access.CloseCurrentDatabase()
$Access.Quit()
[System.Runtime.InteropServices.Marshal]::ReleaseComObject($Access)
Remove-Variable Access
I get an error on the line $Access.Run("SomeProc") that there's not enough parameters specified:
Exception calling "Run" with "1" argument(s): "Invalid number of parameters. (Exception
from HRESULT: 0x8002000E (DISP_E_BADPARAMCOUNT))"
The procedure SomeProc does not require any parameters.
I've read the msdn article on the run method and only one parameter is required.
I've also tried this workaround which also failed to work for an unrelated reason.
Does anyone know what the cause of the error could be and how to get the method working?
This is a driver issue where the OLEDB libraries aren't loading correctly.
I was able to reproduce your error exactly, and I was able to work around it by opening Powershell from your SysWow directory instead of System32.
Try opening this version of Powershell (you'll have to run set-executionpolicy again), and see if it'll execute your script.
%SystemRoot%\syswow64\WindowsPowerShell\v1.0\powershell.exe
Helpful link: https://social.msdn.microsoft.com/Forums/en-US/4500877f-0031-426e-869d-bda33d9fe254/microsoftaceoledb120-provider-cannot-be-found-it-may-not-be-properly-installed?forum=adodotnetdataproviders
The C# signature is something like this:
public object Run(string Procedure, ref object Arg1, ... ref object Arg30) ...
It means that COM the Arg optional arguments are not optional in .NET because they are explicitly marked as [ref]. You need to provide all 32 args even if you don't use them.
Assuming you have the following VBA code:
Public Sub Greeting(ByVal strName As String)
MsgBox ("Hello, " & strName & "!"), vbInformation, "Greetings"
End Sub
You can either use call it like this:
$Access = New-Object -com Access.Application
$Access.OpenCurrentDatabase("Database1.accdb")
$runArgs = #([System.Reflection.Missing]::Value) * 31
$runArgs[0] = "Greeting" #Method Name
$runArgs[1] = "Jeno" #First Arg
$Access.GetType().GetMethod("Run").Invoke($Access, $runArgs)
In your case it will be:
$runArgs = #([System.Reflection.Missing]::Value) * 31
$runArgs[0] = "SomeProc"
$Access.GetType().GetMethod("Run").Invoke($Access, $runArgs)
I would probably try to add a helper to the access object:
Add-Member -InputObject $Access -MemberType ScriptMethod -Name "Run2" -Value {
$runArgs = #([System.Reflection.Missing]::Value) * 31
for($i = 0; $i -lt $args.Length; $i++){ $runArgs[$i] = $args[$i] }
$this.GetType().GetMethod("Run").Invoke($this, $runArgs)
}
Then you can use Run2 as you would expect:
$Access.Run2("Greeting", "Jeno")
$Access.Run2("SomeProc")

Sed : Add a line at the starting of each TCL proc

I have a TCL proc like this, & want to add a line after the start of the proc, the puts "puts " entered myproc" " line
proc myproc { {filename "input.txt"}
{var1 "x"}
{var2 "y"}
{var3 "z"}
{var4 ""}
{var5 "0"}
{var6 "0"}
{var7 0}
} {
puts " entered myproc"
Can you help?
& it should also work for
proc myproc2 { N val } {
puts " entered myproc"
# comment line
set ret {} for { set i 0 } { $i < $N } { incr i } { lappend ret $val }
return $ret
}
If all you want to do is get an execution trace of your code, such as a call stack dump etc, then you don't need to modify your source code at all. You can use tcl itself to do it for you.
Tcl has no reserved keywords, none at all. Not even proc is reserved. You can therefore redefine it:
rename proc _proc
# Now proc no longer exists but we have _proc instead.
# Use it to redefine "proc":
_proc proc {name arguments body} {
set body "puts \"entered $name\";$body"
_proc $name $arguments $body
}
Just do that before running any of your own code and you'll find that every proc prints out when it's being entered on each call.
This is how a lot of tcl debuggers and profilers work - using tcl to redifine itself.
From your comments it looks like you're trying to also print how deep the stack is with each call. To do that you need to add more code to each proc definition. The most straightforward way is of course something like this:
_proc proc {name arguments body} {
set preamble"set dist2top \[info level\];puts \"\$dist2top entered $name\""
set body "$preamble;$body"
_proc $name $arguments $body
}
But as you can see, writing code inside strings can quickly become unmanagable. There are several tricks you can use to make it more manageable. One of the more common is to split $body by line and use list commands to manipulate code. It should reduce at least one level of quoting hell. My favorite is to use a templating technique similar to how you'd write html templates in MVC frameworks. I usually use string map for this:
_proc proc {name arguments body} {
_proc $name $arguments [string map [list %NAME% $name %BODY% $body] {
set dist2top [info level]
puts "$dist2top entered: %NAME%"
%BODY%
}]
}
The last argument in the _proc definition is just a string but it looks like a code block which makes it easier to read. No nasty quoting hell with this technique.
Using awk you can do:
awk '/^ *proc/ {$0 = $0 "\nputs \" entered myproc\""} 1' RS= proc-file.tcl
Gives this file:
proc myproc { {filename "input.txt"}
{var1 "x"}
{var2 "y"}
{var3 "z"}
{var4 ""}
{var5 "0"}
{var6 "0"}
{var7 0}
} {
puts " entered myproc"

Calling a local function from a dot sourced file

I have a main script that I am running. What it does is read through a directory filled with other powershell scripts, dot includes them all and runs a predefined method in each made up of the first portion of the dot delimited file name. Example:
Run master.ps1
Master.ps1 dot sources .\resource\sub.ps1
Sub.ps1 has defined a function called 'dosub'
Master.ps1 runs 'dosub' using Invoke-Expression
Also defined in sub.ps1 is the function 'saysomething'. Implemented in'dosub' is a call to 'saysomething'.
My problem is I keep getting the error:
The term 'saysomething' is not recognized as the name of a cmdlet,
function, script file, or operable program. Check the spelling of the
name, or if a path was included, verify that the path is correct and
try again.
Why can't the method 'dosub' find the method 'saysomething' which is defined in the same file?
master.ps1:
$handlersDir = "handlers"
$handlers = #(Get-ChildItem $handlersDir)
foreach ( $handler in $handlers ) {
. .\$handlersDir\$handler
$fnParts = $handler.Name.split(".")
$exp = "do" + $fnParts[0]
Invoke-Expression $exp
}
sub.ps1:
function saysomething() {
Write-Host "I'm here to say something!"
}
function dosub() {
saysomething
Write-Host "In dosub!"
}
Your code works on my system. However you can simplify it a bit:
$handlersDir = "handlers"
$handlers = #(Get-ChildItem $handlersDir)
foreach ( $handler in $handlers )
{
. .\$handlersDir\$handler
$exp = "do" + $handler.BaseName
Write-Host "Calling $exp"
& $exp
}
Note the availability of the BaseName property. You also don't need to use Invoke-Expression. You can just call the named command ysing the call (&) operator.
What you have given works as needed. You probably don't have the directories etc proper on your machine. Or you are running something else and posting a different ( working!) code here.
You can also make following corrections:
. .\$handlersDir\$handler
instead of above you can do:
. $handler.fullname
Instead the splitting of the filename you can do:
$exp = "do" + $handler.basename

Output redirection still with colors in PowerShell

Suppose I run msbuild like this:
function Clean-Sln {
param($sln)
MSBuild.exe $sln /target:Clean
}
Clean-Sln c:\temp\SO.sln
In Posh console the output is in colors. That's pretty handy - you spot colors just by watching the output. And e.g. not important messages are grey.
Question
I'd like to add ability to redirect it somewhere like this (simplified example):
function Clean-Sln {
param($sln)
MSBuild.exe $sln /target:Clean | Redirect-AccordingToRedirectionVariable
}
$global:Redirection = 'Console'
Clean-Sln c:\temp\SO.sln
$global:Redirection = 'TempFile'
Clean-Sln c:\temp\Another.sln
If I use 'Console', the cmdlet/function Redirect-AccordingToRedirectionVariable should output the msbuild messages with colors the same way as the output was not piped. In other words - it should leave the output as it is.
If I use 'TempFile', Redirect-AccordingToRedirectionVariable will store the output in a temp file.
Is it even possible? I guess it is not :|
Or do you have any advice how to achieve the goal?
Possible solution:
if ($Redirection -eq 'Console) {
MSBuild.exe $sln /target:Clean | Redirect-AccordingToRedirectionVariable
} else {
MSBuild.exe $sln /target:Clean | Out-File c:\temp.txt
}
But if you imagine there can be many many msbuild calls, it's not ideal.
Don't be shy to tell me any new suggestion how to cope with it ;)
Any background info about redirections/coloring/outpu is welcome as well.
(The problem is not msbuild specific, the problem touches any application that writes colored output)
Yeah I would avoid piping colored output. At that point, AFAICT, all color info is lost.
I would recommend using the /filelogger and /noconsolelogger parameters on MSBuild e.g.:
function Invoke-MSBuild($project, [string[]]$targets, [switch]$logToFile) {
$OFS = ';'
$targetArg = if ($targets) {"/t:$targets"} else {''}
if ($logToFile) {
msbuild.exe $project $targetArg /filelogger /noconsolelogger
}
else {
msbuild.exe $project $targetArg
}
}
or you could do something even simpler like this:
function Invoke-MSBuild($project, [string[]]$targets, $logFile) {
$OFS = ';'
$targetArg = if ($targets) {"/t:$targets"} else {''}
if ($logFile) {
msbuild.exe $project $targetArg > $logFile
}
else {
msbuild.exe $project $targetArg
}
}