In my scenario, I have a powershell script that receives standard input. What I would like to do is start a subprocess using an arbitrary command line and redirect the standard input from the powershell script to the subprocess. In other words, I simply want to pass down the standard input to the subprocess.
I have a few ideas on how to do this with loops, but is there a more elegant way?
Not a full answer, but you might have some trouble with this in certain cases, because PowerShell.exe waits for all of the input before it even begins executing your script. I ran into an issue where a process calling PowerShell didn't close its stream so PowerShell waited forever.
The solution was to use an undocumented option (powershell.exe -InputFormat None) and then read the input manually byte by byte; all methods that read more than 1 byte at a time are susceptible to blocking forever, at least as far as I could tell.
You could probably work around that with asynchronous methods.
Related
The task is to close the stdout handle a while before the process exits. With WinAPI functions, it'd be this:
CloseHandle(GetStdHandle(STD_OUTPUT_HANDLE))
I know I can do DllImport with Add-Type but I believe there must be a nicer way.
What's the simplest way to accomplish the same with PowerShell?
The wider task is to test a piece of a Python library that starts and interacts flexibly with local (with help of subprocess and _winapi modules) or remote (via WinRM) processes on Windows. One of the tests is to run a program that closes its ends of stdout and stderr pipes a while before it exits. (There was a similar issue on Linux.) Therefore, a script must closes stdout and stderr so that the calling code is signalled by the OS that they're closed. The only way I found is to call CloseHandle on stdout and stderr handles. Calling .Close or .Dispose on the stream objects doesn't help: they seem to be closed only internally to the called process. The script should be in some "native" language that needs no additional compilers and interpreters. Therefore, it's either cmd, VBScript or PowerShell. Only the last one is able to call WinAPI functions. (At the moment of this update I already wrote scripts both on Python, which works perfectly but needs an interpreter to be installed, and Powershell, which works without any additional installations but a bit cumbersome and very slow.)
I have a powershell script that I am converting from running as a fore-front infinite while-loop to a scheduled task. The biggest problem here is that I would still like to maintain a log. Start-Transcript was the bit that did the logging previously, but that doesn't work with the background task.
These links (1, 2) show similar questions, but they only give the information that start-transcript won't work. They don't give any indication as to how it could be done.
Basically you can do two things:
Add logging routines to your script (see for instance here).
Run the script like this:
powershell.exe -Command "&{your.ps1 *> your.log; exit $LASTEXITCODE}"
Personally I'd prefer the former, but it'd require more changes to your code.
Before you answer, I'm not looking for the functionality of ; to suppress command line printing.
I have a set of scripts which are not mine and I do not have the ability to change. However, in my scripts I make a call to these other scripts through evalin('base', 'scriptName'). Unfortunately, these other scripts do a lot of unnecessary and ugly printing to the command window that I don't want to see. Without being able to edit these other scripts, I would like a way to suppress output to the command line for the time that these other scripts are executing.
One potential answer was to use evalc, but when I try evalc(evalin('base', 'scriptName')) MATLAB throws an error complaining that it cannot execute a script as a function. I'm hoping there's something like the ability to disable command window printing or else redirecting all output to some null file much like /dev/null in unix.
I think you just need to turn the argument in your evalc example into a string:
evalc('evalin(''base'', ''scriptName'')');
Have you tried this solution
here ?
echo off;
I don't know if it will fit your needs, but another solution can be to open a new session of Matlab, and use there only minimized -nodesktop form (-just the command window). You can run from there the annoying scripts, and work on the main session as usual.
The problem here is that the sessions can't be synchronized, so if you need to work with the results of the scripts all the time, it'll be a little bit complicated. Maybe you can save the result to disk, than call it from the main session...
But it mainly depends on your workflow with those scripts.
I have an old, third party, command line, proprietary program which I'm calling from PowerShell.
Once started, this program accepts commands typed in followed by enter (like any other program), but it's very basic. It doesn't have flags, doesn't accept piped in arguments, etc. You have to start the program, type your command, hit enter and parse the results.
Is there a way I can use PowerShell to type in a command and get the resulting output? Right now the best solution I have is to call SendKeys.Send in a background job, but I'm not sure this will work.
Is there a better way?
check out this to see if it would work for you: http://wasp.codeplex.com/
legacy programs are hard to tell, however. this works with standard windows programs.
In Perl, without using the Thread library, what is the simplest way to spawn off a system call so that it is non-blocking? Can you do this while avoiding fork() as well?
EDIT
Clarification. I want to avoid an explicit and messy call to fork.
Do you mean like this?
system('my_command_which_will_not_block &');
As Chris Kloberdanz points out, this will call fork() implicitly -- there's really no other way for perl to do it; especially if you want the perl interpreter to continue running while the command executes.
The & character in the command is a shell meta-character -- perl sees this and passes the argument to system() to the shell (usually bash) for execution, rather than running it directly with an execv() call. & tells bash to fork again, run the command in the background, and exit immediately, returning control to perl while the command continues to execute.
The post above says "there's no other way for perl to do it", which is not true.
Since you mentioned file deletion, take a look at IO::AIO. This performs the system calls in another thread (POSIX thread, not Perl pseudothread); you schedule the request with aio_rmtree and when that's done, the module will call a function in your program. In the mean time, your program can do anything else it wants to.
Doing things in another POSIX thread is actually a generally useful technique. (A special hacked version of) Coro uses it to preempt coroutines (time slicing), and EV::Loop::Async uses it to deliver event notifications even when Perl is doing something other than waiting for events.