Python Windows how to get STDOUT data in real time? - subprocess

I have a windows executable that I want to run over and over. The problem is that sometimes there's an error about 1 second in, but the program doesn't exit. So what I would like to do is to be able to grab the contents of stdout, recognize there is an error, and then kill the subprocess and start it over.
When I run this executable, stuff prints to the screen just fine. But when I wrap it in a subprocess from python then the stdout stuff doesn't show up until the program terminates.
I've tried basically everything posted here with no luck:
Constantly print Subprocess output while process is running
Here's my current code, I replaced the executable with a second python program just to remove any other weird variables:
parent_program.py:
import subprocess, os, sys
program = "python "+os.path.dirname(os.path.abspath(__file__)) + "/child_program.py"
with subprocess.Popen(program, shell=True, stdout=subprocess.PIPE, bufsize=1, universal_newlines=True) as p:
for line in p.stdout:
print(line, end='')
child_program.py:
from time import sleep
for i in range(0,10):
print(i)
sleep(1)
What I would expect is that I would see 1,2,3,4... printed one second at a time, as if I had just run python child_program.py, but instead I get nothing for 10 seconds and then get all the output at once.
I also thought about trying to run the program from the CMD prompt and piping the stdout to a file python child_program.py 2>&1 > output.txt and then having python read that file, but it's the same problem, the file doesn't get written until the program terminates.
Is there any way to fix this on windows?

Related

How fo force subprocess to refresh stdout buffer?

Platform: windows 8.1
IDE: vs2013
use c/c++
Process A read stdout of subprocess using pipe redirect.
but subprocess dont invoke fflush after printf, so processs A cant read anything from pipe before subprocess run to end.
ps: I have souce code of subprecess, but its very hard to modify them.
So whether Process A can force subprocess refresh stdout buffer to read something before subprocess run to end? (the same effective as fflush)

can't find perl error log

I have a perl file (eg:test.pl) which does some DB operations.
While testing, its working fine.
I execute this file as a background process by using the command
perl test.pl &
Its working properly for some days.
But after some days ,the file execution get stopped.
How can I find the reason or view the error?
I checked the log file "/var/log/httpd/error_log", but can't find anything.
I keep the perl file in a server, which runs in Cent OS.
Any one have idea?
There is no 'perl error log'
But you can define a destination for output to be saved to, just run your script like this:
perl test.pl >> /var/log/some-log-file.log 2>&1 &
This will redirect STDOUT (normal shell output) and STDERR (error output) to /var/log/some-log-file.log instead of to the terminal.
You may also wish to use nohup in order to have the script ignore HANGUP (logout) signals, which could be causing your unexpected terminations:
nohup perl test.pl >> /var/log/some-log-file.log 2>&1 &
Obviously, whichever user you run the script as will need to have write access to the log file.

prints in an another window in cmd and close quickley

I am trying to run a Perl script from command prompt.
The script contains one line:
print "Hello World!\n"
I type in the cmd: Perl hello.pl
The line is printed in a new window and quickly is closed.
It's all happening in the cmd! Does anyone had this kind of problem?
I know Perl is working because I tried to run a script that creates an excel file and it worked.
The only problem is, that it doesn't print in the same window as it is supposed to do, but opens a new window, prints there and closes it. (I tried to do a while loop in the end and it didn't help).
I was able to solve this.
In windows there is an option called "Open command prompt as Administrator". A new window does not open up in that case.
The cmd window closes as soon as the command that it runs has exited. You can either
… start a cmd.exe of your own, and launch your script via
> perl C:\path\to\script.pl
instead of double-clicking the perl file (or whatever you are doing to start it). This should not start a new window.
… or you could have the script wait until you have read the message. Just wait for user input of some sort before exiting, e.g. like
<>; # read and discard a line to exit
at the bottom of your script.
You can also use the pause program for this, which you can execute like system('pause').

How to invoke Unix "script" command in Perl?

I'm sure this is an easy fix, but I need to use "script" (and not collect standard in/out/error) for my project. I'm somewhat new to Perl, so please bear with me.
I have a Perl script that works fine. When I run it I generally type script > filename before I run Perl.
$script > file.log
bash-3.2$ perl foobar.pl
This runs fine, and when I'm done I type exit or control D to stop the script and save the file. All I'd like to do is incorporate the script command in Perl and then automatically capture the file when the program stops running (12-16 hours). The problem I have is that is I call in system("script > file.log"); and them call system("perl foobar.pl"); it hangs at the bash-3.2$ prompt. The only way to get Perl to work is control D or exit, stopping the script function.
Anyone have any idea how to fix this? While it's easy to start with script before invoking Perl, if I'm a mole and forget, I have to rerun the program (which takes a long time).
Have you considered using system("script -c 'perl foobar.pl' file.log")?

Odd behavior with Perl system() command

Note that I'm aware that this is probably not the best or most optimal way to do this but I've run into this somewhere before and I'm curious as to the answer.
I have a perl script that is called from an init that runs and occasionally dies. To quickly debug this, I put together a quick wrapper perl script that basically consists of
#$path set from library call.
while(1){
system("$path/command.pl " . join(" ",#ARGV) . " >>/var/log/outlog 2>&1");
sleep 30; #Added this one later. See below...
}
Fire this up from the command line and it runs fine and as expected. command.pl is called and the script basically halts there until the child process dies then goes around again.
However, when called from a start script (actually via start-stop-daemon), the system command returns immediately, leaving command.pl running. Then it goes around for another go. And again and again. (This was not fun without the sleep command.). ps reveals the parent of (the many) command.pl to be 1 rather than the id of the wrapper script (which it is when I run from the command line).
Anyone know what's occurring?
Maybe the command.pl is not being run successfully. Maybe the file doesn't have execute permission (do you need to say perl command.pl?). Maybe you are running the command from a different directory than you thought, and the command.pl file isn't found.
There are at least three things you can check:
standard error output of your command. For now you are swallowing it by saying 2>&1. Remove that part and observe what errors the system command produces.
the return value of system. The command may run and system may still return an exit code, but if system returns 0, you know the command was successful.
Perl's error variable $!. If there was a problem, Perl will set $!, which may or may not be helpful.
To summarize, try:
my $ec = system("command.pl >> /var/log/outlog");
if ($ec != 0) {
warn "exit code was $ec, \$! is $!";
}
Update: if multiple instance of the command keep showing up in your ps output, then it sounds like the program is forking and running itself in the background. If that is indeed what the command is supposed to do, then what you do NOT want to do is run this command in an endless loop.
Perhaps when run from a deamon the "system" command is using a different shell than the one used when you are running as yourself. Maybe the shell used by the daemon does not recognize the >& construct.
Instead of system("..."), try exec("...") function if that works for you.