STDOUT redirected externally and no output seen at the console - perl

I have a program which reads the output from external application.The external app gives set of output. My program reads the output from this external app while($line=<handle to external app>) and print it to STDOUT.But the "print $line STDOUT" prints only some lines and when the error has occurred ,print to STDOUT is not working , but my one more logging statement "push #arr,$line" has stored complete output from the external app.From this i got to know STDOUT is not working properly when error happens.
Eg:
if external app output is like:
Starting command
First command executed successfully
Error:123 :next command failed
Program terminated
In here the STDOUT prints only :
Starting command
First command executed successfully
But if I check the array it has complete output including error details. So I guessed STDOUT has been redirected or lost.
So I tried storing STDOUT in the beginning of the program to $old_handle using open and then try to restore it before print statement using select($old_handle) (thinking some thing redirects STDOUT when error happens)
But I was not successfull, I don't know what is wrong here. Please help me.

It's possible the output is being buffered. Try setting
$| = 1;
at the start of your program. This will cause the output to be displayed straight away, rather than being buffered for later.

Just guess, may be because error output doesn't go to STDOUT. Use redirect
first_program |& perl_program
or
first_program 2>&1 | perl_program

Related

Can I redirect stdout to a file and the console at same time

I have Linux application running on the Embedded board. I want to redirect stdout to a text file.
So I use below function
file = freopen("logs.txt","w",stdout);
After capturing I resume the console as
freopen("/dev/ttyAM0", "w", stdout);
My question is
If I want to see my stdout on the console while capturing/redirecting it to file, Is it possible and if yes how to get this?
I also tried using dup2() API but not succeded.

Profiling a Perl CGI script that times-out

I have a Perl CGI application that sometimes times out, causing it to be killed by Apache and 504 Gateway Time-out error to be sent to browser. I am trying to profile this application using NYTProf, however I cannot read profile data:
$ nytprofhtml -f www/cgi-local/nytprof.out
Reading www/cgi-local/nytprof.out
Profile data incomplete, inflate error -5 ((null)) at end of input file, perhaps the process didn't exit cleanly or the file has been truncated (refer to TROUBLESHOOTING in the documentation)
I am using sigexit=1 NYTProf option. Here's minimal CGI script that reproduces problem:
#!/usr/bin/perl -d:NYTProf
sleep 1 while 1;
Setting sigexit=1 tells NYTProf to catch the following signals:
INT HUP PIPE BUS SEGV
However, when your CGI script times out, Apache sends SIGTERM. You need to catch SIGTERM:
sigexit=term
To catch SIGTERM in addition to the default signals, use:
sigexit=int,hup,pipe,bus,segv,term
CGI.pm has a debug mode, which you can use to run your program from the command line, and pass your CGI parameters as key/value pairs.
It has another feature that you can use to save your params to a file, and then read that file back in later.
What I've done is added the code to save the params to a file, and run my program, via a browser. This also facilitates my abilty to insure that the browser is sending the correct data.
Then I change the code to read the params from the file, and run it as often as I need until I have everything else debugged.
Once you've got the program running to your satisfaction from the command line, you can run it via nytprof to figure out what is taking all the time.

can't find perl error log

I have a perl file (eg:test.pl) which does some DB operations.
While testing, its working fine.
I execute this file as a background process by using the command
perl test.pl &
Its working properly for some days.
But after some days ,the file execution get stopped.
How can I find the reason or view the error?
I checked the log file "/var/log/httpd/error_log", but can't find anything.
I keep the perl file in a server, which runs in Cent OS.
Any one have idea?
There is no 'perl error log'
But you can define a destination for output to be saved to, just run your script like this:
perl test.pl >> /var/log/some-log-file.log 2>&1 &
This will redirect STDOUT (normal shell output) and STDERR (error output) to /var/log/some-log-file.log instead of to the terminal.
You may also wish to use nohup in order to have the script ignore HANGUP (logout) signals, which could be causing your unexpected terminations:
nohup perl test.pl >> /var/log/some-log-file.log 2>&1 &
Obviously, whichever user you run the script as will need to have write access to the log file.

When I terminate a dying Perl test run using prove -v , why are results not saved to a text file via STDOUT?

I'm driving SeleniumRC and a page hangs, so I hit Ctrl-C to stop testing and address the issue. If I pass a txt file via the command line like so:
prove -v some.t :: data.csv > testresults.txt
...I either get nothing or "Terminate batch job (Y/N)? " in the text file. Note that if I don't pass a file for output via the command line the results scroll by as expected.
Does prove have an option to write to a file via stdout that isn't -v?
This issue persists when I add $| = 1; in any of its forms to either prove.pm or some.t. Is there something in prove or Test::Harness that is overriding my autoflush setting?
I've also tried this variation:
prove -v some.t > testresults.txt :: data.csv
Sounds like you are suffering from buffering. The output is buffered (when not sent to a terminal), and those buffers aren't flushed when you fill the application with Ctrl-C.
If the output comes from a Perl script, try adding $| = 1; to it.

Redirecting a .bat-file-containing-executable's stdout to a file

I have a batch file that calls an executable program. The program (compiled C code) generates some output to stdout. The batch echos some output as well. When running the bat, I use redirection (>) to get the text to a file.
mybat.bat contains:
myprog.exe arg1 %1 arg3
echo Done
then, at the console:
C:\> mybat arg2 > log.txt
The problem is that in log.txt I get only the output of the echo Done command and not the output of myprog.exe. Without the redirection, I get the expected output on the screen.
Note: Under Windows XP
Update: this gets even weirder. When running myprog.exe from the command prompt, I get the expected output to the console. Then, when redirecting its output to log.txt, the file is empty! The printing is done using fprintf(stdout, "...") or fprintf(ofp, "...") where ofp is assigned: FILE *ofp = stdout;.
Further investigation: it seems like the fprintf(stdout... lines where redirected, while the fprintf(ofp... are not (yes, the pointer is assigned correctly). I also found that the program crashes at somepoint (at a call to feof()). So, my conclusion is that due to the abnormal termination of the program, the standard output buffers were not written to the file. HOWEVER - this happened only for the lines that use the pointer. I guess that these lines have shorter output, so the flush frequency is lower (I used the stdout line to print a deliberate help message).
Once solving the crash problem, the data is now redirected to the log file. Thanks for your help.
Try redirecting both stdout and stderr to the log file.
mybat arg2 1>&2> log.txt
Try this:
cmd /c "mybat.bat arg2" > log.txt