Redirecting the output of expect to logfile in expect - perl

I am using expect in perl. I want to redirect all the output that appears on the stdout console to a log file so that i can debug it in future. Currently i am using
$exp->log_stdout(0);
Instead of redirecting to this can i do it to a log file? if so how to do it?

if you see the documentation of Expect, you will find information about Log session to a file
$object->log_file("filename" | $filehandle | \&coderef | undef)
All characters send to or received from the spawned process are written to the file. Normally appends to the logfile, but you can pass an additional mode of "w" to truncate the file upon open():
$object->log_file("filename", "w");
That means use log_file method instead of log_stdout.
Your problem will be solved.

Related

Output stdin, stdout to file and console using Perl

I am trying a simple questionnaire using perl. I want to record the responses in a log file as and when a user inputs it. I'm having problem in redirecting the stdin to file.
Below is the code I implemented. Refer this.
open my $tee, "|-", "tee some_file.out";
print $tee "DO you want to continue?(y/n)\n";
$var=<STDIN>;
$var =~ s/[\n\r\f\t]//g;
if($var eq "y"){
print $tee "Enter\n";
}
close $tee;
The output I'm getting now is, only after user input is provided the question is printed.
#in console
y
DO you want to continue?(y/n)
Enter
#some_file.out
DO you want to continue?(y/n)
Enter
Below is the expected output:
#in console
DO you want to continue?(y/n)
y
Enter
#some_file.out
DO you want to continue?(y/n)
y
Enter
I also found Duplicate stdin to stdout but really couldn't achieve what I want to.
Am I missing something?!
Is there any cleaner solution available?
First of all, never use the phrase "redirecting the stdin to..." because stdin is input. It doesn't go to anything. It comes from somewhere.
It seems that what you expected is to have a copy of $var appear in your log file. Since you never printed $var to $tee there's no way that could happen.
So why did you think $var would appear in the log file? From the way you have shown us a copy of the log file next to a copy of what you see on the terminal, I guess that your reasoning went something like this:
The tee put all of the output into the log file
The tee also put all of the output on the terminal
My program didn't output anything else besides what went into the tee
The screen contents should match the log file
But there's a hidden assumption that's required to reach the conclusion:
3a. Nothing else was written to the terminal besides my program's output
And that's the part which is incorrect. When you type y into the terminal while your program is running, the terminal itself echoes what you type. It prints a copy in the terminal window, and also sends the character to your program's stdin. The y that you see on screen is not part of your program's output at all.
Since the echoing is done by the terminal and not by your program, you can't instruct it to also send a copy to your log file. You need to explicitly print it there if you want it to be logged.
You can ask the terminal to stop echoing, and then you take responsibility for printing the characters as they are typed so the user can see what they're typing. If you want to try that, see the Term::ReadKey module.
Or if what you really want is a complete record of everything that appeared on the terminal during the run of your program, maybe you should run it in the standard unix tool script, which is made for exactly that purpose.
(Side note: Did you know about the IO::Tee module? you can have teed output without an external process)

Redirect stdout to a file in tcl

I know this has been asked before.
But hear me out once..
I'm working on a Cisco router (IOS).
I have to write a script which executes some commands and redirect their output to a file. But instead of using tee for every command, I want to open a file then run the commands , whose output will be redirected to the file and then close the file.
And even the redirection operator > is not working or the following answer:
How can I redirect stdout into a file in tcl
The fact that the solutions in that question aren't working for you is informative: the real issue that you are experiencing is that Tcl commands do not normally write to stdout anyway (except for puts, which has that as its main job, and parray, which is a procedure that uses puts internally). The “write to stdout” that you are used to is a feature of the interactive Tcl shell only.
To capture the result of all commands while running a script, you need a wrapper like this:
trace add execution source leavestep log_result
proc log_result {cmd code result op} {
puts stdout $result
}
source theRealScript.tcl
You'll find that it produces a lot of output. Cutting the output down is a rather useful thing, so this reduces it to just the immediately-executed commands (rather than everything they call):
trace add execution source enterstep log_enter
trace add execution source leavestep log_result
proc log_enter {args} {
global log_depth
incr log_depth
}
proc log_result {cmd code result op} {
global log_depth
if {[incr log_depth -1] < 1 && $result ne ""} {
puts stdout $result
}
}
source theRealScript.tcl
You'll probably still get far more output than you want…

How to run a local program with user input in Perl

I'm trying to get user input from a web page written in Perl and send it to a local program (blastp), then display the results.
This is what I have right now:
(input code)
print $q->p, "Your database: $bd",
$q->p, "Your protein is: $prot",
$q->p, "Executing...";
print $q->p, system("blastp","-db $bd","-query $prot","-out results.out");
Now, I've done a little research, but I can't quite grasp how you're supposed to do things like this in Perl. I've tried opening a file, writing to it, and sending it over to blastp as an input, but I was unsuccessful.
For reference, this line produces a successful output file:
kold#sazabi ~/BLAST/pataa $ blastp -db pataa -query ../teste.fs -out results.out
I may need to force the bd to load from an absolute path, but that shouldn't be difficult.
edit: Yeah, the DBs have an environmental variable, that's fixed. Ok, all I need is to get the input into a file, pass it to the command, and then print the output file to the CGI page.
edit2: for clarification:
I am receiving user input in $prot, I want to pass it over to blastp in -query, have the program blastp execute, and then print out to the user the results.out file (or just have a link to it, since blastp can output in HTML)
EDIT:
All right, fixed everything I needed to fix. The big problem was me not seeing what was going wrong: I had to install Tiny:Capture and print out stderr, which was when I realized the environmental variable wasn't getting set correctly, so BLAST wasn't finding my databases. Thanks for all the help!
Write $prot to the file. Assuming you need to do it as-is without processing the text to split it or something:
For a fixed file name (may be problematic):
use File::Slurp;
write_file("../teste.fs", $prot, "\n") or print_error_to_web();
# Implement the latter to print error in nice HTML format
For a temp file (better):
my ($fh, $filename) = tempfile( $template, DIR => "..", CLEANUP => 1);
# You can also create temp directory which is even better, via tempdir()
print $fh "$prot\n";
close $fh;
Step 2: Run your command as you indicated:
my $rc = system("$BLASTP_PATH/blastp", "-db", "pataa"
,"-query", "../teste.fs", "-out", "results.out");
# Process $rc for errors
# Use qx[] instead of system() if you want to capture
# standard output of the command
Step 3: Read the output file in:
use File::Slurp;
my $out_file_text = read_file("results.out");
Send back to web server
print $q->p, $out_file_text;
The above code has multiple issues (e.g. you need better file/directory paths, more error handling etc...) but should start you on the right track.

Line buffered reading in Perl

I have a perl script, say "process_output.pl" which is used in the following context:
long_running_command | "process_output.pl"
The process_output script, needs to be like the unix "tee" command, which dumps output of "long_running_command" to the terminal as it gets generated, and in addition captures output to a text file, and at the end of "long_running_command", forks another process with the text file as an input.
The behavior I am currently seeing is that, the output of "long_running_command" gets dumped to the terminal, only when it gets completed instead of, dumping output as it gets generated. Do I need to do something special to fix this?
Based on my reading in a few other stackexchange posts, i tried the following in "process_output.pl", without much help:
select(STDOUT); $| =1;
select(STDIN); $| =1; # Not sure even if this is needed
use FileHandle; STDOUT->autoflush(1);
stdbuf -oL -eL long_running_command | "process_output.pl"
Any pointers on how to proceed further.
Thanks
AB
This is more likely an issue with the output of the first process being buffered, rather than the input of your script. The easiest solution would be to try using the unbuffer command (I believe it's part of the expect package), something like
unbuffer long_running_command | "process_output.pl"
The unbuffer command will disable the buffering that happens normally when output is directed to a non-interactive place.
This will be the output processing of long_running_processing. More than likely it is using stdio - which will look to see what the output file descriptor is connected to before it does outputing. If it is a terminal (tty), then it will generally output line based, but in the above case - it will notice it is writing to a pipe and will therefore buffer the output into larger chunks.
You can control the buffering in your own process by using, as you showed
select(STDOUT); $| =1;
This means that things that your process prints to STDIO, are not buffered - it makes no sense doing this for input, as you control how much buffering is done - if you use sysread() then you are reading unbuffered, if you use a construct like <$fh> then perl will await until it has a "whole line" (it actually reads up to the next input line separator (as defined in variable $/ which is newline by default)) before it returns data to you.
unbuffer can be used to "disable" the output buffering, what it actually does is make the outputing process think that it is talking to a tty (by using a pseudo tty) so the output process does not buffer.

What can be the possible situations where one should prefer the unbuffered output?

By the discussion in my previous question I came to know that Perl gives line buffer output by default.
$| = 0; # for buffered output (by default)
If you want to get unbuffered output then set the special variable $| to 1 i.e.
$| = 1; # for unbuffered output
Now I want to know that what can be the possible situations where one should prefer the unbuffered output?
You want unbuffered output for interactive tasks. By that, I mean you don't want output stuck in some buffer when you expect someone or something else to respond to the output.
For example, you wouldn't want user prompts sent to STDOUT to be buffered. (That's why STDOUT is never fully buffered when attached to a terminal. It is only line buffered, and the buffer is flushed by attempts to read from STDIN.)
For example, you'd want requests sent over pipes and sockets to not get stuck in some buffer, as the other end of the connection would never see it.
The only other reason I can think of is when you don't want important data to be stuck in a buffer in the event of a unrecoverable error such as a panic or death by signal.
For example, you might want to keep a log file unbuffered in order to be able to diagnose serious problems. (This is why STDERR isn't buffered by default.)
Here's a small sample of Perl users from StackOverflow who have benefited from learning to set $| = 1:
STDOUT redirected externally and no output seen at the console
Perl Print function does not work properly when Sleep() is used
can't write to file using print
perl appending issues
Unclear perl script execution
Perl: Running a "Daemon" and printing
Redirecting STDOUT of a pipe in Perl
Why doesn't my parent process see the child's output until it exits?
Why does adding or removing a newline change the way this perl for loop functions?
Perl not printing properly
In Perl, how do I process input as soon as it arrives, instead of waiting for newline?
What is the simple way to keep the output stream exactly as it shown out on the screen (while interactive data used)?
Is it possible to print from a perl CGI before the process exits?
Why doesn't print output anything on each iteration of a loop when I use sleep?
Perl Daemon Not Working with Sleep()
It can be useful when writing to another program over a socket or pipe. It can also be useful when you are writing debugging information to STDOUT to watch the state of your program live.