I am trying a simple questionnaire using perl. I want to record the responses in a log file as and when a user inputs it. I'm having problem in redirecting the stdin to file.
Below is the code I implemented. Refer this.
open my $tee, "|-", "tee some_file.out";
print $tee "DO you want to continue?(y/n)\n";
$var=<STDIN>;
$var =~ s/[\n\r\f\t]//g;
if($var eq "y"){
print $tee "Enter\n";
}
close $tee;
The output I'm getting now is, only after user input is provided the question is printed.
#in console
y
DO you want to continue?(y/n)
Enter
#some_file.out
DO you want to continue?(y/n)
Enter
Below is the expected output:
#in console
DO you want to continue?(y/n)
y
Enter
#some_file.out
DO you want to continue?(y/n)
y
Enter
I also found Duplicate stdin to stdout but really couldn't achieve what I want to.
Am I missing something?!
Is there any cleaner solution available?
First of all, never use the phrase "redirecting the stdin to..." because stdin is input. It doesn't go to anything. It comes from somewhere.
It seems that what you expected is to have a copy of $var appear in your log file. Since you never printed $var to $tee there's no way that could happen.
So why did you think $var would appear in the log file? From the way you have shown us a copy of the log file next to a copy of what you see on the terminal, I guess that your reasoning went something like this:
The tee put all of the output into the log file
The tee also put all of the output on the terminal
My program didn't output anything else besides what went into the tee
The screen contents should match the log file
But there's a hidden assumption that's required to reach the conclusion:
3a. Nothing else was written to the terminal besides my program's output
And that's the part which is incorrect. When you type y into the terminal while your program is running, the terminal itself echoes what you type. It prints a copy in the terminal window, and also sends the character to your program's stdin. The y that you see on screen is not part of your program's output at all.
Since the echoing is done by the terminal and not by your program, you can't instruct it to also send a copy to your log file. You need to explicitly print it there if you want it to be logged.
You can ask the terminal to stop echoing, and then you take responsibility for printing the characters as they are typed so the user can see what they're typing. If you want to try that, see the Term::ReadKey module.
Or if what you really want is a complete record of everything that appeared on the terminal during the run of your program, maybe you should run it in the standard unix tool script, which is made for exactly that purpose.
(Side note: Did you know about the IO::Tee module? you can have teed output without an external process)
Related
I'm trying to get user input from a web page written in Perl and send it to a local program (blastp), then display the results.
This is what I have right now:
(input code)
print $q->p, "Your database: $bd",
$q->p, "Your protein is: $prot",
$q->p, "Executing...";
print $q->p, system("blastp","-db $bd","-query $prot","-out results.out");
Now, I've done a little research, but I can't quite grasp how you're supposed to do things like this in Perl. I've tried opening a file, writing to it, and sending it over to blastp as an input, but I was unsuccessful.
For reference, this line produces a successful output file:
kold#sazabi ~/BLAST/pataa $ blastp -db pataa -query ../teste.fs -out results.out
I may need to force the bd to load from an absolute path, but that shouldn't be difficult.
edit: Yeah, the DBs have an environmental variable, that's fixed. Ok, all I need is to get the input into a file, pass it to the command, and then print the output file to the CGI page.
edit2: for clarification:
I am receiving user input in $prot, I want to pass it over to blastp in -query, have the program blastp execute, and then print out to the user the results.out file (or just have a link to it, since blastp can output in HTML)
EDIT:
All right, fixed everything I needed to fix. The big problem was me not seeing what was going wrong: I had to install Tiny:Capture and print out stderr, which was when I realized the environmental variable wasn't getting set correctly, so BLAST wasn't finding my databases. Thanks for all the help!
Write $prot to the file. Assuming you need to do it as-is without processing the text to split it or something:
For a fixed file name (may be problematic):
use File::Slurp;
write_file("../teste.fs", $prot, "\n") or print_error_to_web();
# Implement the latter to print error in nice HTML format
For a temp file (better):
my ($fh, $filename) = tempfile( $template, DIR => "..", CLEANUP => 1);
# You can also create temp directory which is even better, via tempdir()
print $fh "$prot\n";
close $fh;
Step 2: Run your command as you indicated:
my $rc = system("$BLASTP_PATH/blastp", "-db", "pataa"
,"-query", "../teste.fs", "-out", "results.out");
# Process $rc for errors
# Use qx[] instead of system() if you want to capture
# standard output of the command
Step 3: Read the output file in:
use File::Slurp;
my $out_file_text = read_file("results.out");
Send back to web server
print $q->p, $out_file_text;
The above code has multiple issues (e.g. you need better file/directory paths, more error handling etc...) but should start you on the right track.
I have a perl script, say "process_output.pl" which is used in the following context:
long_running_command | "process_output.pl"
The process_output script, needs to be like the unix "tee" command, which dumps output of "long_running_command" to the terminal as it gets generated, and in addition captures output to a text file, and at the end of "long_running_command", forks another process with the text file as an input.
The behavior I am currently seeing is that, the output of "long_running_command" gets dumped to the terminal, only when it gets completed instead of, dumping output as it gets generated. Do I need to do something special to fix this?
Based on my reading in a few other stackexchange posts, i tried the following in "process_output.pl", without much help:
select(STDOUT); $| =1;
select(STDIN); $| =1; # Not sure even if this is needed
use FileHandle; STDOUT->autoflush(1);
stdbuf -oL -eL long_running_command | "process_output.pl"
Any pointers on how to proceed further.
Thanks
AB
This is more likely an issue with the output of the first process being buffered, rather than the input of your script. The easiest solution would be to try using the unbuffer command (I believe it's part of the expect package), something like
unbuffer long_running_command | "process_output.pl"
The unbuffer command will disable the buffering that happens normally when output is directed to a non-interactive place.
This will be the output processing of long_running_processing. More than likely it is using stdio - which will look to see what the output file descriptor is connected to before it does outputing. If it is a terminal (tty), then it will generally output line based, but in the above case - it will notice it is writing to a pipe and will therefore buffer the output into larger chunks.
You can control the buffering in your own process by using, as you showed
select(STDOUT); $| =1;
This means that things that your process prints to STDIO, are not buffered - it makes no sense doing this for input, as you control how much buffering is done - if you use sysread() then you are reading unbuffered, if you use a construct like <$fh> then perl will await until it has a "whole line" (it actually reads up to the next input line separator (as defined in variable $/ which is newline by default)) before it returns data to you.
unbuffer can be used to "disable" the output buffering, what it actually does is make the outputing process think that it is talking to a tty (by using a pseudo tty) so the output process does not buffer.
I basically want to reopen STDERR/STDOUT so they write to one logfile with both the stream and the timestamp included on every line. So print STDERR "Hello World" prints STDERR: 20130215123456: Hello World. I don't want to rewrite all my print statements into function calls, also some of the output will be coming from external processes via system() calls anyway which I won't be able to rewrite.
I also need for the output to be placed in the file "live", i.e. not only written when the process completes.
(p.s. I'm not asking particularly for details of how to generate timestamps, just how to redirect to a file and prepend a string)
I've worked out the following code, but it's messy:
my $mode = ">>";
my $file = "outerr.txt";
open(STDOUT, "|-", qq(perl -e 'open(FILE, "$mode", "$file"); while (<>) { print FILE "STDOUT: \$\_"; }'));
open(STDERR, "|-", qq(perl -e 'open(FILE, "$mode", "$file"); while (<>) { print FILE "STDERR: \$\_"; }'));
(The above doesn't add dates, but that should be trivial to add)
I'm looking for a cleaner solution, one that doesn't require quoting perl code and passing it on the command line, or at least module that hides some of the complexity. Looking at the code for Capture::Tiny it doesn't look like it can handle writing a part of output, though I'm not sure about that. annotate-output only works on an external command sadly, I need this to work on both external commands and ordinary perl printing.
The child launched via system doesn't write to STDOUT because it does not have access to variables in your program. Therefore, means having code run on a Perl file handle write (e.g. tie) won't work.
Write another script that runs your script with STDOUT and STDERR replaced with pipes. Read from those pipes and print out the modified output. I suggest using IPC::Run to do this, because it'll save you from using select. You can get away without it if you combine STDOUT and STDERR in one stream.
In my script I am dealing with opening files and writing to files. I found that there is some thing wrong with a file I try to open, the file exists, it is not empty and I am passing the right path to file handle.
I know that my question might sounds weird but while I was debugging my code I put the following command in my script to check some files
system ("ls");
Then my script worked well, when it's removed it does not work correctly anymore.
my #unique = ("test1","test2");
open(unique_fh,">orfs");
print unique_fh #unique ;
open(ORF,"orfs")or die ("file doesnot exist");
system ("ls");
while(<ORF>){
split ;
}
#neworfs=#_ ;
print #neworfs ;
Perl buffers the output when you print to a file. In other words, it doesn't actually write to the file every time you say print; it saves up a bunch of data and writes it all at once. This is faster.
In your case, you couldn't see anything you had written to the file, because Perl hadn't written anything yet. Adding the system("ls") call, however, caused Perl to write your output first (the interpreter is smart enough to do this, because it thinks you might want to use the system() call to do something with the file you just created).
How do you get around this? You can close the file before you open it again to read it, as choroba suggested. Or you can disable buffering for that file. Put this code just after you open the file:
my $fh = select (unique_fh);
$|=1;
select ($fh);
Then anytime you print to the file, it will get written immediately ($| is a special variable that sets the output buffering behavior).
Closing the file first is probably a better idea, although it is possible to have a filehandle for reading and writing open at the same time.
You did not close the filehandle before trying to read from the same file.
Let's say I have the following simple script:
print "ID : ";
$ID = <>;
system (`comp program $ID`);
exec "task --shell";
When I use:
perl foo.pl | tee log.txt
The showing up problem is getting on screen a blink sign echo (waiting for the ID enter) before I even see the "ID : " (print instruction).
I need to keep on a log file all running output script (very long), notice that at the start of the run I have an interactive part that also need to be kept.
Is there a way inside PERL script or outside it, that keep the output stream exactly as it shown out on the screen, and what do you think is the simple & efficient way to conduct it?
I've noticed the IO::Tee , File::Tee moudles and Log4perl - someone can help me find the best way to use it ?
The best approach I have seen is to use script program.
You run script (program) - it gives you shell. in this shell you run anything you want. When you're done, you exit this shell, and everything that was on your screen will be in typescript file. Including all control codes.
There is also "bonus" - you can replay saved session with scriptreplay.
It's possible that your script is asking for user input before printing the "ID: " prompt because you don't have autoflush turned on. Try adding the following:
$|++;
print "ID : ";
$ID = <>;
system (`comp program $ID`);
exec "task --shell";
If this works then you might look at the Perl documentation about output buffering. perldoc -q buffer will explain what's happening in more detail as well as show some alternate ways of turning on autoflush.
If this doesn't solve the problem then it's possible that the tee command is buffering your output. In this case I would use script or screen as suggested by depesz or Manni
You could run everything inside a screen session with logging enabled.