Perl script interacting with another program's STDIN - perl

I have a Perl script that calls another program with backticks, and checks the output for certain strings. This is running fine.
The problem I have is when the other program fails on what it is doing and waits for user input. It requires the user to press enter twice before quitting the program.
How do I tell my Perl script to press enter twice on this program?

The started command receives the same STDIN and STDERR from your script, just STDOUT is piped to your script.
You could just close your STDIN before running the command and there will be no input source. Reading from STDIN will cause an error and the called command will exit:
close STDIN;
my #slines = `$command`;
This will also void any chance of console input to your script.
Another approach would use IPC::Open2 which allows your script to control STDIN and STDOUT of the command at the same time:
use IPC::Open2;
open2($chld_in, $chld_in, 'some cmd and args');
print $chld_in "\n\n";
close $chld_in;
#slines = <$chld_out>;
close $chld_out;
This script provides the two \n input needed by the command and reads the command output.

You could just pipe them in:
echo "\n\n" | yourcommand

Related

CMD operations through Perl

Trying to run some commands through perl. One of the command requires to press enter in the middle to complete!
I was first trying with java but failed to do so i thought it's possible in perl but not getting through!
$dir = "C:\\bip_autochain\\scripts";
chdir($dir) or die("Can't change to dir \n");
system("lcm_cli.bat -lcmproperty C:\\pl\\LCMBiar_Import.property");
sleep(5);
system("\n");
The system command highlighted requires to press enter after some time say 5 sec.
My code doesn't serve this purpose.
If you want to send data from your Perl script to a command launched in a subprocess you need to pipe a filehandle in to the program when launching it. Then you wait the required time and send the data using print (or printf).
There is one huge caveat. If the external program opens the console terminal directly for input and does not read from stdin (i.e. to prompt for a password) you may not be able to send the data to the program.
For the standard case where the program reads from stdin:
$dir = "C:\\bip_autochain\\scripts";
chdir($dir) or die("Can't change to dir \n");
open(CMD, "|lcm_cli.bat -lcmproperty C:\\pl\\LCMBiar_Import.property");
# ^
# vertical bar, aka "pipe" symbol
sleep(5);
print CMD "\n";
...
close(CMD); -- when you are done sending data
The pipe symbol at the beginning of the command is a special form of open that sets up the CMD filehandle piped to the command's stdin. This is descibed in the documentation

Simulate pressing enter key in Perl

I have following perl script which saves the output from the command into a textfile
#!/usr/bin/perl
use warnings;
use strict;
use Term::ANSIColor;
my $cmd_backupset=`ssh user\#ip 'dsmadmc -id=username -password=passwd "q backupset"' >> output.txt`;
open CMD, "|$cmd_backupset" or die "Can not run command $!\n";
print CMD "\n";
close CMD;
The output of output.txt is this:
Text Text Text
...
more... (<ENTER> to continue, 'C' to cancel)
The script is still running in the terminal and when I press enter, the output.txt file gets the extra information. However, I must press enter more than 30 times to get the complete output. Is there a way to automate the script so when the last line in output.txt is more..., it simulates pressing enter?
I have tried with Expect (couldn't get it installed) and with echo -ne '\n'
Most interactive commands, like the one you are using, accept some flag or some command to disable pagination. Sometimes, connecting their stdin stream to something that is not a TTY (e.g. /dev/null) also works.
Just glancing over IBM dsmadmc docs, I see it accepts the option -outfile=save.out. Using it instead of standard shell redirection would probably work in your case.

How to execute perl program as well as give user input via same Windows batch file

My batch file looks like:
perl program.pl
input.txt
My perl program asks for input file via user input. But when I am running my batch file, it just starts a perl program but do not automatically give "input.txt" on the next line as user input.
How to get it done.
"Execute" your Batch file this way:
cmd < theFile.bat
For further details, see this post.
echo input.txt|perl program.pl
In your batch file, you have two commands. One to start the perl program and one to open the input.txt file. Each line one command.
If you want the batch file to send the string input.txt into the perl program, use the echo command to send the string to stdout, and pipe stdout into the stdin of the perl program.

Perl -- command executing inside a script hangs

When I run the following script, it does exactly what I want it to do and exits:
setDisplay.sh:
#!/bin/bash
Xvfb -fp /usr/share/fonts/X11/misc/ :22 -screen 0 1024x768x16 2>&1 &
export DISPLAY=:22
When I run ./setDisplay.sh, everything works fine.
OK, here's where the fun starts...
I have a Perl script that calls setDisplay...
Here is the eamorr.pl script:
#!/usr/bin/perl
use strict;
use warnings;
my $homeDir="/home/eamorr/Dropbox/site/";
my $cmd;
my $result;
print "-----Setting display...\n";
$cmd="sh $homeDir/setDisplay.sh";
print $cmd."\n";
$result=`$cmd`;
print $result;
It just hangs when I run ./eamorr.pl
I'm totally stuck...
When you do this:
$result=`$cmd`;
a pipe is created connecting the perl process to the external command, and perl reads from that pipe until EOF.
Your external command creates a background process which still has the pipe on its stdout (and also its stderr since you did 2>&1). There will be no EOF on that pipe until the background process exits or closes its stdout and stderr or redirects them elsewhere.
If you intend to collect the stdout and stderr of Xvfb into the perl variable $result, you'll naturally have to wait for it to finish. If you didn't intend that, I can't guess what you were trying to do with the 2>&1.
Also a script that ends with an export command is suspect. It can only modify its own environment, and then it immediately exits so there's no noticeable effect. Usually that's a sign that someone is trying to modify the parent process's environment, which is not possible.

Calling command from Perl, need to see output

I need to call some shell commands from perl. Those commands take quite some time to finish so I'd like to see their output while waiting for completion.
The system function does not give me any output until it is completed.
The exec function gives output; however, it exits the perl script from that point, which is not what I wanted.
I am on Windows. Is there a way to accomplish this?
Backticks, or the qx command, run a command in a separate process and returns the output:
print `$command`;
print qx($command);
If you wish to see intermediate output, use open to create a handle to the command's output stream and read from it.
open my $cmd_fh, "$command |"; # <--- | at end means to make command
# output available to the handle
while (<$cmd_fh>) {
print "A line of output from the command is: $_";
}
close $cmd_fh;