Passing PostgreSQL psql error messages to unix shell variables - postgresql

I am calling PostgreSQL function from Unix shell script.
Can anyone tell me how do i capture PostgreSQL error messages returned from the function in unix shell variables?
I used the below method but not able to
#!/bin/ksh
function crt_views
{
#Call the POSTGRE pkg which created dynamic views
echo "Calling POSTGRE function FN_CRT_ACTNET_VWS..." >> $LOGFILENAME
pg_msg=`psql --echo-all -U<uname> << EOF
set client_min_messages='NOTICE';
SELECT FUNC_CRT_VWS();
EOF`
}
crt_views
echo "PRINTING - $pg_msg"
In output I'm not seeing the errors returned from the PostgreSQL function. Please help

You need to redirect stderr to stdout, since backtick ing something will get you only stdout.
This is because output that is both from RAISE NOTICE and RAISE EXCEPTION (among other things) is printed to stderr by psql.
Note: I also switched from using backtick ing to dollar-paren style execution. Using the backtick, redirecting stderr in this way doesn't work. (My testing was under sh instead of ksh, but in this case the behavior should be similar, I suspect.)
Also, set -e will fail on error, which may be desireable (depending on your needs).
So, that leads to something like this:
#!/bin/ksh
set -e
function crt_views
{
#Call the Postgres pkg which created dynamic views
echo "Calling Postgres function FN_CRT_ACTNET_VWS..." >> $LOGFILENAME
pg_msg=$(psql --echo-all -U<uname> << EOF
set client_min_messages='NOTICE';
SELECT FUNC_CRT_VWS();
EOF
2>&1)
}
crt_views
echo "PRINTING - $pg_msg"

Related

From Perl, spawn a shell, configure it, and fork the STDOUT

I use a Perl script to configure and spawn a compiled program, that needs a subshell configured a certain way, so I use $returncode = system("ulimit -s unlimited; sg ourgroup 'MyExecutable.exe'");
I want to capture and parse the STDOUT from that, but I need it forked, so that the output can be checked while the job is still running. This question comes close:
How can I send Perl output to a both STDOUT and a variable? The highest-rated answer describes a function called backtick() that creates a child process, captures STDOUT, and runs a command in it with exec().
But the calls I have require multiple lines to configure the shell. One solution would be to create a disposable shell script:
#disposable.sh
#!/bin/sh
ulimit -s unlimited
sg ourgroup 'MyExecutable.exe'
I could then get what I need either with backtick(disposable.sh) or open(PROCESS,'disposable.sh|').
But I'd really rather not make a scratch file for this. system() happily accepts multi-line command strings. How can I get exec() or open() to do the same?
If you want to use shell's power (that includes loops, variables, but also multiple command execution), you have to invoke the shell (open(..., 'xxx|') doesn't do that).
You can pass your shell script to the shell with the -c option of the shell (another possibility would be to pipe the commands to the shell, but that's more difficult IMHO).
That means calling the backtick function from the other answer like this:
backtick("sh", "-c", "ulimit -s unlimited; sg ourgroup 'MyExecutable.exe'");
The system tee with backticks will do this, no?
my $output = `(ulimit -s unlimited; sg ourgroup 'MyExecutable.exe') | tee /dev/tty`;
or modify Alnitak's backticks (so it does use a subshell)?
my $cmd = "ulimit -s unlimiited ; sg ourgroup 'MyExecutable.exe'";
my $pid = open(CMD, "($cmd) |");
my $output;
while (<CMD>) {
print STDOUT $_;
$output .= $_;
}
close CMD;
Expect should be used as you are interacting with your program: http://metacpan.org/pod/Expect
Assuming /bin/bash on your *nix matches something like bash-3.2$ the below program can be used to launch number of commands using $exp->send on bash console and output from each command can then be parsed for further actions.
#!/usr/bin/perl
use Expect;
my $command="/bin/bash";
my #parameters;
my $exp= new Expect;
$exp->raw_pty(1);
$exp->spawn($command);
$exp->expect(5, '-re', 'bash.*$');
$exp->send("who \n");
$exp->expect(10, '-re', 'bash.*$');
my #output = $exp->before();
print "Output of who command is #output \n";
$exp->send("ls -lt \n");
$exp->expect(10, '-re', 'bash.*$');
my #output = $exp->before();
print "Output of ls command is #output \n";

How to get return value of a subroutine in a perl script to bash

I have done a shell script that gets the output of a .jar file and assign in to a variable.
rewards_generator.sh
APP_ROOT=/home/testApps/
JAR=${APP_ROOT}ClusterGenerator/generator.jar
#get clusters
clusters=$(loadClusters $1)
for i in `echo $clusters | sed 's/,/ /g'`
do
#pull cluster records from database and save query return status to $x
x=$(/usr/IBM/WebSphere/ProcServer/java/bin/java ${JAR} ${i/-/_} 2>&1)
done
Basically, what I've done to the java app is to System.out.print the query return status. Then used the 2>&1 in bash in order to get into the output stream and assign the value to a shell script variable.
Now how can I get the return value of a perl script and assign it to a shell script variable? Is it the same as the one I've done above or is there any other approach to do this?
You can use backticks to record the output of an external command in a bash script.
Here's a simple example:
#!/bin/bash
# Execute the script, recording output to a variable
x=`/path/to/script.pl`
# Display or act on the output some time later
echo "script output: $x"
Now how can I get the return value of a perl script and assign it to a shell script variable?
#! /bin/bash
perl script.pl
return_value=$?

troubles while redirecting stderr in csh

I'm writing a Perl script that should execute commands in shell and parse their output. As a shell I'm intended to use csh. I've started with this
my $out = `cmd`
but it doesn't capture STDERR, which I need too. Running sh with output redirection does nothing
my $out = `sh -c "cmd 2>&1"`
still captures only STDOUT, not STDERR.
Even redirecting to file in csh doesn't work for me
tcsh$ cmd >& logfile.log
still captures STDOUT only %)
The command I'm trying to execute is actuallty sh script and some commands in this script print into STDERR and I want to capture that output. If I execute sh -c "cmd 2>/dev/null" STDERR actually goes to /dev/null and only STDOUT is printed in terminal.
Could anyone help me with this?
I believe there is something you are not telling us. Are you on cygwin? Or Windows? Do you have a PERL5SHELL environment variable set?
There is something that you are not telling us because both of these work fine on the five platforms I can easily test on:
% perl -le '$out = `sh -c "grep missing /dev/nowhere 2>&1" | cat -n`; chomp $out; print "got <<<$out>>>"'
got <<< 1 grep: /dev/nowhere: No such file or directory>>>
But in far, there is no reason to call sh(1) explicitly for shelling out. That’s because Perl always calls sh(1) for all its backtick, pipe opens, and system() shell-outs:
% perl -le '$out = `grep missing /dev/nowhere 2>&1 | cat -n`; chomp $out; print "got <<<$out>>>"'
got <<< 1 grep: /dev/nowhere: No such file or directory>>>
The only except to this I can think of occurs on non-Unix systems, where because they have no /bin/sh, something else is defined.
But under no circumstances will simple shell-outs be calling tcsh(1) behind your back. You’d’ve had to’ve seriously hacked the perl(1) source to get that to happen. I also rather doubt you could (easily) hack the binary, since the string "/bin/tcsh" is going to be longer than "/bin/sh", and it isn’t very often going to be found in /bin/ anyway.
That you can’t get stderr redirection working even from the shell says something pretty weird is going on. I think we need more information.
Here, you are capturing the STDOUT of sh, which is not the STDERR of cmd:
my $out = `sh -c "cmd 2>&1"`;
Can you just run cmd directly?
my $out = `cmd 2>&1`;
Backquotes capture STDOUT not STDERR.
system will dump both stdout and stderr to their parent's settings.
If you want to capture STDERR, you need something like IPC::Open3:
Extremely similar to open2(), open3() spawns the given $cmd and connects CHLD_OUT for reading from the child, CHLD_IN for writing to the child, and CHLD_ERR for errors. If CHLD_ERR is false,
You said that running the command cmd >& logfile.log in tcsh sends only cmd's stdout to the log file, not its stderr. That doesn't make sense.
Try replacing cmd with the following script:
#!/bin/sh
echo stdout
echo STDERR 1>&2
Both "stdout" and "STDERR" should show up in logfile.log.
If so, then perhaps your "cmd" is doing something odd. My best guess is that cmd is writing to /dev/tty, not to either stdout or stderr; that wouldn't be affected by redirection.
To see what I mean, add this line to the above script:
echo tty > /dev/tty
I don't really have time to mock up an example as I normally would, nor even test one. I am thinking that you might try using Capture::Tiny to see if that helps.

How can I suppress system output when using nohup from Perl?

In Perl I am starting a process using the nohup command. The command is below:
system("nohup myproc pe88 &");
This works fine and the process starts as expected. However I would like to suppress the following output of this command - which is:
Sending output to nohup.out
I must have this process redirecting all of it's output to nohup.out but I just don't want it displayed when I run my Perl program. I want to instead, print my own user friendly message. I've tried a few variants but nothing has worked for me yet.
"Sending output to nohup.out" message is sent to STDERR, so you can catch the STDERR via the usual methods
either via shell: system("nohup myproc pe88 2> /tmp/error_log.txt &");
Use /dev/null instead of /tmp/error_log.txt if you don't need stderr at all; and add "> /tmp/myout.txt" to redirect stdout.
Or by capturing via Perl (don't use system() call, instead use IPC::Open3 or capture command from IPC::System::Simple)
How about:
system("nohup myproc pe88 >nohup.out 2>&1 &");
The man page for nohup says:
If standard output is a terminal,
append output to 'nohup.out' if
possible, '$HOME/nohup.out' otherwise.
If standard error is a terminal,
redirect it to standard output. To
save output to FILE, use `nohup
COMMAND > FILE'.
So if you explicitly redirect STDOUT and STDERR to nohup.out, then nohup doesn't print that message. Granted, you don't get the automatic fallback to $HOME/nohup.out if nohup.out is unwritable, but you can check for that first if that's an issue.
Note that if you redirect just STDOUT, nohup prints a "redirecting stderr to stdout" message.

Missing output when running system command in perl/cgi file

I need to write a CGI program and it will display the output of a system command:
script.sh
echo "++++++"
VAR=$(expect -c " spawn ssh -o StrictHostKeyChecking=no $USER#$HOST $CMD match_max
100000 expect \"*?assword:*\" send -- \"$PASS\r\" send -- \"\r\" expect eof ")
echo $VAR
echo "++++++"
In CGI file:
my $command= "ksh ../cgi-bin/script.sh";
my #output= `$command`;
print #output;
Finally, when I run the CGI file in unix, the $VAR is a very long string including \n and some delimiters. However, when I run on web server, the output is
++++++
++++++
So $VAR is missing when passing in the web interface/browser.
I know maybe the problem is $VAR is very long string.
But anyway, is there anyway to solve this problem except writing the output to a file then retrieve it from browser?
Thanks if you are interested in my question.
script.sh uses several environment variables: $USER, $HOST, $CMD and $PASS. The CGI environment will have different environment variables set than a login shell. You may need to set these variables from your CGI script before calling script.sh.
Try finding where commands like expect and ssh that you are calling are on your system and adding their directory paths to the PATH used by your script.
I.e.
which expect
returns /usr/bin/expect then add the line:
PATH=$PATH:/usr/bin && export PATH
near the beginning of the ksh script. During debug you may also want to redirect stderr to a file by appending 2>/tmp/errors.txt to the end of your command since stderr is not shown in the browser.
my $command= "ksh ../cgi-bin/script.sh 2>/tmp/errors.txt";