STDERR output in asterisk cli - perl

How can i see the output to STDERR in asterisk CLI? I found that the stderr output is visible in the original asterisk terminal but cannot be seen in the cli which is obtained by asterisk -cvvvvvvvvvr. I want to see the error message of my perl agi script (warn "text") .

You can't see it.
Reason: stderror sended to linux stderror handler of asterisk process. When you connect to asterisk console, you have other proccess which have other stderror handler.
So if you want see errors, you need setup your asterisk startup script to store that errors in some file. Or edit default script /usr/sbin/safe_asterisk to suite your need.
Actualy if you read AGI specification you can see,that your script have send error messages to stdout,preferable using WARNING agi function. That can be archived by redirecting stderror to stdout in script or by writing special handler/wrapper.

Related

Can I redirect stdout to a file and the console at same time

I have Linux application running on the Embedded board. I want to redirect stdout to a text file.
So I use below function
file = freopen("logs.txt","w",stdout);
After capturing I resume the console as
freopen("/dev/ttyAM0", "w", stdout);
My question is
If I want to see my stdout on the console while capturing/redirecting it to file, Is it possible and if yes how to get this?
I also tried using dup2() API but not succeded.

How to send stderr in email shell script (ash)

I wrote a shell script that I use under ash, and I redirect stderr and stdout to a log file. I would like that log file to be emailed to me only if stderr is not empty.
I tried:
exec >mylog.log 2>&1
# Perform various find commands
if [TEST_IF_STDERR_NOT_EMPTY]; then
/usr/bin/mail -s "mylog" email#mydomain.com < mylog.log
fi
My question is twofold:
1- I get a -sh: /usr/bin/mail: not found error. It seems that the mail command doesn't exist under ash (or at least under my linux box, which is a Synology NAS), what would be the alternative? Worst case, perl is available, but I would prefer to use standard sh commands.
2- How to I test that stderr is not empty?
Thanks
How to check if file is empty in bash
As for the first question, in your code you are calling mail but lower in the post you are calling email. Check your code and make sure it is mail.
Use which mail to get the full path. Maybe it is not installed in /usr/bin/.
Use find to locate mail.
If you can go to another shell, run it and then execute which mail to get the full path of mail in case the path is set up in the alternative shells.

Profiling a Perl CGI script that times-out

I have a Perl CGI application that sometimes times out, causing it to be killed by Apache and 504 Gateway Time-out error to be sent to browser. I am trying to profile this application using NYTProf, however I cannot read profile data:
$ nytprofhtml -f www/cgi-local/nytprof.out
Reading www/cgi-local/nytprof.out
Profile data incomplete, inflate error -5 ((null)) at end of input file, perhaps the process didn't exit cleanly or the file has been truncated (refer to TROUBLESHOOTING in the documentation)
I am using sigexit=1 NYTProf option. Here's minimal CGI script that reproduces problem:
#!/usr/bin/perl -d:NYTProf
sleep 1 while 1;
Setting sigexit=1 tells NYTProf to catch the following signals:
INT HUP PIPE BUS SEGV
However, when your CGI script times out, Apache sends SIGTERM. You need to catch SIGTERM:
sigexit=term
To catch SIGTERM in addition to the default signals, use:
sigexit=int,hup,pipe,bus,segv,term
CGI.pm has a debug mode, which you can use to run your program from the command line, and pass your CGI parameters as key/value pairs.
It has another feature that you can use to save your params to a file, and then read that file back in later.
What I've done is added the code to save the params to a file, and run my program, via a browser. This also facilitates my abilty to insure that the browser is sending the correct data.
Then I change the code to read the params from the file, and run it as often as I need until I have everything else debugged.
Once you've got the program running to your satisfaction from the command line, you can run it via nytprof to figure out what is taking all the time.

automating FTP session

I have the following excerpt from a perl script to automate an FTP session, I'm hoping someone can explain how it works.
system("rsh some_server ftp -in ftp.something.com << !
user anonymous someone\#somewhere.org
some ftp commands
bye");
The background. This perl script runs on a Linux machine, it remotes into a Solaris machine. The FTP session must be executed from the Solaris machine because the FTP site performs IP address checking.
Formerly this script ran on the Solaris machine directly (i.e. it didn't use rsh) I hacked it around and came up with this which seems to work. However I have little idea how, in particular I don't understand the << ! bit at the end of the first line. It looks a little like a here-document but I'm not really sure.
Any explanations welcome.
You are right, << is a heredoc, which is made clear by the following warning (which I get when I take out the rsh command):
sh: line 2: warning: here-document at line 0 delimited by end-of-file (wanted `!')
The construct
<< HEREDOC
reads as standard input everything from HEREDOC up to a line containing only HEREDOC or up to an end-of-file character. When you put this after a command, it is equivalent to
command < file
where file contains the text in the heredoc. In your case, instead of HEREDOC the delimiter is !, so the ! is not passed to ftp but everything after ! is. This is equivalent to
$ cat file
user anonymous someone\#somewhere.org
some ftp commands
bye
$ ftp -in ftp.something.com < file
rsh takes that entire command and runs it on your remote host.
As illustrated by user1146334's answer, this command does not act on the principal of least surprise. At the very least, make it less confusing by changing it to
system("rsh some_server ftp -in ftp.something.com << HEREDOC
user anonymous someone\#somewhere.org
some ftp commands
bye
HEREDOC");
Or even better, as mpapec mentioned in the comments, use Net::FTP and Net::SSH2.
Did you look at the man page?
-i Turns off interactive prompting during multiple file transfers.
-n Restrains ftp from attempting “auto-login” upon initial connection. If auto-login is enabled, ftp will check the .netrc (see netrc(5)) file in the user's
home directory for an entry describing an account on the remote machine. If no entry exists, ftp will prompt for the remote machine login name (default is
the user identity on the local machine), and, if necessary, prompt for a password and an account with which to login.
The client host and an optional port number with which ftp is to communicate may be specified on the command line. If this is done, ftp will immediately attempt
to establish a connection to an FTP server on that host; otherwise, ftp will enter its command interpreter and await instructions from the user. When ftp is
awaiting commands from the user the prompt ‘ftp>’ is provided to the user. The following commands are recognized by ftp:
! [command [args]]
Invoke an interactive shell on the local machine. If there are arguments, the first is taken to be a command to execute directly, with the rest of
the arguments as its arguments.
So essentially you're ftp'ing in and providing a new command per line in-line instead of from a file.

STDOUT redirected externally and no output seen at the console

I have a program which reads the output from external application.The external app gives set of output. My program reads the output from this external app while($line=<handle to external app>) and print it to STDOUT.But the "print $line STDOUT" prints only some lines and when the error has occurred ,print to STDOUT is not working , but my one more logging statement "push #arr,$line" has stored complete output from the external app.From this i got to know STDOUT is not working properly when error happens.
Eg:
if external app output is like:
Starting command
First command executed successfully
Error:123 :next command failed
Program terminated
In here the STDOUT prints only :
Starting command
First command executed successfully
But if I check the array it has complete output including error details. So I guessed STDOUT has been redirected or lost.
So I tried storing STDOUT in the beginning of the program to $old_handle using open and then try to restore it before print statement using select($old_handle) (thinking some thing redirects STDOUT when error happens)
But I was not successfull, I don't know what is wrong here. Please help me.
It's possible the output is being buffered. Try setting
$| = 1;
at the start of your program. This will cause the output to be displayed straight away, rather than being buffered for later.
Just guess, may be because error output doesn't go to STDOUT. Use redirect
first_program |& perl_program
or
first_program 2>&1 | perl_program