How to pass command line argument to gnuplot? - command-line

I want to use gnuplot to draw figure from data file, say foo.data. Currently, I hardcoded the data file name in the command file, say foo.plt, and run command gnuplot foo.plg to plot data. However, I want to pass the data file name as a command argument, e.g. running command gnuplot foo.plg foo.data. How to parse the command line arguments in gnuplot script file? Thanks.

You can input variables via switch -e
$ gnuplot -e "filename='foo.data'" foo.plg
In foo.plg you can then use that variable
$ cat foo.plg
plot filename
pause -1
To make "foo.plg" a bit more generic, use a conditional:
if (!exists("filename")) filename='default.dat'
plot filename
pause -1
Note that -e has to precede the filename otherwise the file runs before the -e statements. In particular, running a shebang gnuplot #!/usr/bin/env gnuplot with ./foo.plg -e ... CLI arguments will ignore use the arguments provided.

You can pass arguments to a gnuplot script since version 5.0, with the flag -c. These arguments are accessed through the variables ARG0 to ARG9, ARG0 being the script, and ARG1 to ARG9 string variables. The number of arguments is given by ARGC.
For example, the following script ("script.gp")
#!/usr/local/bin/gnuplot --persist
THIRD=ARG3
print "script name : ", ARG0
print "first argument : ", ARG1
print "third argument : ", THIRD
print "number of arguments: ", ARGC
can be called as:
$ gnuplot -c script.gp one two three four five
script name : script.gp
first argument : one
third argument : three
number of arguments: 5
or within gnuplot as
gnuplot> call 'script.gp' one two three four five
script name : script.gp
first argument : one
third argument : three
number of arguments: 5
In gnuplot 4.6.6 and earlier, there exists a call mechanism with a different (now deprecated) syntax. The arguments are accessed through $#, $0,...,$9. For example, the same script above looks like:
#!/usr/bin/gnuplot --persist
THIRD="$2"
print "first argument : ", "$0"
print "second argument : ", "$1"
print "third argument : ", THIRD
print "number of arguments: ", "$#"
and it is called within gnuplot as (remember, version <4.6.6)
gnuplot> call 'script4.gp' one two three four five
first argument : one
second argument : two
third argument : three
number of arguments: 5
Notice there is no variable for the script name, so $0 is the first argument, and the variables are called within quotes. There is no way to use this directly from the command line, only through tricks as the one suggested by #con-fu-se.

You can also pass information in through the environment as is suggested here. The example by Ismail Amin is repeated here:
In the shell:
export name=plot_data_file
In a Gnuplot script:
#! /usr/bin/gnuplot
name=system("echo $name")
set title name
plot name using ($16 * 8):20 with linespoints notitle
pause -1

The answer of Jari Laamanen is the best solution. I want just explain how to use more than 1 input parameter with shell variables:
output=test1.png
data=foo.data
gnuplot -e "datafile='${data}'; outputname='${output}'" foo.plg
and foo.plg:
set terminal png
set outputname
f(x) = sin(x)
plot datafile
As you can see,more parameters are passed with semi colons (like in bash scripts), but string variables NEED to be encapsuled with ' ' (gnuplot syntax, NOT Bash syntax)

You may use trick in unix/linux environment:
in gnuplot program: plot "/dev/stdin" ...
In command line: gnuplot program.plot < data.dat

This question is well answered but I think I can find a niche to fill here regardless, if only to reduce the workload on somebody googling this like I did. The answer from vagoberto gave me what I needed to solve my version of this problem and so I'll share my solution here.
I developed a plot script in an up-to-date environment which allowed me to do:
#!/usr/bin/gnuplot -c
set terminal png truecolor transparent crop
set output ARG1
set size 1, 0.2
rrLower = ARG2
rrUpper = ARG3
rrSD = ARG4
resultx = ARG5+0 # Type coercion required for data series
resulty = 0.02 # fixed
# etc.
This executes perfectly well from command-line in an environment with a recent gnuplot (5.0.3 in my case).
$ ./plotStuff.gp 'output.png' 2.3 6.7 4.3 7
When uploaded to my server and executed, it failed because the server version was 4.6.4 (current on Ubuntu 14.04 LTS).
The below shim solved this problem without requiring any change to the original script.
#!/bin/bash
# GPlot v<4.6.6 doesn't support direct command line arguments.
#This script backfills the functionality transparently.
SCRIPT="plotStuff.gp"
ARG1=$1
ARG2=$2
ARG3=$3
ARG4=$4
ARG5=$5
ARG6=$6
gnuplot -e "ARG1='${ARG1}'; ARG2='${ARG2}'; ARG3='${ARG3}'; ARG4='${ARG4}'; ARG5='${ARG5}'; ARG6='${ARG6}'" $SCRIPT
The combination of these two scripts allows parameters to be passed from bash to gnuplot scripts without regard to the gnuplot version and in basically any *nix.

You could even do some shell magic, e.g. like this:
#!/bin/bash
inputfile="${1}" #you could even do some getopt magic here...
################################################################################
## generate a gnuplotscript, strip off bash header
gnuplotscript=$(mktemp /tmp/gnuplot_cmd_$(basename "${0}").XXXXXX.gnuplot)
firstline=$(grep -m 1 -n "^#!/usr/bin/gnuplot" "${0}")
firstline=${firstline%%:*} #remove everything after the colon
sed -e "1,${firstline}d" < "${0}" > "${gnuplotscript}"
################################################################################
## run gnuplot
/usr/bin/gnuplot -e "inputfile=\"${inputfile}\"" "${gnuplotscript}"
status=$?
if [[ ${status} -ne 0 ]] ; then
echo "ERROR: gnuplot returned with exit status $?"
fi
################################################################################
## cleanup and exit
rm -f "${gnuplotscript}"
exit ${status}
#!/usr/bin/gnuplot
plot inputfile using 1:4 with linespoints
#... or whatever you want
My implementation is a bit more complex (e.g. replacing some magic tokens in the sed call, while I am already at it...), but I simplified this example for better understanding.
You could also make it even simpler.... YMMV.

In the shell write
gnuplot -persist -e "plot filename1.dat,filename2.dat"
and consecutively the files you want.
-persist is used to make the gnuplot screen stay as long as the user doesn't exit it manually.

#vagoberto's answer seems the best IMHO if you need positional arguments, and I have a small improvement to add.
vagoberto's suggestion:
#!/usr/local/bin/gnuplot --persist
THIRD=ARG3
print "script name : ", ARG0
print "first argument : ", ARG1
print "third argument : ", THIRD
print "number of arguments: ", ARGC
which gets called by:
$ gnuplot -c script.gp one two three four five
script name : script.gp
first argument : one
third argument : three
number of arguments: 5
for those lazy typers like myself, one could make the script executable (chmod 755 script.gp)
then use the following:
#!/usr/bin/env gnuplot -c
THIRD=ARG3
print "script name : ", ARG0
print "first argument : ", ARG1
print "third argument : ", THIRD
print "number of arguments: ", ARGC
and execute it as:
$ ./imb.plot a b c d
script name : ./imb.plot
first argument : a
third argument : c
number of arguments: 4

Yet another way is this:
You have a gnuplot script named scriptname.gp:
#!/usr/bin/gnuplot -p
# This code is in the file 'scriptname.gp'
EPATH = $0
FILENAME = $1
plot FILENAME
Now you can call the gnuplot script scriptname.gp by this convoluted peace of syntax:
echo "call \"scriptname.gp\" \"'.'\" \"'data.dat'\"" | gnuplot

Related

Passing a gnuplot variable to sed in gnuplot script

I have a gnuplot script which accepts 3 command line arguments
Here is an example of command :
gnuplot> call BudgetRowStacked.gnu " "Fonctionnement" "2017" "545000"
With arguments I build a file name
file="Dépenses".ARG1.ARG2.".dat"
At the moment I use the hardcoded filename file in command such as
values="`sed -n -e 4p DépensesFonctionnement2017.dat`"
values2="`sed -n -e 2p DépensesFonctionnement2017.dat`"
I would like to use the file variable instead of the hardcoded names in the sed command. How can I do this ?
One option would be to put something like this into BudgetRowStacked.gnu:
file="Dépenses".ARG1.ARG2.".dat"
getData(fName, row)=system(sprintf("sed -n -e %dp %s", row, fName))
values = getData(file, 4)
values2 = getData(file, 2)
Here, sprintf first constructs the command of interest as a string and passes this to system which executes it and returns its output.

perl 3 or 4 line commands via backticks

How can I use this code to create more commands. The current version does 2. How can i do 3 or 4 or more?
my $startprocess = `(echo "y" | nohup myprocess) &`
The original question answered by user DVK:
Can I execute a multiline command in Perl's backticks?
edit: thanks for the reply sebastian.
I have to run everything in one line because Im running a program within terminal and i want to make progressive commands.
e.g command 1 starts the program. command 2 navigates me to the menu. Command 3 lets me change a setting. Command 4 lets me issue a command that prompts a response that I can only get under the condition of that new setting.
To run multiple commands would keep me trapped in step one.
The line you quoted contains one command line piped together. That's not running multiple commands.
Did you consider using open?
my $pid = open my $fh, '-|', 'myprocess';
print $fh "y\n";
There is no need to run multiple commands in one (backtick) line, why not just use multiple ones?
$first = `whoami`;
$second = `uptime`;
$third = `date`;
Backticks are used to capture the output of the command, system just runs the command and returns the exit state:
system '(echo "y" | nohup myprocess) &';
All solutions allow multiple commands piped together as this is a shell feature and all commands just pass the command string to the shell (unless it's simple enough to handle it without a shell):
Bash:
$ ps axu|grep '0:00'|sort|uniq -c|wc
Perl:
system "ps axu|grep '0:00'|sort|uniq -c|wc";
$result = `ps axu|grep '0:00'|sort|uniq -c|wc`;
open my $fh, '|-', "ps axu|grep '0:00'|sort|uniq -c|wc";
Always watch the quotation marks: system "foo "bar" baz"; won't pass "bar" baz as arguments to the foo command.
Lots of common stuff in this answer: Please be more detailed in your question to get a better reply.

Shell script to build CLI args for a PERL script

I have a Jenkins job, triggered as a parameterized build. It accepts an optional String parameter (HOSTNAMES) that can be a comma separated list of hostnames.
I need to pass this comma separated list of hostnames as a command line argument to a PERL script (within Execute shell build step).
Here is how I process the input parameter and construct the command line argument within the execute shell build step:
cmd_options=''
echo hostnames is $HOSTNAMES
if [ "$HOSTNAMES" != "" ]
then
cmd_options+=" --hostnames \"$HOSTNAMES\""
fi
perl myscript.pl $cmd_options
In the console output of the build though, I see the argument being passed incorrectly. Here is the console output:
+ cmd_options=
+ echo hostnames is host1, host2
hostnames is host1, host2
+ '[' 'host1, host2' '!=' '' ']'
+ cmd_options+=' --hostnames "host1, host2"'
+ perl myscript.pl --hostnames '"host1,' 'host2"'
I want myscript.pl to be called this way:
perl myscript.pl --hostnames "host1, host2"
I have tried various ways of manipulating $cmd_options using single quotes and double quotes, but have been unsuccessful so far in getting it right. Any pointers at where I am going wrong?
When you build a command, delay the interpolation and use eval to execute it.
HOSTNAMES='host1, host2'
cmd_options=''
if [ "$HOSTNAMES" != "" ]; then
cmd_options+='--hostnames "$HOSTNAMES"'
fi
eval "prog $cmd_options"
A better solution is to use an array.
HOSTNAMES='host1, host2'
cmd_options=()
if [ "$HOSTNAMES" != "" ]; then
cmd_options+=(--hostnames "$HOSTNAMES")
fi
prog "${cmd_options[#]}"
If prog is the following program:
#!/usr/bin/perl
use feature qw( say );
say 0+#ARGV; say for #ARGV
Both snippets output the following:
2
--hostnames
host1, host2
Looks like you will not be able to embed a list inside $cmd_options,
as it prevents you from using the quotation-marks properly -
"escaping" the quotation-marks with the backslash (\")
converts them to a regular " character - not a delimiter, and as such,
they are simply concatenated to the first and last items of the $HOSTNAMES list.
Suggest you drop this line:
cmd_options+=" --hostnames \"$HOSTNAMES\""
and, instead, use the following two lines, as needed
(this assumes you still need $cmd_options for passing other parameters)
perl myscript.pl $cmd_options --hostnames "$HOSTNAMES"
perl myscript.pl $cmd_options
Wrapped in an if statement, it should look like this:
if [ "$HOSTNAMES" != "" ]
then
perl myscript.pl $cmd_options --hostnames "$HOSTNAMES"
else
perl myscript.pl $cmd_options
fi
Another option is to make sure there are no spaces in the $HOSTNAMES list -
it will allow to pass the list as a single parameter and the quotation-marks will not be required anymore.
Assuming you don't need the script's positional parameters any more, you can set them yourself. (This will work in any POSIX shell, where arrays are unavailable.)
# Save any positional parameters first, if necessary;
# we're going to overwrite them.
first_arg=$1
second_arg=$2
# etc.
set -- --hostnames "$HOSTNAMES"
perl myscript.pl "$#"

Handling Perl command line arguments with spaces from a bash script?

This has been driving me nuts for hours now.
Consider the following test script in perl:
(hello.pl)
#!/usr/bin/perl
print "----------------------------------\n";
$numArgs = $#ARGV + 1;
print "thanks, you gave me $numArgs command-line arguments:\n";
foreach $argnum (0 .. $#ARGV) {
print "$ARGV[$argnum]\n";
}
Ok, it simply prints out the command line arguments given to the script.
For instance:
$ ./hello.pl apple pie
----------------------------------
thanks, you gave me 2 command-line arguments:
apple
pie
I can give the script a single argument with a space by surrounding the words with double quotes:
$ ./hello.pl "apple pie"
----------------------------------
thanks, you gave me 1 command-line arguments:
apple pie
Now I want to use this script in a shell script. I've set up the shell script like this:
#!/bin/bash
PARAM="apple pie"
COMMAND="./hello.pl \"$PARAM\""
echo "(command is $COMMAND)"
$COMMAND
I am calling the hello.pl with the same params and escaped quotes.
This script returns:
$ ./test.sh
(command is ./hello.pl "apple pie")
----------------------------------
thanks, you gave me 2 command-line arguments:
"apple
pie"
Even though the $COMMAND variable echoes the command exactly like the way I ran the perl script from the command line the second time, this time it does not want to see the apple pie as a single argument.
Why not?
This looks like the problem described in the Bash FAQ as: I'm trying to put a command in a variable, but the complex cases always fail!
The answer to that FAQ suggests a number of possible solutions - I hope that's of use.
The issue of the 2 command-line arguments
"apple
pie"
is due to shell expansion with the IFS shell variable being set to have a space as value.
printf '%q\n' "$IFS" # show value of IFS variable
You may use xargs & sh -c '...code...' to mimic / re-enable ordinary parameter parsing.
PARAM="'apple pie'"
printf '%s' "$PARAM" | xargs sh -c './hello.pl "$#"' argv0
Another option may be to write a few lines of C (like in shebang.c)!
http://www.semicomplete.com/blog/geekery/shebang-fix.html
You should try eval $COMMAND instead of simply $COMMAND.

Is there any use in providing arguments as separate parameters to a system call using Perl?

On Unix, all these three generate the same result
system("top -H -p $pid -n 1"); #ver1
system("top", "H", "p $pid", "n 1"); #ver2
system("top", "-H", "-p $pid", "-n 1"); #ver3
What is the difference between ver2 and ver3?
Is there any reason I should use ver2 and ver3, and not ver1?
They do not even support piping the results, for example, are there any ver2 and ver3 equivalents of the following call?
system("top -H -p $pid -n 1 | grep myprocess | wc -l");
Even it looks same it is not same:
$ perl -e 'system("./test.pl -H -p $$ -n 1");system("./test.pl", "H", "p $$", "n 1");system("./test.pl", "-H", "-p $$", "-n 1");'
-H,-p,10497,-n,1
H,p 10497,n 1
-H,-p 10497,-n 1
$ cat ./test.pl
#!/usr/bin/perl
$\="\n";
$,=",";
print #ARGV;
It is up to top implementation that it works same. Other applications may not work same.
Quoth perlfunc for system:
Note that argument processing varies depending on the number of arguments. If there is more than one argument
in LIST, or if LIST is an array with more than one value, starts the program given by the first element of
the list with arguments given by the rest of the list. If there is only one scalar argument, the argument is checked for shell metacharacters, and if there are any, the entire argument is passed to the system's command shell for parsing (this is /bin/sh -c on Unix platforms, but varies on other platforms). If there are no shell metacharacters in the argument, it is split into words and passed directly to execvp , which is more efficient.
So if $pid is just digits, all are equivalent.
To interpolate results of an arbitrary shell command including pipes use qx and friends.
As a practical reason for using LIST, sometimes your command-line arguments contain spaces or other characters that would confuse your shell.
system("mplayer.exe", "--volume", "75",
q[C:/Program Files/My Music Player/Music Library/The "Music" Song.mp3]);
What is the difference between ver2 and ver3?
Just in what arguments you're passing to top. I don't know of a version of top that will take switches without dashes like some versions of ps do, so you should use version 3.
Is there any reason I should use ver2 and ver3, and not ver1?
If you pass a single string to system it will run it via your shell. This means it will be shell interpreted. Any stray spaces or shell meta characters (quotes, dollar signs, etc...) in the arguments would be interpreted and possibly mess things up. It's also a potential security hole.
For example, if $pid was something like '10; echo pwnd; echo ' then you'd run top -H -p 10 then echo pwnd then echo -n1.
So for both safety and security, unless you need shell processing (see below) you should pass system a list.
Are there any ver2 and ver3 equivalents which allow pipes?
No, piping and redirection is done by the shell. You have to use something other than system. You can do it with open, but it's a pain in the ass. Easiest way is to use IPC::Run.
use IPC::Run;
my $out;
run ["echo", "foo\nbar\nbaz"], "|",
["grep", "ba"], "|",
["wc", "-l"],
\$out;
print $out; # 2
But really if you're just grepping and counting a handful of lines, use Perl.
my $out;
run ["echo", "foo\nbar\nbaz"], '>', \$out;
my $count = grep { /ba/ } split /\n/, $out;
print $count;