Ignoring variables in shell script while using cat>file.sh<<EOF ... EOF syntax - perl

I have a question regarding embedding script files within a shell script. I often have a need to create a single shell script that unpacks other scripts, but really dislike having to comment out all of the embedded script's variables. Example of contents of my shell script:
echo "Hello world"
pwd
cat>embedded_perl_script<<EOF
#!/usr/bin/perl -w
\$input = \$ARGV[0];
my \$argc;
\$argc = #ARGV;
print \$input
EOF
perl embedded_perl_script
echo "Finished!"
This code works fine, but I would really like a way to avoid commenting out all of the embedded perl script's variables. Any suggestions?

Try this :
echo "Hello world"
pwd
cat>embedded_perl_script<<'EOF'
#!/usr/bin/perl -w
$input = $ARGV[0];
my $argc;
$argc = #ARGV;
print $input
EOF
perl embedded_perl_script
echo "Finished!"
Note that the EOF had changed to 'EOF' =)
Note : this technique is named here-doc

Related

Running a system command in a perl script that has "#" character

I have a system command like this :
unix_command "#output_file path_to_file"
Now when I try exec or system commands in a perl script I get this error :
Getting a string when expecting an operator.
Can you please help me how to do it in Perl.
Appreciate your help.
Thanks a ton!
Rakesh
system is really two different functions.
You can use it to launch a program.
The following syntax are used to launch a program:
system($prog, #one_or_more_args)
system({ $prog }, $arg0, #args)
Using one of these syntax, all strings passed as arguments are passed untouched to the child program.
Example usage:
system('perl', '-e', 'my #a = "foo"; print "#a\n";');
You can use it to execute a shell command.
The following syntax are used to execute a shell command:
system($shell_cmd)
The above is short for
system('/bin/sh', '-c', $shell_cmd)
You must provide a valid shell command. It you are building the command, you will need to take care to properly escape anything that needs escaping.
Example usage:
use String::ShellQuote qw( shell_quote );
my $cmd = shell_quote('perl', '-e', 'my #a = "foo"; print "#a\n";');
system($cmd);
A bit more specifically to your case, the shell command
program #file1 file2
can be executed as follows:
system('program', '#'.$file1, $file2);
If you actually need to construct a shell command (e.g. because you want to redirect output), you can use the following:
use String::ShellQuote qw( shell_quote );
my $cmd = shell_quote('program', '#'.$file1, $file2) . ' >output.txt 2>&1';
system($cmd);
If you don't need interpolation, use single quotes.
system 'echo #a';
If you do, use backslash.
system "echo \#a";

How to Call Perl script from tcl script

I have a file with 4 perl commands ,
I want to open the file from the tcl and execute each perl command.
TCL script
runcmds $file
proc runcmds {file} {
set fileid [open $file "r"]
set options [read $fileid]
close $fileid
set options [split $options "\n"] #saperating each commad with new line
foreach line $options {
exec perl $line
}
}
when executing the above script
I am getting the error as "can't open the perl script /../../ : No Such file or directory " Use -S to search $PATH for it.
tl;dr: You were missing -e, causing your script to be interpreted as a filename.
To run a perl command from inside Tcl:
proc perl {script args} {
exec perl -e $script {*}$args
# or in 8.4: eval [list perl -e $script] $args
}
Then you can do:
puts [perl {
print "Hello "
print "World\n"
}]
That's right, an arbitrary perl script inside Tcl. You can even pass in other arguments as necessary; access from perl via #ARGV. (You'll need to add other options like -p explicitly.)
Note that this can pass whole scripts; you don't need to split them up (and probably shouldn't; you can do lots with one-liners but they tend to be awful to maintain and there's no technical reason to require it).

How Perl can execute a command in the same shell with it?

I am not sure whether the title is really make sense to this problem. My problem is simple, I want to write a perl script to change my current directory and hope the result can be kept after calling the perl script. The script looks like this:
if ($#ARGV != 0) {
print "usage: mycd <dir symbol>";
exit -1;
}
my $dn = shift #ARGV;
if ($dn eq "kite") {
my $cl = `cd ./private`;
print $cl."\n";
}
else {
print "unknown directory symbol";
exit -1;
}
However, my current directory doesn't change after calling the script. What is the reason? How can I resolve it?
No, the Perl script will be run in a subprocess so it will not be able to affect the environment of the process that called it.
There are various tricks you can use such as sourcing shell scripts (in the context of the current shell rather than a sub-process), or using bash functions and aliases, but they won't work here.
How Perl can execute a command in the same shell with it?
Unless you have a very atypical shell, shells can only receive commands via STDIN, via its command line, and possibly via a command evaluation builtin.
The first two are out unless the Perl script is the parent of the shell, but you could use the third one indirectly as in the following example.
script.pl:
#!/usr/bin/perl
print "chdir 'private'\n";
bash script:
echo "$PWD" # /some/dir
eval "$( script.pl )"
echo "$PWD" # /some/dir/private
Of course, if you use bash, you could hide the details in a shell function.
mycd () {
eval "$( mycd.pl "$#" )"
}
Allowing you use to use
mycd
or even
mycd foo

Export variable from a shell script into a perl script

Perl Code
`. /home/chronicles/logon.sh `;
print "DATA : $ENV{ID}\n";
In logon.sh , we are exporting the variable "ID" (sourcing of shell script).
Manual run
$> . /home/chronicles/logon.sh
$> echo $LOG
While I am running in terminal manually (not from script). I am getting the output. (But not working from the script)
I followed this post :
How to export a shell variable within a Perl script?
But didnt solve the problem.
Note
I am not allowed to change "logon.sh" script.
The script inside the backticks is executed in a child process. While environment variables are inherited from parent processes, the parent can't access the environment of child processes.
However, you could return the contents of the child environment variable and put it into a Perl variable like
use strict; use warnings; use feature 'say';
my $var = `ID=42; echo \$ID`;
chomp $var;
say "DATA: $var";
output:
DATA: 42
Here an example shell session:
$ cat test_script
echo foo
export test_var=42
$ perl -E'my $cmd = q(test_var=0; . test_script >/dev/null; echo $test_var); my $var = qx($cmd); chomp $var; say "DATA: $var"'
DATA: 42
The normal output is redirected into /dev/null, so only the echo $test_var shows.
It won't work.
An environment variable can't be inherited from a child process.
The environment variable can be updated in your "manual run" is because it's in the same "bash" process.
Source command is just to run every command in login.sh under current shell.
More info you can refer to: can we source a shell script in perl script
You could do something like:
#/usr/bin/perl
use strict;
use warnings;
chomp(my #values = `. myscript.sh; env`);
foreach my $value (#values) {
my ($k, $v) = split /=/, $value;
$ENV{$k} = $v;
}
foreach my $key (keys %ENV) {
print "$key => $ENV{$key}\n";
}
Well, I've find a solution, that sound nice for me: This seem robust, as this use widely tested mechanism to bind shell environment to perl (running perl) and robust library to export them in a perl variable syntax for re-injecting in root perl session.
The line export COLOR tty was usefull to ask my bash to export newer variables... This seem work fine.
#!/usr/bin/perl -w
my $perldumpenv='perl -MData::Dumper -e '."'".
'\$Data::Dumper::Terse=1;print Dumper(\%ENV);'."'";
eval '%ENV=('.$1.')' if `bash -c "
. ./home/chronicles/logon.sh;
export COLOR tty ID;
$perldumpenv"`
=~ /^\s*\{(.*)\}\s*$/mxs;
# map { printf "%-30s::%s\n",$_,$ENV{$_} } keys %ENV;
printf "%s\n", $ENV{'ID'};
Anyway, if you don't have access to logon.sh, you have to trust it before running such a solution.
Old...
There is my first post... for history purpose, don't look further.
The only way is to parse result command, while asking command to dump environ:
my #lines=split("\n",`. /home/chronicles/logon.sh;set`);
map { $ENV{$1}=$2 if /^([^ =])=(.*)$/; } #lines;
This can now be done with the Env::Modify module
use Env::Modify 'source';
source("/home/chronicles/logon.sh");
... environment setup in logon.sh is now available to Perl ...
Your Perl process is the parent of the shell process, so it won't inherit environment variables from it. Inheritance works the other way, from parent to child.
But when you run the script with backticks, as shown, the standard output of the script is returned to the Perl script. So, either modify the shell script to end with the echo $LOG statement you show, or create a new shell script that runs the login.sh and then has echo $LOG. Your Perl script would then be:
my $value = `./myscript.sh`;
print $value;

Can I run a Perl script from stdin?

Suppose I have a Perl script, namely mytest.pl. Can I run it by something like cat mytest.pl | perl -e?
The reason I want to do this is that I have a encrypted perl script and I can decrypt it in my c program and I want to run it in my c program. I don't want to write the decrypted script back to harddisk due to secruity concerns, so I need to run this perl script on-the-fly, all in memory.
This question has nothing to do with the cat command, I just want to know how to feed perl script to stdin, and let perl interpreter to run it.
perl < mytest.pl
should do the trick in any shell. It invokes perl and feeds the script in via the shell redirection operator <.
As pointed out, though, it seems a little unnecessary. Why not start the script with
#!/usr/bin/perl
or perhaps
#!/usr/bin/env perl
? (modified to reflect your Perl and/or env path)
Note the Useless Use of Cat Award. Whenever I use cat I stop and think whether the shell can provide this functionality for me instead.
Sometimes one needs to execute a perl script and pass it an argument. The STDIN construction perl input_file.txt < script.pl won't work. Using the tip from How to assign a heredoc value to a variable in Bash we overcome this by using a "here-script":
#!/bin/bash
read -r -d '' SCRIPT <<'EOS'
$total = 0;
while (<>) {
chomp;
#line = split "\t";
$total++;
}
print "Total: $total\n";
EOS
perl -e "$SCRIPT" input_file.txt
perl mytest.pl
should be the correct way. Why are you doing the unnecessary?
cat mytest.pl | perl
…is all you need. The -e switch expects the script as a command line argument.
perl will read the program from STDIN if you don't give it any arguments.
So you could theoretically read an encrypted file, decrypt it, and run it, without saving the file anywhere.
Here is a sample program:
#! /usr/bin/perl
use strict;
use warnings;
use 5.10.1;
use Crypt::CBC;
my $encrypted = do {
open my $encrypted_file, '<', 'perl_program.encrypted';
local $/ = undef;
<$encrypted_file>;
};
my $key = pack("H16", "0123456789ABCDEF");
my $cipher = Crypt::CBC->new(
'-key' => $key,
'-cipher' => 'Blowfish'
);
my $plaintext = $cipher->decrypt($encrypted);
use IPC::Run qw'run';
run [$^X], \$plaintext;
To test this program, I first ran this:
perl -MCrypt::CBC -e'
my $a = qq[print "Hello World\n"];
my $key = pack("H16", "0123456789ABCDEF");
my $cipher = Crypt::CBC->new(-key=>$key,-cipher=>"Blowfish");
my $encrypted = $cipher->encrypt($a);
print $encrypted;
' > perl_program.encrypted
This still won't stop dedicated hackers, but it will prevent most users from looking at the unencrypted program.