I'm using AS perl on win7.
print `cd \\\\ `; # does nothing, says nothing
Same with qx()
print `dir \\\\ `; # correctly prints the root directory
other commands also seem to work fine.
cd works fine from the command line of a batch file.
Has anyone else seen this? Is there a workaround?
You may be looking for chdir. Using a shell command in backticks is not going to have a lasting effect. When you run a backtick command, you spawn a new shell, execute the command, and return the standard output to Perl. Then the shell exits and any and all changes to it is lost.
perldoc -q changed
I {changed directory, modified my environment} in a perl script. How come the change disappeared when I exited the script? How do I get my changes to be visible?
In the strictest sense, it can't be done--the script executes as a different process from the shell it was started from. Changes to a process are not reflected in its parent--only in any children created after the change.
Related
I have a command
my $output = `somecommand parm1 parm2`;
When I try to run this Perl script I get the message
Can't exec "somecommand" at .....
It seems it is not seeing anything past the first space in between the backticks. I have a friend who runs this in a different environment and it runs fine.
What could I have in my environment that would cause this? I am running Perl v5.20 but so is my friend.
Perl's not ignoring the command parameters, it's just mentioning only the part of the command that it has a problem with -- it can't find somecommand
Whatever your somecommand is, it's not a shell command and it's not in a directory listed in your PATH variable
Change PATH to add its location to the end and it will work for you. You can do that system-wide or you dan modify it temporarily in your Perl code by manipulating $ENV{PATH} before you run the command
I'm sure this is an easy fix, but I need to use "script" (and not collect standard in/out/error) for my project. I'm somewhat new to Perl, so please bear with me.
I have a Perl script that works fine. When I run it I generally type script > filename before I run Perl.
$script > file.log
bash-3.2$ perl foobar.pl
This runs fine, and when I'm done I type exit or control D to stop the script and save the file. All I'd like to do is incorporate the script command in Perl and then automatically capture the file when the program stops running (12-16 hours). The problem I have is that is I call in system("script > file.log"); and them call system("perl foobar.pl"); it hangs at the bash-3.2$ prompt. The only way to get Perl to work is control D or exit, stopping the script function.
Anyone have any idea how to fix this? While it's easy to start with script before invoking Perl, if I'm a mole and forget, I have to rerun the program (which takes a long time).
Have you considered using system("script -c 'perl foobar.pl' file.log")?
I am using mr on Windows and it allows running arbitrary commands before/after any repository action. As far as I can see this is done simply by invoking perl's system function. However something seems very wrong with my setup: when making mr run the following batch file, located in d:
#echo off
copy /Y foo.bat bar.bat
I get errors on the most basic windows commands:
d:/foo.bat: line 1: #echo: command not found
d:/foo.bat: line 2: copy: command not found
To make sure mr isn't the problem, I ran perl -e 'system( "d:/foo.bat" )' but the output is the same.
Using xcopy instead of copy, it seems the xcopy command is found since the output is now
d:/foo.bat: line 1: #echo: command not found
Invalid number of parameters
However I have no idea what could be wrong with the parameters. I figured maybe the problem is the batch file hasn't full access to the standard command environment so I tried running it explicitly via perl -e 'system( "cmd /c d:\foo.bat" )' but that just starts cmd and does not run the command (I have to exit the command line to get back to the one where I was).
What is wrong here? A detailed explanation would be great. Also, how do I solve this? I prefer a solution that leaves the batch file as is.
The echo directive is executed directly by the running command-prompt instance.
But perl is launching a new process with your command. You need to run your script within a cmd instance, for those commands to work.
Your cmd /c must work. Check if you have spaces in the path you are supplying to it.
You can use a parametrized way of passing arguments,
#array = qw("/c", "path/to/xyz.bat");
system("cmd.exe", #array);
The echo directive is not an executable and hence, it errors out.
The same is true of the copy command also. It is not an executable, while xcopy.exe is.
Note that I'm aware that this is probably not the best or most optimal way to do this but I've run into this somewhere before and I'm curious as to the answer.
I have a perl script that is called from an init that runs and occasionally dies. To quickly debug this, I put together a quick wrapper perl script that basically consists of
#$path set from library call.
while(1){
system("$path/command.pl " . join(" ",#ARGV) . " >>/var/log/outlog 2>&1");
sleep 30; #Added this one later. See below...
}
Fire this up from the command line and it runs fine and as expected. command.pl is called and the script basically halts there until the child process dies then goes around again.
However, when called from a start script (actually via start-stop-daemon), the system command returns immediately, leaving command.pl running. Then it goes around for another go. And again and again. (This was not fun without the sleep command.). ps reveals the parent of (the many) command.pl to be 1 rather than the id of the wrapper script (which it is when I run from the command line).
Anyone know what's occurring?
Maybe the command.pl is not being run successfully. Maybe the file doesn't have execute permission (do you need to say perl command.pl?). Maybe you are running the command from a different directory than you thought, and the command.pl file isn't found.
There are at least three things you can check:
standard error output of your command. For now you are swallowing it by saying 2>&1. Remove that part and observe what errors the system command produces.
the return value of system. The command may run and system may still return an exit code, but if system returns 0, you know the command was successful.
Perl's error variable $!. If there was a problem, Perl will set $!, which may or may not be helpful.
To summarize, try:
my $ec = system("command.pl >> /var/log/outlog");
if ($ec != 0) {
warn "exit code was $ec, \$! is $!";
}
Update: if multiple instance of the command keep showing up in your ps output, then it sounds like the program is forking and running itself in the background. If that is indeed what the command is supposed to do, then what you do NOT want to do is run this command in an endless loop.
Perhaps when run from a deamon the "system" command is using a different shell than the one used when you are running as yourself. Maybe the shell used by the daemon does not recognize the >& construct.
Instead of system("..."), try exec("...") function if that works for you.
I am looking for a nice way to get the following done:
So I have a script that I need to run in Python in Unix by calling from a Perl script that was, in turn, called from my Excel VBA macro in Windows using Plink. The Python script, due to dependency issues, has to run in either csh or bash, and I will need to use export/setenv to add a few libraries before running the script. However by default, perl runs in sh shell and as such, there is no way I can add in all the dependencies and have the Python script to run.
So, I am just wondering if there is EITHER: 1. a way for me to add dependencies to sh shell in the perl script, OR 2. force my perl script to run in csh (preferred, since for some reason .bashrc for the account runs into permission issues).
Thanks a lot!
How about "3. Set the appropriate environment variable in the Perl or Python scripts"?
$ENV{'PATH'} = ...
...
os.environ['PATH'] = os.pathsep.join(newpaths + os.environ['PATH'].split(os.pathsep))
(dunno how to get the path separator in Perl, sorz)
To force the shell to csh, try the following in Perl :
`/bin/csh -c "command_name"`;
Edit:
You can use ENV variable, like this. Try that :
$s = `/bin/bash -c 'VAR_FOO=753; echo \$VAR_FOO'`;
print $s;
I ended up just change the .cshrc script, apparently the addition to PATH, for some reason, did not work for me. After that, everything runs smoothly by putting all into one line
so basically it looks something like this
/path/to/.cshrc && /python/path/to/python
Hope that helps!