Getting error while running perl module from command line - perl

I have a customized perl module(Modulehere) that take xls sheet and parsing it.
I tried to run that from commandline itself like:
perl -I /home/suser/modules -e "use Modulehere;Modulehere::load_it('/tmp/test.xls')"
But it gives the error like:
Can't open perl script "–e": No such file or directory
Please help!

It works on my machines (OS X and Linux) but looking at the documentation (man perlrun)
-Idirectory
Directories specified by -I are prepended to the search path for modules (#INC).
There is no space between -I and the directory. Maybe your Perl version is being to strict and considering everything after the space as a script file.

Related

Perl command to change file EOL and saving the file with the same name

I am using this perl command on my debian to change my file EOL:
perl -p -e 's/\n/\r\n/' < ~/scripts/bite/EOL/*.csv > ~/scripts/bite/sent/samefilename.csv
Every day there will be a new file in the "EOL" directory with a different name and it always has only 1 file in the directory, so I am using "*" to take whatever file is in it.
But i need to save the new file with the same name as the file I chose to change without manually entering the file name to the command. Eventually this goes into my cronjob, so everything would be automatic.
EDIT: Fixed my problem by using "unix2dos"
Linux has command unix2dos and dos2unix for converting eol from MS Windows/DOS to UNIX format. Perhaps it is easiest solution for described problem.
unix2dos
dos2unix
I would use the unix2dos utility, but you can also use
perl -pe's/\n/\r\n/' -i file.csv
See Specifying file to process to Perl one-liner.
Your program and this one only works on unix systems.

checking to see if files are executable perl

I have a program that checks to see if the files in my directory are readable,writeable, and executable.
i have it set up so it looks like
if (-e $file){
print "exists";
}
if (-x $file){
print "executable";
}
and so on
but my issue is when I run it it shows that the text files are executable too. Plain text files with 1 word in them. I feel like there is an error. What did I do wrong. I am a complete perl noob so forgive me.
It is quite possible for a text file to be executable. It might not be particularly useful in many cases, but it's certainly possible.
In Unix (and your Mac is running a Unix-like operating system) the "executable" setting is just a flag that is set in the directory entry for a file. That flag can be set on or off for any file.
There are actually three of these permissions why record if you can read, write or execute a file. You can see these permissions by using the ls -l command in a terminal window (see man ls for more details of what various ls options mean). There are probably ways to view these permissions in the Finder too (perhaps a "properties" menu item or something like that - I don't have a Mac handy to check).
You can change these permissions with the chmod ("change mode") command. See man chmod for details.
For more information about Unix file modes, see this Wikipedia article.
But whether or not a file is executable has nothing at all to do with its contents.
The statement if (-x $file) does not check wether a file is an executable but if your user has execution priveleges on it.
For checking if a file is executable or not, I'm affraid there isn't a magic method for it. You may try to use:
if (-T $file) for checking if the file has an ASCII or UTF-8 enconding.
if (-B $file) for checking if the file is binary.
If this is unsuitable for your case, consider the following:
Assuming you are on a Linux enviroment, note that every file can be executed. The question here is: The execution of e.g.: test.txt, is going to throw a standard error (STDERR)?
Most likely, it will.
If test.txt file contains:
Some text
And you launched it in your Perl script by: system("./test.txt"); This will display a STDERR like:
./test.txt: line 1: Some: command not found
If for some reason you are looking to run all the files of your directory (in a for loop for instance) be warned that this is pretty dangerous, since you will launch all your files and you may not be willing to do so. Specially if the perl script is in the same directory that you are checking (this will lead to undesirable script behaviour).
Hope it helps ;)

Cwd::abs_path broken on msys, cygwin

I'm trying to get the absolute path to a file provided as a Windows path in Cygwin, respectively Msys (Git Bash) perl. I would like solutions that also work when the supplied path is a "native" Cygwin/MSys path.
I tried using Cwd::abs_path, but that seems subtly broken. Here is a test:
user#MYPC MINGW64 /f/Temp
$ perl
use Cwd;
print Cwd::abs_path("F:\\") . "\n";
print Cwd::abs_path("F:\\test.txt") . "\n";
print Cwd::abs_path("..\\test.txt") . "\n";
/f
/f/Temp/F:/test.txt
/f/Temp/../test.txt
Directories work, relative paths "work" but don't give the result I'd expect (i.e. .. is not eliminated), but when I add a filename to an absolute path the result is wrong. I had hoped that Cwd would do the path translation for me.
I need to later extract parts of the path (using the functions from File::Spec) and also want open the file. To continue working with the extracted part the path should be native to the perl version used. I want to avoid using cygpath, since I'd like the script to also work with ActivePerl, which understands Windows paths only. I could of course add some ifs to only call cygpath for the unix-y perls.
You do not have an absolute path. msys and cygwin are unix emulation environments, and in unix, absolute paths start with /. F:\ is a valid relative path and file name in unix.
Linux$ touch 'F:\'
Linux$ ls
F:\
In cygwin, /cygdrive/f/ refers to your F:. You can use the command-line utility cygpath to convert between native and Windows paths.
cygwin$ cygpath -w /cygdrive/c/
C:\
cygwin$ cygpath -u 'C:\'
/cygdrive/c/
msys should also have a way of accessing the Windows drive through its virtual unix file system.

Can't exec No such file or directory

Merry Christmas to everybody. I'm having a dilemma with a perl script. In my script, I call another program with a system call, but I got this error:
Can't exec "./Classificador/svm_classify": No such file or directory at Analise_de_Sentimentos_mudanca.pl line 463.
I don't know if there is a problem in having my program in a different directory than the called program.
Another curious thing is that this script used to run normally in Ubuntu 10.10. But now I've changed to Mint 14. Is it missing some library?
Best wishes,
Thiago
The relative pathname ./Classificador/svm_classify is interpreted relative to the user's current directory, not the directory containing the perl script. You need to do one of the following:
The user must cd to the directory containing the perl script before running it.
The perl script should call chdir() to set the current directory to the directory where it's stored.
Put the absolute pathname in the script, instead of ./.
Does this "./Classificador/svm_classify" exists ?
Check the following :
1) to go the directory where this file lays - Analise_de_Sentimentos_mudanca.pl
2) run :
ll Classificador/svm_classify
3) show us the results

Using grep in eshell on NTemacs

I have been trying to do a recursive grep command on files in sub folders using grep in NTemacs and Cygwin. So far the "best" results have been using grep in eshell. When I use this:
grep "t" -r *
I get a list of all file names containing the letter t, in all sub folders one layer down but notthing else. In Cygwin i get nothing. I'm working on a directroy that is not in the Cygwin install. Don't know if that mather or not.
What I want is to match the content of a more complex string in all files (and not just the file names, but the content). And in all sub directories.
I would like to use eshell from emacs but I'm open to suggestions, apart form using LINUX. This is a work PC and I don't want to do all the setup of a LINUX install.
i just wrote a very similar answer to another question, but i suspect it's the same root problem:
my first thought is that your files have windows line endings (CRLF) as opposed to unix/linux line endings (LF), and that is messing with grep's ability to parse the file. try running this:
dos2unix filename
on each file you need to search then try your grep statement again.
if you need to convert many files across several directories, i suggest using dos2unix with the -exec action of find:
find . -exec dos2unix {} \;
(add whatever other options you need to find before running that, of course)