Does perl have interactive shell? [duplicate] - perl

This question already has answers here:
How can I start an interactive console for Perl?
(24 answers)
Closed 9 years ago.
Languages like Ruby, Python, Lua, PHP, Node.js, etc. have a simple "shell" where you can type simple one-liners and see the result. Does Perl has something similar? I'm not looking for something fancy that does pretty printing (I'll use print()) or accepts multiline input.
The story is this:
I need to experiment with some regexps. I know I can do perl -e "..." but then I need to shell-escape the code and this complicates matters. If I had a Perl shell I wouldn't need to worry about escaping.

Have you ever tried Reply.pm? I'm using it.
https://metacpan.org/release/Reply

Have a try with this CPAN module:
Perl::Shell

Related

Perl: console / command-line tool for interactive code evaluation and testing

Python offers an interactive interpreter allowing the evaluation of little code snippets by submitting a couple of lines of code to the console. I was wondering if a tool with similar functionality (e.g. including a history accessible with the arrow keys) also exists for Perl?
There seem to be all kinds of solutions out there, but I can't seem to find any good recommendations. I.e. lots of tools are mentioned, but I'm interested in which tools people actually use and why. So, do you have any good recommendations, excluding the standard perl debugging (perl -d -e 1)?
Here are some interesting pages I've had a look at:
a question in the official Perl FAQ
another Stackoverflow question, where the answer mostly is the perl debugger and several links are broken
Perl Console
Perl Shell
perl -d -e 1
Is perfectly suitable, I've been using it for years and years. But if you just can't,
then you can check out Devel::REPL
If your problem with perl -d -e 1 is that it lacks command line history, then you should install Term::ReadLine::Perl which the debugger will use when installed.
Even though this question has plenty of answers, I'll add my two cents on the topic. My approach to the problem is easy if you are a ViM user, but I guess it can be done from other editors as well:
Open your ViM, and type your code. You don't need to save it on any file.
:w !perl for evaluation (:w !COMMAND pipes the buffer to the process obtained by running COMMAND. In this case the mighty perl interpreter!)
Take a look at the output
This approach is good for any interpreted language, not just for Perl.
In the case of Perl it is extremely convenient when you are writing your own modules, since in my experience the perl interpreter will refuse to reload a module (even when loading was attempted and failed). On the minus side, you will loose all your context every time, so if you are doing some heavy or slow operation, you need to save some intermediate results (whilst the perl console approach preserves the previously computed data).
If you just need the evaluation of an expression - which is the other use case for a perl console program - another good alternative is seeing the evaluation out of a perl -e command. It's fast to launch, but you have to deal with escaping (for this thing the $'...' syntax of Bash does the job pretty well.
Just use to get history and arrows:
rlwrap perl -de1

Using Haskell to extend Perl?

Has anyone ever written a Haskell extension to Perl? Maybe something simple, like a function that calculates the fib. sequence? I'm interested in using Haskell, and I see some overlap between the Haskell and Perl community. Any pointers to Haskell / Perl projects, or cool things that manage to use both of these? I've seen Language::Haskell -which is only an interpreter- but it seems poorly documented, 6 years old, and lots of fail.
Is it possible to build extentions to Perl using ghci comparable to using XS (something I don't claim to know anything about)? I realize this question is probably all kinds of wrong, and badly worded. I'm attempting two things that I know little about - Haskell and extending Perl (which have both always interested me). Feel free to edit this.
The closest work was Inline::Haskell I think, during the pugs / perl6 time.
You can also embed Perl5 in a Haskell program: http://hackage.haskell.org/package/HsPerl5
The Haskell FFI happily supports calling into Haskell from other languages, but I'm not sure this is sensible in the larger scheme of things. Sounds like you're doing it wrong.
It's perhaps worth noting here that you can write shell scripts in Haskell as well using runhaskell:
#! /usr/bin/env runhaskell
There's HSH for mixing shell expressions into Haskell programs.
And the Simple UNIX Tools Haskell wiki page is full of ideas too.
Nothing to Perl but more about Scripting in Haskell

How can I convert a shell script into a Perl script?

How can I convert a shell script into a Perl script?
I have a 10k line shell script and want to convert it into Perl. Is there any tool is doing that, or is there any way to do that?
There is no simple converter. From Perl FAQ "How-can-I-convert-my-shell-script-to-perl?"
Learn Perl and rewrite it. Seriously,
there's no simple converter. Things
that are awkward to do in the shell
are easy to do in Perl, and this very
awkwardness is what would make a
shell->perl converter nigh-on
impossible to write. By rewriting it,
you'll think about what you're really
trying to do, and hopefully will
escape the shell's pipeline datastream
paradigm, which while convenient for
some matters, causes many
inefficiencies.
Learn whichever shell language the script is written in
Learn Perl
Translate the code

Is it possible to write a shell script which is faster than the equivalent script in Perl? [closed]

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 10 years ago.
I wrote multiple scripts in Perl and shell and I have compared the real execution time. In all the cases, the Perl script was more than 10 times faster than the shell script.
So I wondered if it possible to write a shell script which is faster than the same script in Perl? And why is Perl faster than shell although I use the system function in Perl script?
There are few ways to make your shell (eg Bash) execute faster.
Try to use less of external commands if Bash's internals can do the task
for you. Eg, excessive use of sed , grep, awk et for string/text
manipulation.
If you are manipulating relatively BIG files, don't use bash's while read loop.
Use awk. If you are manipulating really BIG files, you can use grep to search for the patterns you want, and then pass them to awk to "edit". grep's searching algorithm is very good and fast. If you want to get only front or end of the file, use head and tail.
file manipulation tools such as sed, cut, grep, wc, etc all can be done
with one awk script or using Bash internals if not complicated. Therefore, you can try to cut down the use of these tools that overlap in their functions.
Unix pipes/chaining is excellent, but using too many of them,
eg
command|grep|grep|cut|sed makes your code slow. Each pipe is an overhead.
For this example, just one awk does them all.
command | awk '{do everything here}'
The closest tool you can use which can match Perl's speed for certain tasks, eg string manipulation or maths, is awk. Here's a fun benchmark for this solution. There are around 9million numbers in the file
Output
$ head -5 file
1
2
3
34
42
$ wc -l <file
8999987
# time perl -nle '$sum += $_ } END { print $sum' file
290980117
real 0m13.532s
user 0m11.454s
sys 0m0.624s
$ time awk '{ sum += $1 } END { print sum }' file
290980117
real 0m9.271s
user 0m7.754s
sys 0m0.415s
$ time perl -nle '$sum += $_ } END { print $sum' file
290980117
real 0m13.158s
user 0m11.537s
sys 0m0.586s
$ time awk '{ sum += $1 } END { print sum }' file
290980117
real 0m9.028s
user 0m7.627s
sys 0m0.414s
For each try, awk is faster than Perl.
Lastly, try to learn awk beyond what they can do as one liners.
This might fall dangerously close to arm-chair optimization, but here are some ideas that might rationalize your results:
Fork/exec: almost anything useful that is done by a shell script is done via a shell-out, that is starting a new shell and running the a command such as sed, awk, cat etc. More often then not, more then one process is executed, and data is moved via pipes.
Data structures: Perl's data structures are more sophisticated then Bash's or Csh's. This typically force the programmer to be created with data storage. This can take the forms of:
use non optimal data structures (arrays instead of hashes)
store data in textual form (for example integers as strings) that needed to be reinterpreted every time.
save data in a file, and re-parse it again and again.
etc.
Non optimized implementation: some shell construct might not be designed with optimization in mind, but with user convenience. For example, I have reason to believe that the bash implementation of Parameter Expansion in particular ${foo//search/replace} is sub-optimal relative to the same operation in sed. This is typically not a problem for day-to-day tasks.
Okay, I know I'm asking for it by opening up a can of worms closed two years ago, but I'm not 100% happy with any of the answers.
The right answer is YES. But most new coders will still go to Perl and Python and write code that struggles mightily to WRAP CALLS TO EXTERNAL EXECUTABLES because they lack the mentoring or experience required to know when to use which tools.
The Korn Shell (ksh) has fast builtin math, and a fully capable and speedy regex engine that, gasp, can handle Perl type regex. It also has associative arrays. It can even load external .so libraries. And it was a finished and mature product 10 years ago. It's even already installed on your Mac.
No, I think it is impossible:
bash command is truly interpeted language, but Perl programs are compiled to bytecode before execution
Certain shell commands can run faster than Perl, in some situations. I once benchmarked a simple sed script against the equivalent in perl, and sed won. But when the requirements became more complex, the perl version started beating the sed version. So the answer is, it depends. But for other reasons, (simplicity, maintainability, etc.) I'd lean toward doing things in Perl anyway unless the requirements are very simple, and I expect them to stay that way.
Yes. C code is going to be faster than Perl code for the same thing, so a script which uses a compiled executable for doing a lot of work is going to be faster than a perl program doing the same thing.
Of course, the Perl program could be rewritten to use the executable, in which case it would probably be faster again.

Is there autoexpect for Perl's Expect?

I would like to generate Perl Expect code automatically, does something like autoexpect exist for Perl's Expect??
This is not a good answer, but will have to do until a good answer comes along.
I ran the TCL autoexpect and it created a script file, I then wrote couple lines of perl code that parses the lines with "send" and "expect" tags and then uses the perl expect module to run them along with some other actions.
This hybrid approach gets me by, but I am still hoping for a better answer to come.