Can aptitude for learning Programming paradigms be influenced by culture or native language's grammar? [closed] - perl

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 8 years ago.
Improve this question
It is well known that different people have different aptitudes regarding various programming paradigms (e.g. some people have trouble learning non-procedural, especially functional languages. Some people have trouble understanding pointers - see Joel Spolsky's blog for musings on that. Some people have trouble grasping recursion).
I was recently reading about a study that looked at how the grammar of someone's native language affected their speed of learning math. Can't find that article now but a quick googling found this reference.
That led me to wondering whether someone's native culture or first language might affect their aptitude towards various programming paradigms. I'm more curious about positive influences - e.g. some trait that make it easier/faster for someone to learn a particular paradigm, for example native language grammar being very recursion-oriented.
To be clear, I'm looking for how culture/language grammare may affect the difference between aptitude of the same person towards various paradigms as opposed to how it affects overall aptitude towards programming between different persons.
Important: the only answers I'm interested in are either references to scientific studies, or personal observations from someone intimately familiar with a particular culture/language, including from their own experience.
E.g. I'm not interested in your opinion of how Chinese being your first language affects anything unless you speak Chinese or worked with extremely large set of Chinese-native programmers extensively.
I'm OK with your guesstimates not based on scientific studies, but please be sure to supply your reasoning about plausible causes of your observation.
I'm not interested in culture-bashing (any such commends will be deleted or flagged for deletion).
I'm also not particularly interested in culture-building - we all know Linus is from Finland and Tetris was written in Russia and Larry Wall is an American. Any culture/nation can produce a brilliant mind in any discipline. I'm interested in averages.

Disclaimer: I was a Cultural Anthropologist before I got into programming, so you know I'm going to be on a high horse, here.
Obviously, a person's history will have an impact on their aptitude for any particular task, but I think this has less to do with the structure or grammar of a person's language than it does with the particular material conditions of the culture in which that language is spoken.
For example, a pair of Anthropologists in the 60's went to various African communities and tested people's susceptibility to various optical illusions. Here is a classic one:
In this illusion, the bottom line looks longer, because the angled lines connecting it make it appear to be off in the distance.
These Anthropologists found that in many African cultures, the illusion doesn't work at all - people consider the lines to be the same length. By refining their study, they found that the only people who were susceptible to the illusion were people who had grown up in an urban environment. They hypothesized that the illusion did not work on people from remote jungle environments, because these people had little or no experience with right angles and seeing things at very long distances.
My point with this is that even if you successfully found a correlation between programmers' native languages and their abilities with certain aspects of programming, you couldn't be sure that the correlation wasn't spurious. For example, you might think that Asians tend to be bad drivers, and you might even be able to demonstrate this statistically. If you then concluded, however, that "bad driving" is some sort of fundamental characteristic of Asian-ness, you would be ignoring the fact that Asians are more likely to be from Asia, and thus to have had much less experience driving cars (or even being in cars) while growing up than Westerners (and especially Americans) have had.
With programming, we might think that a particular language inhibits programming ability, and not take note of the fact that the society in which that language is spoken has much less access to computers, and thus people growing up with that language appear to have less programming aptitude or ability to understand certain programming concepts.
In short, I wouldn't give much credence to the idea that language inhibits anyone's ability to understand anything in particular. The human mind is much too flexible and adaptable for that to be true.

This seems analogous to the Sapir-Whorf Hypothesis - that the facilities of a language affect the ease which which one can cogitate about certain subjects, or in the words of the Wikipedia article:
"The linguistic relativity principle (also known as the Sapir-Whorf Hypothesis) is the idea that the varying cultural concepts and categories inherent in different languages affect the cognitive classification of the experienced world in such a way that speakers of different languages think and behave differently because of it."
( http://en.wikipedia.org/wiki/Linguistic_relativity )
While there appears to be little definitive information here, the discussions appear to be relevant to the question, and perhaps worthy of further exploration.

Just a few random thoughts. I think the influence is generally very weak and can most of the time be neglected but they do exist and sometimes they can make us feel them.
In Chinese grammar, for example, we don't quite distinguish between plural and singular forms, but I wouldn't think we Chinese have any noticeable difficulty understanding the concepts of scalar and array in Perl. The reason might be this: although we generally don't need particular suffixes or changes in form to indicate whether something is singular or plural, we do have the concepts of plural and singular and we mostly depend upon the context to tell them apart. Grammar-wise, the context in Chinese may possibly be way more important than that in those languages belonging to indo-european family. We omit a lot of things sometimes when they have already been mentioned and sometimes when we just presume that these things can be implicitly well understood by the listener. In either case, we don't need those indefinite and definite articles (a, an, the) or those relative pronouns like, that, which and who, to indicate whether they're being mentioned for the first time or yet another time again. Maybe that's partially why I feel very comfortable with Perl's default variable "$". print; chomp; split; all act upon $, which has never ever been mentioned. But this is quite subjective.
I think the Chinese language is more characterized by implicitness and fuzziness than Indo-european languages. For example, We never ever pay attention to subject verb agreement and we never ever do verbal conjugation to denote tenses. This could mean that the Chinese are inclined use a not quite so logical mode of thinking. One of my teachers onced used an example to try to generalize (or maybe over-generalize)the difference between Chinese non-logical mode of thinking and American logical mode of thinking.
If the American version of quarrelling should be this:
“I can lick you.”
“No, you can’t.”
“Yes, I can.”
“No, you can’t.”
“I can.”
“you can’t.”
“Can!”
“Can’t!”
The Chinese version (translated in English) would be something like this:
I can lick you.
How dare you!
What if I dare?
Then you try.
Try? Hm, you wait and see.
Wait and see? I’m not afraid.
Not afraid? OK. You don’t run away.
Who runs away? Come on and lick
Well, I agree that there may be some differences between Chinese way of thinking and that of other countries but the example looks like a stereotype because the Chinese may easily switch to the use of the American version. Back to the question, I think the language and culture may indeed influence a programmer's learning process in one way or another but this influence is defninitely not decidingly noticeable. Maybe because of the culture you're exposed to makes you feel a little bit uncomfortable to get used to some notions in some programming language, recursion or whatever, but time will solve it.

I was recently reading about a study that looked at how the grammar of someone's native language affected their speed of learning math. ... Important: the only answers I'm interested in are either references to scientific studies, or personal observations from someone intimately familiar with a particular culture/language, including from their own experience.
I learned a lot of maths before I started programming (enough to count as "intimately familiar"), and IMO programming is relatively easy: more tangible.
Sometimes I've wondered whether it's beneficial to know more than one human language: if you only know one language, then you might think of the words "cat" and "dog" as being values, i.e. synonymous with cat and dog objects; but if you're fluent in more than one language, then "cat" and "dog" become pointers: because for example the French words "chat" and "chien" are referring/pointing to the same objects as "cat" and "dog", and so clearly there's a distinction between the word and the object.
It's disappointing that you post the question without linking to the article which inspired it. I thought of "reverse polish notation" and wondered whether that was at all the kind of differences in "grammar" that were considered in the original study.

The reference you cite seems to rest on the assumption that making it easier helps with learning. In my understanding, there is a countereffect: without enough challange, you're not learning enough.
There are theories/studies (anyone with a link?) that development of language created crucial pressure on expanding the cerebral cortex and thus "made us human". (in very darwinistic terms: more grey matter ==> better language capabilities ==> better teamwork ==> better survival as a group). So language complexity can't be all bad for learning.
(My only qualification is being an eager follower of The Frontal Cortex blog, so take this with a grain of salt.)
In german we have a strange ordering of numbers: 10^0 and 10^1 positions are switched, but others are normal, (e.g. 25 is 'five and twenty', 125 is 'one hundred five and twenty'). It's been claimed that this makes learning numbers harder, and thus german should adopt a more intuitive ordering.
I guess that it helps a lot with doing additions in your head - at least if you stay below 100 or 200 - You can first add the 10^0 position and already say it / write it down while taking any carry into account for the 10^1 position.
(That doesn't continue for 10^2, I guess that would be done in writing by the majority anyway)
Also: abstractions. There are languages where numbers aren't abstracted from objects, "two coconuts" and "two sabretooth tigers" don't share a common "two" word / concept. Such a language would probably be very bad for developing math skills. Here the abstraction (separating number and object) in language is important.
Generally, I'd say the language has a strong effect on shaping a developing mind, and I see no reason why this should not extend to culture.
Of course it's still open what would be the "right kind of complexity" - for what, and how particular language features affect general improvement vs. establishment of an elite (i.e. "sharpening the skills of the gifted, while hampering the rest").
Interesting Question, no doubt - looking forward to other replies.

Related

Examples of excellent Common Lisp code?

I've learned enough Common Lisp to be able to muddle my way through writing an application. I've read Seibel's Practical Common Lisp
What libraries or programs should I be reading to understand the idioms, the Tao, of Common Lisp?
CL-PPCRE is often cited as a good example, for good reason. Actually, probably any of Edi Weitz's libraries will make good reading, but CL-PPCRE is particularly clever and it's a useful and impressive library. Beyond that a lot of CL implementations are written mostly in CL. It can be pretty productive to pick some part of CL that's usually implemented in CL and compare how different implementations handle it. In particular, some of the best examples of large useful macro systems are implementations of things in the standard. Loop is an interesting read, or if you're really ambitious you could compare a few implementations of CLOS.
If there's some area of computing you are particularly interested in it might be worth mentioning that, so people can tailor recommendations to that.
It's another book, so it may not be precisely what you're looking for, but Peter Norvig's Paradigms in Artificial Intelligence Programming contains a lot of well-written, smallish Common Lisp programs. It's not perfectly natural code, especially in the the first few chapters, because, like code in Practical Common Lisp, it focuses on teaching you how to program in CL, but it's still very much worth a read. It also contains some excellent examples of ways you can build other languages on top of Common Lisp, and it has some valuable advice on how to improve the performance of CL programs.
The other recommendations (PAIP and CL-PPCRE) are excellent. I would also suggest becoming acquainted with Alexandria's code and also taking a look at GBBopen.
The Art of the Metaobject Protocol - is a book with the most beautiful code ever written.
LISP (Lisp In Small Pieces) is a neat book; shows clossette (small obect system) and
some compiler stuff. Without doubt, Norvig's book is awesome.
I really like "Building Problem Solvers" as well but the code is a bit rough. I'm not
used to binding dynamic-scoped variables in the parameter list. But it made for much fun
improvements trying to "pre-compile" the discrimination net. This book gives another
approaches to some of Norvig's code; I still think that Norvig's code
is much "neater and cleaner" code (easier to read and understand, and still creative).
"Building Problem Solvers" should come with a warning label like "you're gonna need some
aspiren (sp). It felt like a bolt of lightning struck me in the head when I finally
"understood" rms and Sussmans' dependency directed backtracking. Compiling the pdis
was also brilliant. Excellent stuff. I just wish I could remember it all ...
I like the SBCL code.
The only thing I would offer is to program. That is what I did.
I did two things. One I tackled a problem that i was familiar with, a unit testing framework and expanded it to include test suites. To get an understanding of macro writing.
The second thing I did was play around with basic objects in CL. Macros, closures, and style.
Also don't forget about getting feedback from Lispers about your code.
(defun ugly-lisp-code? () ())

Why is the Lisp community so fragmented? [closed]

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 10 years ago.
To begin, not only are there two main dialects of the language (Common Lisp and Scheme), but each of the dialects has many individual implementations. For example, Chicken Scheme, Bigloo, etc... each with slight differences.
From a modern point of view this is strange, as languages these days tend to have definitive implementations/specs. Think Java, C#, Python, Ruby, etc, where each has a single definitive site you can go to for API docs, downloads, and such. Of course Lisp predates all of these languages. But then again, even C/C++ are standardized (more or less).
Is the fragmentation of this community due to the age of Lisp? Or perhaps different implementations/dialects are intended to solve different problems? I understand there are good reasons why Lisp will never be as united as languages that have grown up around a single definitive implementation, but at this point is there any good reason why the Lisp community should not move in this direction?
The Lisp community is fragmented, but everything else is too.
Why are there so many Linux distributions?
Why are there so many BSD variants? OpenBSD, NetBSD, FreeBSD, ... even Mac OS X.
Why are there so many scripting languages? Ruby, Python, Rebol, TCL, PHP, and countless others.
Why are there so many Unix shells? sh, csh, bash, ksh, ...?
Why are there so many implementations of Logo (>100), Basic (>100), C (countless), ...
Why are there so many variants of Ruby? Ruby MRI, JRuby, YARV, MacRuby, HotRuby?
Python may have a main site, but there are several slightly different implementations: CPython, IronPython, Jython, Python for S60, PyPy, Unladen Swallow, CL-Python, ...
Why is there C (Clang, GCC, MSVC, Turbo C, Watcom C, ...), C++, C#, Cilk, Objective-C, D, BCPL, ... ?
Just let some of them get fifty and see how many dialects and implementations it has then.
I guess Lisp is diverse, because every language is diverse or gets diverse. Some start with a single implementation (McCarthy's Lisp) and after fifty years you got a zoo. Common Lisp even started with multiple implementations (for different machine types, operating systems, with different compiler technology, ...).
Nowadays Lisp is a family of languages, not a single language. There is not even consensus what languages belong to that family or not. There might be some criteria to check (s-expressions, functions, lists, ...), but not every Lisp dialect supports all these criteria. The language designers have experimented with different features and we got many, more or less, Lisp-like languages.
If you look at Common Lisp, there are about three or four different active commercial vendors. Try to get them behind one offering! Won't work. Then you have a bunch of active open source implementations with different goals: one compiles to C, another one is written in C, one tries to have a fast optimizing compiler, one tries to have some middlle ground with native compilation, one is targeting the JVM ... and so on. Try to tell the maintainers to drop their implementations!
Scheme has around 100 implementations. Many are dead, or mostly dead. At least ten to twenty are active. Some are hobby projects. Some are university projects, some are projects by companies. The users have diverse needs. One needs a real-time GC for a game, another one needs embedding in C, one needs only barebones constructs for educational purposes, and so on. How to tell the developers to keep from hacking their implementation.
Then there are some who don't like Commmon Lisp (too big, too old, not functional enough, not object oriented enough, too fast, not fast enough, ...). Some don't like Scheme (too academic, too small, does not scale, too functional, not functional enough, no modules, the wrong modules, not the right macros, ...).
Then somebody needs a Lisp combined with Objective-C, then you get Nu. Somebody hacks some Lisp for .net. Then you get some Lisp with concurrency and fresh ideas, then you have Clojure.
It's language evolution at work. It is like the cambrian explosion (when lots of new animals appeared). Some will die, others will live on, some new will appear. At some point in time some dialects appear that pick up the state of art (Scheme for everything with functional programming in Lisp in the 70s/80s and Common Lisp for everything MacLisp-like in the 80s) - that causes some dialects to disappear mostly (namely Standard Lisp, InterLisp, and others).
Common Lisp is the alligator of Lisp dialects. It is a very old design (hundred million years) with little changes, looks a little bit frightening, and from time to time it eats some young...
If you want to know more, The Evolution of Lisp (and the corresponding slides) is a very good start!
I think it is because "Lisp" is such a broad description of a language. The only common thing between all the lisps that I know is most things are in brackets, and uses prefix function notation. Eg
(fun (+ 3 4))
However nearly everything else can vary between implementations. Scheme and CL are completely different languages, and should be considered like that.
I think calling the lisp community fragmented is like calling the "C like" community fragmented. It has c,c++,d,java,c#, go, javascript, python and many other languages which I can't think of.
In summary: Lisp is more of a language property (like garbage collection, static typing) than an actual language implementation, so it is completely normal that there are many languages that have the Lisp like property, just like many languages have garbage collection.
I think it's because Lisp was born out of, and maintains the spirit of the hacker culture. The hacker culture is to to take something and make it "better" according to your belief in "better".
So when you have a bunch of opinionated hackers and a culture of modification, fragmentation happens. You get Scheme, Common Lisp, ELISP, Arc. These are all pretty different languages, but they're all "Lisp" at the same time.
Now why is the community fragmented? Well, I'll blame time and maturity on that. The language is 50 years old! :-)
Scheme and Common Lisp are standardized. SBCL seems like the defacto open source lisp and there are plenty of examples out there on how to use it. It's fast and free. ClozureCL also looks pretty darn good.
PLT Scheme seems like the defacto open source scheme and there are plenty of examples out there how to use it. It's fast and free.
The CL HyperSpec seems as good as the JavaDoc to me.
As far as community fragmentation I think this has little to standards or resources. I think this has far more to do with what has been a relatively small community until recently.
Clojure I think has a good chance to become The Lisp for the new generation of coders.
Perhaps my point is a very popular implementation is all that is required to give the illusion of a cohesive community.
LISP is not nearly as fragmented as BASIC.
There are so many dialects and versions of BASIC out there I have lost count.
Even the most commonly used implementation (MS VB) is different between versions.
The fact that there are many implementations of Common LISP should be considered a good thing. In fact, given that there are roughly the same number of free implementations of Common LISP as there are free implementations of C++ is remarkable, considering the relative popularity of the languages.
Free Common LISP implementations include CMU CL, SBCL, OpenMCL / Clozure CL, CLISP, GCL and ECL.
Free C++ implementations include G++ (with Cygwin and MinGW32 variants), Digital Mars, Open Watcom, Borland C++ (legacy?) and CINT (interpreter). There are also various STL implementations for C++.
With regards to Scheme and Common LISP, although admittedly, an inaccurate analogy, there are times when I would consider Scheme is to Common LISP what C is to C++, i.e. while Scheme and C are small and elegant, Common LISP and C++ are large and (arguably) more suited for larger applications.
Having many implementations is beneficial, because each implementation is optimal in unique places. And modern mainstream languages don't have one implementation anyway. Think about Python: its main implementation is CPython, but thanks to JPython you can run Python on the JVM too; thanks to Stackless Python you can have massive concurrency thanks to microthreads; etc. Such implementations will be encompatible in some ways: JPython integrates seamlessly with Java, whilst CPython doesn't. Same for Ruby.
What you don't want is having many implementations which are incompatible to the bone. That's the case with Scheme, where you can't share libraries among implementations without rewriting a lot of code, because Schemers can't agree on how to import/export libraries. Common Lisp libraries, OTOH, because of standardization in core areas, are more likely to be portable, and facilities exist to conditionally write code handling each implementation's peculiarities. Actually, nowadays you may say that Common Lisp is defined by its implementations (think about the ASDF package installation library), just like mainstream languages.
Two possible contributing factors:
Lisp languages aren't hugely popular in comparison to other languages like C/C++/Ruby and so on - that alone may give the illusion of a fragmented community. There may be equal fragmentation in the other language-communities, but a larger community will have larger fragments..
Lisp languages are easier than most to implement. I've seen many, many "toy" Lisp implementations people have made for fun, many "proper" Lisp implementations to solve specific tasks. There are far more Lisp implementations than there are, say, Python interpreters (I'm aware of about.. 5, most of which are generally interchangeable)
There are promising projects like Clojure, which is a new language, with a clear goal (concurrency), without much "historical baggage", easy to install/setup, can piggyback on Java's library "ecosystem", has a good site with documentation/libraries, and has an official mailing list. This pretty much checks off every issue I encountered while trying to learn Common Lisp a while ago, and encourages a more centralised community.
My point of view is that Lisp is a small language so it is easy to implement (compare to Java, C#, C, ...).
Note: As many comment that it is indeed not that small it miss my point. Let me try to be more precise: This boll down to the entry point price. Building a VM that compile some well know mainstream language is quit hard compare to building a VM that deal with LISP. This would then make it easy to build small community around a small project. Now the library and spec may or may not be fully implemented but the value proposition is still there. Closure it a typical example where the R5RS is certainly not in the scope.

Is learning LISP useful at all these days? [closed]

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 11 years ago.
I picked up a LISP book at a garage sale the other day and was just wondering if it was worth spending some time on.
Yes. I'll stick to Common Lisp, here, though Scheme is also a superb language that has a lot to recommend it.
In Common Lisp, you have a largish multi-paradigm language that provides some things that either don't exist widely outside the Lisp family of languages, or are limited to CL and even more obscure/niche languages.
The first feature, which you can get in one way or another from CL, Scheme and quite a few other dialects, is a real macro system.
I say "real" because the system is much more complete, flexible and reliable than, say, C preprocessor macros. It's extremely difficult to get CPP macros to do even simple things (like swapping the values of two variables, or making a foreach construct) in a reliable fashion, but these are trivial with Lisp macros. This turns out to be a very powerful tool for introducing new abstractions and dispensing with "boilerplate" code.
The second feature, which is effectively limited to Common Lisp, is CLOS, the Common Lisp Object System. Despite the name, it's not a conventional OO system like that of Java with methods being part of a class's definition. Instead, it provides polymorphism through "generic functions" which are what methods are attached to, and by default allow you to do multiple dispatch.
I vastly prefer CLOS to the more usual approach to object orientation, as it makes a number of "patterns" (like the Visitor pattern) completely unneccessary and because extension of existing generic functions is so easy; others loathe it because it takes an extremely cavalier approach to encapsulation and because extension of generic functions becomes arguably too easy. Either way, CLOS is different enough that I think it's worth learning just for the different perspective it provides.
The third feature, which is available outside of Lisp but still fantastic if you've never experienced it before is dynamic, interactive programming. CL debuggers tend to be extremely powerful tools, and CL provides for dynamic definition and redefinition of functions, classes and methods, all of which dramatically improves one's ability to explore a problem, test solutions of that problem and its subproblems, and finally put together a program that works correctly and efficiently.
Lastly, for a lot of classes of problems, Lisp is a great practical language. It provides good performance (usually not as fast as C, but dramatically faster than most "scripting languages"), safety, automatic memory management, a decent "standard library" of functions and tremendous opportnities for easy extension.
It is worth learning for "mind-expansion" purposes but not so popular for building apps these days.
However, it is powerful, and mature, and there are fast and free compilers out there. So there is no reason not to choose it for a program if you like.
The way in which Lisp treats data structures and program structures the same offers amazing power which is worth understanding.
Its history is fascinating and it has shaped the world of computer science.
Be sure to check out Ableson and Sussman's Structure and Interpretation of Computer Programs at MIT OpenCourseware
Clojure: a Lisp on .NET and Java VMs
GNU CLISP.
CMU Common Lisp
Depends on the book. Which book?
Common Lisp is worth learning today because it's one of the few languages that pretty much "does everything". If there's some mainstream or obscure programming idiom or technique, odds are Common Lisp has it already in some form. About the only thing CL lacks is continuations (many argue it doesn't need them, but that's not helpful if you want to explore them).
Anyone spending any serious time writing in Common Lisp will come out Changed in some way, typically for the better, IMHO.
Even if you can't carry all of the Lispy concepts you learn and use in to other environments, knowing about them and how they work is still useful.
Good programmers expose themselves to as many different programming paradigms as they can - not as many programming languages as they can.
The LISP family (there are several variants) is very worth while getting to know. Your objective should be getting your head wraped around functional programming and the lambda calculus - the paradigm that LISP is based on. Focus less on becomming an "ace" LISP programmer (that could take years).
If you find functional programming "flips your switch", try having a look at PROLOG too - here the paradigm is based on evaluation of Horn Clauses (predicate logic).
I may have spent the last 20 years earning a living as a COBOL programmer (OMG - they still have those!), but I think I am a better programmer because of the time spent learning what LISP, and a number of other programming languages were really all about.
Have a blast...
Here's a Google Tech talk worth watching about a company currently producing large, complex software for the airline industry in Common Lisp:
Lisp for High-Performance Transaction Processing (The video is now unavailable. Here's some note by Zach Beane.)
Other topics are mentioned in that video, including Clojure, a new variant of Lisp for the JVM (some work is now being done to develop Clojure for the CLR too, but that is not as far along), which is worth checking out for the way it addresses concurrency issues. See the Clojure site at:
http://clojure.org/
and, in particular at first, check out the link in the upper right on that page to some excellent Screencasts with overviews of the concurrency issues and Clojure features.
If you get interested in Lisp, and the book you found is not that great (I have an old one myself that didn't do much for me), Paul Graham's book On Lisp is available free at http://www.paulgraham.com/onlisp.html and is very good. The general Lisp idea is the same for Common Lisp or Scheme or Emacs Lisp or Clojure, but the specifics will be different - so keep that in mind if reading Graham's book, which focuses mostly on Common Lisp (with some mentions of Scheme specifics.) On Lisp is probably not the best beginner book, but it's worth going through it and just skimming over specifics you're not ready to follow in detail yet to see what is there, particularly with regard to macros, which On Lisp really explores.
One benefit of Lisp is that you develop an appreciation for prefix and wonder why everyone else in the world doesn't us it too (like with Latex or vim)
+ 1 2 3 4 5 6 7 8 9
is much easier to code/edit/paste than
1 + 2 + 3 + 4 + 5 + 6 + 7 + 8 + 9
See this question, and especially the third answer, the one that explains that Lisp is good for solving complicated problems, those problems that are hard to decompose into more manageable modules if you were to try to solve them in another language. In other words, if you are keen on exploring ways to solve convoluted problems, that involve, let's say Natural Language Processing, or Knowledge Aggregation... then, yes, Lisp might just be useful to you.
Learn lisp to learn about macros, and that code is data, and to learn that you can reach enlightenment w/o the self-flagellation of C++.
Learn Common Lisp to learn about reader macros and compiler macros. I don't know any other language that has them.
Learn scheme for continuations.
Learn Clojure because it's going to make Java obsolete :-)
It (Common Lisp) is still heavily used by academics working in Artificial Intelligence. Scheme is a Lisp-like language used by many (most?) CS departments as well. Personally I think learning Lisp is worthwhile whether or not you end up using it. It's a classic language that we've learned a great deal from over time.
Learning LISP is a good way to learn functional programming effectively, and is often used as an introductory language for undergraduate students. Many people feel that Structure and Interpretation of Computer Programs, which uses the Scheme dialect of Lisp, is a book that should be on every programmers shelf.
Paul Graham has been a big proponent of Lisp, and in his book, Hackers and Painters, he describes how he used the power of Lisp to dominate the competition in creating ViaWeb for Yahoo Stores.
Elsewhere, I've seen Lisp dialects used prominently in the aerospace industry, as scripting tools for integration frameworks like Comet, and AML. Lisp will always be tied to the early AI experiments in the 1950's.
As other have alluded to, Lisp isn't that popular anymore for general programming, and it definitely (IMO) has some major problems for writing real systems in, but of course others disagree (for instance much of ITA's software is written in Lisp, and they make crazy bank).
Even if you never write a 'real' program in Lisp, it is absolutely worth learning. There are many programming techniques originally pioneered in Lisp that, knowing them, will help you write better code in Python, Perl, Ruby, ML, Haskell, and even C++. For instance, check out Higher Order Perl, which shows how to do all kinds of amazing tricks in Perl; to quote the introduction "Instead of telling you how wonderful Lisp is, I will tell you how wonderful Perl is, and in the end you will not have to know any Lisp, but you will know a lot more about Perl. [...] Then you can stop writing C programs in Perl. I think you will find it to be a nice change. Perl is much better at being Perl than it is at being a slow version of C."
And there are some great books out there that use Lisp, and it will be easier to understand them if you know the language - SICP, Norvig's Paradigms of Artificial Intelligence Programming, and The Reasoned Schemer all come to mind as must-reads.
Some people here are recommending Clojure. While I would say that Lisp in general is good, I would caution against Clojure, at least for beginners. This is not out of any difficulty using Clojure, just in the fact that Clojure is gratuitously inconsistent with Common Lisp and Scheme. If you want JVM integration, use Armed Bear Common Lisp: https://common-lisp.net/project/armedbear/ , though for general use, I would recommend Steel Bank Common Lisp: http://www.sbcl.org/
This is like questioning if "it's worth to learn C these days of web programming?". Only you can decide if it's worth. What you have to ask yourself is: what am I trying to achieve reading the book?
Learning new languages sometimes aren't useful in the practical sense of things (maybe you're aren't going to use LISP ever in your life), but in the long term it's going to be useful because of the knowledge acquired by different paradigms you aren't too familiar with - and you could use some of you learned in what you already use today.
It's been awhile since university, but after I took a 3rd year CS course that required learning Lisp and writing Lisp programs, writing and thinking recursively was a snap. Not that I had problems with recursion before the course, but afterwards, it was 2nd nature. I also used CLOS in the course (University of Toronto), but it was so long ago, I barely remember what I did.

Plural form of word "mutex" [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
Improve this question
What is the correct plural form of the portmanteau mutex. Is it mutexes or mutices?
From a purely linguistics point of view, the correct usage is mutexes because the word mutex is not Latin in origin. Prescriptivists would wail in anguish if mutices were to enter regular usage.
The -ices usage (e.g., the plurals of index and vertex) is falling out of favor. Indexes and vertexes are both correct usage, for example.
Let their common usage decide...
GoogleFight
Everyone knows that the correct answer is Mutii.
Mutexes. It's correct in a de facto manner--- the vast majority of people (in my experience, certainly) call them mutexes, not mutices, and English is a language that's defined by use. :)
As mutex is short for "mutual exclusion", I would only imagine that "mutual exclusions" would become mutexes. Mutices would be confusing. Better to be unambiguous.
As a side note: it's not a portmanteau, or it would be a mutsion.
There's no official correct form because 'mutex' hasn't gained wide enough circulation to enter any of the major English dictionaries. Thus, the most correct term is whatever is used most by people. And I think that Google hits are a pretty good indicator of (relative) usage frequency, as great_lama has pointed out.
Other English nouns that end in -ex or -ix:
Affix
Annex
Apex
Appendix
Cervix
Circumflex
Complex
Cortex
Crucifix
Duplex
Helix
Ibex
Index
Infix
Latex
Matrix
Phoenix
Prefix
Postfix
Reflex
Remix
Suffix
Vertex
Vortex
And lots more less common words. If you look up these in the dictionary, you'll find that most of them have both plurals shown as acceptable. Several have only the -exes/-ixes form, but few or none (depending on the dictionary you use) have only the -ices form.
In conclusion, I believe mutexes to be the correct plural form of mutex.
Either/or. I've seen both (though mutexes is considerably more common).
Mutex is not in any real dictionary I know of, so there's no "official answer."
Index can be pluralized to indexes or indices, though, so it makes sense that mutex could follow suit.
Since the word apex can be pluralized as either apexes or apices, I'd say you can pronounce it either mutexes or mutices. Whatever suits you.
I think that the hysterical raisins (in this case the fact that "mutex" is a portmanteau) should not be given too much weight in resolving such issues.
Perhaps it would be more useful to consider similar words and their usage; reflex -> reflexes for example.
Or, use the simplest choice: most pluralizations in english use -s/-es (depending on whether last letter is a perceived vowel); in this case -es.
I guess I can't see any reason to use the alternative, except as some sort of tribute to Latin, once thought to be the noblest of all languages. :)
Maybe it is like sheep? Singular and plural?

The future of Perl? (Perl 6, employability)

I've found a few related questions, like Python vs. Perl (now deleted) and Is Perl Worth it? (now deleted), but I can't seem to find anything that directly addresses this question.
Is there a legitimate future in Perl? I work in a Perl shop right now, and I came from PHP so I see some of the advantages of an arguably "lower" level language when doing things on the server-level, but it seems to me a lot of the tasks in Perl can be performed more quickly in PHP, and SOME ARGUE (subjective, not my opinion) that Python does these tasks in a more explicit way that's easier to maintain.
Is having this job on my resume ultimately going to make me less employable, especially if the language no longer grows?
A few notes:
I love Perl, so don't think I'm bashing the language. It's fun to use and we use a fairly verbose syntax that is relatively easy to maintain.
I realize that "Vaporware" is a buzzword that isn't necessarily applicable to this situation, because Perl doesn't have a marketing department and they're not "promising" Perl 6 by any date.
I realize that CPAN keeps the community going, so whether Perl 6 comes out or not people continue to build modules that increase possibilities in the language, but that doesn't mean that industry shops realize this, and switch to "more supported" languages that keep coming out with revised versions of the language like Python and (especially) PHP.*
EDIT {CLARIFICATION}
Cade Roux and Telemachus both brought up good points about whether or not your future can be defined by your resume.
To be honest, this was brought up when one of my former employers said "I don't hire anyone with Perl as their last job. That's OLD technology." This was a PHP shop, so take all that with a grain of salt.
Now without defaming my former employer, she's not a tech person AT ALL, so she was really expressing an opinion of a layperson, and in this case my question was more along the lines of "Is there a stigma on this particular technology placed on it by people who don't utilize it?", specifically more along the lines of people who may have had past experience with similar employers. I'm not asking you to look into the future with a magic glass to assume what the next "hot" language would be, but rather if this particular language (which is accused of stunted growth, again by laypeople) has negative connotations placed upon it.
I hope that makes a little more sense.
Plenty of shops - including on Wall Street - heavily use Perl and will continue to do so.
However, I have never seen a PHP or Python used in this industry (not saying it is not used, but that I never encountered. Purely personal anecdote. Nor have I EVER heard any conversation of "Perl can not do X that Python can, let's use Python").
Perl6 is irrelevant to job picture.
Many shops are still on 5.8 or G-d forbid 5.6
More importantly, perl5 continues to evolve, including with features/ideas from Perl6. See Perl 5.10 and 5.11
Plus evolution includes really cool framework like Moose etc...
I can probably come up with more bullets later, but the summary is that no, having a Perl job will in no way negatively affect your career prospects.
However, knowing nothing but Perl may affect it negatively, so make sure you know Java, C#, C++ or something besides dynamic interpreted languages. Not many shops would hire "Perl Only" developer, even if they gladly hire "Perl + other stuff" ones.
See Tim Bunce's Perl Myths slides on slide share.
In short, Perl is not dead and has lots of jobs available.
Anyone who actually watches the development of Perl, would know that that there has perhaps been more work on the Perl language in the past decade, than in the previous decade.
This has been spurred on by the introduction of Perl6.
The introduction of Perl 6 spurred on, the now deeply ingrained, testing culture.
Just look at how much the Rakudo implementation of Perl 6, is tested:
Rakudo Progress http://rakudo.de/progress.png
There has also been a lot of back-porting of Perl 6 features into Perl 5.
For example, the Perl 6 "switch" statement
#!/usr/bin/perl
use strict;
use warnings;
use 5.10.1;
# or
use feature qw'switch say';
my $str = "testing 123";
given( $str ){
when(/(\d+)/){
say $1;
}
when( [0..10] ){
say $_, 'is equal to some number between 0 and 10';
# given, sets the current topic "$_"
}
}
There are few languages I would tie my career to. Perl will always be there and it will always be the best tool for certain kinds of jobs. But this is true for many languages. However, there are also languages which have more competition in some of the spaces where they are used. Perl is one language that has a lot more strong niches.
Still, you wouldn't restrict yourself to using just one language for your entire life - or even in one project if there are better options to solve a problem.
Career-wise, there are basic technologies which are fairly universally used, and of these I think a few of the most valuable are: relational database concepts and SQL, XML/HTML/HTTP/DOM, regular expressions. These are all basically independent of any particular vendor or language, and if you are strong in these areas, choice of language and platform are going to be informed by the problem being addressed.
Perl is, and always will be, a practical language for manipulating large amounts of data. I work in an industry where moving, converting, and parsing large amounts of text and image data is what we do, and I couldn't live without Perl.
Likewise, if you're a sysadmin (especially a Unix one), Perl is a necessary tool. There are tons of places where you need to be able to whip up a quick and dirty application that runs right along with the shell functions.
Languages have niches. Perl has a big stable niche, in many ways much more stable than fad-driven web languages. PHP, for example, is a nice little web language, but its saving grace is that it's quick and easy to develop in, not that it is a particularly great language. I'll tend to use PHP over Perl for web applications (though I use Python over PHP, if I have time), but 90% of the stuff I do in my day-to-day would be nearly impossible in PHP, and is flat trivial in Perl.
#Nate: I love Python. LOVE it. I actually worry that I love it too much, and I'm being irrational about it. PHP is a nice tool, but when your main selling point is "Quick and Easy" then you're running a risk. That was the big push behind original Visual Basic, and we all know how that worked out.
I'd discourage you from putting Perl on your resume - there's already too many people in the perl market and we don't want any more! ... just kidding.
The past is supposedly no guide to the future, but, despite having plenty of C (etc.) and Java in my 'skills toolbag' I've seen more gainful employ from my Perl than anything else over the last decade.
I suspect that offshore-perl-new-build may not be the biggest market in the future, but there's certainly active development in the city and media industries in the UK.
Otherwise, I'd just agree with the points above. Technicians with diverse skills are more able to pick the right tools, and less inclined to 'get religious' about language choice.
If you're looking at a post where the non-technical management have a strong point of view about what technology should and shouldn't be used - I'd place that one in the 'avoid' pile.
To add another separate answer - as you have noted - there is a very real danger when dealing with recruiters and others that your resume will be interpreted and things inferred that are not necessarily how you see yourself, and you might get pigeon-holed.
This WILL happen both ways - too much variation and you aren't an expert in anything OR too little variation and you are only good at one thing.
I don't have a simple answer for combatting that, except to ensure that you emphasize portable skills and also achievements which are independent of technology - making the company more money, landing new business, making new markets, etc.
Perl is another tool in your toolbox. If I have an opening and one person is narrow focused to a specific technology, and another has a broad range of skills I would be more inclined to hire the one with the wider range of skills even if they might not be quite as deeply knowledgeable. Some one who has a wide range of skills on a range of platforms is someone who can think, innovate and adapt.
I don't understand the point of this question. You have a job and you already know Perl. You can ask whether or not to learn new languages and which ones to learn (please don't, but you could), but none of us can or should predict whether or not you're going to get another job using Perl.
You ask, "Is having this job on my resume ultimately going to make me less employable, especially if the language no longer grows?"
Well, it's better than a blank resume, and you can't change your past, so really what are we talking about here?