Is LISP a compiled or interpreted language? - lisp

I know there is no such thing, strictly speaking, as a compiled or interpreted language.
But, generally speaking, is LISP used to write scripts like Python, bash script, and batch script?
Or is it a general purpose programming language like C++, JAVA, and C#?
Can anyone explain this in simple terms?

Early versions of Lisp programming language and Dartmouth BASIC would be examples interpreter language (parse the source code and perform its behavior directly.). However, Common lisp (Current version) is a compiler language.
Note that most Lisp compilers are not Just In Time compilers. You as a programmer can invoke the compiler, for example in Common Lisp with the functions COMPILE and COMPILE-FILE. Then Lisp code gets compiled.
Additionally most Lisp systems with both a compiler and an interpreter allow the execution of interpreted and compiled code to be freely mixed.
For more details check here

Lisp is a compiled general purpose language, in its modern use.
To clarify:
“LISP” is nowadays understood as “Common Lisp”
Common Lisp is an ANSI Standard
There are several implementations of Common Lisp, both free and commercial
Code is usually compiled, then loaded into an image. The order in which the individual parts/files of an entire system are compiled and loaded is usually defined through a system definition facility (which mostly means ASDF nowadays).
Most implementations also provide a means for loading source code when started. Example:
sbcl --load 'foo.lisp'
This makes it also possible to use lisp source files as “scripts”, even though they will very likely be compiled before execution.

Traditionally, LISP can be interpreted or compiled -- with some of each running at the same time. Compilation, in some cases, would be to a virtual machine like JAVA.
LISP is a general purpose programming language, but rarely used as such anymore. In the days of microcoded LISP machines, the entire operating system, including things like network, graphics and printer drivers, were all written in LISP itself. The very first IMAP mail client, for example, was written entirely in LISP.
The unusual syntax likely makes other programming languages, like Python, more attractive. But if one looks carefully, you can find LISP-inspired elements in popular languages like Perl.

Related

Discovering the "Core" Entities and Macros of Common Lisp

While reading Peter Seibel's "Practical Common Lisp", I learned that aside from the core parts of the language like list processing and evaluating, there are macros like loop, do, etc that were written using those core constructs.
My question is two-fold. First is what exactly is the "core" of Lisp? What is the bare minimum from which other things can be re-created, if needed? The second part is where can one look at the code of the macros which come as part of Common Lisp, but were actually written in Lisp? As a side question, when one writes a Lisp implementation, in what language does he do it?
what exactly is the "core" of Lisp? What is the bare minimum from which other things can be re-created, if needed?
The minimum set of syntactic operators were called "special forms" in CLtL. The term was renamed to "special operators" in ANSI CL. There are 24 of them. This is well explained in CLtL section "Special Forms". In ANSI CL they are 25.
where can one look at the code of the macros which come as part of Common Lisp, but were actually written in Lisp?
Many Common Lisp implementations are free software (list); you can look at their source code. For example, here for SBCL, here for GNU clisp, etc.
when one writes a Lisp implementation, in what language does he do it?
Usually a Lisp implementation consists of
a lower-level part, written in a system programming language. This part includes the implementation of the function call mechanism and of the runtime part of the 24 special forms. And
a higher-level part, for which Lisp itself is used because it would be too tedious to write everything in the system programming language. This usually includes the macros and the compiler.
The choice of the system programming language depends. For implementations that are built on top of the Java VM, for example, the natural choice is Java. For implementations that include their own memory management, it is often C, or some Lisp extensions with similar semantics than C (i.e. where you have fixed-width integer types, explicit pointers etc.).
First is what exactly is the "core" of Lisp? What is the bare minimum from which other things can be re-created, if needed?
Most Lisps have a core of primitive constructs, which is usually written in C (or maybe assembly). The usual reason for choosing those languages is performance. The bare minimum from which other things can be re-created depends on how bare-minimum you want to go. That is to say, you don't need much to be Turing-complete. You really only need lambdas for your language to have a bare minimum, from which other things can be created. Though, typically, people also include defmacro, cond, defun, etc. Those things aren't strictly necessary, but are probably what you mean by "bare minimum" and what people usually include as primitive language constructs.
The second part is where can one look at the code of the macros which come as part of Common Lisp, but were actually written in Lisp?
Typically, you look in in the Lisp sources of the language. Sometimes, though, your macro is not a genuine macro and is a primitive language construct. For such things, you may also need to look in the C sources to see how these really primitive things are implemented.
Of course, if your Lisp implementation is not open-source, you need to disassemble its binary files and look at them piece-by-piece in order to understand how primitives are implemented.
As a side question, when one writes a Lisp implementation, in what language does he do it?
As I said above, C is a common choice, and assembly used to be more common. Though, there are Lisps written in high-level languages like Ruby, Python, Haskell, and even Lisp itself. The trade-off here is performance vs. readability and comprehensibility.
If you want a more-or-less canonical example of a Lisp to look at which is totally open-source, check out Emacs' source code. Of course, this isn't Common Lisp, although there is a cl package in the Emacs core which implements a fairly large subset of Common Lisp.

The concept of Self-Hosting

So I'm developing a small programming language, and am trying to grasp around the concept of "Self-Hosting".
Wikipedia states:
The first self-hosting compiler (excluding assemblers) was written for Lisp by Hart and Levin at MIT in 1962. They wrote a Lisp compiler in Lisp, testing it inside an existing Lisp interpreter. Once they had improved the compiler to the point where it could compile its own source code, it was self-hosting.
From this, I understand that someone had a Lisp interpreter, (lets say in Python).
The Python program then reads a Lisp program which in turn can also read Lisp programs.
By the term, "Self-Hosting", this surely can't mean the Python program can cease to be of use, because removing that would remove the ability to run the Lisp program which reads other Lisp programs!
So by this, how does a program become able to host itself directly on the OS? Maybe I'm just not understanding it correctly.
In this case, the term self-hosting applies to the Lisp compiler they wrote, not the interpreter.
The Python Lisp interpreter (as in your example) would take Lisp source as input, and execute it directly.
The Lisp compiler (written in lisp) can take any Lisp source as input and generate a native machine binary[1] as output (which could then run without an interpreter).
With those two pieces, eliminating Python becomes feasible. The process would go as follows:
python.exe lispinterpret.py lispcompiler.lisp -i lispcompiler.lisp -o lispcompiler.exe
We ask Python to interpret a lisp program from source (lispcompiler.lisp), and we pass lispcompiler.lisp itself as input. lispcompiler.lisp then outputs lispcompiler.exe as output, which is a native machine binary (and doesn't depend on Python).
The next time you want to compile the compiler, the command is:
lispcompiler.exe -i lispcompiler.lisp -o lispcompiler2.exe
And you will have a new compiler without the use of Python.
[1] Or you could generate assembly code, which is passed to an assembler.

Communicate from Lisp to other runtimes

Short version:
Is there a way to allow other programs to call Lisp functions of a Lisp program?
Long version:
I'm contemplating a graph database project :) Not to be started immediately, I'm only probing the ground so far. I've tried couple of graph databases, and my biggest gripe about them is they all are written in Java, (some are in C++, which isn't going to cut it either...). Java has no good way of communicating outwards. It may only be embedded inside another Java program. C++ is just hard to embed / I'm dubious that embedding was even planned.
So, I would obviously want to write it in CL, but I'm considering other options too. So, if you believe that CL simply won't do it, but you have reasons to believe that some other language will, then that's an interesting answer! My requirements to the "other language" would be that it must support parallel computing in some way. Obviously, high performance. And, as mentioned, extensibility.
I see multiple ways to call Lisp from other languages:
The simplest way that should work with all implementations would be to just maintain a bidirectional stream to the REPL. So you could send commands to the REPL and receive the REPL's response. One drawback of this would of course be that everything would be converted to strings.
You could mirror the way SLIME communicates with SWANK. In that case, you either use SWANK directly on the Lisp side and communicate through the same protocol SLIME uses, or write your own version of such a library.
Finally, there are Lisp implementations designed with embeddability in mind. I'm thinking particularly of Embeddabble Common Lisp (ECL) here which has a C API. For example, this section in the manual explains how to call functions, by getting hold of the function's symbol with ecl_make_symbol and then calling it with cl_funcall or cl_apply.
As alternatives to Common Lisp, other Lisp languages might be worthwhile to consider. Various Scheme implementations are designed to be embeddable, this is for example the documentation of Racket's C API. It seems you prefer the native code side of the runtime world over the JVM, but otherwise, Clojure is also interesting for being embeddable within Java.
For the host language there are few limits because most languages should support "pipes" (i.e. streams to other processes) or have a C FFI to call some Lisp's C API.

LISP 1.5 How lisp is like a machine language?

I wish that John McCarthy was still alive, but...
From LISP 1.5 Programmer's Manual :
LISP can interpret and execute programs written in the form of S-
expressions. Thus, like machine language, and unlike most other higher
level languages, it can be used to generate programs for further
execution.
I need more clarification about how machine language can used to generate programs and how Lisp can do it?
All that is saying is that machine code can directly write machine instructions to memory and jump to those instructions to execute them; this is the basis of many attack vectors to break into software, in fact.
The point is, when you're writing machine code, it's easy to generate machine code. But when you're writing in a compiled language like C, you can't just generate C code at run time and then execute it - unless your program includes a C compiler.
Lisp - and, these days, many other languages, especially "scripting languages" like Perl, Python, Ruby, Tcl, Javascript, and command shells - have the ability to execute code that is generated at runtime. In Lisp, since code and data have the same structure, this is usually less work than it is in the other languages, where the code to be evaluated is generally a string that has to be parsed. (Though Perl has the ability to eval a block instead of a string, which lets the compiler do the parsing ahead of time for literal code.)
A machine language can alter itself while running. The last assembly programming i did was for MS DOS and resident program that i used to run before testing other programs. When my program misbehaved, a keystroke switched to the resident program and could peek into the running program and alter it directly before resuming. It was quite handy since I didn't have a debugger.
LISP had this from the very beginning since it was originally interpreted. You could change the definition of a function while you were running and the whole langugage was always available at runtime, even eval and define. When it started getting compiled it wasn't compiled like Algol, but partially, allowing for interpreted and compiled code to intermix at the same time. The fact that its code structure was list structure and that symbols are a data type contributed to this.
Last interview I saw with McCarthy he was asked about what he thought of modern programming languages (Not LISP family but the Algol family language Ruby, that is said to be influenced by LISP), and before answering he asked if they could represent code as data (like list structure). Since it didn't, Ruby is still behind what LISP was in the 60s in his opinion.
Many new programming languages are emerging in the Algol family and some of the most promising ones, like Perl6 and Nemerle, are getting closer to the features LISP had in the 60s.
Machine language programs can fill memory regions with arbitrary bytes. Then they can just jump to the start of such region which will thus get executed right away.
Lisp language programs can easily create arbitrary S-expressions in memory, using cons. Then they can just call eval on these S-expressions to evaluate (interpret) them.
High level languages programs can easily fill memory regions with characters representing new code in the language's syntax. But they can not run such a code.

Lisp as a meta environment

I'm working towards my Ph.D regarding better software reuse by integrating different types of computer languages. Due to performance and safety issues I don't consider to integrate them with foreign function calls and/or the use of web services.
Lisp is my favorite vehicle, because of interactive development, macros, doing modifications at runtime, code as data (the usual things one would imagine hearing the word Lisp), and others.
There are some approaches to port different types of Lisp to virtual machines like the JVM (clojure, kawa, SISC, ABCL, etc.) or .NET (clojure .NET, DotLisp, IronLisp). This is quite interesting, but one is restricted to the "universe" of the respective virtual machine.
Does anybody know of approaches the other way round, i.e. running Java or C# on a Lisp system? I have found the rest of cloak. It seem to be more or less a dead project. To me it would be much more sensible to have Lisp as a common abstraction, hosting other languages like Java and C#.
Which obstacles do you see to overcome this lack of a generic and extendable "language environment" integrating languages like Java or C# (without foreign function calls or (web) services))? Is it due to the fact that no Lisp system is running on a kind of a virtual machine, like the LLVM for instance, or what else?
Best regards, Ingmar
Lisp is a good platform for this kind of language hosting because of its macro capabilities. However, you want many more language features to do it well: modules, reader macros, high-level macro specification, and so on. Racket is one Lisp variant that's going forward in this direction. You can already use Algol 60, a variant of Prolog, a typed sister language, and so on. Guile is also moving in this direction with an ECMAScript implementation.
As far as implementing Java or C# on Lisp, it is possible in theory but it would require a massive amount of work to support these languages at a practical level (Racket used to implement a small portion of Java). It's also not clear that you would really gain anything considering that the CLR and JVM are both multi-language platforms now. What is more interesting is harnessing Lisp macros to define better custom languages (DSLs), defining useful dialects of your Lisp, or implementing another language specifically to bootstrap a useful tool (e.g., Guile implementing Emacs Lisp).
well "it depends", as always, right?
How much of Lisp do you want expose to Java, if any? For example, if you port the JVM to Lisp, do you somehow mate the JVMs need for a garbage collector to the actual underlying GC of the Lisp implementation, or do you simply write your own that GCs the JVM objects within the JVM heap.
It may very well be impractical to mate the two, for several reasons. The Lisp GC is pretty much hidden, much like Javas GC, from the actual implementation. That may be too hidden to work with a JVM implementation.
There's no reason you can't build a JVM in Lisp, it's just a bunch of byte codes. Lisp handles bytes just fine.
There have been implementations of the JVM in JavaScript, it's not much different than a Lisp at its core.
But beyond having a lispy command line to interact with the JVM, the JVM itself wouldn't be very "lispy". How could it be? It's a JAVA VM. The IMPLEMENTATION can be "lispy", but, ideally, none of that lisp-ness would bubble out to the JVM itself.
Beyond any advantages Lisp has in developing ANY program, I don't think Lisp lends itself specifically to being "better" to developing a virtual machine.
Lisp is great at developing other languages, particularly other S-Exp based languages. But a VM is a VM. Monster case statement or some other dispatch base on numeric values mechanism.
Lisp is a perfect host language for such a meta-platform, but it is not necessarily an ideal target language for compiling something low level and imperative. Fortunately, nothing stops you from generating, say, an assembly code within your Lisp environment.