How can I register a collections directory without clobbering the Racket's default(s)? - racket

So, basically, I'm trying to register a local collections collects/ directory containing modules that I don't want to be global (e.g. not in /usr/share/racket/collects) for the interpeter to use.
However, in doing this, I think I've ran into a problem where I can set the path using something like: scheme_set_collects_path(scheme_make_path("/hardcoded/absolute/path/to/collects")); but then it appears to clobber the system-wide racket collections (error is encountered where it tries to interpret the module header #lang racket - I think).
I can't give an exact rundown since I've since scrapped major contextual elements of this, so all I can really ask is: is it even possible to embed Racket such that you can interpret local source files?
I've looked at the header files and seen scheme_init_collection_paths_post, which allows you to specify pre and post paths for collections. However, I don't know if these paths completely clobber any defaults, or how do pass an empty path to pre.
Currently my idea is to register a local collections path in the same folder as my executable and dynamically require functions from local collections. So I must register a path without messing up the defaults (since doing short-form #lang racket requires "racket" collection).
I'm trying to achieve this:
((dynamic-require ''mypackage/mymodule 'myfunction))
where 'myfunction is provided by 'mymodule.
I've not quite wrapped my head around how you actually use the scheme_* functions, so I've done this:
Scheme_Object* a[2];
a[0] = scheme_intern_symbol("mypackage/mymodule");
a[1] = scheme_intern_symbol("myfunction");
Scheme_Object* func = scheme_dynamic_require(2, a);
scheme_apply(func, 0, NULL);
I don't believe that's actually how you'd go about dynamically requiring functions properly but I vaguely copied it from the dynamic require Racket REPL example.
The error I'm getting now is:
standard-module-name-resolver: collection not found
for module path: mypackage/mymodule
collection: "mypackage"
in collection directories:
context...:
show-collection-err
standard-module-name-resolver
SIGSEGV MAPERR si_code 1 fault on addr 0xd8
[1] 7102 abort (core dumped) ./main
Ignore the fact I've called the collection "mypackage". This error is the most recent error. Previously, it could resolve the local collections (I've basically been doing raco pkg install --link mypackage and then copying that into a local collects/ folder - since I know installing packages compiles them) but it wouldn't be able to load any "racket" modules (such as the one required by #lang racket).
So, how do you register an extra - local - collections directory for Racket's interpreter?

Related

Common Lisp - name clash I thought the package system was supposed to protect me against

Over the weekend, I had a name clash that was very hard to track down, but I managed to boil it down to a short example - thing is, I thought the package system was supposed to protect me from this, so I'm wondering how it can in future.
If I do this:
(ql:quickload "cl-irc")
(defpackage #:clash-demo
(:use #:cl
#:cl-irc))
(in-package #:clash-demo)
;; This is the name that clashes - I get a warning about this if I compile
;; this interactively (i.e. from slime) but not if I quickload the whole project.
(defun server-name (server)
(format nil "server-name ~a" server))
;; This needs an IRC server to work - if you have docker
;; then this will do the trick:
;;
;; docker run -d --rm --name ircd -p 6667:6667 inspircd/inspircd-docker
(defparameter *connection*
(cl-irc:connect :nickname "clash-demo"
:server "localhost"
:port 6667
:connection-security :none
:username "username"))
After the above, I get the following warning when defining server-name:
WARNING: redefining CL-IRC:SERVER-NAME in DEFUN
And the following error if I try and print *connection* (in my more full-fledged project I got a missing slot in a class that I'd defined - I think the root cause of both the problem I found and the minimal example above is the same though):
Control stack guard page temporarily disabled: proceed with caution
While I get the warning if I define things interactively, in practice I moved a bunch of code into a quickproject:make-project'd and ql:quickload-ed it, which I think silenced the warning as it always loaded cleanly, hence why it took me so long to track down the name clash.
My questions are:
Isn't the package system supposed to protect me from this? I think I can sort of see why the above happens - the reader's already seen the symbol server-name so it thinks I'm referring to the already defined cl-irc:server-name, and re-uses that - but surely the package system should somehow allow me to work around this?
I'm assuming the warning when I quickload-ed the project was silence because quickload assumes I don't want to see warnings from projects, is there a way I can make this more forceful when I load projects I'm making so that it raises an error, or at least warns me of these name clashes? For all I know there are a bunch more that just haven't caused me a problem yet.
I was expecting either (i) the names not to clash (i.e. my file would define the symbol clash-demo:server-name, not re-use cl-irc:server-name and cause it to be redefined) or (ii) this to be an error, or at least a warning when I quickload the project.
Thanks very much in advance for any advice!
Suppose you really like packages A and B and want to :use both of them in your package MINE. They have a hundred external symbols each and you don't want to have to type any package prefixes. However, they both export a symbol with the same name, a:frob and b:frob. If you simply :use them, you will get a symbol conflict.
To resolve the conflict, there are three options to decide what to do when you're in package MINE and you refer to the unqualified symbol frob:
Prefer A so it refers to the the symbol a:frob: (defpackage mine (:use a b) (:shadowing-import-from a frob))
Prefer B so it refers to the symbol b:frob: (defpackage mine (:use a b) (:shadowing-import-from b frob))
Prefer neither so it refers to mine::frob: (defpackage mine (:use a b) (:shadow frob)) - then, to use one from A or B you must write a:frob or b:frob explicitly
Any of these three cases may be preferable depending on your situation. Common Lisp will not automatically choose one for you. I think this is a reasonable design choice.
Briefly, no: the package system is not meant to protect you against this. If you use a package CL-IRC then you're saying that you want, for instance, to get the symbol CL-IRC:SERVER-NAME when you type server-name. What you are not saying is whether you should be allowed to modify the values associated with that symbol in any way, which is an orthogonal question. The package system is just about names, not values.
In the case of functions, then it's very often (but not always! consider loading patches) a mistake to define a function with a given name in multiple contexts. In the case of variables that's slightly less clear: it's probably a mistake if there are multiple def* forms for a given variable in different contexts, but simply assigning to the variable might well be fine.
So what is needed is a way for defining macros (defun etc) to be able to detect this and complain about it.
CL does not provide such a mechanism. Many implementations do however, either by detecting 'different contexts' (which I have been vague about) or by providing a way of saying that certain packages are sacred and redefinitions should not be allowed, or both.
In this case, the implementation has warned you about the redefinition, but Quicklisp may have suppressed that. I am not sure how to desupress warnings like this in Quicklisp.
In summary the answer is that the problem of controlling and limiting redefinition is orthogonal to what the package system does, and unfortunately CL does not provide a standard solution to this second problem.
If you are interested I have a little shim which uses the condition system to make very sure warnings are treated as errors in contexts like this. I could append it to this answer, but not until after Christmas probably.

coqIDE is not properly connecting files in project and is not compiling

I'm new to using coq/coqIDE and I'm not computer savvy either so I don't know whats wrong or what to call the issue. I was trying to go through the Software Foundations book, but coqIDE isn't working right. I'm using the latest windows 10, and coqIDE 8.10.2
The first issue is when I go to the tab compile -> compile buffer in Basics.v, coqIDE doesn't create a .vo file or a .glob. None of the other buttons worked either. Running coqIDE as admin didn't make it work either, but I figured out I can get around this by manually dragging Basics.v into the coqc application file.
I had no issues with coq working during the first lesson, but in the next lesson we're meant to import the definitions from Basics.v into Induction.v, when I run what they say
From LF Require Export Basics.
I get the error The file C:\Users\...\Coq Files\Tutorial\lf\Basics.vo contains library Basics and not library LF.Basics even though the _CoqProject file contains "-Q . LF" as it should.
I can get around this error too by just writing "Require Export Basics."
which properly loads, up until I actually try calling a definition from Basics
Running
Require Export Basics.
Example example: evenb 2 = true.
works until I get to evenb, and then gives the error
The reference evenb was not found in the current environment. even though it's in Basics.v
If I get even more anal and try
Add LoadPath "C:\Users\...\Coq Files\Tutorial\lf".
From LF Require Export Basics.
I get the error
Cannot find a physical path bound to logical path matching suffix <> and prefix LF.
And then finally if I try
Add LoadPath "C:\Users\...\Coq Files\Tutorial\lf".
Require Export Basics.
Example example: evenb 2 = true.
Loads properly.
So I'm wondering how should I fix the load path so that the project works without putting that junk in every file and how do I make the compile tab work.
There were some people talking about "hitting make in the top-level" but I have no idea what that means. I tried it anyway and ran the Makefile as a .bat even though I already downloaded it properly so there shouldn't be any need, but the Makefile didn't change anything anyway.
I don't think I'm forgetting anything, thanks in advance.
Instead of choosing Compile > Compile buffer, try Compile > Make instead (when browsing any one of the vernacular files in Logical Foundations) - I think that is what others meant by "hitting make in the top-level". But first, you may want to remove the workarounds you added in e.g. Induction.v, and save a trivial change in Basics.v such as adding/removing a newline somewhere in order to convince make to recompile it.

finding out the list of required modules by a module in racket

I want to keep a list of required modules of a particular module (let's say the current-module).
I feel like there are some other options (such as parsing the module?) that could be tried, but I started playing with the idea of shadowing (require) and adding the required items to a hash-table with the module-name. The problem is I cannot figure how to write a syntax definition for it.
Although not working, a function definition equivalent would be like below:
(define require-list (make-hash))
(define require
(lambda vals
; add vals to hash-table with key (current-namespace)
(let ([cn (current-namespace)])
(hash-set! require-list cn
(append vals (hash-ref require-list cn))))
(require vals)))
.. it seems the last line call should be modified to call the original (require) as well?
A correct version or a pointer to how to do it, or any other way of achieving the original goal highly appreciated.
If you just want to get a list of imports for a particular module, there is a convenient built-in called module->imports that will do what you want. It will return a mapping between phase levels and module imports—phase levels higher than 0 indicate imports used at compile-time for use in macro expansion.
> (require racket/async-channel)
> (module->imports 'racket/async-channel)
'((0
#<module-path-index:(racket/base)>
#<module-path-index:(racket/contract/base)>
#<module-path-index:(racket/contract/combinator)>
#<module-path-index:(racket/generic)>))
Note that the module in question must be included into the current namespace in order for module->imports to work, which require or dynamic-require will both do.
This will inspect the information known by the compiler, so it will find all static imports for a particular module. However, the caveats about dynamic requires mentioned by John Clements still apply: those can be dynamically performed at runtime and therefore will not be detected by module->imports.
Short short version:
Have you tried turning on the module browser?
Short version:
You're going to need a macro for this, and
It's not going to be a complete solution
The existing require is not a function; it's a language form, implemented as a macro. This is because the compiler needs to collect the same information you do, and therefore the required modules must be known at compile time.
The right way to do this--as you suggest--is definitely to leverage the existing parsing. If you expand the module and then walk the resulting tree, you should be able to find
everything you need. The tree will be extremely large, but will contain (many instances of) a relatively small number of primitives, so writing this traversal shouldn't be too hard. There will however be a lot of fiddling involved in setting up namespace anchors etc. in order to get the expansion to happen in the first place.
Regarding your original idea: you can definitely create a macro that shadows require. You're going to want to define it in another file and rename it on the way out so that your macro can refer to the original require. Also, the require form has a bunch of interesting subforms, and coming up with a macro that tries to handle all of these subforms will be tricky. If you're looking at writing a macro, though, you're already thinking about an 80% solution, so maybe this won't bother you.
Finally: there are forms that perform dynamic module evaluation, and so you can't ever know for sure all of the modules that might be required, though you could potentially annotate these forms (or maybe shadow the dynamic module-loading function) to see these as they happen.
Also, it's worth mentioning that you may get more precise answers on the Racket mailing list.

Structuring large Lisp applications

I am currently trying to wrap my head around packages, systems & co.
I now have read Packages, systems, modules, libraries - WTF? a few times, and I think I'm still having difficulties to get it right.
If I simply want to split a Lisp source file into two files, where one shall "use" the other - how do I do that? Do I need to build a system for this? Should I use a module? …? I'm coming from a Node.js background, and there you can simply say
var foo = require('./foo');
to get a reference to what's exported in file foo.js. What is the closest equivalent to this in Lisp?
I understand that ASDF is for systems, and that it is bundled as part of Quicklisp, at least according to its documentation:
ASDF comes bundled with all recent releases of active Common Lisp implementations as well as with quicklisp […]
Okay, Quicklisp is for libraries, but what is their relationship? Is Quicklisp something such as a "package manager" in other languages? And if so, then what exactly does ASDF provide?
Sorry for these many questions, but I think it just shows the trouble I have to understand how to structure Lisp applications. Any help would be greatly appreciated :-)
System
For structuring large system use a system management tool. A 'free' one is ASDF.
You would need a system declaration, which lists the parts of you library or application. Typically it goes into an own file. Then you load a system or compile a system. There should be tutorials how to do that.
A simple Lisp system might have the following files:
a system file describing the system, its parts and whatever other stuff is needed (other systems)
a package file which describes the namespaces used
a basic tools file (for examples functions used by the macro)
a macro file which lists the macros (used so that they get compiled/loaded before the rest of the software)
one or more other files with functionality.
Quicklisp is independent of that. It's a software distribution tool.
Quick hack to compile and load files
But you can also compile and load files the old fashioned way without a system tool:
(defparameter *files*
'("/yourdir/foo.lisp" "/yourdir/bar.lisp"))
(defun compile-foobar ()
(mapc #'compile-file *files*))
(defun load-foobar ()
(mapc #'load *files*))
(defun compile-and-load ()
(mapc (lambda (file)
(load (compile-file file)))
*files*))
In reality there might be more to it, but often it's enough. It should be easy to write your own building tool. A typical system tool will provide many more features for building more complex software in a structured way. Many of the ideas for these tools reach back at least 35 years. See for example the Lisp Machine manual, here the edition from 1984, chapter Maintaining Large Systems.
The role of files
Note that in plain Common Lisp the role of files and its semantics are not very complex.
A file is not a namespace, it is not associated with a class/subclass or an object, it is not a module. You mix Lisp constructs in a file like you want. Files can be arbitrary large (for example one complex library has a version where it is delivered as single source file with 30000 lines). The only real place in the standard semantics where a file plays a role is when compiling a file. What side effects has compiling a file? What optimizations can a compiler do?
Other than that it is assumed that the development environment provides services like load and compiling groups of files aka systems, provide overviews of compilation errors, record source locations of definitions, can locate definitions and more. A tool like ASDF handles the system part.
There is a require function in Common Lisp, but it deprecated. If you simply want to split your code in one or more pieces, to use it interactively in the REPL, you can put the code in different files and then load each of them. If instead you want to write a full lisp project, I have found very useful the quickproject package, that provides a simple starting point for the creation of new packages.

Is it better to put the defpackage in a separate file when creating packages

The example below is given in Paul Grahams ANSI Common Lisp as an example of doing encapsulation:
(defpackage "CTR"
(:use "COMMON-LISP")
(:export "COUNTER" "INCREMENT" "CLEAR"))
(in-package ctr)
;function definitions here
However in Peter Seibels Practical Common Lisp, link here, he says:
Because packages are used by the reader, a package must be defined
before you can LOAD or COMPILE-FILE a file that contains an IN-PACKAGE
expression switching to that package. Packages also must be defined
before other DEFPACKAGE forms can refer to them...
The best first step toward making sure packages exist when they need
to is to put all your DEFPACKAGEs in files separate from the code that
needs to be read in those packages
So he recommends creating two files for every package, one for the defpackage and one for the code. The files containing defpackages should start with (in-package "COMMON-LISP-USER").
To me it seems like putting the defpackage in the same file, before the in-package and code, is a good way to ensure that the package is defined before used. So the first method, collecting everything into one file seems easier. Are there any problems with using this method for package creation?
I think that using a separate file for defpackage is a good habit
because:
You don't « pollute » your files with defpackage.
It makes it easier to find the exported/shadowed/... symbols, you
know you just have to look at package.lisp.
You don't have to worry about the order when you use ASDF.
(defsystem :your-system
:components ((:file "package")
... the rest ...))`
Peter Seibel says so ;)
EDIT:
I forgot to mention quickproject which facilitates the creation of
new CL projects.
REPL> (quickproject:make-project "~/src/lisp/my-wonderful-project/"
:depends-on '(drakma cl-ppcre local-time))`
This command will create a directory "~/src/lisp/my-wonderful-project/"
and the following files:
package.lisp
my-wonderful-project.asd (filled)
my-wonderful-project.lisp
README.txt
And thus, I think it's good to use the same convention.
I tend to use multiple source code files, a single "packages.lisp" file and a singe "project.asd" system definition file for most of my projects. If the project requires multiple packages, they're all defined in "packages.lisp", with the relevant exports in place exported.
There is this reason for putting DEFPACKAGE in its own file: if you have a large package, then you might have several groups of related functions, and you might want to have separate source files per function group. Then all the source files would have their own IN-PACKAGE at the top, but they would all "share" the external DEFPACKAGE. Then as long as you get the DEFPACKAGE loaded first, it doesn't matter the order you load the other source files.
An example I'm currently working on has multiple classes in the package, and the source files are broken up to be per class, each having a class definition and the related generic function and method definitions.