Using `qlot` from a Lisp REPL - lisp

I'm interested in using the qlot library from inside of a Lisp image to manage multiple local instances of quicklisp.
There doesn't seem to be any documentation on how to use it, except through a non-Lisp CLI interface, and the obvious
(qlot:with-local-quicklisp (#P"/a/path/here/") (qlot:install :skippy))
or
(qlot:with-local-quicklisp (#P"/a/path/here/") (qlot:quickload :skippy))
give me
Component "skippy" not found
[Condition of type ASDF/FIND-SYSTEM:MISSING-COMPONENT]
What I'm looking for is a way to install a particular library by name. Basically, exactly how one would use ql:quickload, but targeting a specific, local directory instead of ~/quicklisp. What am I doing wrong?

It looks like the intent is to modify dynamically scoped variables in a way that makes using ql:quickload directly possible.
So
(qlot:with-local-quicklisp (#P"/a/path/to/some/quicklisp/")
(qlot/util:with-package-functions :ql (quickload)
(quickload :skippy)))
will result in skippy being installed in the quicklisp instance at #P"/a/path/to/some/quicklisp/" instead of the default location.
This leaves me a bit perplexed as to what qlot:quickload is for; its describe output doesn't shed additional light.

Related

Possibility of a multilanguage 'source' name with Twincat Eventlogger

Roald has written an excellent guide for the Twincat Eventlogger.
https://roald87.github.io/twincat/2020/11/03/twincat-eventlogger-plc-part.html
https://roald87.github.io/twincat/2021/01/20/twincat-eventlogger-hmi-part.html
For us this is exactly what we want, there is however 1 thing I haven't figured out. How to get the sourcename of the alarm in multiple languages in the HMI. params::sourceName gives the path in the software (example: MAIN.fbConveyor1.Cylinder1) This path can be customized when initializing the alarm (as Roald has shown). This doesn't work in my case, since I would like to define a generic alarm (example: "Cilinder not retracted within maximum time") that is instantiated multiple times.
I was thinking of using the source as a way to show the operator where the alarm occurs. We use this way (path) already for saving machine settings among other things. The machines we build are installed all over the world, so multilanguage is a must.
Beckhoff does support multilanguage alarm names (when defined), but the source is not defined, but dynamically generated.
Anyone have an idea how this problem can be solved?
If I understand your question correctly, then being able to parameterize the event text with information of the source of the problem should help you out.
If you define the event text as Cylinder {0} has not retracted in time. then you can add the arguments of that text during runtime.
IF bRaiseAlarm THEN
bRaiseAlarm := FALSE;
fbAlarm.ipArguments.Clear().AddString('Alice');
fbAlarm.Raise(0);
END_IF
However, since this also stated in the articles you mentioned, I am unsure if this would solve your problem.
'Alice' in this example, can be hard to localize. The following options come to my mind.
The string can be based on an ENUM. Enums can have textlist support, so if you add your translations there, that should allow multilingual output. However... this does require a lot of setup, placing translations inside your code, and making sure the PLC application is aware of the language that the parameter should use.
Use tags to mark the source device, as tags can be language invariant. It is not the most user-friendly method, but it could work for you. It would become something like: "Cylinder 'AA.1123' did not retract in time.". 'AA.1123' as a tag would have to be stored inside your PLC code as a string. You will have to trust that your operator can relate the tag back to the actual source.
Hopefully, this helped, or else please help me understand the problem better.

How does Server.call work in elixir-mongo?

I'm learning Elixir and attempting to use the elixir-mongo library. During the auth/1 command, A the function uses Server.call, piping in the MongoDB request string. looking at the Mongo.Server class, it does not appear to be an actual genserver, nor have a method to match call/1. How is this working?
With high probability it doesn't work. Mongo.Server module doesn't export call function. There are no macros that generate it magically. My guess is that master branch is currently broken. If you are using the library and want to dig into the sources make sure you are looking at the same tag as the version you are using in your project.
Also, there are no classes and methods in Elixir. There are modules and functions :)

Documentation of OCaml code in Eclipse

I'm using Eclipse with the OcaIDE-Plugin to write my ocaml-project.
I have written several ocaml-functions that I want to document (comment, return values and params).
I've created my documentation in the .ml-files like described in this link: http://caml.inria.fr/pub/docs/manual-ocaml/ocamldoc.html
Here is an example of one function:
(** sorting tuples where first element is key *)
let my_comp x y = (*Some code*)
Unfortunately, my comments don't show up, when I press F2 at one of the functions, it only shows the name and the file it is contained.
When writing comments in an mli-file, it works as expected, but i also want to document "private" functions that are not accessible from the outside. Can I define functions in the mli, that are NOT accessible from the outside, just for the documentation?
How can I make Eclipse to show my documention?
Well, as you said, you would like to show the documentation but not export the function out of the module. That, sadly, won't work.
I guess OcaIDE can be considered as incomplete but it doesn't look like it's something people care about (I don't know a single person working on OcaIDE). If you like having autocompletion etc, maybe try to program with emacs and install merlin (look, I found the perfect post for you : here)
As for the suggestion of defining a function in the mli not accessible from the outside, it's completely opposed to why mli files are created, so don't expect that to be possible. ;-)
I hoped I've been able to help you.

How do I find all the modules used by a Perl script?

I'm getting ready to try to deploy some code to multiple machines. As far as I know, using a Makefile.pm to track dependencies is the best way to ensure they are installed everywhere. The problem I have is I'm not sure our Makefile.pm has been updated as this application has passed through a few different developers.
Is there any way to automatically parse through either my source or a few full runs of my program to determine exactly what versions of what modules my application is depending on? On top of that, is there any way to filter it based on CPAN packages? (So that I only depend on Moose instead of every single module that comes with Moose.)
A third related question is, if you depend on a version of a module that is not the latest, what is the best way to have someone else install it? Should I start including entire localized Perl installations with my application?
Just to be clear - you can not generically get a list of modules that the app depends on by code analysis alone. E.g. if your apps does eval { require $module; $module->import() }, where $module is passed via command line, then this can ONLY be detected by actually running the specific command line version with ALL the module values.
If you do wish to do this, you can figure out every module used by a combination of runs via:
Devel::Cover. Coverage reports would list 100% of modules used. But you don't get version #s.
Print %INC at every single possible exit point in the code as slu's answer said. This should probably be done in END{} block as well as __DIE__ handler to cover all possible exit points, and even then may be not fully 100% covering in generic case if somewhere within the program your __DIE__ handler gets overwritten.
Devel::Modlist (also mentioned by slu's answer) - the downside compared to Devel::Cover is that it does NOT seem to be able to aggregate a database across multiple sample runs like Devel::Cover does. On the plus side, it's purpose-built, so has a lot of very useful options (CPAN paths, versions).
Please note that the other module (Module::ScanDeps) does NOT seem to allow you to do runtime analysis based on arbitrary command line arguments (e.g. it seems at first glance to only allow you to execute the program with no arguments) and if that's true, is inferior to all the above 3 methods for any code that may possibly load modules dynamically.
Module::ScanDeps - Recursively scan Perl code for dependencies
Does both static and runtime scanning. Just modules, I don't know of any exact way of verifying what versions from what distributions. You could get old packages from BackPan, or just package your entire chain of local dependencies up with PAR.
You could look at %INC, see http://www.perlmonks.org/?node_id=681911 which also mentions Devel::Modlist
I would definitely use Devel::TraceUse, which also shows a tree of the modules, so it's easy to guess where they are being loaded.

Parsing Unix/iPhone/Mac OS X version of PE headers

This is a little convoluted, but lets try:
I'm integrating LUA scripting into my game engine, and I've done this in the past on win32 in an elegant way. On win32 all I did was to mark all of the functions I wanted to expose to LUA as export functions. Then, to integrate them into LUA, I'd parse the PE header of the executable, unmangle the names, parse the parameters and such, then register them with my LUA runtime. This allowed me to avoid manually registering every function individually just to expose them to LUA.
Now, flash forward to today where I'm working on the iPhone. I've looked through some Unix stuff and I've gotten very close to taking a similar approach, however I'm not sure it will actually work.
I'm not entirely familiar with Unix, but here is what I have so far on iPhone:
Step 1: Query for the executable path through objective-C and get the path of my app
Step 2: Use dlopen to get a handle to my app using: `dlopen(path, RTLD_NOW)`
Step 3: Use `dlsym( libraryHandle, objectName )` to attempt to get the address of a known symbol.
The above steps won't actually get me to where I want to be, but even that doesn't work. Does anyone have any experience doing this type of thing on Unix? Are there any headers or functions I can google to put me on the right track?
Thanks;)
iPhone does not support dynamic linking after the initital application launch. While what you want to do does not actually require linking in any new application TEXT, it would not shock me to find out that some of the dl* functions do not behave as expected.
You may be able to write some platform specific code, but I recommend using a technique developed by the various BSDs called linker sets. Bascially you annotate the functions you want to do something with (just like you currently mark them for export). Through some preprocessor magic they store the annotations, sometimes in an extra segment in the binary image, then have code that grabs that data and enumerates its. So you simply add all the functions you want into the linker set, then walk through the linker set and register all the functions in it with lua.
I know people have gotten this stuff up and running on Windows and Linux, I have used it on Mac OS X and various *BSDs. I am linking the FreeBSD linker_set implementation, but I have not personally seen the Windows implementation.
You need to pass --export-dynamic to the linker (via -Wl,--export-dynamic).
Note: This is for Linux, but could be a starting point for your search.
References:
http://sourceware.org/binutils/docs/ld/Options.html
If static linking is an option, integrate that into the linker script. Before linking, do "nm" on all object files, extract the global symbols, and generate a C file containing a (preferably sorted/hashed) mapping of all symbol names to symbol values:
struct symbol{ char* name; void * value } symbols = [
{"foo", foo},
{"bar", bar},
...
{0,0}};
If you want to be selective in what you expose, it might be easiest to implement a naming schema, e.g. prefixing all functions/methods with Lua_.
Alternatively, you can create a trivial macro,
#define ForLua(X) X
and then grep the sources for ForLua, to select the symbols that you want to incorporate.
You could just generate a mapfile and use that instead, no?