I have a Swift library with a core module plus optional bonus modules. I would like to use the following directory layout, mapping to exported Swift package names as shown:
Taco/
Source/
Core/ → import Taco
Toppings/ → import TacoToppings
SideDishes/ → import TacoSideDishes
To my eyes, that’s a sensible-looking project layout. However, if I’m reading the docs right, this will pollute the global module namespace with unhelpful names like “Core”. It seems that SwiftPM will only export a module whose name is identical to the directory name, and thus I have to do this:
Taco/
Source/
Taco/
TacoToppings/
TacoSideDishes/
Is there a way to configure Package.swift to use the tidier directory layout above and still export the desired module names?
Alternatively, is it possible to make the Core, Toppings, and SideDishes modules internal to the project, and export them all to the world as one big Taco module?
There is not currently a clean way to do this, but it seems like a reasonable request. I recommend filing an enhancement request at http://bugs.swift.org for it.
There is one "hacky" way you can do this:
Create your sources in the desired internal layout:
Sources/Core
Sources/Toppings
Add additional symbolic links for the desired module names:
ln -s Core Sources/Taco
ln -s Toppings Sources/TacoToppings
Add an exclude directive to the manifest to ignore the non-desired module name:
let package = Package(
name: "Taco",
exclude: ["Sources/Core", "Sources/Toppings"]
)
is it possible to make the Core, Toppings, and SideDishes modules internal to the project, and export them all to the world as one big Taco module?
No, unfortunately there is no way to do this currently, and it requires substantial compiler work to be able to support.
Related
What cause this issue
Ansible supports user-defined module_utils folder, we can add following line in ansible.cfg:
module_utils = /xxx/lib/module_utils
Then, when the playbook running, ansible will combine both /usr/local/lib/python3.6/dist-packages/ansible/module_utils and /xxx/lib/module_utils together.
So, we can import module utilities in user-defined ansible module, like:
import ansible.module_utils.my_utils
But, pylint doesn't read the ansibe.cfg file and combine the user-defined utility folder with system one. So, it can't find my_utils in /usr/local/lib/python3.6/dist-packages/ansible/module_utils, and cause this issue.
My question
Is there any way to make the pylint 'see' the modules in user-defined folder?
BTW, add additional search path in pylint configuration like below won't fix this issue.
init-hook='import sys; sys.path.append("/xxx/lib/module_utils")'
because in ansible module, we used ansible.module_utils namespace
import ansible.module_utils.my_utils
not
import my_utils
I'm trying to build a Swift Package Manager system package (a module.modulemap)
making available two system C libraries where one includes the other.
That is, one (say libcurl) is a base module and the other C library is including
that (like so: #include "libcurl.h"). On the regular C side this works, because
the makefiles pass in proper -I flags and all is good (and I could presumably
do the same in SPM, but I'd like to avoid extra flags to SPM).
So what I came up with is this module map:
module CBase [system] {
header "/usr/include/curl.h"
link "curl"
export *
}
module CMyLib [system] {
use CBase
header "/usr/include/mylib.h"
link "mylib"
export *
}
I got importing CBase in a Swift package working fine.
But when I try to import CMyLib, the compiler complains:
error: 'curl.h' file not found
Which is kinda understandable because the compiler doesn't know where to look
(though I assumed that use CBase would help).
Is there a way to get this to work w/o having to add -Xcc -I flags to the
build process?
Update 1: To a degree this is covered in
Swift SR-145
and
SE-0063: SwiftPM System Module Search Paths.
The recommendation is to use the Package.swift pkgConfig setting. This seems to work OK for my specific setup. However, it is a chicken and egg if there is no .pc file. I tried embedding an own .pc file in the package, but the system package directory isn't added to the PKG_CONFIG_PATH (and hence won't be considered during the compilation of a dependent module). So the question stands: how to accomplish that in an environment where there libs are installed, but w/o a .pc file (just header and lib).
I'm little confused by various terminologies used in the SystemJS configuration. It talks about module, location, package etc...
Isn't module in JS is a single file, and package is a collection of modules or files? If so, how a module can be an alias to a package?
This is from the documentation page:
The map option is similar to paths, but acts very early in the normalization process. It allows you to map a module alias to a location or package:
Yes module is a single file, in javascript it's just the file name (with assumed .js extension) in quotes after from keyword in
import ... from 'some-module';
In SystemJS config file, paths and map can be used to define what actual file or URL that some-module refers to.
packages in config file allow you to apply a set of configuration parameters (default extension, module format, custom loader etc) for all modules in or below particular location (the key in packages object).
One of the settings in packages is main, which is similar to main in package.json in node (except that it's default value is empty, not index.js): it determines which file is loaded when the package location itself appears in from in import statement.
So, I think "how a module can be an alias to a package?" question about this
The map option is similar to paths, but acts very early in the
normalization process. It allows you to map a module alias to a
location or package:
can be explained on this example:
paths: {
'npm:': 'node_modules/'
},
map: {
'some-module': 'npm:some-module'
},
packages: {
'some-module': {
main: './index.js'
}
}
when these map, packages and path settings are applied by SystemJS to
import something from 'some-module';
they will cause SystemJS to load a module from node_modules/some-module/index.js under baseURL.
and
import something from 'some-module/subcomponent';
is mapped to node_modules/some-module/subcomponent.js.
Note: this is based on my experience with SystemJS 0.19. I haven't tried 0.20 yet.
I have a Coq project with its libraries organised into subdirectories, something like:
…/MyProj/Auxiliary/Aux.v
…/MyProj/Main/Main.v (imports Auxiliary/Aux.v)
When I compile the files, I expect to do so from working directory MyProj (via a makefile). But I also want to work on the files using Proof General/Coqtop, in which case the working directory is by default the directory in which the file lives.
But this means that the LoadPath is different between the two contexts, and so the logical path needed for the library import is different. How do I set up the coqc invocation, the LoadPath, and the import declarations so that they work in both contexts?
Each approach I have tried, something goes wrong. For instance, if I compile Aux.v by invoking
coqc -R "." "MyProj" Auxiliary/Aux.v
and import it in Main.v as
Require Import MyProj.Auxiliary.Aux.
then this works when I compile Main.v with
coqc -R "." "MyProj" Main/Main.v
but fails in Coqtop, with Error: Cannot find library MyProj.Auxiliary.Aux in loadpath. On the other hand, if before the Require Import I add
Add LoadPath ".." as MyProj.
then this works in Coqtop, but fails under coqc -R "." "MyProj" Main/Main.v, with
Error: The file […]/MyProj/Auxiliary/Aux.vo contains library
MyProj.Auxiliary.Aux and not library MyProj.MyProj.Auxiliary.Aux
I’m looking for a solution that’s robust for a library that’s shared with collaborators (and hopefully eventually with users), so in particular it can’t use absolute file paths. The best I have found for now is to add emacs local variables to set the LoadPath up when Proof General invokes Coqtop:
((coq-mode . ((coq-prog-args . ("-R" ".." "MyProj" "-emacs")))))
but this (a) seems a little hacky, and (b) only works for Proof General, not in Coqide or plain Coqtop. Is there a better solution?
Allow me to side-step your question by suggesting an alternative process, hinted at by Tiago.
Assuming that your project's tree looks like this:
MyProj/Auxiliary/Aux.v
MyProj/Main/Main.v
In MyProj, write a _CoqProject file listing all your Coq files
-R . ProjectName
Auxiliary/Aux.v
Main/Main.v
When you open one of these Coq files, emacs will look for the _CoqProject and do-the-right-thing (tm).
As shown by Tiago, coq_makefile will also give you a Makefile for free.
I know you explicitly asked for something that works across different platforms, but there's already a Proof-General-specific solution that is less hacky than the one you have. Proof General has a special variable called coq-load-path that you can set with local variables, much like you did for coq-prog-args. The advantage is that you don't have to worry about any other arguments that need to be passed to coqtop (such as -emacs in your example). Thus, your .dir-locals.el file could have a line like this:
((coq-mode . ((coq-load-path . ((".." "MyProj"))))))
Unfortunately, I am not aware of anything that works across platforms, although I'm pretty sure that something specific for CoqIDE must exist. If this is the case, maybe you could set up a script to keep these configuration files updated across different platforms?
If you use coq_makefile you can install the library in your system.
Without OPAM
To initialize your project:
coq_makefile -f _CoqProject -o Makefile
Share your library with other projects:
make install
With OPAM
Assuming you have OPAM installed, you can use coq-shell to help you take care of dependencies.
To setup your project:
coq_shell_url="https://raw.githubusercontent.com/gares/opam-coq-shell/master/src/opam-coq"
curl -s "${coq_shell_url}" | bash /dev/stdin init 8.4 # Install Coq and its dependencies
eval `opam config env --switch=coq-shell-8.4` # Setup the environment
coq_makefile -f _CoqProject -o Makefile # Generates the makefile
opam pin add coq:YOURLIBRARY . # Add your library to OPAM
When you update your library you should do:
opam upgrade coq:YOURLIBRARY
Here is an example of a project that uses the OPAM method:
https://bitbucket.org/cogumbreiro/aniceto-coq/src
I can't use functions of custom subdirectories.
My Code Organziation
I have under "src" a path hierarchy like
a/b
with all my directories and go-Files (it is the "root" of my project). The directories contain no subdirectories and it works fine. So the deepest path is "a/b/c". E.g. I have
a/b/c
and
a/b/d
with some go-files. Import of "a/b/d" and calling a function with "d.DoSomething()" from a file in "a/b/c" works fine.
Problem description
Now I want ot reorganize "a/b/d". I move some files from "a/b/d" to
a/b/d/e
and the rest of the files to
a/b/d/f
If try to import "a/b/d/e" with import-statement
import ( "a/b/d/e" )
from the same file in "/a/b/c" and want to call "e.DoSomething()" (it is the place, where the file with the "DoSomething-function" moved to), I get an error at the line, where I call "e.DoSomething()": "undefined: e".
While searching for a result, I've nowhere seen examples with deeper path hierarchies. Is it generally not possible to use/import subdirectories or what's the problem?
go-version I used: go1.2.2 linux/amd64
Thanks for any advices
Your approach is completely wrong. Go has absolutely no concept of importing files or directories, all you can import in Go are packages. It now happens that the name of a package is it's path relative to GOPATH and you import packages by that name. But the identifier under which an imported package is available in the importing code depends on the package declaration of the package. You cannot simply "move" files between directories as each directory (for the go tool) is a single package without changing the package declaration.
You can have package x under path a/b/c. When you import package x with import ( "a/b/c" ) all the exported stuff from package x is available to you as x.ExportedName.
Please read http://blog.golang.org/organizing-go-code carefully.
Try and do a go build in a/b/d/e first, before trying to build in a/b: that will generate the compiled classes you want to import.