Suppose I want to find out in which package a class is defined, e.g. say (defclass x ()()) is defined in p1. One way could be to get the package via (symbol-package 'x). the problem with this solution is that x is exported in a different package p2. Any other suggestions?
As Rainer Joswig said, classes aren't defined in packages; symbols have packages and the name of a class is a symbol.
If you want to know the value of *PACKAGE* at the time the class definition read, compiled, or loaded (which could conceivably be three different values), I do not believe there is any way to retrieve that unless you write code to store it at that time.
Furthermore, it doesn't seem like a meaningful piece of information to have. A package is simply a namespace for symbols, and there's no reason the package that was current at the time the class definition was read, compiled, or loaded should have anything to do with the class itself.
However, if what you really want is for the name of the class x to reside in a package p1, but p2 exports it, you may be interested in adding x to the shadow list of p1 in its defpackage form (or after).
Related
I want to build up a tests library and keep it separated from the libraries under development. My first thought is to go for a structure like the following:
PensLib
--Variants
----BallPoint
----FountainPen
----Tests
------TB_BallPoint
HammocksLib
--Variants
----SingleHammock
----DoubleHammock
----Tests
------TB_DoubleHammock
--Systems
----IndoorWalls
----OutdoorWallAndTree
----CoconutPalms
----Tests
------TB_IndoorWalls
Tests
--PensLib
----Variants
------Test_BallPoint // extends PensLib.Variants.Tests.TB_BallPoint
--HammocksLib
----Variants
------Test_DoubleHammock // extends HammocksLib.Variants.Tests.TB_DoubleHammock
----Systems
------Test_IndoorWalls // extends HammocksLib.Systems.Tests.TB_IndoorWalls
For now let's assume that the way I structure my libraries make sense (which most likely doesn't). I will soon ask more questions on good practices in setting up the testing environment in Dymola and with the Testing Library.
My question is about the correct way to handle relative and absolute paths within models, if possible at all.
The model PensLib.Variants.Tests.TB_BallPoint is used for developing the variant BallPoint
The model Tests.PensLib.Variants.Tests_BallPoint is used for automated testing
I want the model Test_BallPoint to extend the model TB_BallPoint, but I cannot link them. I guess the absolute path PensLib.Variants.Tests.TB_BallPoint is treated as a relative one, since PensLib is found "on the way out" of the Tests library, and from there it goes looking for the rest of the path. Is there perhaps a way to control the path, kind of ..\..\..\PensLib\Variants\Tests\TB_BallPoint?
As you already noted such a setup makes troubles. There are ways around that, namely global name lookup and imports, which I explain briefly further below.
Both solutions are nice when you have such a case in a few situations. But if you have to use it all the time, you make your setup unnecessarily complicated.
Hence, I suggest to make yourself the live easier and change your package structure:
Either create a dedicated test library for every library, maybe PensLib_Tests and HammocksLib_Tests
Or rename the packages in the Tests library and don't use the exact library names
Global name lookup
You can use absolute class paths. They are denoted with a leading ., so this should work:
extends .PensLib.Variants.Tests.TB_BallPoint;
See Modelica Specification chapter 5: Scoping, Name Lookup, and Flattening for details, especially 5.3.3 Global Name Lookup
Importing
You can simply import the library. Lookup of imports is always performed globally.
import PensLib;
extends PensLib.Variants.Tests.TB_BallPoint;
I was trying to create a DataMatrix variable by calling the DataMatrix() function.
But that function doesn't exist. If I type this:
>> DataMatrix
I got this error message:
Undefined function or variable 'DataMatrix'.
I did install the Bioinformatics Toolbox and my version is 2016b on Mac
Any ideas?
As #Andras mentioned in the comments, the procedure to import and use this class is already mentioned in the documentation for the class (though you might be forgiven if you missed it, as it's not on the top part of the page dealing with syntax).
The tl;dr version is that you should either access the class constructor as, e.g.:
D = bioma.data.DataMatrix(...);
or, import the class from the package / namespace first, and then use it directly, i.e.:
import bioma.data.DataMatrix;
D = DataMatrix(...);
Explanation
The reason you need this step in the first place, is because this class is enclosed inside a "package" (a.k.a. a "namespace"). Read the section called "Packages Create Namespaces" in the matlab documentation to find out more what this means.
However, in principle it boils down to the fact that, if you have a folder whose name has a + prefix, then this acts as a namespace for the functions contained within.
So, if you have a folder called +MyPackage on your path, and this contains a function m-file called myfunction.m (but this is not in your path), then you can access this function in the matlab terminal by typing MyPackage.myfunction().
Or, you can import MyPackage.myfunction from that package / namespace and then use myfunction directly.
So, going back to DataMatrix, you will see that if you search where the class definition is located in your matlab folder, you'll find it here:
./toolbox/bioinfo/microarray/+bioma/+data/#DataMatrix/DataMatrix.m
and presumably ./toolbox/bioinfo/microarray is already in your path.
I.e. the bioma package/namespace is in your path, and you can access the data package/namespace below it, and then the class definition for DataMatrix by doing bioma.data.DataMatrix.
PS: Furthermore, the "#" prefix in front of a folder name denotes a class folder, containing the constructor and class methods. If this "#folder" is in your path (or imported etc), then this means you have access to the underlying constructor. This is a remnant from matlab's old object-oriented style, before the classdef keyword was introduced. You can read more about class directories here if you're interested.
I have a fortran project whith some name conflicts (from doxygen's point of view). Sometimes a local variable in a procedure may have the same name as a subroutine or function. For compilation/linking there are no problems, as the different definitions live separate lives, for instance:
progA/main.f defines and uses the variable delta.
libB/delta.f defines a function named delta.
progB/main.f uses the function delta defined in libB.
progB is linked with libB, progA is not linked with libB.
In this case, when generating call/caller graphs, or linked source code, the variable delta in progA/main.f will be identified as the function delta. Is there some combination of doxygen settings I can use to inform it that progA is not supposed to use definitions in libB, or something similar?
Another issue is that I may have functions/subroutines with the same name in different subdirectories. Again, as long as they are not linked together this does not represent a problem for compilation, but doxygen cannot identify which of them is meant in links, calls, etc. Is there some work around this (without renaming procedures, that is)?
I have some code for solving a puzzle game called nurikabe, recently I've been rewriting it to OOP (still learning) and have the following structure:
# CNurikabe.py
from includes import Board, Validation, Heuristics
class CNurikabe(object):
...
# CValidation.py
from includes import Board, Heuristics
class CValidation(object):
...
# CHeuristics.py
from includes import Board
class CHeuristics(object):
...
# CBoard.py
class CBoard(object):
def __init__(self, filename):
# Vars shared by every class
self.x, self.y, self.z, self.t = self.parseData(filename)
# run.py
from CNurikabe import CNurikabe
nurikabe = CNurikabe()
nurikabe.solve('output')
# includes.py
from CBoard import CBoard
Board = CBoard('data.dat')
from CHeuristics import CHeuristics
Heuristics = CHeuristics()
from CValidation import CValidation
Validation = CValidation()
CBoard class has info which has to be shared among all the other classes (such as board dimensions, number coordinates, etc), also I want it to be instantiated once, if possible preventing dependency injection (unnecessarily passing the filename to each class init method, for example)
The classes are needed to access the following:
CValidation class use: CBoard and CHeuristics
CHeuristics class use: CBoard
CNurikabe class use: CBoard, CValidation and CHeuristics
The code I have, works just as expected. I can call other class' methods within the other classes just the way I want it, for example:
# CNurikabe.py:
class CNurikabe(object):
def someFunc(self):
for i in range(Board.dimensionx):
Heuristics.doSomeStuff()
Validation.doSomeMore()
But I've read maybe too much about how globals are evil. Also the code inside includes.py is a bit hackish, because if I change the order of the imports the program won't run, complaining about being unable to import some names.
I also tried another way, only instantiating globally the CBoard class and then, for the other classes, creating an instance of the classes I need. But I felt that was kinda repetitive, creating an unique global instance of CHeuristics inside each class, for example, and that still wouldn't solve the CBoard global problem.
I also thought about creating an instance inside each class's init, but then the code would be very verbose, having to call for example: self.Heuristics.doSomeStuff()
So my question is what would be a better approach to structure this? I've read about singleton patterns (which may be overkill, since it's a small project), and endless ways of doing it for multiple languages like C++ and PHP. Actually The way I'm doing it resembles that of the "extern Class instance;" way of doing it in C++, long time ago I was working on a C++ project that had that style and I liked it, didn't see any problems with it, although the class instances were global.
Globals are evil. However, you probably need a singleton pattern that encapsulate some things together. My experience from C++ and Python is that you can nicely use the hybrid character of the language and use a module in the role of singleton. (If you think more about it, module variables play the role of a singleton member variables, plain functions in the module resemble methods of a singleton.)
This way I suggest to put the board functionality into board.py heuristic functionality into heuristic.py,..., convert the methods to functions, self.variable to variable and use:
import board
import heuristic
import validation
...
class CNurikabe: # the explicit (object) base class is not needed in Python 3
def someFunc(self):
for i in range(board.dimensionx):
heuristics.doSomeStuff()
validation.doSomeMore()
Think about the import board as about getting the reference to the singleton instance -- which actually is, because there is a single instance of the module object. And it will be syntactically the same as your older code -- except getting the instance (the module) will be easier.
Update: Once you need more instances, you should think in classes again. However, passing objects is very cheap operation in Python -- you only copy reference values (technically the address, i.e. 4 or 8 bytes). Once you get the reference value, you can easily assign it to a local variable. (Every assignment in Python means copying the reference value, hence sharing the access to the assigned object. This way, there is actually no need for global variables, and no excuse to use global variables.
Using local variables, the syntax again remains the same.
I have a directory with some helper functions that should be put into a package. Step one is obviously naming the directory something like +mypackage\ so I can call functions with mypackage.somefunction. The problem is, some functions depend on one another, and apparently MATLAB requires package functions to call functions in the very same package still by explicitly stating the package name, so I'd have to rewrite all function calls. Even worse, should I decide to rename the package, all function calls would have to be rewritten as well. These functions don't even work correctly anymore when I cd into the directory as soon as its name starts with a +.
Is there an easier solution than rewriting a lot? Or at least something self-referential like import this.* to facilitate future package renaming?
edit I noticed the same goes for classes and static methods, which is why I put the self-referential part into this separate question.
In truth, I don't know that you should really be renaming your packages often. It seems to me that the whole idea behind a package in MATLAB is to organize a set of related functions and classes into a single collection that you could easily use or distribute as a "toolbox" without having to worry about name collisions.
As such, placing functions and classes into packages is like a final step that you perform to make a nice polished collection of tools, so you really shouldn't have much reason to rename your packages. Furthermore, you should only have to go through once prepending the package name to package function calls.
... (pausing to think if what I'm about to suggest is a good idea ;) ) ...
However, if you really want to avoid having to go through your package and prepend your function calls with a new package name, one approach would be to use the function mfilename to get the full file path for the currently running package function, parse the path string to find the parent package directories (which start with "+"), then pass the result to the import function to import the parent packages. You could even place these steps in a separate function packagename (requiring that you also use the function evalin):
function name = packagename
% Get full path of calling function:
callerPath = evalin('caller', 'mfilename(''fullpath'')');
% Parse the path string to get package directories:
name = regexp(callerPath, '\+(\w)+', 'tokens');
% Format the output:
name = strcat([name{:}], [repmat({'.'}, 1, numel(name)-1) {''}]);
name = [name{:}];
end
And you could then place this at the very beginning of your package functions to automatically have them include their parent package namespace:
import([packagename '.*']);
Is this a good idea? Well, I'm not sure what the computational impacts will be if you're doing this every time you call a package function. Also, if you have packages nested within packages you will get output from packagename that looks like this:
'mainpack.subpack.subsubpack'
And the call to import will only include the immediate parent package subsubpack. If you also want to include the other parent packages, you would have to sequentially remove the last package from the above string and import the remainder of the string.
In short, this isn't a very clean solution, but it is possible to make your package a little easier to rename in this way. However, I would still suggest that it's better to view the creation of a package as a final step in the process of creating a core set of tools, in which case renaming should be an unlikely scenario and prepending package function calls with the package name would only have to be done once.
I have been exploring answers to the same question and I have found that combining package with private folders can allow most or all of the code to be used without modification.
Say you have
+mypackage\intfc1.m
+mypackage\intfc2.m
+mypackage\private\foo1.m
+mypackage\private\foo2.m
+mypackage\private\foo3.m
Then from intfc1, foo1, foo2, and foo3 are all reachable without any package qualifiers or import statements, and foo1, foo2, and foo3 can also call each other without any package qualifiers or import statements. If foo1, foo2, or foo3 needs to call intfc1 or intfc2, then that needs qualification as mypackage.intfc1 or an import statement.
In the case that you have a large set of mutually interdependent functions and a small number of entry points, this reduces the burden of adding qualifiers or import statements.
To go even further, you could create new wrapper functions at the package level with the same name as private functions
+mypackage\foo1.m <--- new interface layer wraps private foo1
+mypackage\private\foo1.m <--- original function
where for example +mypackage\foo1.m might be:
function answer = foo1(some_parameter)
answer = foo1(some_parameter); % calls private function, not itself
end
This way, unlike the intfc1 example above, all the private code can run without modification. In particular there is no need for package qualifiers when calling any other function, regardless of whether it is exposed by a wrapper at the package level.
With this configuration, all functions including the package-level wrappers are oblivious of the package name, so renaming the package is nothing more than renaming the folder.