I'm a relative newbie to SystemVerilog.
I have a package with class A defined in it. This class uses a virtual interface, since
it's a driver (BFM) in a testbench. I'm using a package so I can use the same BFM in
other designs.
In my testbench, I import the A class and pass to it an instance of the virtual interface.
However, when a task in the class tries to assign a value to a signal in the interface, I'm getting a compilation error.
What am I doing wrong?
How can one package a BFM with a virtual interface?
Thanks,
Ran
SystemVerilog packages cannot include interfaces within the actual package. So your interface needs to be compiled along with you package source. The classes you define will reside in the package while the interface definition resides in the global scope where modules live.
Classes within packages can make references to virtual interfaces, but you need to make sure the interface is compiled and visible, apart from the package source.
Strictly according to the spec, I don't think this is possible since it adds an implicit external dependency:
Items within packages are generally type definitions, tasks, and
functions. Items within packages shall not have hierarchical
references to identifiers except those created within the package or
made visible by import of another package. A package shall not refer
to items defined in the compilation unit scope.
It doesn't say anything about the design element namespace, which is where interface declarations live, but accessing any member of an interface requires a hierarchical reference.
You should consider packages to be completely self-contained, other than pre-processor directives and import.
Generally the class declaration not present before its usage is resolved with the help of systemverilog typedef definition. For example "Class A uses Class B" and "Class B uses class A" then typedef is used to resolve the stalemate.
Now when you bring in the package with the above scenario then one needs to ensure both Class A and Class B have to be in same package. If they are not then the compile wont go through.
The reason being the SystemVerilog parser will need the definition of the classes indicated with the typedef at the end of the package parsing. This fails.
This issue needs to watched out that "typedef does not apply across package".
Related
I could declare a package or import a package in Modelica models, but I am not sure if there is any difference between them, I tried the following code, both of them work fine.
My question is :
Is there anything I should pay attention to when using these two methods?
partial model A
package SI1=Modelica.SIunits;
import SI2=Modelica.SIunits;
SI1.Voltage u1;
SI2.Voltage u2;
end A;
You are doing two fundamentally different things here, which both work for this case:
package SI1=Modelica.SIunits; is called a short class definition.
You create a new package named SI1, which inherits everything from Modelica.SIunits.
Short class definitions are basically the same as writing
package SI1
extends Modelica.SIunits;
end SI1;
See chapter 4.5.1 Short Class Definitions in the Modelica spec for details.
import SI2=Modelica.SIunits on the other hand simply influences where the Modelica tool looks for class definitions - so no new class is defined here. The chapter 13.2.1.1 Lookup of Imported Names explains that in the Modelica spec.
If you just want to use the package, import it. That's what import was designed for. Declaring a new package only makes sense if you want to add functionality or change anything (which is very limited though, if you are using the short class definition).
Only the import clause seems to trigger the lookup on a package that is not already loaded. Using for example the Modelica_LinearSystems2 library:
import: it checks, and Modelica_LinearSystems2 is loaded
partial model A
import ls2=Modelica_LinearSystems2;
end A;
package: it checks, but Modelica_LinearSystems2 is not loaded
partial model B
package ls=Modelica_LinearSystems2;
end B;
I guess that can break your models if not all of their dependencies are loaded when trying to simulate.
It is nevertheless interesting to see how Dymola (or even Modelica, since pedantic check doesn't throw any error) does not seem to care much about the use of package instead of import, when it comes to packages already loaded. I wasn't expecting the following model to work:
model C
package SI1=Modelica.SIunits;
SI1.Voltage u1;
parameter SI1.Current R=1;
equation
u1=2*R;
end C;
It turns out that even auto-completion (Ctrl+Space) works:
EDIT : got it shorter.
We created three modules following the prism doc and our requirements.
We did a horizontal slices with modules.
SharedServices
BusinessLogic
UserInterface
In the UserInterface we are using Syncfusion components and other packages, and It would be great to put everything in the UserInterface module but how can we reference nuget assemblies from that module in the shell (to apply theming for example) to avoid having references in each modules & the shell ?
Should we add nugetpackage to each module and the shell (is it bad... ?) or is it possible to have one module which defines base class referencing external assemblies for example and that would be themable (with ResourceDictionary) & usable in the whole solution (shell & other modules) .
Thanks.
Very broad question, it might well be closed, but I try to give you a few guiding thoughts:
Generally, you either slice horizontally (as you did, UI-module with all the views plus logic-module with all the services) or vertically (as your Product-module suggests: views, view models, services for the product in one module, those for the user in another).
You can do both, but then you should "slice through", so one module for product-ui, one for user-ui, one for product-services, one for user-services... you get the idea. That means a lot of modules, though.
Also, when creating your modules, have an idea of what you want to achieve. Modules can encapsulate components to be reused in another app. Or they can encapsulate exchangeable components, so you could create a car-sharing app today and tomorrow swap out the car-module for a bike-module and have a bike-sharing app. Or they can be used to enforce segregation of code based on risk analysis in a regulated environment. What I'm trying to convey: don't create modules just to have modules, make each module have a defined purpose.
Also, define the interfaces for the modules. I don't like modules to reference each other, as it effectively destroys all segregation that would otherwise be there. Create seperate non-module assemblies that only contain public interfaces. Then make your modules contain the implementations as internal types. In an ideal world, no module assembly contains a public type. The interface-assemblies can be either per module or per consumer or per link between modules (those checked boxes in your N2-chart, you have one, don't you?).
You want to keep the number of modules reasonable, as well as the dependencies between them (not as in "assembly references" but through interface-assembly).
how can we reference nuget assemblies from that module in the shell (to apply theming for example) to avoid having references in each modules & the shell ?
You should separate the "interface" part (e.g. base classes or DTOs, not part of the module) and the actual services part (that's the module). Example: unity has a nuget package for the interfaces (Unity.Abstractions) and one that contains the container implementation (Unity.Container). There's nothing wrong with everyone referencing the interface, basically, that's saying "I want to use that interface".
"The ambiguity, is in the box" - Monty Python.
Autofac is having a problem resolving an interface. See attached solution.
The Interface, IAmbiguous, is defined in project ACommon. It is implemented in project AInjectable. The AInjectable project does not / cannot reference ACommon. The AInjectable project defines IAmbiguous as an existing item brought in with a file link.
The UI project calls ACommon Inject and attempts to register the AInjectable assembly. IAmbiguous is not ambiguous initially but after a builder.RegisterAssemblyTypes command it becomes "ambiguous in the namespace." There is no error thrown when the container is built but the registration is not there.
Registration can be done "AsImplementedInterfaces" if Named and Keyed is not used. But then there is no way to Resolve the registration because the service IAmbiguous is "ambiguous in the namespace."
This question was double-posted as an issue on Autofac. It is not an Autofac problem. I will copy/paste the answer from the issue in here; for future readers, if you want to see the repro solution, go check out the full issue
What you're doing by including the same interface in two different assemblies isn't something you should be doing. Note that by doing that, your AInjectable class is not implementing the interface from the ACommon project. It's implementing a different but identically named interface.
This sort of thing is a problem - having the same type (interface, class, whatever) name in two different assemblies. We even had a problem (#782) where we had a System.SerializableAttribute in Autofac as a shim for .NET Core. You really just can't do that.
You'll also see the same thing if you try to make a static extension method class that has the same namespace and name as some other static extension method class. Ambiguous references.
Without doing Reflection.Emit style code generation, you won't be able to declare an interface in one assembly ("Assembly A") and implement that interface in a different assembly ("Assembly B") without having Assembly B reference Assembly A. That's just how .NET works. What you're seeing is a manifestation of that when you use Autofac, but it's not caused by Autofac. It's caused by you doing something you shouldn't be doing in .NET.
The fix is to define your interfaces in a separate assembly that everyone implementing the interfaces can reference. (Or you can try to dynamically generate code using Reflection.Emit or Roslyn or something, but that's waaaay harder.)
Usually I create a "Scala Object" that keeps all my global constants.
I have been told that it is better to use "package object" in order to keep constants.
I have never used "package object" previously so my questions are:
What are the best practices to hold constants in Scala and why?
Why do I need "package object"?
You dont need a package object.
However it allows you to make code available at the package level without declaring another class or object.
It's a convenience.
package foo.bar
package object dem {
// things here will be available in `foo.bar` package and all subpackages
// without the need of an import statement.
}
The only convention specified for constants are regarding casing. Constants should be PascalCased
Keep in mind that declaring constants in the package object might make them available in places where you don't want them to be available.
I leave you the naming convention for package objects:
The standard naming convention is to place the definition above in a file named package.scala that's located in the directory corresponding to package pp.
What is the use of physical packages and package declaration in Drools.
For eg,
I have a rule, Myrule.drl in the physical package com.mycompany
As I understand the package declaration in drools doesn't depend on the actual physical package the file lies in.
So I can give in Myrule.drl as
package com.sample;
rule "my rule"
when
then
end
Can somebody help me to understand what is the relation between the physical package/folder that the drl file lies and the package declaration in drl file?
Your understanding is correct. There is no relation between the package declaration in your DRL and the place where the DRL file is.
All the Java classes generated by either the RHS of the rules or the type declarations are going to be placed by Drools' Compiler in the package defined in the DRL file where they are. This allow you to have rules or declare types with same name but in different packages.
Hope it helps,
Why don't you just read the documentation, Drools "Expert" manual, Section 7.5?
A package is a collection of rules and other related constructs, such as imports and globals. The package members are typically related to each other - perhaps HR rules, for instance. A package represents a namespace, which ideally is kept unique for a given grouping of rules. The package name itself is the namespace, and is not related to files or folders in any way.
There's more...
Can somebody help me to understand what is the relation between the physical package/folder that the drl file lies and the package declaration in drl file?
There is no relationship (directly). You have to think in terms of what happens when you run the rules engine. When you run your application, these rules are turned into virtual objects placed in the declared namespace. So, for example:
DRL snippet
package com.mystuff
import java.util.logging.Logger;
import java.util.logging.Level;
rule "Hello World"
when
some condition...
then
Logger.log(Level.INFO, "Hello World");
end
When this rule is activated, given that you have logging turned on to the right levels, you will notice an entry similar to this one:
INFO com.mystuff.Rule_HelloWorld592399015 defaultConsequence Hello World
proving that the rule was turned into a virtual object (you won't find it as a .class file in your project) that it's located in the com.mystuff namespace as Estaban Aliverti mentioned on this post.
This is not that important if you are only operating with a single rule file. But, as you can imagine, if you have multiple files with similar rules, they will be resolved by the namespace included in the DRL file.