I wanted to print out a structure used in a ovm_sequence_item. Since the structure is long, I plan to override table printer knobs by using tbl_printer.knobs.value_width = 100;
Here is the code snipper
virtual function void do_print(ovm_printer printer);
ovm_table_printer tbl_printer;
super.do_print(printer); //print all other fields
$cast(tbl_printer, printer);
tbl_printer.knobs.value_width = 100;
tbl_printer.print_generic("ppid","CppPpid_t",$bits(CppPpid_t),
$psprintf("A=%0b,B=%0b,C=%0d,D=%0d,E=%0d,F=%0x",
struct.A,
struct.B,
struct.C,
struct.D,
struct.E,
struct.F)
);
endfunction: do_print
I am getting this casting error.
Error-[DCF] Dynamic cast failed
*.sv, 58
Casting of source class type 'SIP_SHARED_LIB.ovm_pkg.ovm_tree_printer' to
destination class type 'SIP_SHARED_LIB.ovm_pkg.ovm_table_printer' failed due
to type mismatch.
Please ensure matching types for dynamic cast
Can someone help me what I am doing wrong? How is it getting ovm_tree_printer when I am trying to use ovm_printer?
ovm_printer is just the base class that declares the API for printers. What gets passed around are concrete classes, like ovm_table_printer or ovm_tree_printer. Both of these can be stored in a variable of type ovm_printer.
You're probably passing a tree printer in the print(...) call on your object. If you don't specify any printer, the default printer is used. Per default, this is the table printer, but it could be that this was changed. Look for ovm_default_printer in the package scope.
If someone explicitly wants you to do a tree print, then you can't magically change that to a table print. Best you can do is check if you're doing a table print and if so then change the knobs:
if (!$cast(tbl_printer, printer))
return;
tbl_printer.knobs.value_width = 100;
Related
I often get this warning message:
UVM_WARNING # 0: reporter [TPRGED] Type name 'packet2mem_comp_Str' already registered with factory. No string-based lookup support for multiple types with the same type name.
I did not register any class with the same name, unless the parent one which, I suppose, does not present any problem.
My class is an inherited parametrized class declared as follows:
class packet2mem_comp #(string S = "MEM") extends mem_comp;
typedef packet2mem_comp #(S) packet2mem_comp_Str;
`uvm_object_utils(packet2mem_comp_Str)
function new (string name = "packet2mem_comp");
super.new(name);
endfunction : new
... //rest of my code
endclass: packet2mem_comp
Does anyone know how to fix this warning?
There are special versions of the macros for parameterised classes. Instead of
`uvm_object_utils(packet2mem_comp_Str)
try
`uvm_object_param_utils(packet2mem_comp_Str)
or perhaps
`uvm_object_param_utils(packet2mem_comp #(S))
You have not posted an MCVE, so I have not tested this.
I have a class who's main purpose revolves around a temp table. I want to make a constructor that takes an identical temp table as input.
So far the compiler chokes on any attempt to pass a temp table as a input parameter. If I use a table handle instead, it works. But I'd rather not copy from a dynamic table to a static one.
Progress wants the tables to match up at compile time, but I know they'll be the same - the're defined in a .i file.
Is there an easy way to line up the tables, or am I stuck parsing it out one field at a time?
Works like a charm to me.
CLASS Test.TTOO.TempTableWrapper:
{Test/TTOO/ttCustomer.i}
CONSTRUCTOR PUBLIC TempTableWrapper (TABLE ttCustomer):
FOR EACH ttCustomer:
DISPLAY ttCustomer.CustNum ttCustomer.Name .
END.
END CONSTRUCTOR.
END CLASS.
and the caller:
ROUTINE-LEVEL ON ERROR UNDO, THROW.
USING Test.TTOO.* FROM PROPATH.
DEFINE VARIABLE oWrapper AS TempTableWrapper NO-UNDO .
{Test/TTOO/ttCustomer.i}
/* *************************** Main Block *************************** */
CREATE ttCustomer.
ASSIGN ttCustomer.CustNum = 42
ttCustomer.Name = "It works" .
oWrapper = NEW TempTableWrapper(TABLE ttCustomer ) .
You can also pass the temp-table by-ref:
oWrapper = NEW TempTableWrapper(TABLE ttCustomer BY-REFERENCE) .
However, then the temp-table data is only availalbe during the constructor as BY-REFERENCE calls "overlap" the temp-table within the callee only for the duration of that call.
For permanent "BY-REFERENCE", use the BIND keyword on the call and the paramter - in which case the callee must define the temp-table as REFERENCE-ONLY.
Note, it's not required (although recommended at least by me) to define the temp-tables in include files. At runtime and compile time, the schemas just need to match.
When the compiler does not like your call, delete the classes r-code and recompile.
I have a problem with Lua and I don't know if I going in the right direction. In C++ I have a dictionary that I use to pass parameter to a resource manager. This dictionary is really similar to a map of hash and string.
In Lua I want to access to these resource so I need a representation of hashes. Also hashes must be unique cause are used as index in a table. Our hash function is 64bit and I'm working on 32bit enviroment (PS3).
C++ I have somethings like that:
paramMap.insert(std::make_pair(hash64("vehicleId"), std::string("004")));
resourceManager.createResource(ResourceType("Car"), paramMap);
In Lua want use these resources to create a factory for other userdata.
I do stuff like:
function findBike(carId)
bikeParam = { vehicleId = carId }
return ResourceManager.findResource('car', bikeParam)
end
So, sometime parameter are created by Lua, sometime parameter are created by C++.
Cause my hashkey ('vehicleId') is an index of a table it need to be unique.
I have used lightuserdata to implement uint64_t, but cause I'm in a 32bit enviroment I can't simply store int64 in pointer. :(
I have to create a table to store all int64 used by the program and save a reference in userdata.
void pushUInt64(lua_State *L, GEM::GUInt64 v)
{
Int64Ref::Handle handle = Int64Ref::getInstance().allocateSlot(v);
lua_pushlightuserdata(L, reinterpret_cast<void*>(handle));
luaL_setmetatable(L, s_UInt64LuaName);
}
but userdata are never garbage collected. Then my int64 are never released and my table will grow forever.
Also lightuserdata don't keep reference to metadata so they interfere with other light userdata. Checking the implementation the metadata table is added in L->G_->mt_[2].
doing that
a = createLightUserDataType1()
b = createLightUserDataType2()
a:someFunction()
will use the metatable of b.
I thought that metatable where bounded to type.
I'm pretty confused, with the current implementation lightuserdata have a really limited use case.
With Python you have a hash metafunction that is called anytime the type is used as index for a dictionary. It's possible to do something similar?
Sorry for my english, I'm from Italy. :-/
I'm building a MEF-based plugin-centric WPF application and I'm facing an issue with GetExports, maybe it's just my ignorance but I find an odd behaviour. I have a number of exported parts, all derived from 2 different interfaces (let's name them A and B), but all marked with the same metadata attribute X. So I have code like:
[Export(typeof(A))]
[TheXAttributeHere...]
public class SomePart1 : A { ... }
for each part, and the same for classes implementing B:
[Export(typeof(B))]
[TheXAttributeHere...]
public class SomePart2 : B { ... }
Now, when I try getting all the parts implementing A and decorated by attribute X with some values, MEF returns not only the A-implementing parts, but ALSO the B-implementing parts. So, when I expect to deal with A-objects I get a B, whence a cast exception.
In the real world, interfaces are named IItemPartEditorViewModel and IItemPartEditorView, while their common attribute is named ItemPartEditorAttribute and exposes a PartType string property on which I do some filtering. My code to get parts is thus like e.g.:
var p = (from l in container.GetExports<IItemPartEditorViewModel, IItemPartEditorMetadata>()
where l.Metadata.PartType == sPartType
select l).FirstOrDefault();
When looking for IItemPartEditorViewModel whose PartType is equal to some value, I get the IItemPartEditorView instead of IItemPartEditorViewModel implementing object. If I comment out the attribute in the IItemPartEditorView object instead, I correctly get the IItemPartEditorViewModel implementing object.
Update the suggested "templated" method was used, but I mistyped it here as I forgot to change lessthan and greaterthan into entities. Anyway, reviewing the code I noticed that in the attribute I had "ViewModel" instead or "View" for the interface type, so this was the problem. Shame on me, sorry for bothering :)!
I think I'd need to see more of the code to know for sure what's going on. However, I'd suggest you call GetExports like this:
// Get exports of type A
container.GetExports<A>();
// Get exports of type B
container.GetExports<B>();
Then do your filtering on the list returned. This will probably fix the cast issues you are having. I'd also be interested in seeing the code for the custom metadata attribute. If it derives from ExportAttribute for example, that might be part of the problem.
I'm trying to integrate NHibernate.Validator with ASP.NET MVC client side validations, and the only problem I found is that I simply can't convert the non-interpolated message to a human-readable one. I thought this would be an easy task, but turned out to be the hardest part of the client-side validation. The main problem is that because it's not server-side, I actually only need the validation attributes that are being used, and I don't actually have an instance or anything else at hand.
Here are some excerpts from what I've been already trying:
// Get the the default Message Interpolator from the Engine
IMessageInterpolator interp = _engine.Interpolator;
if (interp == null)
{
// It is null?? Oh, try to create a new one
interp = new NHibernate.Validator.Interpolator.DefaultMessageInterpolator();
}
// We need an instance of the object that needs to be validated, se we have to create one
object instance = Activator.CreateInstance(Metadata.ContainerType);
// we enumerate all attributes of the property. For example we have found a PatternAttribute
var a = attr as PatternAttribute;
// it seems that the default message interpolator doesn't work, unless initialized
if (interp is NHibernate.Validator.Interpolator.DefaultMessageInterpolator)
{
(interp as NHibernate.Validator.Interpolator.DefaultMessageInterpolator).Initialize(a);
}
// but even after it is initialized the following will throw a NullReferenceException, although all of the parameters are specified, and they are not null (except for the properties of the instance, which are all null, but this can't be changed)
var message = interp.Interpolate(new InterpolationInfo(Metadata.ContainerType, instance, PropertyName, a, interp, a.Message));
I know that the above is a fairly complex code for a seemingly simple question, but I'm still stuck without solution. Is there any way to get the interpolated string out of NHValidator?
Ok, so I know this is an old question, but I stumbled across this when trying to do the same thing, and it helped me get started - so I thought I would provide an answer.
I think the code in the question was on the right track but there are a couple of problems. The interpolator was not completely initialised with the ResourceManager and Culture details, and it doesn't seem to allow for the fact that you can only have one DefaultMessageInterpolator per validation attribute. Also, you don't need an instance of the object you are validating to get an interpolated message.
In the code in the question, where you are initialising the interpolator with the attribute value, you also need to initialise the interpolator with details of the ResourceManager to be used.
This can be done using the overloaded Initialize method on DefaultMessageInterpolator which has the following signature:
public void Initialize(ResourceManager messageBundle,
ResourceManager defaultMessageBundle,
CultureInfo culture)
The first parameter is a user-defined ResourceManager in case you want to use your own resource file for error messages, you can pass a null if you just want to use the default ResouceManager, the second parameter is the default ResourceManager - you can pass
new ResourceManager(
NHibernate.Validator.Cfg.Environment.BaseNameOfMessageResource,
Assembly.GetExecutingAssembly());
for this, the last parameter is the culture to use, (NHibernate.Validator comes with resource files with validation messages in several languages) - if you pass a null in to this it will just use CultureInfo.CurrentCulture
Lastly, you can only have one DefaultMessageInterpolator per attribute, so you will need to create a new DefaultMessageInterpolator for each validation attribute. You could make use of the DefaultMessageInterpolatorAggregator to handle this, or just roll your own.
I hope this helps someone.
Thanks for your help all--I'd upvote if I could. I just wanted to add that in addition to the first Initialize call on the DefaultMessageInterpolator that Stank illustrates, I also had to make a second different Initialize call to fully initialize it (I was getting some Null Reference Exceptions using only the first call). My code is as follows:
string interpolatedMessage = "";
DefaultMessageInterpolator interpolator = new DefaultMessageInterpolator();
interpolator.Initialize(null,
new ResourceManager(
NHibernate.Validator.Cfg.Environment.BaseNameOfMessageResource,
Assembly.Load("NHibernate.Validator")),
CultureInfo.CurrentCulture);
interpolator.Initialize(attribute as Attribute);
if (attribute is IValidator && attribute is IRuleArgs)
{
IValidator validator = attribute as IValidator;
IRuleArgs ruleArgs = attribute as IRuleArgs;
InterpolationInfo interpolationInfo = new InterpolationInfo(
validatableType,
null,
propertyName,
validator,
interpolator,
ruleArgs.Message);
interpolatedMessage = interpolator.Interpolate(interpolationInfo);
}