I am using JOOQ to manipulate the database,now i have a problem.
There is a polymorphic class OrderEntry
#JsonTypeInfo(use = JsonTypeInfo.Id.NAME, include = JsonTypeInfo.As.EXISTING_PROPERTY, property = "type", visible = true)
#JsonSubTypes(value = {
#JsonSubTypes.Type(value = ReissueOrderEntry.class, name = "reissue"),
#JsonSubTypes.Type(value = RawOrderEntry.class, name = "raw"),
#JsonSubTypes.Type(value = FreebieOrderEntry.class, name = "freebie"),
#JsonSubTypes.Type(value = ReplaceOrderEntry.class, name = "replace")
})
public class OrderEntry extends OrderObject {
String type;
}
It will be deserialized into different objects according to the field 'type'.
But in jooq's deserialization it will only be deserialized as OrderEntry.
code
How can i solve it?
I'm assuming you're trying to use the built-in ConverterProvider logic that makes use of Jackson, e.g. when writing things like:
record.into(OrderEntry.class);
jOOQ loads Jackson from the classpath without any additional modules / plugins loaded. If you wish to use additional plugins, then you'll have to roll your own ConverterProvider, which implements loading additional plugins.
Related
I´ve been looking for how avoid return a list without the attribute lazyLoader, I want to continue using the lazyLoader but I don´t want return the attribute when I return the whole list of my entity from my controller
I´m working with .NET core.
[
{
"lazyLoader": {},
"id": "id1"
"name": "name"
},
{
"lazyLoader": {},
"id": "id2",
"name": "name2"
}
]
You can do a select of you collection only retrieving the rest of the data.
That way your objects will not have the Navigation property at all.
db.YourCollection.Where(your condition)Select(x => new { id = x.id , name = x.name } );
In Entity Framework, if you have an object where one or more of its properties use lazy loading, check its runtime type name using GetType().Name. For an object of a Car class, for example, you will notice that the runtime type is actually something called CarProxy, which is a temporary in-memory type created by Entity Framework using reflection. This "fake" proxy class's base type is Car, and has all the original Car properties, but includes an extra one called LazyLoader for properties that may need it.
If you do further checking on this "fake" CarProxy type, you will also see that Assembly.IsDynamic = true, which is indicative that the class was created dynamically using reflection (see documentation):
var TheCar = DBContext.Cars.Find(1);
Console.WriteLine(TheCar.GetType().Assembly.IsDynamic.ToString()); //will echo "true"
Luckily, Newtonsoft.Json has an override on the JsonConvert.SerializeObject() method that allows us to provide a base type, so that the resulting JSON doesn't contain properties that don't exist in that type. So, to eliminate the LazyLoader property, just provide the object's BaseType as the type parameter:
var TheCar = DBContext.Cars.Find(1);
var TheJSON = Newtonsoft.Json.JsonConvert.SerializeObject(TheCar, TheCar.GetType().BaseType);
To make sure you don't get any circular reference loops when serializing (a very high probability when using lazy loading), call the serializer with the following setting:
var TheCar = DBContext.Cars.Find(1);
var Settings = new Newtonsoft.Json.JsonSerializerSettings
{
ReferenceLoopHandling = Newtonsoft.Json.ReferenceLoopHandling.Ignore
};
var TheJSON = Newtonsoft.Json.JsonConvert.SerializeObject(TheCar, TheCar.GetType().BaseType, Settings);
Note: This may only work on the first level deep when the serializer travels through the object. If there are yet more lazy-loading child properties of the object you provide to the serializer, the "LazyLoader" property may appear again. I haven't tested it so I can't say for sure.
I know this is old, but add
public boolean ShouldSerializeLazyLoader() { return false; }
to all the classes down the tree of the ones you want to serialize, and you will get a lazyloader free JSON.
Ref.: https://www.newtonsoft.com/json/help/html/ConditionalProperties.htm
The checked answer for this question is just working for the root object, if we have many nested lazyloaded objects, this solution will not work.
Although the answer of #Marcello-Barbiani is correct but it is not a good way to add this function to all entities we have.
The best way is create a new ContractResolver derived from DefaultContractResolver and check if property is Lazyloader then skip it as below:
public class NonLazyloaderContractResolver : DefaultContractResolver
{
public new static readonly NonLazyloaderContractResolver Instance = new NonLazyloaderContractResolver();
protected override JsonProperty CreateProperty(MemberInfo member, MemberSerialization memberSerialization)
{
JsonProperty property = base.CreateProperty(member, memberSerialization);
if (property.PropertyName == "LazyLoader")
{
property.ShouldSerialize = i => false;
}
return property;
}
}
after that adding above class pass it through JsonSerializerSettings while serializing the object:
var json = JsonConvert.SerializeObject(newProduct, new JsonSerializerSettings() {
ContractResolver = new NonLazyloaderContractResolver(),
ReferenceLoopHandling = ReferenceLoopHandling.Ignore,
DefaultValueHandling = DefaultValueHandling.Ignore });
and finally if you are using asp.net core or asp.net core webapi add this contract as default contractresolver in startup.cs file:
services.AddMvc()
.SetCompatibilityVersion(CompatibilityVersion.Version_2_1)
.AddJsonOptions(options =>
{
options.SerializerSettings.ContractResolver = new NonLazyloaderContractResolver();
options.SerializerSettings.ReferenceLoopHandling = Newtonsoft.Json.ReferenceLoopHandling.Ignore;
});
I have a list List<Payment> which I'd like to map to another list List<PaymentPlan>. These types look like this:
public class Payment {
#XmlElement(name = "Installment")
#JsonProperty("Installment")
private List<Installment> installments = new ArrayList<>();
#XmlElement(name = "OriginalAmount")
#JsonProperty("OriginalAmount")
private BigDecimal originalAmount;
//getters setters, more attributes
}
and....
public class PaymentPlan {
//(Installment in different package)
private List<Installment> installments;
#XmlElement(name = "OriginalAmount")
#JsonProperty("OriginalAmount")
private BigDecimal originalAmount;
//getters setters, more attributes
}
I expect that something like this is working...
#Mappings({
#Mapping(//other mappings...),
#Mapping(source = "payments", target = "paymentInformation.paymentPlans")
})
ResultResponse originalResponseToResultResponse(OrigResponse originalResponse);
...but I get:
Can't map property java.util.List<Payment> to java.util.List<PaymentPlan>.
Consider to declare/implement a mapping method java.util.List<PaymentPlan> map(java.util.List<Payment> value);
I don't know how to apply this information. First I though I need to declare some extra mapping (in the same mapper class) for the lists, so MapStruct knows how to map each field of the List types like this:
#Mappings({
#Mapping(source = "payment.originalAmount", target = "paymentInformation.paymentPlan.originalAmount")
})
List<PaymentPlan> paymentToPaymentPlan(List<Payment> payment);
...but I get error messages like
The type of parameter "payment" has no property named "originalAmount".
Obviously I do something completely wrong, since it sound like it does not even recognize the types of the List.
How can I basically map from one List to another similar List? Obviously I somehow need to combine different mapping strategies.
btw: I know how to do it with expression mapping, like...
#Mapping(target = "paymentPlans",expression="java(Helper.mapManually(payments))")
but I guess MapStruct can handle this by iself.
I presume you are using version 1.1.0.Final. Your extra mapping is correct, the only difference is that you need to define a mapping without the lists MapStruct will then use that to do the mapping (the example message is a bit misleading for collections).
PaymentPlan paymentToPaymentPlan(Payment payment);
You don't even need the #Mappings as they would be automatically mapped. You might also need to define methods for the Instalment (as they are in different packages).
If you switch to 1.2.0.CR2 then MapStruct can automatically generate the methods for you.
I am trying to map an object A to a list of object B.
I have a mapper which maps from object A to object B.
I have tried a number of different ways for example
Trying to create a list with one object A using 'expressions'
And 'qualifiedByName' but this does not work because I think
when you use expressions/qualifiedByName you can not use
Custom mappers ( I could be wrong here ?)
I also tried to call the mapper from the #aftermapper method using
The ‘mappers.getMapper’ to get a handle on the target mapper
But I found that the spring beans used in the mapper
Where not being populated. Mapping in the aftermapping makes
Sense in that I can call the target mapper with the source
And then add the target to the list. So I am hoping that there
is another Way to get a handle on the mapper component from my mapper.
All my mappers use
#Mapper(componentModel="spring",
All suggestions are welcome
Below is an code sample showing the problem.
Regards,
Declan
// SPECIESTOLOGSPECY.JAVA
// From this mapper I want to call SpecyToLogDeclarationMapperApi mapper
// to map ‘species’ to ‘logdeclarations’ which is a list of logdeclaration
// you can see want I am trying to do in the aftermapping method
// where I map species to logdeclaration and then put this into a list
// unfortunately I need other mapping components (ConfigMapperFromCode & SpecyToFishDeclarationMapperApi)
// to be autowired into SpecyToLogDeclarationMapperApi and this is not happening.
// is there another way to get a handle to the spring managed
// SpecyToLogDeclarationMapperApi mapper ?
#Mapper(componentModel="spring",
uses = {
ConfigMapperFromCode.class,
GeoInfoMapper.class,
SpecyToLogDeclarationMapperApi.class
})
public interface SpeciesToLogSpecy {
SpecyToLogDeclarationMapperApi MAPPER = Mappers.getMapper(SpecyToLogDeclarationMapperApi.class);
#Mappings(
{
#Mapping(target="createdate", expression = "java(java.sql.Timestamp.valueOf(java.time.LocalDateTime.now()))"),
#Mapping(target="speciesid", qualifiedByName={"ConfigMapperFromCode", "speciesIdFromCodeAsDecimal"}, source = "species.speciesCode"),
#Mapping(target="unitweight", constant = "1"),
#Mapping(target = "inactiveind", constant = "N"),
#Mapping(target = "unitdefaultind", constant = "Y"),
#Mapping(target = "sectiontypeid", expression = "java(new BigDecimal(ie.gov.agriculture.fisheries.logsheet.mapping.constants.MappingConstants.LOG_SPECIES_SECTION_TYPE_SHEETDECLARATION))"),
#Mapping(target = "unituomid", expression = "java(new BigDecimal(ie.gov.agriculture.fisheries.logsheet.mapping.constants.MappingConstants.LOGSHEET_CATCHUNITS_KG))"),
#Mapping(target="catchtypeid", qualifiedByName={"ConfigMapperFromCode", "returnCatchTypeId"}, source = "species.spType"),
#Mapping(target="legalfishsizetypeid", qualifiedByName={"ConfigMapperFromCode", "legalFishSizeTypeIdFromCode"}, source = "species.fishSizeClass"),
#Mapping(target="presenttypeid", qualifiedByName={"ConfigMapperFromCode", "presentationTypeIdFromCode"}, source = "species.presentation.presentationType"),
//#Mapping(target="logdeclarations", source = "species")
}
)
Logspecy speciesToLogspecy(Species species, #Context ExtraFields extraFields);
#AfterMapping
default void afterMap(#MappingTarget Logspecy logspecy, #Context ExtraFields extraFields){
Logdeclaration logDeclaration = MAPPER.SpeciesToLogDeclarations(species, extraFields);
List<Logdeclaration> logdeclarations = new ArrayList<Logdeclaration>();
logdeclarations.add(logDeclaration);
logSpecy.setLogdeclarations(logdeclarations);
{
// SPECYTOLOGDECLARATIONMAPPERAPI.JAVA
#Mapper(componentModel="spring",
uses = {
ConfigMapperFromCode.class,
SpecyToFishDeclarationMapperApi.class
}
)
public interface SpecyToLogDeclarationMapperApi {
#Mappings(
{
#Mapping(target="createdate", expression = "java(java.sql.Timestamp.valueOf(java.time.LocalDateTime.now()))"),
#Mapping(target="geartypeid", qualifiedByName={"ConfigMapperFromCode", "gearIdFromCode"}, source = "species.gearType"),
#Mapping(target="fishcount", source = "species.qty"),
#Mapping(target = "inactiveind", constant = "N"),
#Mapping(target="packagetypeid", qualifiedByName={"ConfigMapperFromCode", "packagingTypeIdFromCode"}, source = "species.presentation.packaging"),
#Mapping(target="packagecount", source = "species.presentation.pkgunit"),
#Mapping(target="avgpackageweight", source = "species.presentation.pkgUnitWeight"),
#Mapping(target="conversionfactor", source = "species.presentation.convFactor"),
#Mapping(target="fishdeclaration", source = "species.geoInfo")
}
)
Logdeclaration SpeciesToLogDeclarations (Species species, #Context ExtraFields extraFields);
The problem is that you are trying to map Species to List<Logdeclaration> and MapStruct cannot find such mapping method. In order to make it work you can add the following methods to your SpecyToLogDeclarationMapperApi:
default List<Logdeclaration> SpeciesToLogDeclarationsToList(Species species, #Context ExtraFields extraFields) {
if ( species == null ) {
return null;
}
Logdeclaration logDeclaration = SpeciesToLogDeclarations(species, extraFields);
List<Logdeclaration> logdeclarations = new ArrayList<Logdeclaration>();
logdeclarations.add(logDeclaration);
return logdeclarations;
}
This are some additional things, that I think you can do to improve your code and use MapStruct "more correctly":
You will need to remove the usage of Mappers#getMapper(Class) when the component model is not the default
If you don't want to use FQN in your expressions, you can use Mapper#imports and MapStruct will import them in the implementation.
When you have a single source parameter you don't have to use it's name in the mapping
e.g.
#Mapping(target="fishcount", source = "species.qty")
can be
#Mapping(target="fishcount", source = "qty")
We are using Kryo to communicate between a Scala application and a Java application. Since the class definitions have to be used from Java (and we don't want to include the Scala library as a dependency in the Java applicaton) we are using JavaBeans to define the transfer objects.
However, using JavaBeans directly in Scala is a bit of a hassle. No pattern matching, having to use new, etc. What we're doing right now is defining extractors and apply methods in separate objects on the Scala side to make it nicer to work with these classes.
Since most of what we need is boilerplate, we are wondering if there would be a way of doing this automatically. For example, we have this JavaBean (there are about 20+ different message types):
public class HandshakeRequest extends Request {
private String gatewayId;
private String requestId;
private Date timestamp = new Date();
public HandshakeRequest(String gatewayId, String requestId) {
this.gatewayId = gatewayId;
this.requestId = requestId;
}
public String getGatewayId() { return gatewayId; }
public String getRequestId() { return requestId; }
public Date getTimestamp() { return timestamp; }
private HandshakeRequest() { /* For Kryo */ }
}
This is an example of the object we use to bridge to Scala:
object Handshake {
def unapply(msg: HandshakeRequest): Option[ (DateTime, String, String) ] = {
(
new DateTime(msg.getTimestamp.getTime),
msg.getRequestId,
msg.getGatewayId
).some
}
def apply(gatewayId: String, requestId: String) = new HandshakeRequest(gatewayId, requestId)
}
Since all of our object have the Timestamp, it is also part of the boilerplate. We'd like some way (perhaps a macro?) to automatically generate the unapply and apply methods (and ideally, the whole object itself).
Does anyone know of an easy way to accomplish this?
Since there were no answers, I came up with this: https://github.com/yetu/scala-beanutils.
I've started a project https://github.com/limansky/beanpuree which allows convertions from beans to case classes. I'm going to add some more features, like automatic type convertion between Java number classes and Scala classes.
I would like to have an embedded document referred to by a map (as in 'class A' below). The environment is Grails + GORM + MongoDB.
is that possible, and if yes, how?
class A { // fails with IllegalArgumentException occurred when processing request: can't serialize class X in line 234 of org.bson.BasicBSONEncoder
static mapWith = "mongo"
Map<String, X> map = new HashMap<String, X>()
}
class B { // works
static mapWith = "mongo"
List<X> list = new ArrayList<X>()
}
class C { // works with primitive type values
static mapWith = "mongo"
Map<String, String> map = new HashMap<String, String>()
}
class X {
String data
public X(String data) {
this.data = data
}
}
The embedding works perfectly,as Art Hanzel advised.
However your problem comes from the fact that you try and use List genericity as a sort of constraint :
Map<String, X>
The problem is that Grails couldn't cope well with this syntax, first because Groovy doesn't support genericity.
However, the MongoDB plugin offers a very powerful functionality that lets you define custom type as Domain Class Properties : see here.
In your case you could have
class A {
static mapWith = "mongo"
MyClass map = new MyClass()
}
Then in your src/java for example you could for example implement a
class MyClass extends HashMap<String,X> { }
Then, of course, you have to define a special AbstractMappingAwareCustomTypeMarshaller to specify how to read and write the property in the DB.
An additional step could also be to add a custom validator to class A to check the validity of data...
The MongoDB Grails plugin documentation describes how to make embedded documents:
class Foo {
Address address
List otherAddresses
static embedded = ['address', 'otherAddresses']
}
Off the top of my head, you should be able to access these via the object graph. I don't see any reason why you shouldn't.
myFoo.address.myAddressProperty...