Morphia converter calling other converters - mongodb

I want to convert Optional<BigDecimal> in morphia. I created BigDecimalConverter, and it works fine. Now I want to create OptionalConverter.
Optional can hold any object type. In my OptionalConverter.encode method I can extract underlying object, and I'd like to pass it to default mongo conversion. So that if there is string, I'll just get string, if there is one of my entities, I'll get encoded entity. How can I do it?

There are two questions:
1. How to call other converters?
2. How to create a converter for a generic class whose type parameters are not statically known?
The first one is possible by creating the MappingMongoConveter and the custom converter together:
#Configuration
public class CustomConfig extends AbstractMongoConfiguration {
#Override
protected String getDatabaseName() {
// ...
}
#Override
#Bean
public Mongo mongo() throws Exception {
// ...
}
#Override
#Bean
public MappingMongoConverter mappingMongoConverter() throws Exception {
MappingMongoConverter mmc = new MappingMongoConverter(
mongoDbFactory(), mongoMappingContext());
mmc.setCustomConversions(new CustomConversions(CustomConverters
.create(mmc)));
return mmc;
}
}
public class FooConverter implements Converter<Foo, DBObject> {
private MappingMongoConverter mmc;
public FooConverter(MappingMongoConverter mmc) {
this.mmc = mmc;
}
public DBObject convert(Foo foo) {
// ...
}
}
public class CustomConverters {
public static List<?> create(MappingMongoConverter mmc) {
List<?> list = new ArrayList<>();
list.add(new FooConverter(mmc));
return list;
}
}
The second one is much more difficult due to type erasure. I've tried to create a converter for Scala's Map but haven't found a way. Unable to get the exact type information for the source Map when writing, or for the target Map when reading.
For very simple cases, e.g. if you don't need to handle all possible parameter types, and there is no ambiguity while reading, it may be possible though.

Related

Mapping a field using existing target value (Mapstruct)

i have a custom case that some of my dto's have a field of type X, and i need to map this class to Y by using a spring service method call(i do a transactional db operation and return an instance of Y). But in this scenario i need to use existing value of Y field. Let me explain it by example.
// DTO
public class AnnualLeaveRequest {
private FileInfoDTO annualLeaveFile;
}
//ENTITY
public class AnnualLeave {
#OneToOne
private FileReference annualLeaveFile;
}
#Mapper
public abstract class FileMapper {
#Autowired
private FileReferenceService fileReferenceService;
public FileReference toFileReference(#MappingTarget FileReference fileReference, FileInfoDTO fileInfoDTO) {
return fileReferenceService.updateFile(fileInfoDTO, fileReference);
}
}
//ACTUAL ENTITY MAPPER
#Mapper(uses = {FileMapper.class})
public interface AnnualLeaveMapper {
void updateEntity(#MappingTarget AnnualLeave entity, AnnualLeaveRequest dto);
}
// WHAT IM TRYING TO ACHIEVE
#Component
public class MazeretIzinMapperImpl implements tr.gov.hmb.ikys.personel.izinbilgisi.mazeretizin.mapper.MazeretIzinMapper {
#Autowired
private FileMapper fileMapper;
#Override
public void updateEntity(AnnualLeave entity, AnnualLeaveUpdateRequest dto) {
entity.setAnnualLeaveFile(fileMapper.toFileReference(dto.getAnnualLeaveFile(), entity.getAnnualLeaveFile()));
}
}
But mapstruct ignores the result of "FileReference toFileReference(#MappingTarget FileReference fileReference, FileInfoDTO fileInfoDTO) " and does not map the result of it to the actual entity's FileReference field. Do you have any idea for resolving this problem?
Question
How do I replace the annualLeaveFile property while updating the AnnualLeave entity?
Answer
You can use expression to get this result. For example:
#Autowired
FileMapper fileMapper;
#Mapping( target = "annualLeaveFile", expression = "java(fileMapper.toFileReference(entity.getAnnualLeaveFile(), dto.getAnnualLeaveFile()))" )
abstract void updateEntity(#MappingTarget AnnualLeave entity, AnnualLeaveRequest dto);
MapStruct does not support this without expression usage. See the end of the Old analysis for why.
Alternative without expression
Instead of fixing it in the location where FileMapper is used, we fix it inside the FileMapper itself.
#Mapper
public abstract class FileMapper {
#Autowired
private FileReferenceService fileReferenceService;
public void toFileReference(#MappingTarget FileReference fileReference, FileInfoDTO fileInfoDTO) {
FileReference wanted = fileReferenceService.updateFile(fileInfoDTO, fileReference);
updateFileReference(fileReference, wanted);
}
// used to copy the content of the service one to the mapstruct one.
abstract void updateFileReference(#MappingTarget FileReference fileReferenceTarget, FileReference fileReferenceFromService);
}
Old analysis
The following is what I notice:
(Optional) your FileMapper class is not a MapStruct mapper. This can just be a normal class annotated with #Component, since it does not have any unimplemented abstract methods. (Does not affect code generation of the MazeretIzinMapper implementation)
(Optional, since you have this project wide configured) you do not have componentModel="spring" in your #Mapper definition, maybe you have this configured project wide, but that is not mentioned. (required for the #Autowired annotation, and #Component on implementations)
Without changing anything I already get a working result as you want it to be, but for non-update methods (not listed in your question, but was visible on the gitter page where you also requested help) the FileMapper as is will not be used. It requires an additional method that takes only 1 argument: public FileReference toFileReference(FileInfoDTO fileInfoDTO)
(Edit) to get rid of the else statement with null value handling you can add nullValuePropertyMappingStrategy = NullValuePropertyMappingStrategy.IGNORE to the #Mapper annotation.
I've run a test and with 1.5.0.Beta2 and 1.4.2.Final I get the following result with the thereafter listed FileMapper and MazeretIzinMapper classes.
Generated mapper implementation
#Generated(
value = "org.mapstruct.ap.MappingProcessor",
date = "2022-03-11T18:01:30+0100",
comments = "version: 1.4.2.Final, compiler: Eclipse JDT (IDE) 1.4.50.v20210914-1429, environment: Java 17.0.1 (Azul Systems, Inc.)"
)
#Component
public class MazeretIzinMapperImpl implements MazeretIzinMapper {
#Autowired
private FileMapper fileMapper;
#Override
public AnnualLeave toEntity(AnnualLeaveRequest dto) {
if ( dto == null ) {
return null;
}
AnnualLeave annualLeave = new AnnualLeave();
annualLeave.setAnnualLeaveFile( fileMapper.toFileReference( dto.getAnnualLeaveFile() ) );
return annualLeave;
}
#Override
public void updateEntity(AnnualLeave entity, AnnualLeaveRequest dto) {
if ( dto == null ) {
return;
}
if ( dto.getAnnualLeaveFile() != null ) {
if ( entity.getAnnualLeaveFile() == null ) {
entity.setAnnualLeaveFile( new FileReference() );
}
fileMapper.toFileReference( entity.getAnnualLeaveFile(), dto.getAnnualLeaveFile() );
}
}
}
Source classes
Mapper
#Mapper( componentModel = "spring", uses = { FileMapper.class }, nullValuePropertyMappingStrategy = NullValuePropertyMappingStrategy.IGNORE )
public interface MazeretIzinMapper {
AnnualLeave toEntity(AnnualLeaveRequest dto);
void updateEntity(#MappingTarget AnnualLeave entity, AnnualLeaveRequest dto);
}
FileMapper component
#Mapper
public abstract class FileMapper {
#Autowired
private FileReferenceService fileReferenceService;
public FileReference toFileReference(#MappingTarget FileReference fileReference, FileInfoDTO fileInfoDTO) {
return fileReferenceService.updateFile( fileInfoDTO, fileReference );
}
public FileReference toFileReference(FileInfoDTO fileInfoDTO) {
return toFileReference( new FileReference(), fileInfoDTO );
}
// other abstract methods for MapStruct mapper generation.
}
Why the exact wanted code will not be generated
When generating the mapping code MapStruct will use the most generic way to do this.
An update mapper has the following criteria:
The #MappingTarget annotated argument will always be updated.
It is allowed to have no return type.
the generic way to update a field is then as follows:
// check if source has the value.
if (source.getProperty() != null) {
// Since it is allowed to have a void method for update mappings the following steps are needed:
// check if the property exists in the target.
if (target.getProperty() == null) {
// if it does not have the value then create it.
target.setProperty( new TypeOfProperty() );
}
// now we know that target has the property so we can call the update method.
propertyUpdateMappingMethod( target.getProperty(), source.getProperty() );
// The arguments will match the order as specified in the other update method. in this case the #MappingTarget annotated argument is the first one.
} else {
// default behavior is to set the target property to null, you can influence this with nullValuePropertyMappingStrategy.
target.setProperty( null );
}

Overriding array(list) type conversions in Spring Data R2DBC

I'm using Postgres as my datasource and I've created a custom Spring converter for a property that holds a list of my custom objects:
#Slf4j
#WritingConverter
#AllArgsConstructor
public class CustomObjectListToStringConverter implements Converter<List<CustomObject>, String> {
#Override
public String convert(#Nonnull List<CustomObject> source) {
try {
return objectMapper.writeValueAsString(source);
} catch (JsonProcessingException e) {
log.error("Error occurred while serializing list of CustomObject to JSON.", e);
}
return "[]";
}
}
Conversion goes smoothly but IllegalArgumentException is raised in getArrayType method of PostgresArrayColumns class because my custom type is not a simple type.
Is there a way to circumvent this guard for some property?
Currently, there is no override possible because DatabaseClient considers collection-typed values as values for Postgres' array fields. Please file a ticket at https://github.com/spring-projects/spring-data-r2dbc/ to fix the issue.
It is not supported intentionally based on the documentation.
Please note that converters get applied on singular properties. Collection properties (e.g. Collection) are iterated and converted element-wise. Collection converters (e.g. Converter<List>, OutboundRow) are not supported.
Source: spring-data-r2dbc mapping reference
Solution:
Create a wrapper class (complex type) as follows:
class CustomObjectList {
List<CustomObject> customObjects;
}
Then, you apply a converter Converter<CustomObjectList, String> and vice versa.
public class CustomObjectListToStringConverter implements Converter<CustomObjectList, String> {

Conditional naming of POJO classes

Our problem is, our service GET /services/v1/myobject returns object named Xyz. This representation is used by multiple existing clients.
The new service GET /services/v2/myobject needs to expose exact same object but with different name, say XyzLmn
Now one obvious solution would be to create two classes Xyz and XyzLmn, then copy Xyz into XyzLmn and expose XyzLmn in the v2.
What I am looking for is, how can I keep the same java pojo class Xyz and conditionally serialize it to either XyzLmn or Xyz ?
have you try to add the #JsonIgnoreProperties(ignoreUnknown = true) on your domain object?
One solution is:
Write a customer serializer that emits XyzLmn
Register the customer serializer conditionally
public class XyzWrapperSerializer extends StdSerializer<Item> {
public ItemSerializer() {
this(null);
}
public ItemSerializer(Class<Item> t) {
super(t);
}
#Override
public void serialize(
Item value, JsonGenerator jgen, SerializerProvider provider)
throws IOException, JsonProcessingException {
jgen.writeStartObject();
jgen.writeNumberField("XyzLmn", value.id);
jgen.writeEndObject();
} }
XyzWrapper myItem = new XyzWrapper(1, "theItem", new User(2, "theUser"));
ObjectMapper mapper = new ObjectMapper();
SimpleModule module = new SimpleModule();
module.addSerializer(XyzWrapper.class, new XyzWrapperSerializer());
mapper.registerModule(module);
String serialized = mapper.writeValueAsString(myItem);

Problems when using EntityFilteringFeature and SelectableEntityFilteringFeature with Jersey 2

I'm new to Jersey 2 and JAX-RS, so probably I'm missing something.
What I'm trying to do is a test program to define a coding style in rest services developing.
The test was written in JAVA and uses JERSEY 2.22.2, JDK 1.8.31, MOXY AS JSON Provider.
I defined a Resource with GET methods to support LIST/DETAIL. Due to the size of my POJO, I used some filters and everything was fine.
// 1) First of all I defined the annotation.
#Target({ElementType.TYPE, ElementType.METHOD, ElementType.FIELD})
#Retention(RetentionPolicy.RUNTIME)
#Documented
#EntityFiltering
public #interface MyDetailView {
public static class Factory extends AnnotationLiteral<MyDetailView>
implements MyDetailView {
private Factory() {
}
public static MyDetailView get() {
return new Factory();
}
}
// 2) Once defined the annotation, I used to
// programmaticaly exclude the list of subItems in the response...
#XmlRootElement
public class MyPojo {
...
//*** THIS SHOULD BE FILTERED IF THE ANNOTATION IS NOT SPECIFIED IN THE RESPONSE ***
#MyDetailView
private List<SubItem> subItems = new ArrayList<SubItem>();
public List<SubItem> getSubItems() {
return subItems;
}
public void setSubItems(List<SubItem> subItems) {
this.subItems = subItems;
}
}
// 3) I registered the EntityFilteringFeature
public class ApplicationConfig extends ResourceConfig {
public ApplicationConfig() {
....
register(EntityFilteringFeature.class);
}
// 4) Finally, I wrote the code to include/exclude the subItems
/*
The Resource class has getCollection() and getItem() methods...
getCollection() adds the annotation only if filterStyle="detail"
getItem() always add the annotation
*/
#Path(....)
#Produces(MediaType.APPLICATION_JSON)
#Consumes(MediaType.APPLICATION_JSON)
public class MyResource extends SecuredResource {
//filterStyle -> "detail" means MyDetailAnnotation
#GET
public Response getCollection(
#QueryParam("filterStyle") String filterStyle,
#Context UriInfo uriInfo) {
//THIS CODE AFFECTS THE RESPONSE
boolean detailedResponse = "detail".equals(filterStyle);
Annotation[] responseAnnotations = detailedResponse
? new Annotation[0]
: new Annotation[]{MyDetailView.Factory.get()};
//pojo collection...
MyPagedCollection myCollection = new MyPagedCollection();
//.....
ResponseBuilder builder = Response.ok();
return builder.entity(myCollection, responseAnnotations).build();
}
#GET
#Path("/{id}")
public Response getItem(#PathParam("{id}") String idS, #Context UriInfo uriInfo) {
MyPOJO pojo = ...
Annotation[] responseAnnotations = new Annotation[]{MyDetailView.Factory.get()};
return Response.ok().entity(pojo, responseAnnotations).build();
}
}
After the first test, I tried to use the SelectableEntityFilteringFeature to allow the client to ask for specific fields in the detail, so I changed the ApplicationConfig
public class ApplicationConfig extends ResourceConfig {
public ApplicationConfig() {
....
register(EntityFilteringFeature.class);
register(SelectableEntityFilteringFeature.class);
property(SelectableEntityFilteringFeature.QUERY_PARAM_NAME, "fields");
}
and I've add the "fields" QueryParam to the Resource getItem() method...
#GET
#Path("/{id}")
public Response getDetail(#PathParam({id}) String id,
#QueryParam("fields") String fields,
#Context UriInfo uriInfo) {
....
But as long as I registered the SelectableEntityFilteringFeature class, the EntityFilteringFeature class stopped working. I tried to add "fields" parameter to one of the Resource methods, it worked perfectly. But the MyDetailAnnotation was completely useless.
I tried to register it using a DynamicFeature
public class MyDynamicFeature implements DynamicFeature {
#Override
public void configure(ResourceInfo resourceInfo, FeatureContext context) {
if ("MyResource".equals(resourceInfo.getResourceClass().getSimpleName())
&& "getItem".equals(resourceInfo.getResourceMethod().getName())) {
//*** IS THE CORRECT WAY TO BIND A FEATURE TO A METHOD? ***
//
context.register(SelectableEntityFilteringFeature.class);
context.property(SelectableEntityFilteringFeature.QUERY_PARAM_NAME, "fields");
}
}
Now the questions:
1) Why registering both the SelectableEntityFilteringFeature feature breaks the EntityFilteringFeature?
2) What is the correct way to bind a feature to a method with the DynamicFeature interface?
Thanks in advance.
This is my first post to Stack Overflow, I hope it was written complaining the rules.
Short answer: you can't. It appears to be a bug as of 2.25.1 and up to 2.26(that I tested with). https://github.com/jersey/jersey/issues/3523
SelectableEntityFilteringFeature implictily registers EntityFilteringFeature (As mentioned here). So I don't see a need to add this.
Since you need Annotation based filtering, you can exclude registering SelectableEntityFilteringFeature.
You can just do,
// Set entity-filtering scope via configuration.
.property(EntityFilteringFeature.ENTITY_FILTERING_SCOPE, new Annotation[] {MyDetailView.Factory.get()})
// Register the EntityFilteringFeature.
.register(EntityFilteringFeature.class)
// Further configuration of ResourceConfig.
You can refer to this example for usage and this example for registering the filter.
So you can remove SelectableEntityFilteringFeature and try just the above mentioned way to register it.

How to read the new XStreamConverter parameters?

Since version 1.4.2 of XStream, the XStreamConverter annotation takes additional parameters (very good feature and just what I need).
#XStreamConverter(value=CustomXStreamConverter.class, strings={xyz"})
private List<String> phones;
But how can I read this values (xyz) in my custom converter?
public class CustomXStreamConverter implements Converter {
//?
}
I figure out the solution, just override the class constructor in order to receive the parameter.
public class CustomXStreamConverter implements Converter {
private String alias;
public ListToStringXStreamConverter(String alias) {
super();
this.alias = alias; //xyz
}
//...