I've two MapperConfig:
#MapperConfig(
uses = {
StringTypeMapper.class,
ExtensionMapper.class
}
)
public interface ElementMapperConfig extends GenericMapperConfig {
#Mapping(target = "id", source = "idElement")
#Mapping(target = "extension", source = "extension")
Element mapElement(org.hl7.fhir.r4.model.Element fhir);
}
And GenericMapperConfig:
#MapperConfig(
componentModel = "spring",
injectionStrategy = InjectionStrategy.CONSTRUCTOR,
nullValueCheckStrategy = NullValueCheckStrategy.ALWAYS,
nullValueMappingStrategy = NullValueMappingStrategy.RETURN_NULL
)
public interface GenericMapperConfig {
}
As you can see, I'm using spring component model.
Nevertheless, Mapper implementation is getting required Mapper using Mappers.getMapper(...).
As you can see, ElementMapperConfig extends GenericMapperConfig, but it seems that configuration from GenericMapperConfig is ignored.
Generated Mapper example:
#Generated(
value = "org.mapstruct.ap.MappingProcessor"
)
public class StringTypeMapperImpl implements StringTypeMapper {
private final ExtensionMapper extensionMapper = Mappers.getMapper( ExtensionMapper.class );
}
StringTypeMapper is:
#Mapper(
config = ElementMapperConfig.class
)
public interface StringTypeMapper {
#InheritConfiguration(name = "mapElement")
StringType fhirToMpi(org.hl7.fhir.r4.model.StringType stringType);
}
I don't quite figure out why GenericMapperConfig configuration is not populated, I mean, I don't get why componentModel = "spring" is ignored on Mapper implementation.
The documentation does not mention this way of composing multiple MapperConfigurations.
It also does not mention another way of doing it, but this one works. The idea is to extend mappers instead of configurations.
Introduce a base mapper with generic configuration:
#Mapper(
config = GenericMapperConfig.class
)
public interface BaseMapper {
}
Base your concrete mapper on the base one and configure it using the specific configuration:
#Mapper(
config = ElementMapperConfig.class
)
public interface StringTypeMapper extends BaseMapper {
#InheritConfiguration(name = "mapElement")
StringType fhirToMpi(org.hl7.fhir.r4.model.StringType stringType);
}
Finally make ElementMapperConfig not inherit GenericMapperConfig:
#MapperConfig(
uses = {
StringTypeMapper.class,
ExtensionMapper.class
}
)
public interface ElementMapperConfig {
#Mapping(target = "id", source = "idElement")
#Mapping(target = "extension", source = "extension")
Element mapElement(org.hl7.fhir.r4.model.Element fhir);
}
Related
I would like to add custom extension to Patient->telecom class by extending HAPI's ContactPoint class and adding new extension . So this is standard boilerplate code I took from https://hapifhir.io/hapi-fhir/docs/model/custom_structures.html
#DatatypeDef(name="MyContactPoint")
public class MyContactPoint extends ContactPoint implements ICompositeType {
private static final long serialVersionUID = 1L;
#Override
protected MyContactPoint typedCopy() {
MyContactPoint retVal = new MyContactPoint();
super.copyValues(retVal);
return retVal;
}
#Child(name="my-ext-value")
#Extension(definedLocally = false, isModifier = false, url = "http://someurl")
private BooleanType valueMyExtValue;
public Boolean getMyExtValue() {
return valueMyExtValue.getValue();
}
public MyContactPoint setMyExtValue(Boolean theValue) {
this.valueMyExtValue = new BooleanType(theValue);
return this;
}
}
And it works fine when I generate XML:
Patient data = new Patient();
MyContactPoint telecom = new MyContactPoint();
telecom.setMyExtValue(true);
data.addTelecom(telecom);
However, I cannot wrap my head how I can LOAD this data from XML to get strongly-typed MyContactPoint object? I know I can do telecom.getExtensionByUrl() etc and finally load it - but I was hoping there is a way to outsource this legwork to HAPI-FHIR by declaring extensions.
I know I can do it easily in "parent" profile like this:
#ResourceDef()
public class MyTask extends Task {
#Child(name="my-ext-value")
#Extension(definedLocally = false, isModifier = false, url = "http://someurl")
private BooleanType valueMyExtValue;
public Boolean getMyExtValue() {
return valueMyExtValue.getValue();
}
public MyTask setMyExtValue(Boolean theValue) {
this.valueMyExtValue = new BooleanType(theValue);
return this;
}
}
and then I load it as follows:
var fhirContext = FhirContext.forR4Cached();
var fhirParser = fhirContext.newXmlParser();
fhirParser.setPreferTypes(List.of(MyTask.class));
var res = fhirParser.parseResource(xml);
var myTask = (MyTask)res;
var theValue = myTask.getMyExtValue();
So that was easy for IBaseResource->custom extension flow. But how do I do this for IBaseResource->Custom DataType (overriding existing field!)->custom extension?
I want to create a custom sling model which can be adapted from com.adobe.cq.dam.cfm.ContentFragment
like below,
import com.adobe.cq.dam.cfm.ContentFragment;
#Model(adaptables = ContentFragment.class, adapters = EventInfo.class)
public class EventInfoImpl implements EventInfo{
#Self
ContentFragment cf;
#Override
public String getTitle(){
return cf.getElement("title").getContent();
}
}
but in a caller class,
EventInfo e = contentFragment.adaptTo(EventInfo.class);
adaptTo() returns null.(variable "e" is null)
Why adaptTo() returns null? and How do I adapt correctly in this case?
Sling models can only be adapted from Resource or SlingHttpServletRequest. For anything else you need a classic AdapterFactory.
https://sling.apache.org/apidocs/sling11/org/apache/sling/api/adapter/AdapterFactory.html
How to implement a custom AdapterFactory for Sling Resource?
You can see it in the Sling source code of the ModelAdapterFactory. There is the method createModel:
<ModelType> ModelType createModel(#NotNull Object adaptable, #NotNull Class<ModelType> type)
https://github.com/apache/sling-org-apache-sling-models-impl/blob/master/src/main/java/org/apache/sling/models/impl/ModelAdapterFactory.java
If you dig down, the real filtering happens in the helper-class AdapterImplementations (line 258-268)
https://github.com/apache/sling-org-apache-sling-models-impl/blob/master/src/main/java/org/apache/sling/models/impl/AdapterImplementations.java
if (adaptableType == Resource.class) {
map = resourceTypeMappingsForResources;
resourceTypeRemovalLists = resourceTypeRemovalListsForResources;
} else if (adaptableType == SlingHttpServletRequest.class) {
map = resourceTypeMappingsForRequests;
resourceTypeRemovalLists = resourceTypeRemovalListsForRequests;
} else {
log.warn("Found model class {} with resource type {} for adaptable {}. Unsupported type for resourceType binding.",
new Object[] { clazz, resourceType, adaptableType });
return;
}
I have an existing mapping of 2 objects ExpertJpa to ExpertDto that need another param to filter ExpertJpa.
This map working properly and now I try to convert List of ExpertJpa to List of ExpertDto, I add this second param.
#Mappings({
#Mapping(target = "status", ignore = true),
#Mapping(target = "profile", source = "input.expertProfile"),
#Mapping(target = "engagementId", expression = "java(new MapperHelper().ReturnExpertEngagementIdByApiKey(input,identity))"),
#Mapping(target = "campaignId", expression = "java(new MapperHelper().ReturnExpertCampaignIdByApiKey(input,identity))"),
})
Expert ExpertJpaToExpert(com.consumer.expert.dbaccessor.entities.Expert input, Identity identity);
List<Expert> ListExpertsJpaToListExperts(List<com.consumer.expert.dbaccessor.entities.Expert> input, Identity identity);
On build, I get Error message that List is an interface and cannot be instance....
Error:(53, 18) java: The return type java.util.List is an abstract class or interface. Provide a non abstract / non interface result type or a factory method.
MapStruct can do this automatically for you. However it cannot handle multiple argument methods (in principle it maps source to target).
Having said that, if you rewrite your code a little bit it you could get rid of the expression and have a full type safe solution.
So:
class IdentityContext {
private final Identity id;
private final MapperHelper mapperHelper;
public IdentityContext(Identity id){
this.id = id;
this.mapperHelper = new MapperHelper();
}
#AfterMapping
public void setIds(com.consumer.expert.dbaccessor.entities.Expert input, #MappingTarget Expert expertOut) {
expertOut.setEngagementId( mapperHelper.ReturnExpertEngagementIdByApiKey(input,identity) );
expertOut.setCampaignId( mapperHelper. ReturnExpertCampaignIdByApiKey(input,identity) );
}
}
now define your mapper as such:
#Mappings({
#Mapping(target = "status", ignore = true),
#Mapping(target = "profile", source = "input.expertProfile")
})
Expert ExpertJpaToExpert(com.consumer.expert.dbaccessor.entities.Expert input, #Context IdentityContext ctx);
List<Expert> ListExpertsJpaToListExperts(List<com.consumer.expert.dbaccessor.entities.Expert> input, #Context IdentityContext ctx)
Note: MapStruct will now recognise the list mapping because the IdentityContext is marked as #Context (so: it will be only set in the calling method but in essence not be part of the mapping source-target itself).
I want to create my own SonarQube Plugin for the RPG language. I have the following problem.
I start by created the RpgLanguage class that extends to AbstractLanguage. In this class, I defined my new language "Rpg". You can see my class in the following code :
public class RpgLanguage extends AbstractLanguage{
public static final String KEY = "rpg";
private Settings settings;
public RpgLanguage(Settings settings) {
super(KEY, "Rpg");
this.settings = settings;
}
public String[] getFileSuffixes() {
String[] suffixes = settings.getStringArray("");
if (suffixes == null || suffixes.length == 0) {
suffixes = StringUtils.split(".RPG", ",");
}
return suffixes;
}
}
After, I have created my RpgRulesDefinition class that implements RulesDefinition. In this class, I create a new repository for the language RPG and I want to add a rule in this repository (empty rules). The code is like below :
public static final String REPOSITORY_KEY = "rpg_repository_mkoza";
public void define(Context context) {
NewRepository repo = context.createRepository(REPOSITORY_KEY, "rpg");
repo.setName("Mkoza Analyser rules RPG");
// We could use a XML or JSON file to load all rule metadata, but
// we prefer use annotations in order to have all information in a single place
RulesDefinitionAnnotationLoader annotationLoader = new RulesDefinitionAnnotationLoader();
annotationLoader.load(repo, RpgFileCheckRegistrar.checkClasses());
repo.done();
}
My class RpgFileCheckRegistrar that call my Rules :
/**
* Register the classes that will be used to instantiate checks during analysis.
*/
public void register(RegistrarContext registrarContext) {
// Call to registerClassesForRepository to associate the classes with the correct repository key
registrarContext.registerClassesForRepository(RpgRulesDefinition.REPOSITORY_KEY, Arrays.asList(checkClasses()), Arrays.asList(testCheckClasses()));
}
/**
* Lists all the checks provided by the plugin
*/
public static Class<? extends JavaCheck>[] checkClasses() {
return new Class[] {
RulesExampleCheck.class
};
}
/**
* Lists all the test checks provided by the plugin
*/
public static Class<? extends JavaCheck>[] testCheckClasses() {
return new Class[] {};
}
My Rule class (still empty):
#Rule(
key = "Rule1",
name = "Rule that make nothing",
priority = Priority.MAJOR,
tags = {"example"}
)
public class RulesExampleCheck extends BaseTreeVisitor{
/**
* Right in java code your rule
*/
}
And the class SonarPlugin that call all these extensions :
public final class RpgSonarPlugin extends SonarPlugin
{
// This is where you're going to declare all your Sonar extensions
public List getExtensions() {
return Arrays.asList(
RpgLanguage.class,
RpgRulesDefinition.class,
RpgFileCheckRegistrar.class
);
}
}
The problem when I want to start the server sonar, I obtain this error stack :
Exception sending context initialized event to listener instance of class org.sonar.server.platform.PlatformServletContextListener
java.lang.IllegalStateException: One of HTML description or Markdown description must be defined for rule [repository=rpg_repository_mkoza, key=Rule1]
I try different things but I don't understand why there are these error.
Of course I want that my repository "rpg_repository_mkoza" is display in the RPG's repository in SonarQube with the Rules : RulesExampleCheck.
My sonar-plugin-version is the 3.7.1
I find my problem. There are need to add the field 'description' in #Rule.
For example :
#Rule(
key = "Rule1",
name = "RuleExampleCheck",
description = "This rule do nothing",
priority = Priority.INFO,
tags = {"try"}
)
I have a custom Ocean workstep in Petrel, but I cannot succeed in persisting my arguments package. My package contains PillarGrid objects (Gird, Property, Zone) as shown below :
[Archivable(Version = 1, Release = "2013.6")]
public class MyArguments : DescribedArgumentsByReflection, IIdentifiable
{
[Archived]
private Grid grid = Grid.NullObject;
[Archived]
private int seedNumber;
[Archived]
private int numberRealizations = 1;
[Archived]
private Zone regionZone = Zone.NullObject;
[Archived]
private Property property = Property.NullObject;
[Archived]
private Droid droid = Droid.Empty;
...
public DeeSseArguments()
: base()
{
// adding this to datasource
IDataSourceManager dsManager = DataManager.DataSourceManager;
string DataSourceId = DeeSseDataSourceFactory.DataSourceId;
StructuredArchiveDataSource dataSource = dsManager.GetSource(DataSourceId) as StructuredArchiveDataSource;
this.droid = dataSource.GenerateDroid();
dataSource.AddItem(this.droid, this);
}
}
I created a DataSourceFactory based on a StructuredArchiveDataSource :
public class MyDataSourceFactory : DataSourceFactory
{
public static string DataSourceId = "MyArgsPackId";
public override Slb.Ocean.Core.IDataSource GetDataSource()
{
return new StructuredArchiveDataSource(DataSourceId, new[] { typeof(MyArguments) });
}
}
I registered this DataSourceFactory in he module method "integrate".
When I try to save my project in Petrel, I have the following error message : "System.Exception: MyArgsPackId: Not an archivable type 'Slb.Ocean.Petrel.DomainObject.PillarGrid.Property'"
How can I manage this persistence please ?
PillarGrid and other complex types cannot be persisted in this manner.
For IIdentifiable objects you should persist their DROID.
Later you can then retrieve the object from said DROID using DataManager.Resolve.
Chippy