Mapstruct How to generate mapping source/target in txt file at build time - annotations

I need to generate somewhere (maybe in directory "target/generated/annotations/..../ MyMapper.txt at buildtime) all the sources/target by mapper.
to then be able to read the txt files at runtime
example :
#Mapper
public interface MyMapper {
#Mapping(target = "a", source = "source.x.y.z")
#Mapping(target = "b", source = "source.r.s.t")
#Mapping(target = "c", source = "source.o.p.q")
MyObject map(MySource source);
}
Content of the generated file : target/generated/annotations/MyMapper.txt
mypackage.MyObject.a=mypackage.MySource.x.y.z
mypackage.MyObject.b=mypackage.MySource.r.s.t
mypackage.MyObject.c=mypackage.MySource.o.p.q
how can i do that?
thank you in advance for your help

MapStruct cannot do this out of the box.
You'll need to write your own annotation processor that will use the #Mapper annotation and generate your own text file.

Thank you #Filip for your advice,
I post here a solution, which maybe can help others.
#Filip : What do you think about this solution?
#SupportedAnnotationTypes({ "org.mapstruct.Mapping", "org.mapstruct.Mappings" })
#SupportedSourceVersion(SourceVersion.RELEASE_8)
#AutoService(Processor.class)
public class MapperProcessor extends AbstractProcessor {
private static final String SOURCE = "source";
private static final String TARGET = "target";
private static final Pattern PATTERN_TARGET = Pattern.compile(".*" + TARGET + "=\"([^\"]*)\".*");
private static final Pattern PATTERN_SOURCE = Pattern.compile(".*" + SOURCE + "=\"([^\"]*)\".*");
private static final String DEST_PATH = "META-INF/mapstruct/";
#Override
public boolean process(Set<? extends TypeElement> annotations, RoundEnvironment roundEnv) {
processingEnv.getMessager().printMessage(Diagnostic.Kind.NOTE, " MapperProcessor : creating metadata from mapstruct annotation");
// init param
Map<String, Map<String, String>> globalMapping = new HashMap<>();
/////////// annotation : #Mappings
for (Element element : roundEnv.getElementsAnnotatedWith(Mappings.class)) {
final String className = ((TypeElement) element.getEnclosingElement()).getQualifiedName().toString();
Map<? extends ExecutableElement, ? extends AnnotationValue> elementValues = element.getAnnotationMirrors().get(0).getElementValues();
elementValues.values().forEach(value -> {
Map<String, String> mapTargetSource = getTargetSourceValue((List<?>) value.getValue());
globalMapping.put(className, mapTargetSource);
});
}
/////////// annotation : #Mapping
for (Element element : roundEnv.getElementsAnnotatedWith(Mapping.class)) {
final String className = ((TypeElement) element.getEnclosingElement()).getQualifiedName().toString();
List<? extends AnnotationMirror> annotationMirrors = element.getAnnotationMirrors();
Map<? extends ExecutableElement, ? extends AnnotationValue> elementValues = annotationMirrors.get(0).getElementValues();
String target = null;
String source = null;
for (Entry<? extends ExecutableElement, ? extends AnnotationValue> entrySet : elementValues.entrySet()) {
final ExecutableElement key = entrySet.getKey();
final AnnotationValue value = entrySet.getValue();
if (StringUtils.equals(key.getSimpleName(), TARGET)) {
target = value.getValue().toString();
}
if (StringUtils.equals(key.getSimpleName(), SOURCE)) {
source = value.getValue().toString();
}
if (StringUtils.isNoneBlank(target, source)) {
break;
}
}
Map<String, String> mapTargetSource = new HashMap<>();
mapTargetSource.put(target, source);
globalMapping.put(className, mapTargetSource);
}
writeMetaData(globalMapping);
return true;
}
///////////////// private methods
private Map<String, String> getTargetSourceValue(List<?> listValue) {
final Map<String, String> result = new HashMap<>();
listValue.forEach(objValueBrut -> {
String target = getValue(PATTERN_TARGET, objValueBrut.toString());
String source = getValue(PATTERN_SOURCE, objValueBrut.toString());
if (StringUtils.isNoneBlank(target, source)) {
result.put(target, source);
}
});
return result;
}
private static String getValue(Pattern pattern, String valueBrut) {
Matcher matcher = pattern.matcher(valueBrut);
if (matcher.matches()) {
return matcher.group(1);
}
return null;
}
private void writeMetaData(Map<String, Map<String, String>> globalMapping) {
for (Entry<String, Map<String, String>> entrySet : globalMapping.entrySet()) {
String className = entrySet.getKey();
Map<String, String> mapping = entrySet.getValue();
if (mapping.isEmpty()) {
continue;
}
try {
FileObject resource = processingEnv.getFiler().createResource(StandardLocation.CLASS_OUTPUT, "", DEST_PATH + className + ".txt");
try (PrintWriter out = new PrintWriter(resource.openWriter())) {
mapping.forEach((k, v) -> out.println(k + "=" + v));
}
} catch (IOException e) {
e.printStackTrace();
}
}
}
}

Related

TableView, TableColumns vanish off the right edge when resizing

Using some custom resizing behaviour I'm losing columns off the right side of the TableView. I have to use UNCONSTRAINED_RESIZE_POLICY (or maybe write a custom POLICY) so that I can size some of the columns to their content.
I have some custom behaviour for the resizing of columns in the TableViews I use in my application.
I use the reflection pattern to autoresize some columns to their content when the data first populates. The remaining columns width is set to a proportion of the remaining width (if there are 3 columns not being autoresized then remaining width/3=column width).
I also have a column width listener which will listen for when a user drags column widths or double clicks on the header divider to size the column to it's content. I also listen to the width of the table itself and then assign any new extra width to the last column.
The above works ok but the issue is when a user resizes a column or multiple columns to the point the last column is as small as it can go columns will start to getting pushed off the right side of the TableView. It makes sense it would do this as I have my POLICY set to UNCONSTRAINED. I obviously can't use CONSTRAINED_RESIZE_POLICY or the above logic won't work.
Is there a custom policy out there that will reduce the rightmost columns inside 1 by 1 as the user increases the column width, so the right column first until it's as small as it can be, then the next rightmost and so on. Or do I need to write this behaviour? I did come across a Koitlin based POLICY in TorpedoFX that looked interesting but I'd rather stay pure Java.
Basically the outcome I want is what I have now but any user resizing just reduces the right-most column to a minimum size, then the next right-most and so on until all the columns to the right of the column the user is resizing are at minimum size but are still visible on the TableView. If there are no columns to the right that can be resized then the user shouldn't be able to resize their column without first resizing a column to the left.
Columns should never disappear off the right side of the TableView.
I've written a test class that mimics this behaviour, it's slightly verbose in places and would be refactored in the real application.
package application;
import java.io.Serializable;
import java.lang.reflect.Method;
import java.util.Arrays;
import java.util.List;
import java.util.ResourceBundle;
import java.util.concurrent.ExecutorService;
import java.util.concurrent.Executors;
import javafx.application.Application;
import javafx.application.Platform;
import javafx.beans.property.ReadOnlyObjectWrapper;
import javafx.beans.value.ChangeListener;
import javafx.beans.value.ObservableValue;
import javafx.collections.FXCollections;
import javafx.collections.ListChangeListener;
import javafx.collections.ObservableList;
import javafx.concurrent.Task;
import javafx.scene.Group;
import javafx.scene.Scene;
import javafx.scene.control.Skin;
import javafx.scene.control.TableColumn;
import javafx.scene.control.TableColumn.CellDataFeatures;
import javafx.scene.control.TableColumnBase;
import javafx.scene.control.TableView;
import javafx.scene.control.cell.PropertyValueFactory;
import javafx.scene.layout.HBox;
import javafx.scene.layout.Priority;
import javafx.stage.Stage;
import javafx.util.Callback;
public class TableViewSample extends Application {
private TableView<TableData> table = new TableView<TableData>();
private ObservableList<TableData> data = FXCollections.observableArrayList();
private boolean columnResizeOperationPerformed = false;
private String resizeThreeColumn = "";
private String resizeFourColumn = "";
private String resizeSixColumn = "";
private final ExecutorService executorService = Executors.newFixedThreadPool(1);
public static void main(String[] args) {
launch(args);
}
#Override
public void start(Stage stage) {
Scene scene = new Scene(new Group());
stage.setWidth(1300);
stage.setHeight(600);
TableColumn<TableData, String> oneColumn = new TableColumn<>("One");
TableColumn<TableData, String> twoColumn = new TableColumn<>("Two");
TableColumn<TableData, String> threeColumn = new TableColumn<>("Three");
TableColumn<TableData, String> fourColumn = new TableColumn<>("Four");
TableColumn<TableData, String> fiveColumn = new TableColumn<>("Five");
TableColumn<TableData, String> sixColumn = new TableColumn<>("Six");
TableColumn<TableData, String> sevenColumn = new TableColumn<>("");
TableColumn<TableData, String> eightColumn = new TableColumn<>("");
TableColumn<TableData, String> nineColumn = new TableColumn<>("Nine");
TableColumn<TableData, String> tenColumn = new TableColumn<>("Ten");
TableColumn<TableData, String> elevenColumn = new TableColumn<>("Eleven");
TableColumn<TableData, String> twelveColumn = new TableColumn<>("Twelve");
TableColumn<TableData, String> thirteenColumn = new TableColumn<>("Thirteen");
TableColumn<TableData, String> lastColumn = new TableColumn<>("Last");
table.setEditable(false);
table.setPrefWidth(1100);
table.setMaxWidth(1100);
table.setItems(data);
table.getColumns().addAll(oneColumn, twoColumn, threeColumn, fourColumn, fiveColumn, sixColumn, sevenColumn, eightColumn, nineColumn, tenColumn, elevenColumn, twelveColumn, thirteenColumn, lastColumn);
table.setFixedCellSize(25.0);
// This cellValueFactory code could be refactored in the real application
oneColumn.setCellValueFactory(new PropertyValueFactory<TableData, String>("oneColumn"));
oneColumn.setCellValueFactory(new Callback<CellDataFeatures<TableData, String>, ObservableValue<String>>() {
public ObservableValue<String> call(CellDataFeatures<TableData, String> p) {
return new ReadOnlyObjectWrapper(p.getValue().getOneColumn());
}
});
twoColumn.setCellValueFactory(new PropertyValueFactory<TableData, String>("twoColumn"));
twoColumn.setCellValueFactory(new Callback<CellDataFeatures<TableData, String>, ObservableValue<String>>() {
public ObservableValue<String> call(CellDataFeatures<TableData, String> p) {
return new ReadOnlyObjectWrapper(p.getValue().getTwoColumn());
}
});
threeColumn.setCellValueFactory(new PropertyValueFactory<TableData, String>("threeColumn"));
threeColumn.setCellValueFactory(new Callback<CellDataFeatures<TableData, String>, ObservableValue<String>>() {
public ObservableValue<String> call(CellDataFeatures<TableData, String> p) {
return new ReadOnlyObjectWrapper(p.getValue().getThreeColumn());
}
});
fourColumn.setCellValueFactory(new PropertyValueFactory<TableData, String>("fourColumn"));
fourColumn.setCellValueFactory(new Callback<CellDataFeatures<TableData, String>, ObservableValue<String>>() {
public ObservableValue<String> call(CellDataFeatures<TableData, String> p) {
return new ReadOnlyObjectWrapper(p.getValue().getFourColumn());
}
});
fiveColumn.setCellValueFactory(new PropertyValueFactory<TableData, String>("fiveColumn"));
fiveColumn.setCellValueFactory(new Callback<CellDataFeatures<TableData, String>, ObservableValue<String>>() {
public ObservableValue<String> call(CellDataFeatures<TableData, String> p) {
return new ReadOnlyObjectWrapper(p.getValue().getFiveColumn());
}
});
sixColumn.setCellValueFactory(new PropertyValueFactory<TableData, String>("sixColumn"));
sixColumn.setCellValueFactory(new Callback<CellDataFeatures<TableData, String>, ObservableValue<String>>() {
public ObservableValue<String> call(CellDataFeatures<TableData, String> p) {
return new ReadOnlyObjectWrapper(p.getValue().getSixColumn());
}
});
sevenColumn.setCellValueFactory(new PropertyValueFactory<TableData, String>("sevenColumn"));
sevenColumn.setCellValueFactory(new Callback<CellDataFeatures<TableData, String>, ObservableValue<String>>() {
public ObservableValue<String> call(CellDataFeatures<TableData, String> p) {
return new ReadOnlyObjectWrapper(p.getValue().getSevenColumn());
}
});
eightColumn.setCellValueFactory(new PropertyValueFactory<TableData, String>("eightColumn"));
eightColumn.setCellValueFactory(new Callback<CellDataFeatures<TableData, String>, ObservableValue<String>>() {
public ObservableValue<String> call(CellDataFeatures<TableData, String> p) {
return new ReadOnlyObjectWrapper(p.getValue().getEightColumn());
}
});
nineColumn.setCellValueFactory(new PropertyValueFactory<TableData, String>("nineColumn"));
nineColumn.setCellValueFactory(new Callback<CellDataFeatures<TableData, String>, ObservableValue<String>>() {
public ObservableValue<String> call(CellDataFeatures<TableData, String> p) {
return new ReadOnlyObjectWrapper(p.getValue().getNineColumn());
}
});
tenColumn.setCellValueFactory(new PropertyValueFactory<TableData, String>("tenColumn"));
tenColumn.setCellValueFactory(new Callback<CellDataFeatures<TableData, String>, ObservableValue<String>>() {
public ObservableValue<String> call(CellDataFeatures<TableData, String> p) {
return new ReadOnlyObjectWrapper(p.getValue().getTenColumn());
}
});
elevenColumn.setCellValueFactory(new PropertyValueFactory<TableData, String>("elevenColumn"));
elevenColumn.setCellValueFactory(new Callback<CellDataFeatures<TableData, String>, ObservableValue<String>>() {
public ObservableValue<String> call(CellDataFeatures<TableData, String> p) {
return new ReadOnlyObjectWrapper(p.getValue().getElevenColumn());
}
});
twelveColumn.setCellValueFactory(new PropertyValueFactory<TableData, String>("twelveColumn"));
twelveColumn.setCellValueFactory(new Callback<CellDataFeatures<TableData, String>, ObservableValue<String>>() {
public ObservableValue<String> call(CellDataFeatures<TableData, String> p) {
return new ReadOnlyObjectWrapper(p.getValue().getTwelveColumn());
}
});
thirteenColumn.setCellValueFactory(new PropertyValueFactory<TableData, String>("thirteenColumn"));
thirteenColumn.setCellValueFactory(new Callback<CellDataFeatures<TableData, String>, ObservableValue<String>>() {
public ObservableValue<String> call(CellDataFeatures<TableData, String> p) {
return new ReadOnlyObjectWrapper(p.getValue().getThirteenColumn());
}
});
lastColumn.setCellValueFactory(new PropertyValueFactory<TableData, String>("lastColumn"));
lastColumn.setCellValueFactory(new Callback<CellDataFeatures<TableData, String>, ObservableValue<String>>() {
public ObservableValue<String> call(CellDataFeatures<TableData, String> p) {
return new ReadOnlyObjectWrapper(p.getValue().getLastColumn());
}
});
// using CONSTRAINED_RESIZE_POLICY will cause all kinds of odd behaviour because of the autoresize and then the columnWidthListener below.
table.setColumnResizePolicy(TableView.UNCONSTRAINED_RESIZE_POLICY);
table.getItems().addListener(new ListChangeListener<TableData>() {
#Override
public void onChanged(Change<? extends TableData> c) {
// check to see if any of the data coming in has column 3 or 4 values that columns can be resized with
if (!columnResizeOperationPerformed) {
boolean outerBreak = false;
while (c.next() && !outerBreak) {
List<? extends TableData> addedSubList = c.getAddedSubList();
if (!addedSubList.isEmpty()) {
for (TableData data : addedSubList) {
outerBreak = checkForColThreeOrFourData(data);
}
}
}
}
// resize some columns to fit contents, other columns to take up remaining space
if (!columnResizeOperationPerformed && !table.getItems().isEmpty()) {
// only prevent future column resizing if the threeColumn has some valid data to size on
if (resizeThreeColumn.length() > 0) {
columnResizeOperationPerformed = true;
}
double totalWidth = 0;
totalWidth = autosizeColumn(oneColumn);
totalWidth += autosizeColumn(threeColumn);
totalWidth += autosizeColumn(fourColumn);
totalWidth += autosizeColumn(sixColumn);
totalWidth += autosizeColumn(sevenColumn);
totalWidth += autosizeColumn(eightColumn);
totalWidth += autosizeColumn(nineColumn);
totalWidth += autosizeColumn(tenColumn);
totalWidth += autosizeColumn(elevenColumn);
totalWidth += autosizeColumn(twelveColumn);
totalWidth += autosizeColumn(lastColumn);
double remainingWidth = table.getWidth() - totalWidth;
sizeColumn(twoColumn, remainingWidth / 4.0);
sizeColumn(fiveColumn, remainingWidth / 4.0);
sizeColumn(thirteenColumn, remainingWidth / 4.0);
table.requestLayout();
}
}
});
ChangeListener<? super Number> columnWidthListener = (obs, ov, nv) -> {
double totalWidth = table.getColumns().stream()
.filter(tc -> !tc.equals(lastColumn))
.mapToDouble(TableColumnBase::getWidth)
.sum();
sizeColumn(lastColumn, table.getWidth() - totalWidth);
};
// listen for any column resizing or table width changes and assign extra width to the lastColumn above
table.getColumns().stream()
.filter(tc -> !tc.equals(lastColumn)).forEach(tc -> {
tc.widthProperty().addListener(columnWidthListener);
});
table.widthProperty().addListener(columnWidthListener);
HBox hBox = new HBox(table);
HBox.setHgrow(table, Priority.ALWAYS);
((Group) scene.getRoot()).getChildren().addAll(hBox);
stage.setScene(scene);
stage.show();
// create Task to update the table data after the UI is constructed so that the column autoresizing code above is applied as data is populated.
Task task = new Task() {
#Override
protected Object call() {
try {
Thread.sleep(100);
Platform.runLater(() -> {
updateTableData();
});
}
catch (Exception ex) {}
return null;
}
};
executorService.submit(task);
}
/**
* A test version of a check from the real application to make sure resizing of columns happens when column data of specific columns is valid
*
* #param tableData
* #return
*/
private boolean checkForColThreeOrFourData(TableData tableData) {
if (resizeThreeColumn.length() == 0) {
resizeThreeColumn = tableData.getThreeColumn();
}
if (resizeFourColumn.length() == 0) {
resizeFourColumn = tableData.getFourColumn();
}
if (resizeSixColumn.length() == 0) {
resizeSixColumn = tableData.getSixColumn();
}
if ((resizeThreeColumn.length() > 0) && resizeFourColumn.length() > 0 && resizeSixColumn.length() > 0) { return true; }
return false;
}
public void sizeColumn(TableColumn<?, ?> column, double width) {
column.setPrefWidth(width);
}
public static double autosizeColumn(TableColumn<?, ?> column) {
final TableView<?> tableView = column.getTableView();
final Skin<?> skin = tableView.getSkin();
final int rowsToMeasure = -1;
try {
Method method = skin.getClass().getDeclaredMethod("resizeColumnToFitContent", TableColumn.class, int.class);
method.setAccessible(true);
method.invoke(skin, column, rowsToMeasure);
}
catch (Exception e) {
e.printStackTrace();
}
return column.getWidth();
}
private void updateTableData() {
data.setAll(Arrays.asList(new TableData("Manufacturer1", "User 1", "value12345", "desc12345", "defaultName", "17:04:49 15/05/19", "200", "0", "0", "3", "12", "2", "16-15-14", "80"),
new TableData("Manufacturer2", "User 2", "value67890", "desc67890", "", "17:06:38 15/05/19", "100", "0", "0", "3", "11", "2", "16-15-14", "82")));
}
class TableData implements Serializable {
private static final long serialVersionUID = 1L;
private String oneColumn;
private String twoColumn;
private String threeColumn;
private String fourColumn;
private String fiveColumn;
private String sixColumn;
private String sevenColumn;
private String eightColumn;
private String nineColumn;
private String tenColumn;
private String elevenColumn;
private String twelveColumn;
private String thirteenColumn;
private String lastColumn;
public TableData(String oneColumn, String twoColumn, String threeColumn, String fourColumn, String fiveColumn, String sixColumn, String sevenColumn, String eightColumn, String nineColumn, String tenColumn, String elevenColumn,
String twelveColumn, String thirteenColumn, String lastColumn) {
this.oneColumn = oneColumn;
this.twoColumn = twoColumn;
this.threeColumn = threeColumn;
this.fourColumn = fourColumn;
this.fiveColumn = fiveColumn;
this.sixColumn = sixColumn;
this.sevenColumn = sevenColumn;
this.eightColumn = eightColumn;
this.nineColumn = nineColumn;
this.tenColumn = tenColumn;
this.elevenColumn = elevenColumn;
this.twelveColumn = twelveColumn;
this.thirteenColumn = thirteenColumn;
this.lastColumn = lastColumn;
}
public String getOneColumn() {
return oneColumn;
}
public String getTwoColumn() {
return twoColumn;
}
public String getThreeColumn() {
return threeColumn;
}
public String getFourColumn() {
return fourColumn;
}
public String getFiveColumn() {
return fiveColumn;
}
public String getSixColumn() {
return sixColumn;
}
public String getSevenColumn() {
return sevenColumn;
}
public String getEightColumn() {
return eightColumn;
}
public String getNineColumn() {
return nineColumn;
}
public String getTenColumn() {
return tenColumn;
}
public String getElevenColumn() {
return elevenColumn;
}
public String getTwelveColumn() {
return twelveColumn;
}
public String getThirteenColumn() {
return thirteenColumn;
}
public String getLastColumn() {
return lastColumn;
}
}
}
As mentioned above this code will auto resize the selected columns and assign the remaining width equally to the other columns.
It will listen to user column width adjustments correctly.
What it won't do is prevent the columns to the right edge vanishing off the view. I would like the right columns to be reduced in width to accomodate the user column width increase in the order described above, right-most first continuing in from the right as columns reach their minimum.
Thanks for any help.

Sling Forward with SyntheticResource

I'm trying to build a Sling servlet that returns a modified value of a resource from the JCR. I dont want to change the original resource, so I create a SyntheticResource and make my manipulations. I then return it back using the RequestDispatcher.
The following code doesn't return the Modified content as expected and I don't see any errors in the log either. Can you tell me what I'm doing wrong here
#SlingServlet(methods = "GET", resourceTypes = "sling/components/test", selectors = "test")
public class TestServlet extends SlingSafeMethodsServlet {
/**
*
*/
private static final long serialVersionUID = 4078524820231933974L;
private final Logger log = LoggerFactory.getLogger(getClass());
#Reference
ResourceResolverFactory resolverFactory;
protected void doGet(SlingHttpServletRequest request, SlingHttpServletResponse response) throws IOException {
Map<String, Object> param = new HashMap<String, Object>();
ResourceResolver resolver = null;
response.setContentType("text/html");
StringWriterResponse writerResponse = new StringWriterResponse(response);
PrintWriter writer = response.getWriter();
try {
param.put(ResourceResolverFactory.SUBSERVICE, "testService");
final String path = request.getRequestPathInfo().getResourcePath();
resolver = resolverFactory.getServiceResourceResolver(param);
final Resource resource = resolver.getResource(path);
String resourceType = resource.getResourceType();
Resource testResource = new SyntheticResource(resolver,
path, resourceType) {
public <T> T adaptTo(Class<T> type) {
if (type == ValueMap.class) {
ModifiableValueMap map = resource
.adaptTo(ModifiableValueMap.class);
map.put("jcr:title", "Modified Title");
return (T)map;
}
return super.adaptTo(type);
}
};
RequestDispatcherOptions requestDispatcherOptions = new RequestDispatcherOptions();
requestDispatcherOptions.setReplaceSelectors("");
final RequestDispatcher requestDispatcher = request.getRequestDispatcher(testResource, requestDispatcherOptions);
requestDispatcher.forward(request, writerResponse);
// log.debug( writerResponse.getString() );
writer.println(writerResponse.getString());
response.setStatus(HttpServletResponse.SC_OK );
} catch (Exception e) {
log.error("Exception: ", e);
} finally {
if( resolver != null) {
resolver.close();
}
if( writer != null ){
writer.close();
}
if (writerResponse != null) {
writerResponse.clearWriter();
}
}
}
}
Using a ResourceDecorator would be simpler, it can return a ResourceWrapper that implements the required changes. Just be careful to keep the decorator's decorate method efficient when it's called for a Resource that it doesn't want to decorate, as it will be called for all Resources.

CQ5 multifield configuration service

I'm trying to create a CQ5 service with a multifield configuration interface. It would be something like this but at the click of PLUS button it would add not just a new row but a group of N rows.
Property
Field1 +-
Field2
....
FieldN
Any advice?
As far as I know there is no such possibility in the Apache Felix.
Depending on your actual requirement I would consider decomposing the configuration. Try moving all the fieldsets (groups of fields that you'd like to add through the plus button) into a separated configuration. So, closely to the slf4j.Logger configuration you would have a Configuration Factory approach.
A simple configuration factory can look like following
#Component(immediate = true, configurationFactory = true, metatype = true, policy = ConfigurationPolicy.OPTIONAL, name = "com.foo.bar.MyConfigurationProvider", label = "Multiple Configuration Provider")
#Service(serviceFactory = false, value = { MyConfigurationProvider.class })
#Properties({
#Property(name = "propertyA", label = "Value for property A"),
#Property(name = "propertyB", label = "Value for property B") })
public class MyConfigurationProvider {
private String propertyA;
private String propertyB;
#Activate
protected void activate(final Map<String, Object> properties, final ComponentContext componentContext) {
propertyA = PropertiesUtil.toStringArray(properties.get("propertyA"), defaultValue);
propertyB = PropertiesUtil.toStringArray(properties.get("propertyB"), defaultValue);
}
}
Using it is as simple as adding a reference in any #Component
#Reference(cardinality = ReferenceCardinality.OPTIONAL_MULTIPLE, referenceInterface = MyConfigurationProvider.class, policy = ReferencePolicy.DYNAMIC)
private final List<MyConfigurationProvider> providers = new LinkedList<MyConfigurationProvider>();
protected void bindProviders(MyConfigurationProvider provider) {
providers.add(provider);
}
protected void unbindProviders(MyConfigurationProvider provider) {
providers.remove(provider);
}
This is one way of doing it.
#Component(label = "My Service", metatype = true, immediate = true)
#Service(MyService.class)
#Properties({
#Property(name = "my.property", description = "Provide details Eg: url=http://www.google.com|size=10|path=/content/project", value = "", unbounded = PropertyUnbounded.ARRAY) })
public class MyService {
private String[] myPropertyDetails;
#Activate
protected void activate(ComponentContext ctx) {
this.myPropertyDetails = getPropertyAsArray(ctx.getProperties().get("my.property"));
try {
if (null != myPropertyDetails && myPropertyDetails.length > 0) {
for(String myPropertyDetail : myPropertyDetails) {
Map<String, String> map = new HashMap<String, String>();
String[] propertyDetails = myPropertyDetails.split("|");
for (String keyValuePair : propertyDetails) {
String[] keyValue = keyValuePair.split("=");
if (null != keyValue && keyValue.length > 1) {
map.put(keyValue[0], keyValue[1]);
}
}
/* the map now has all the properties in the form of key value pairs for single field
use this for logic execution. when there are no multiple properties in the row,
you can skip the logic to split and add in the map */
}
}
} catch (Exception e) {
log.error( "Exception ", e.getMessage());
}
}
private String[] getPropertyAsArray(Object obj) {
String[] paths = { "" };
if (obj != null) {
if (obj instanceof String[]) {
paths = (String[]) obj;
} else {
paths = new String[1];
paths[0] = (String) obj;
}
}
return paths;
}
}

Creating custom plugin for chinese tokenization

I'm working towards properly integrating the stanford segmenter within SOLR for chinese tokenization.
This plugin involves loading other jar files and model files. I've got it working in a crude manner by hardcoding the complete path for the files.
I'm looking for methods to create the plugin where the paths need not be hardcoded and also to have the plugin in conformance with the SOLR plugin architecture. Please let me know if there are any recommended sites or tutorials for this.
I've added my code below :
public class ChineseTokenizerFactory extends TokenizerFactory {
/** Creates a new WhitespaceTokenizerFactory */
public ChineseTokenizerFactory(Map<String,String> args) {
super(args);
assureMatchVersion();
if (!args.isEmpty()) {
throw new IllegalArgumentException("Unknown parameters: " + args);
}
}
#Override
public ChineseTokenizer create(AttributeFactory factory, Reader input) {
Reader processedStringReader = new ProcessedStringReader(input);
return new ChineseTokenizer(luceneMatchVersion, factory, processedStringReader);
}
}
public class ProcessedStringReader extends java.io.Reader {
private static final int BUFFER_SIZE = 1024 * 8;
//private static TextProcess m_textProcess = null;
private static final String basedir = "/home/praveen/PDS_Meetup/solr-4.9.0/custom_plugins/";
static Properties props = null;
static CRFClassifier<CoreLabel> segmenter = null;
private char[] m_inputData = null;
private int m_offset = 0;
private int m_length = 0;
public ProcessedStringReader(Reader input){
char[] arr = new char[BUFFER_SIZE];
StringBuffer buf = new StringBuffer();
int numChars;
if(segmenter == null)
{
segmenter = new CRFClassifier<CoreLabel>(getProperties());
segmenter.loadClassifierNoExceptions(basedir + "ctb.gz", getProperties());
}
try {
while ((numChars = input.read(arr, 0, arr.length)) > 0) {
buf.append(arr, 0, numChars);
}
} catch (IOException e) {
e.printStackTrace();
}
m_inputData = processText(buf.toString()).toCharArray();
m_offset = 0;
m_length = m_inputData.length;
}
#Override
public int read(char[] cbuf, int off, int len) throws IOException {
int charNumber = 0;
for(int i = m_offset + off;i<m_length && charNumber< len; i++){
cbuf[charNumber] = m_inputData[i];
m_offset ++;
charNumber++;
}
if(charNumber == 0){
return -1;
}
return charNumber;
}
#Override
public void close() throws IOException {
m_inputData = null;
m_offset = 0;
m_length = 0;
}
public String processText(String inputText)
{
List<String> segmented = segmenter.segmentString(inputText);
String output = "";
if(segmented.size() > 0)
{
output = segmented.get(0);
for(int i=1;i<segmented.size();i++)
{
output = output + " " +segmented.get(i);
}
}
System.out.println(output);
return output;
}
static Properties getProperties()
{
if (props == null) {
props = new Properties();
props.setProperty("sighanCorporaDict", basedir);
// props.setProperty("NormalizationTable", "data/norm.simp.utf8");
// props.setProperty("normTableEncoding", "UTF-8");
// below is needed because CTBSegDocumentIteratorFactory accesses it
props.setProperty("serDictionary",basedir+"dict-chris6.ser.gz");
props.setProperty("inputEncoding", "UTF-8");
props.setProperty("sighanPostProcessing", "true");
}
return props;
}
}
public final class ChineseTokenizer extends CharTokenizer {
public ChineseTokenizer(Version matchVersion, Reader in) {
super(matchVersion, in);
}
public ChineseTokenizer(Version matchVersion, AttributeFactory factory, Reader in) {
super(matchVersion, factory, in);
}
/** Collects only characters which do not satisfy
* {#link Character#isWhitespace(int)}.*/
#Override
protected boolean isTokenChar(int c) {
return !Character.isWhitespace(c);
}
}
You can pass the argument through the Factory's args parameter.

spring data mongodb converter

I am using spring data mongo-db 1.4.1.RELEASE.
My entity 'Event' has a getter method which is calculated based on other properties:
public int getStatus() {
return (getMainEventId() == null) ? (elapseTimeInMin() < MINIMUM_TIME ? CANDIDATE :
VALID) : POINTER;
}
I wanted the property 'status' to be persisted only through the getter ,so I wrote converters:
#WritingConverter
public class EventWriteConverter implements Converter<Event ,BasicDBObject > {
static final Logger logger = LoggerFactory.getLogger(EventWriteConverter.class.getCanonicalName());
public BasicDBObject convert(Event event) {
logger.info("converting " +event );
if (event.getMainEventId() != null)
return new BasicDBObject("mainEventId", event.getMainEventId() );
BasicDBObject doc = new BasicDBObject("status",event.getStatus()).
append("updated_date",new Date()).
append("start",event.getS0()).
append("end",event.getS1()).
append("location",event.getLocation()).
;
BasicDBList list = new BasicDBList();
doc.append("access_points",event.getHotPoints());
return doc;
}
#ReadingConverter
public class EventReadConverter implements Converter<BasicDBObject, Event> {
#Inject
HotPointRepositry hotRepositry;
static final Logger logger = LoggerFactory.getLogger(EventReadConverter.class.getCanonicalName());
public Event convert(BasicDBObject doc) {
logger.info(" converting ");
Event event = new Event();
event.setId(doc.getObjectId("_id"));
event.setS0(doc.getDate("start"));
event.setS1(doc.getDate("end"));
BasicDBList dblist = (BasicDBList) doc.get("hot_points");
if (dblist != null) {
for (Object obj : dblist) {
ObjectId hotspotId = ((BasicDBObject) obj).getObjectId("_id");
event.addHot(hotRepositry.findOne(hotId));
}
}
dblist = (BasicDBList) doc.get("devices");
if (dblist != null) {
for (Object obj : dblist)
event.addDevice(obj.toString());
}
event.setMainEventId(doc.getObjectId("mainEventId"));
return event;
}
}
My test mongo configuration is
#Profile("test")
#Configuration
#EnableMongoRepositories(basePackages = "com.echo.spring.data.mongo")
#ComponentScan(basePackages = "com.echo.spring.data.mongo" )
public class MongoDbTestConfig extends AbstractMongoConfiguration {
static final Logger logger = LoggerFactory.getLogger(MongoDbTestConfig.class.getCanonicalName());
#Override
protected String getDatabaseName() {
return "echo";
}
#Override
public Mongo mongo() {
return new Fongo("echo-test").getMongo();
}
#Override
protected String getMappingBasePackage() {
return "com.echo.spring.data.mongo";
}
#Bean
#Override
public CustomConversions customConversions() {
logger.info("loading custom converters");
List<Converter<?, ?>> converterList = new ArrayList<Converter<?, ?>>();
converterList.add(new EventReadConverter());
converterList.add(new EventWriteConverter());
CustomConversions cus = new CustomConversions(converterList);
return new CustomConversions(converterList);
}
}
And my test (using fongo) is
ActiveProfiles("test")
#RunWith(SpringJUnit4ClassRunner.class)
#ContextConfiguration(classes = MongoDbTestConfig.class )
public class SampleMongoApplicationTests {
#Test
#ShouldMatchDataSet(location = "/MongoJsonData/events.json")
public void shouldSaveEvent() throws IOException {
URL url = Resources.getResource("MongoJsonData/events.json");
List<String> lines = Resources.readLines(url,Charsets.UTF_8);
for (String line : lines) {
Event event = objectMapper.readValue(line.getBytes(),Event.class);
eventRepository.save(event);
}
}
I can see the converters are loaded when the configuration customConversions() is called
I added logging and breakpoints in the convert methods but they do not seems to be
called when I run or debug, though they are loaded .
What am I doing wrong ?
I had a similar situation, I followed Spring -Mongodb storing/retrieving enums as int not string
and I need both the converter AND converterFactory wired to get it working.