How to support embedded maps (with custom value types) in MongoDB GORM? - mongodb

I would like to have an embedded document referred to by a map (as in 'class A' below). The environment is Grails + GORM + MongoDB.
is that possible, and if yes, how?
class A { // fails with IllegalArgumentException occurred when processing request: can't serialize class X in line 234 of org.bson.BasicBSONEncoder
static mapWith = "mongo"
Map<String, X> map = new HashMap<String, X>()
}
class B { // works
static mapWith = "mongo"
List<X> list = new ArrayList<X>()
}
class C { // works with primitive type values
static mapWith = "mongo"
Map<String, String> map = new HashMap<String, String>()
}
class X {
String data
public X(String data) {
this.data = data
}
}

The embedding works perfectly,as Art Hanzel advised.
However your problem comes from the fact that you try and use List genericity as a sort of constraint :
Map<String, X>
The problem is that Grails couldn't cope well with this syntax, first because Groovy doesn't support genericity.
However, the MongoDB plugin offers a very powerful functionality that lets you define custom type as Domain Class Properties : see here.
In your case you could have
class A {
static mapWith = "mongo"
MyClass map = new MyClass()
}
Then in your src/java for example you could for example implement a
class MyClass extends HashMap<String,X> { }
Then, of course, you have to define a special AbstractMappingAwareCustomTypeMarshaller to specify how to read and write the property in the DB.
An additional step could also be to add a custom validator to class A to check the validity of data...

The MongoDB Grails plugin documentation describes how to make embedded documents:
class Foo {
Address address
List otherAddresses
static embedded = ['address', 'otherAddresses']
}
Off the top of my head, you should be able to access these via the object graph. I don't see any reason why you shouldn't.
myFoo.address.myAddressProperty...

Related

Expected default behavior for Grails RESTful mapping to Nested Resources

I have my Grails Domain classes annotated with #Resource with the uri specifications in UrlMappings where I declare the resource nesting. But even though according to https://docs.grails.org/latest/guide/theWebLayer.html#restfulMappings it seems that just declaring this the right way, I should have the correct behavior that I wanted, which is that a URL pattern such as /nesting/1/nested will list the nested domain that belonged to the nesting domain with ID 1, the observed behavior is that it just lists out all nested domain objects.
So for that, my workaround is to have a controller implemented that overrides the listResources to filter the nested domain by the nesting domain. But what's weird to me is why I even have to do that at all. The documentation said it defaults to the index action but said index action seems to just behave as if it's the index() of nested (without taking nesting into account).
My domain entities are WeightSensor:
#Resource(formats = ['json', 'xml'])
class WeightSensor extends Sensor<WeightData>
{
Set<WeightData> data
static constraints = {
}
}
its superclass Sensor
#Resource(formats = ['json', 'xml'])
class Sensor<T extends SensorData>
{
Set<T> data
static hasMany = [data: SensorData]
String name
static constraints = {
name unique: true
}
}
and WeightData
class WeightData extends SensorData
{
Float weight
static constraints = {
weight nullable: false
}
}
and its superclass SensorData
class SensorData
{
#BindingFormat('yyyy-MM-dd HH:mm:ss.S') // 2019-07-11 22:00:28.909
Date timestamp
static belongsTo = [sensor: Sensor]
static constraints = {
timestamp nullable: false
}
}
In my UrlMappings I have the following:
"/sensor/weight"(resources: 'weightSensor') {
"/data"(resources: "weightData")
}
My WeightDataController extends from a SensorDataController:
class WeightDataController extends SensorDataController<WeightSensor, WeightData>
{
#SuppressWarnings("GroovyUnusedDeclaration")
static responseFormats = ['json', 'xml']
WeightDataController()
{
super(WeightData, WeightSensor, "weightSensorId")
}
}
And SensorDataController in turn extends RestfulController, and overrides the listAllResources method as below.
import grails.rest.RestfulController
class SensorDataController<S extends Sensor, T extends SensorData> extends RestfulController<T>
{
String idProperty
Class<S> sensorType
#SuppressWarnings("GroovyUnusedDeclaration")
static responseFormats = ['json', 'xml']
protected SensorDataController(Class<T> dataType, Class<S> sensorType, String idProperty)
{
super(dataType)
this.idProperty = idProperty
this.sensorType = sensorType
}
#Override
protected List<T> listAllResources(Map params)
{
Long sensorId = params.get(idProperty) as Long
if (sensorId)
{
resource.withCriteria() {
eq 'sensor.id', sensorId
maxResults params.max ?: 10
firstResult params.offset ?: 0
} as List<T>
}
else
{
super.listAllResources(params)
}
}
}
Note because in order for me to have my WeightDataController class be used, I needed to remove the #Resource on top of WeightData domain entity above, another nice little gem of wisdom I had to discover with trial and error.
I can probably blame this on the fact that the documentation for nested resources seems a bit open to interpretation. But when we see in the documentation a URL like GET books/${bookId}/authors, doesn't that look like it should return the list of Author objects that belongs to the Book instance IDed by bookId?
I know that I'm not alone as I did find this online of someone asking the same question I have - https://gist.github.com/mnellemann/7cfff1c721ef32f0be6c63574795f795 but no one answered them either. I also came across another SO post nested RESTful resources that was abandoned 5 years ago as well.
But 3 people having the same question and no one responding to our questions (I asked mine on the Grails Slack community) usefully because there is a work-around is not acceptable. At the risk of having my question taken down for a slew of different reasons, I question the usefulness of even having the grails nested resource URL mapping in the first place because I could have done everything manually myself without having to "declare" such a nesting in UrlMappings.
In closing, what I'm trying to find out is whether or not there's more "configuration" I need to do to get Grails nested Resources to behave in the way that I expected, which is how the documentation painted, correctly. Because just doing what was described doesn't get me that.

Neo4j 3.0.3 Stored procedures in Scala

Is there any sample Scala code available for creating stored procedures in Neo4j-3.0.3 ?
I have been trying to create one simple Scala based stored procedure. Below is the Error message I get when I copy my scala-jar file to the neo4j-plugins directory and start the neo4j server :
=================
Caused by: org.neo4j.kernel.lifecycle.LifecycleException: Component 'org.neo4j.kernel.impl.proc.Procedures#1ac0223' was successfully initialized, but failed to start. Please see attached cause exception.
at org.neo4j.kernel.lifecycle.LifeSupport$LifecycleInstance.start(LifeSupport.java:444)
at org.neo4j.kernel.lifecycle.LifeSupport.start(LifeSupport.java:107)
at org.neo4j.kernel.impl.factory.GraphDatabaseFacadeFactory.newFacade(GraphDatabaseFacadeFactory.java:140)
... 10 more
Caused by: org.neo4j.kernel.api.exceptions.ProcedureException: Unable to find a usable public no-argument constructor in the class `neoscala`. Please add a valid, public constructor, recompile the class and try again.
=================
The scala class that I have used is :
package neoproc
import org.neo4j.graphdb.GraphDatabaseService
import org.neo4j.procedure.Procedure;
import javax.ws.rs.core.{Context, Response}
class neoscala(#Context db: GraphDatabaseService) {
#Procedure
def alice():String = {
String.valueOf(db.execute( "MATCH (n:User) return n" ));
}
}
Your Scala class declares a constructor with a GraphDatabaseService argument, and the exception tells you that it only wants a no-argument constructor.
It's documented in both
the user documentation:
Only static fields and #Context-annotated fields are allowed in Procedure classes.
the Javadoc:
The procedure method itself can contain arbitrary Java code - but in order to work with the underlying graph, it must have access to the graph API. This is done by declaring fields in the procedure class, and annotating them with the Context annotation. Fields declared this way are automatically injected with the requested resource. This is how procedures gain access to APIs to do work with.
All fields in the class containing the procedure declaration must either be static; or it must be public, non-final and annotated with Context.
Apparently it's not possible to create a class with a public field in Scala, so you'll have to create a parent Java class with the public field, and extend it with your Scala class:
// ProcedureAdapter.java
public abstract class ScalaProcedureAdapter {
#Context
public GraphDatabaseService db;
}
// neoscala.scala
class neoscala extends ScalaProcedureAdapter {
// ...
}
Here is the solution for this :
We will create Class in scala :
class FullTextIndex extends JavaHelper {
#Procedure("example.search")
#PerformsWrites
def search(#Name("label") label: String,
#Name("query") query: String): Stream[SearchHit] = {
//declare your method
}
val nodes: Stream[Node] = db.index.forNodes(index).query(query).stream
val newFunction: java.util.function.Function[Node, SearchHit] = (node: Node) => new SearchHit(node)
nodes.map {
newFunction
}
}
private def indexName(label: String): String = {
"label-" + label
}
}
Procedure in Neo4j always return result in Stream and it is a latest feature in Java8 so we will also used Java Class for return the final result and For defining the public variable.
We will create Java class for result :
public class JavaHelper {
#Context
public GraphDatabaseService db;
#Context
public Log log;
public static class SearchHit {
//your result code here
}
You can refer knoldus blog for Neo4j User Defined Procedure for creating and storing Neo4j Procedure with Scala. Here you will also find sample code with git hub repository.

How to use BeanWrapperFieldSetMapper to map a subset of fields?

I have a Spring batch application where BeanWrapperFieldSetMapper is used to map fields using a prototype object. However, the CSV file that is being read (via a FlatFileItemReader) contains one (indicator) field that determines the mapping of another field. If the indicator field has a value of Y, then the value of the another field should be mapped to property foo otherwise it should be mapped to property bar.
I know that I can use a custom FieldSetMapper to do this, but then I have to code the mapping all of the other fields (of which there are a quite a few). Alternatively, I could do this post reading via an ItemProcessor but then my domain (prototype) object must have a property representing the indicator field (which I prefer not to do since it is not really part of the business domain).
Is it possible to perhaps use a custom FieldSetMapper to only map these custom fields and delegate the other mappings to BeanWrapperFieldSetMapper? Or is there some other better way to solve for this?
Here is my current attempt to use a custom FieldSetMapper and delegate to BeanWrapperFieldSetMapper:
public class DelegatedFieldSetMapper extends BeanWrapperFieldSetMapper<MyProtoClass> {
#Override
public MyProtoClass mapFieldSet(FieldSet fieldSet) throws BindException {
String indicator = fieldSet.readString("indicator");
Properties fieldProperties = fieldSet.getProperties();
if (indicator.equalsIgnoreCase("y")) {
fieldProperties.put("test.foo", fieldSet.readString("value");
} else {
fieldProperties.put("test.bar", fieldSet.readString("value");
}
fieldProperties.remove("indicator");
Set<Object> keys = fieldProperties.keySet();
List<String> names = new ArrayList<String>();
List<String> values = new ArrayList<String>();
for (Object key : keys) {
names.add((String) key);
values.add((String) fieldProperties.getProperty((String) key));
}
DefaultFieldSet domainObjectFieldSet = new DefaultFieldSet(names.toArray(new String[names.size()]), values.toArray(new String[values.size()]));
return super.mapFieldSet(domainObjectFieldSet);
}
}
However, a FlatFileParseException is thrown. The relevant parts of the batch config class are as follows:
#Configuration
#EnableBatchProcessing
public class BatchConfiguration {
#Value("${file}")
private File file;
#Bean
#Scope("prototype")
public MyProtoClass () {
return new MyProtoClass();
}
#Bean
public ItemReader<MyProtoClass> reader(LineMapper<MyProtoClass> lineMapper) {
FlatFileItemReader<MyProtoClass> flatFileItemReader = new FlatFileItemReader<MyProtoClass>();
flatFileItemReader.setResource(new FileSystemResource(file));
final int NUMBER_OF_HEADER_LINES = 1;
flatFileItemReader.setLinesToSkip(NUMBER_OF_HEADER_LINES);
flatFileItemReader.setLineMapper(lineMapper);
return flatFileItemReader;
}
#Bean
public LineMapper<MyProtoClass> lineMapper(LineTokenizer lineTokenizer, FieldSetMapper<MyProtoClass> fieldSetMapper) {
DefaultLineMapper<MyProtoClass> lineMapper = new DefaultLineMapper<MyProtoClass>();
lineMapper.setLineTokenizer(lineTokenizer);
lineMapper.setFieldSetMapper(fieldSetMapper);
return lineMapper;
}
#Bean
public LineTokenizer lineTokenizer() {
DelimitedLineTokenizer lineTokenizer = new DelimitedLineTokenizer();
lineTokenizer.setNames(new String[] {"value", "test.bar", "test.foo", "indicator"});
return lineTokenizer;
}
#Bean
public FieldSetMapper<MyProtoClass> fieldSetMapper(PropertyEditor emptyStringToNullPropertyEditor) {
BeanWrapperFieldSetMapper<MyProtoClass> fieldSetMapper = new DelegatedFieldSetMapper();
fieldSetMapper.setPrototypeBeanName("myProtoClass");
Map<Class<String>, PropertyEditor> customEditors = new HashMap<Class<String>, PropertyEditor>();
customEditors.put(String.class, emptyStringToNullPropertyEditor);
fieldSetMapper.setCustomEditors(customEditors);
return fieldSetMapper;
}
Finally, the CSV flat file look like this:
value,bar,foo,indicator
abc,,,y
xyz,,,n
Let's say that BatchWorkObject is the class to be mapped.
Here's a sample code in Spring Boot style that needs only your custom logic to be added.
new BeanWrapperFieldSetMapper<BatchWorkObject>(){
{
this.setTargetType(BatchWorkObject.class);
}
#Override
public BatchWorkObject mapFieldSet(FieldSet fs)
throws BindException {
BatchWorkObject tmp= super.mapFieldSet(fs);
// your custom code here
return tmp;
}
});
The code actually accomplishes what is desired except for one issue that results in the FlatFileParseException. The DelegatedFieldSetMapper contains the issue as follows:
DefaultFieldSet domainObjectFieldSet = new DefaultFieldSet(names.toArray(new String[names.size()]), values.toArray(new String[values.size()]));
To resolve, change to:
DefaultFieldSet domainObjectFieldSet = new DefaultFieldSet(values.toArray(new String[values.size()]), names.toArray(new String[names.size()]));
Write your own FieldSetMapper with a set of prepared delegates inside.
Those delegates are pre-built for every different kind of fields mapping.
In your object route to correct delegate based on indicator field (with a Classifier, for example).
I can't see any other way, but this solution is quite easy and straightforward to maintain.
Processing based on the input format/data can be done using a custom implementation of ItemProcessor which is either changing values in the same entity (that was populated by IteamReader) or creates a new one output entity.

Persisting dynamic groovy properties with GORM MongoDB

I am currently trying to persist the following class with the GORM MongoDB plugin for grails:
class Result {
String url
def Result(){
}
static constraints = {
}
static mapWith="mongo"
static mapping = {
collection "results"
database "crawl"
}
}
The code I'm running to persist this class is the following:
class ResultIntegrationTests {
#Before
void setUp() {
}
#After
void tearDown() {
}
#Test
void testSomething() {
Result r = new Result();
r.setUrl("http://heise.de")
r.getMetaClass().setProperty("title", "This is how it ends!")
println(r.getTitle())
r.save(flush:true)
}
}
This is the result in MongoDB:
{ "_id" : NumberLong(1), "url" : "http://heise.de", "version" : 0 }#
Now the url is properly persisted with MongoDB but the dynamic property somehow is not seen by the mapper - although the println(r.getTitle()) works perfectly fine.
I am new to groovy so I thought that someone with a little more experience could help me out with this problem. Is there a way to make this dynamically added property visible to the mapping facility? If yes how can I do that?
Thanks a lot for any advice...
Rather than adding random properties to the metaClass and hoping that Grails will both scan the metaClass looking for your random properties and then persist them, why not just add a Map to your domain class, (or a new Key/Value domain class which Result can hasMany) so you can add random extra properties to it as you want.
try this doc
#Test
void testSomething() {
Result r = new Result();
r.url = "http://heise.de"
r.['title'] = "This is how it ends!" //edit: forgot the subscript
println r.['title']
r.save(flush:true)
}
BTW, Instead of using gorm or hibernate you can always use directly java api / gmongo.

Can't insert new entry into deserialized AutoBean Map

When i try to insert a new entry to a deserialized Map instance i get no exception but the Map is not modified. This EntryPoint code probes it. I'm doing anything wrong?
public class Test2 implements EntryPoint {
public interface SomeProxy {
Map<String, List<Integer>> getStringKeyMap();
void setStringKeyMap(Map<String, List<Integer>> value);
}
public interface BeanFactory extends AutoBeanFactory {
BeanFactory INSTANCE = GWT.create(BeanFactory.class);
AutoBean<SomeProxy> someProxy();
}
#Override
public void onModuleLoad() {
SomeProxy proxy = BeanFactory.INSTANCE.someProxy().as();
proxy.setStringKeyMap(new HashMap<String, List<Integer>>());
proxy.getStringKeyMap().put("k1", new ArrayList<Integer>());
proxy.getStringKeyMap().put("k2", new ArrayList<Integer>());
String payload = AutoBeanCodex.encode(AutoBeanUtils.getAutoBean(proxy)).toString();
proxy = AutoBeanCodex.decode(BeanFactory.INSTANCE, SomeProxy.class, payload).as();
// insert a new entry into a deserialized map
proxy.getStringKeyMap().put("k3", new ArrayList<Integer>());
System.out.println(proxy.getStringKeyMap().keySet()); // the keySet is [k1, k2] :-( ¿where is k3?
}
}
Shouldn't AutoBeanCodex.encode(AutoBeanUtils.getAutoBean(proxy)).toString(); be getPayLoad()
I'll check the code later, and I don't know if that is causing the issue. But it did stand out as different from my typical approach.
Collection classes such as java.util.Set and java.util.List are tricky because they operate in terms of Object instances. To make collections serializable, you should specify the particular type of objects they are expected to contain through normal type parameters (for example, Map<Foo,Bar> rather than just Map). If you use raw collections or maps you will get bloated code and be vulnerable to denial of service attacks.
Font: http://www.gwtproject.org/doc/latest/DevGuideServerCommunication.html#DevGuideSerializableTypes