grails 3 mongodb: Method on class [] was used outside of a Grails application - mongodb

I am using the following:
Grails Version: 3.0.1
Groovy Version: 2.4.3
JVM Version: 1.8.0_05
mongodb: 3.0.3
I have two domain objects that look like this:
class PhoneNumber {
String country
String numberString
static constraints = {
country nullable: false, size: 2..2
numberString nullable: false, blank: false, size: 1..16
}
}
and
class Contact {
String name
static hasMany = [phoneNumber: PhoneNumber]
static embedded = ['phoneNumber']
static constraints = { }
}
I have a controller that looks like this:
class ContactController extends RestfulController {
static responseFormats = ['json', 'xml']
ContactController() { super(Contact) }
#Transactional
def save(Contact contact) {
println contact
response.status = 201
def result = [:]
result.id = 1
render result as JSON
}
}
When I POST to the controller via:
curl -XPOST "http://localhost:8080/contact" -d "#contact.json"
I get a response of {"id":1}. However if I add the following line to my Contact and PhoneNumber domain objects:
static mapWith = 'mongo'
I get the following error:
ERROR org.grails.web.errors.GrailsExceptionResolver - IllegalStateException occurred when processing request: [POST] /contact - parameters:
{"id":null,"name":"Full Name","phoneNumber":[{"country":"ca","numberString":"18095551212"},{"country":"ca","numberString":"16135551212"}]}:
Method on class [xxx.Contact] was used outside of a Grails application. If running in the context of a test using the mocking API or bootstrap Grails correctly.. Stacktrace follows:
java.lang.IllegalStateException: Method on class [demo.Contact] was used outside of a Grails application. If running in the context of a test using the mocking API or bootstrap Grails correctly.
at grails.transaction.GrailsTransactionTemplate$2.doInTransaction(GrailsTransactionTemplate.groovy:93) ~[grails-core-3.0.1.jar:3.0.1]
at grails.transaction.GrailsTransactionTemplate.execute(GrailsTransactionTemplate.groovy:90) ~[grails-core-3.0.1.jar:3.0.1]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[na:1.8.0_05]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[na:1.8.0_05]
at java.lang.Thread.run(Thread.java:745) [na:1.8.0_05]
What else needs to be done to get mongodb domain objects marshalled on a POST?

Explicitly define the id field as an ObjectId when using MongoDB.
domain/com/example/Book.groovy
package com.example
import org.bson.types.ObjectId
class Book {
ObjectId id
String title
String author
static mapWith = "mongo"
}
BSON IDs are not simple long numbers — they contain four parts, including a timestamp. When converted to a String (example: book.id as String), the ID will be 24 characters long and look something like: "556a7299aa2437211f8e4e73"
See:
https://docs.mongodb.com/manual/reference/method/ObjectId

Related

How to use "Kafka Streams Binder" with "Functional Style" and DI?

https://cloud.spring.io/spring-cloud-static/spring-cloud-stream-binder-kafka/3.0.0.M3/reference/html/spring-cloud-stream-binder-kafka.html#_programming_model shows an example where the input topic can be set using the property spring.cloud.stream.bindings.process_in.destination.
Now I want to use dependency injection, e.g.
#Bean
public java.util.function.Consumer<KStream<Object, String>> process(JavaMailSender mailSender) {...}
When starting the application (based on Spring Boot) the property spring.cloud.stream.bindings.process_in.destination is ignored, and instead the input topic input is subscribed.
EDIT: Here is the Kotlin code (without imports)
Mailer.kt:
#Configuration
class Mailer {
#Bean
fun sendMail(/*mailSender: JavaMailSender*/) = Consumer<KStream<Any, Mail>> { input ->
input.foreach { _, mail -> println("mail = $mail") }
}
}
Mail.kt:
data class Mail(var from: String = "", var to: String = "", var subject: String = "", var body: String = "")
Application.kt:
#SpringBootApplication
class Application
fun main(args: Array<String>) {
runApplication<Application>(*args) {
}
}
application.yml::
spring.cloud.stream:
bindings.sendMail_in.destination: mail
kafka.binder.configuration.listeners: PLAINTEXT://localhost:9092
There were a few issues in the binder that didn't properly autowire the beans provided to a function/consumer bean. Latest snapshot solves those problems though. Please make sure that you are using the latest snapshot (3.0.0.BUILD-SNAPSHOT). Here is a sample application that works with the same scenario that you provided.

Neo4j 3.0.3 Stored procedures in Scala

Is there any sample Scala code available for creating stored procedures in Neo4j-3.0.3 ?
I have been trying to create one simple Scala based stored procedure. Below is the Error message I get when I copy my scala-jar file to the neo4j-plugins directory and start the neo4j server :
=================
Caused by: org.neo4j.kernel.lifecycle.LifecycleException: Component 'org.neo4j.kernel.impl.proc.Procedures#1ac0223' was successfully initialized, but failed to start. Please see attached cause exception.
at org.neo4j.kernel.lifecycle.LifeSupport$LifecycleInstance.start(LifeSupport.java:444)
at org.neo4j.kernel.lifecycle.LifeSupport.start(LifeSupport.java:107)
at org.neo4j.kernel.impl.factory.GraphDatabaseFacadeFactory.newFacade(GraphDatabaseFacadeFactory.java:140)
... 10 more
Caused by: org.neo4j.kernel.api.exceptions.ProcedureException: Unable to find a usable public no-argument constructor in the class `neoscala`. Please add a valid, public constructor, recompile the class and try again.
=================
The scala class that I have used is :
package neoproc
import org.neo4j.graphdb.GraphDatabaseService
import org.neo4j.procedure.Procedure;
import javax.ws.rs.core.{Context, Response}
class neoscala(#Context db: GraphDatabaseService) {
#Procedure
def alice():String = {
String.valueOf(db.execute( "MATCH (n:User) return n" ));
}
}
Your Scala class declares a constructor with a GraphDatabaseService argument, and the exception tells you that it only wants a no-argument constructor.
It's documented in both
the user documentation:
Only static fields and #Context-annotated fields are allowed in Procedure classes.
the Javadoc:
The procedure method itself can contain arbitrary Java code - but in order to work with the underlying graph, it must have access to the graph API. This is done by declaring fields in the procedure class, and annotating them with the Context annotation. Fields declared this way are automatically injected with the requested resource. This is how procedures gain access to APIs to do work with.
All fields in the class containing the procedure declaration must either be static; or it must be public, non-final and annotated with Context.
Apparently it's not possible to create a class with a public field in Scala, so you'll have to create a parent Java class with the public field, and extend it with your Scala class:
// ProcedureAdapter.java
public abstract class ScalaProcedureAdapter {
#Context
public GraphDatabaseService db;
}
// neoscala.scala
class neoscala extends ScalaProcedureAdapter {
// ...
}
Here is the solution for this :
We will create Class in scala :
class FullTextIndex extends JavaHelper {
#Procedure("example.search")
#PerformsWrites
def search(#Name("label") label: String,
#Name("query") query: String): Stream[SearchHit] = {
//declare your method
}
val nodes: Stream[Node] = db.index.forNodes(index).query(query).stream
val newFunction: java.util.function.Function[Node, SearchHit] = (node: Node) => new SearchHit(node)
nodes.map {
newFunction
}
}
private def indexName(label: String): String = {
"label-" + label
}
}
Procedure in Neo4j always return result in Stream and it is a latest feature in Java8 so we will also used Java Class for return the final result and For defining the public variable.
We will create Java class for result :
public class JavaHelper {
#Context
public GraphDatabaseService db;
#Context
public Log log;
public static class SearchHit {
//your result code here
}
You can refer knoldus blog for Neo4j User Defined Procedure for creating and storing Neo4j Procedure with Scala. Here you will also find sample code with git hub repository.

Spring data elastic search findAll with OrderBy

I am using spring data's elastic search module, but I am having troubles building a query. It is a very easy query though.
My document looks as follows:
#Document(indexName = "triber-sensor", type = "event")
public class EventDocument implements Event {
#Id
private String id;
#Field(type = FieldType.String)
private EventMode eventMode;
#Field(type = FieldType.String)
private EventSubject eventSubject;
#Field(type = FieldType.String)
private String eventId;
#Field(type = FieldType.Date)
private Date creationDate;
}
And the spring data repository looks like:
public interface EventJpaRepository extends ElasticsearchRepository<EventDocument, String> {
List<EventDocument> findAllOrderByCreationDateDesc(Pageable pageable);
}
So I am trying to get all events ordered by creationDate with the newest event first. However when I run the code I get an exception (also in STS):
Caused by: org.springframework.data.mapping.PropertyReferenceException: No property desc found for type Date! Traversed path: EventDocument.creationDate.
So it seems that it is not picking up the 'OrderBy' part? However a query with a findBy clause (eg findByCreationDateOrderByCreationDateDesc) seems to be okay. Also a findAll without ordering works.
Does this mean that the elastic search module of spring data doesn't allow findAll with ordering?
Try adding By to method name:
findAllByOrderByCreationDateDesc

Project reference elements Morphia

Does it possible project attributes'reference elements in Morphia?
I have an structure similar to this:
#Entity
public class Event {
#Embedded
private List<Edition> editions;
}
public class Edition {
#Reference
private List<Lecture> lectures
}
When I try project some attributes of reference element such like this:
final MorphiaIterator<Event, Event> aggregate = this.basicDAO.getDs().
<Event, Event>createAggregation(Event.class).
match(query).
project(
Projection.projection("editions.address"),
Projection.projection("editions.ageLimit"),
Projection.projection("editions.bannerURL"),
Projection.projection("editions.description"),
Projection.projection("editions.endsIn"),
Projection.projection("editions.establishment"),
Projection.projection("editions.iconURL"),
Projection.projection("editions.id"),
Projection.projection("editions.observation"),
Projection.projection("editions.startsIn"),
Projection.projection("editions.lectures.endsIn"),
Projection.projection("editions.lectures.name"),
Projection.projection("editions.lectures.startsIn"),
Projection.projection("editions.position"),
Projection.projection("name")
).
aggregate(Event.class);
return aggregate.next();
I receive the following message:
java.lang.RuntimeException: java.lang.ClassCastException: com.mongodb.BasicDBObject cannot be cast to com.mongodb.DBRef
Actually, this is just a doubt base in fact that I could execute another query to retrieve just the attributes that I want an then complete my object

How to support embedded maps (with custom value types) in MongoDB GORM?

I would like to have an embedded document referred to by a map (as in 'class A' below). The environment is Grails + GORM + MongoDB.
is that possible, and if yes, how?
class A { // fails with IllegalArgumentException occurred when processing request: can't serialize class X in line 234 of org.bson.BasicBSONEncoder
static mapWith = "mongo"
Map<String, X> map = new HashMap<String, X>()
}
class B { // works
static mapWith = "mongo"
List<X> list = new ArrayList<X>()
}
class C { // works with primitive type values
static mapWith = "mongo"
Map<String, String> map = new HashMap<String, String>()
}
class X {
String data
public X(String data) {
this.data = data
}
}
The embedding works perfectly,as Art Hanzel advised.
However your problem comes from the fact that you try and use List genericity as a sort of constraint :
Map<String, X>
The problem is that Grails couldn't cope well with this syntax, first because Groovy doesn't support genericity.
However, the MongoDB plugin offers a very powerful functionality that lets you define custom type as Domain Class Properties : see here.
In your case you could have
class A {
static mapWith = "mongo"
MyClass map = new MyClass()
}
Then in your src/java for example you could for example implement a
class MyClass extends HashMap<String,X> { }
Then, of course, you have to define a special AbstractMappingAwareCustomTypeMarshaller to specify how to read and write the property in the DB.
An additional step could also be to add a custom validator to class A to check the validity of data...
The MongoDB Grails plugin documentation describes how to make embedded documents:
class Foo {
Address address
List otherAddresses
static embedded = ['address', 'otherAddresses']
}
Off the top of my head, you should be able to access these via the object graph. I don't see any reason why you shouldn't.
myFoo.address.myAddressProperty...