Memory Leak in Grails with MongoDB - mongodb

I've found a strange issue when saving or updating several objects in Grails with MongoDB. Currently I'm using Grails 2.2.3 and MongoDB plugin 1.3.0.
The problem seems to be that the instances of MiUsuario are never GC neither when I manually call the GC. In our main application we don't make batch updates, but when doing load tests (with JMeter and monitoring JVM with Java VisualVM) this problem causes memory filling and Tomcat stops responding.
I've created a small new application to show the problem.
A simple domain object:
class MiUsuario {
ObjectId id
String nickName
}
My controller:
import pruebasrendimiento.Prueba
class MiUsuarioController {
def doLogin(String privateKey, String id){
MiUsuario user = MiUsuario.get(id)
user.nickName = new Random().nextInt().toString()
user.save(failOnError:true)
render 'ok'
}
}
My BuildConfig (Just the dependencies and plugins part):
dependencies {
}
plugins {
// runtime ":hibernate:$grailsVersion"
runtime ":jquery:1.8.3"
runtime ":resources:1.2"
build ":tomcat:$grailsVersion"
// runtime ":database-migration:1.3.2"
// compile ':cache:1.0.1'
runtime ":mongodb:1.3.0"
}
I've also tried something that Burt said a long time ago (http://burtbeckwith.com/blog/?p=73), but DomainClassGrailsPlugin.PROPERTY_INSTANCE_MAP.get().clear() doesn't make any difference. And the other option that's said in that page, RequestContextHolder.resetRequestAttributes(), gives me an exception.

I had similar problem and it solves upgrading to grails 2.3.1. Try it.

Related

Embedded Kafka in micronaut app not finding beans

I'm using the embedded Kafka server in my test described here: https://micronaut-projects.github.io/micronaut-kafka/latest/guide/#kafkaEmbedded. The problem is I'm getting this io.micronaut.context.exceptions.BeanContextException: Error processing bean [Definition: org.app.messaging.TestConsumer] method definition [void receive(String msg)]: Failed to inject value for parameter [testService] of method [setTestService] of class: org.app.messaging.TestConsumer when I run the test. Any ideas how to fix this?
Here's what the test looks like:
void "test run kafka embedded server"() {
given:
ApplicationContext applicationContext = ApplicationContext.run(
Collections.singletonMap(
AbstractKafkaConfiguration.EMBEDDED, true
)
)
when:
AbstractKafkaConsumerConfiguration config = applicationContext.getBean(AbstractKafkaConsumerConfiguration)
Properties props = config.getConfig()
then:
props[ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG] == 9091
when:
KafkaEmbedded kafkaEmbedded = applicationContext.getBean(KafkaEmbedded)
then:
kafkaEmbedded.kafkaServer.isPresent()
kafkaEmbedded.zkPort.isPresent()
cleanup:
applicationContext.close()
}
Placing a test anywhere other than the root package seem to be causing multiple "bean definition not found" issues. There's no ComponentScan support in the framework so the only thing that worked for me was to move the test file to the root package. There's some ideas here: https://github.com/micronaut-projects/micronaut-core/issues/511 if you're experiencing similar issues with a CLI app. However, it didn't work when using the embedded server and embedded kafka.

Grails afterInsert hook issue with Hibernate 5.1 and PostgreSQL

In an existing Grails 3.1.15 application, recently upgraded to Hibernate 5.1, an odd issue with afterInsert (or other) hooks started to appear. After some hours of testing, I could track this down to Hibernate 5.1 + PostgreSQL combination - issue is not reproducible with H2. To reproduce, create a simple application consisting of 2 domain objects - an Audit trail and a User, as shown here:
class Audit {
Date dateCreated
String auditData
}
class User {
String name
String email
def afterInsert() {
new Audit(auditData: "User created: $this").save()
}
}
The code above works OK with Hibernate 4, however, if the application is upgraded to Hibernate5 plugin + Hibernate 5.1.x (tested with 5.1.0.Final and 5.1.5.Final) the above scenario will always lead to a ConcurrentModificationException when you attempt to save a new User instance. You can just use a scaffold controller to reproduce. Note this only happens with PostgreSQL as the data source - with H2 it would still work OK.
Now, according to GORM Docs (see chapter 8.1.3) one should use a new session when attempting to save other objects in beforeUpdate or afterInsert hooks anyway:
def afterInsert() {
Audit.withNewSession() {
new Audit(auditData: "User created: $this").save()
/* no exception logged, but Audit instance not preserved */
}
}
But this wouldn't really resolve the issue with PSQL. The exception is gone, the User instance is persisted, but the Audit instance is not saved. Again, this code would work OK with H2. To really avoid the issue with PSQL, you would have to manually flush the session in afterInsert:
def afterInsert() {
Audit.withNewSession() { session ->
new Audit(auditData: "User created: $this").save()
session.flush()
/* finally no exceptions, both User and Audit saved */
}
}
Question: is this a bug, or is this expected? I find it a bit suspicious that the manual flush is required in order for the Audit instance to be persisted - and even more so when I see it works without a flush with H2 and only seems to affect PostgreSQL. But I couldn't really find any reports - any pointers are appreciated!
For the sake of completeness, I tested with the following JDBC driver versions for PostgreSQL:
runtime 'org.postgresql:postgresql:9.3-1101-jdbc41'
runtime 'org.postgresql:postgresql:9.4.1208.jre7'
runtime 'org.postgresql:postgresql:42.0.0'
And for the upgrade to Hibernate 5.1, the following dependencies were used:
classpath "org.grails.plugins:hibernate5:5.0.13"
...
compile "org.grails.plugins:hibernate5:5.0.13"
compile "org.hibernate:hibernate-core:5.1.5.Final"
compile "org.hibernate:hibernate-ehcache:5.1.5.Final"

Importing domain classes from GORM-standalone module into Grails

I have 2 pieces of my puzzle:
1) a no-Grails project named core-module with standalone GORM:
dependencies {
compile 'org.grails:grails-datastore-gorm-mongodb:6.0.4.RELEASE'
compile 'org.grails:grails-validation:3.2.3'
}
and domain classes like:
import grails.gorm.annotation.Entity
#Entity
class Module {
String id
String tags
}
the GORM-ing is initialized by
Map config = new Yaml().load this.class.getResource( '/application.yml' ).text
new MongoDatastore( config, Module, UserAccount )
and the domain classes are working as they would in a Grails app.
2) a Grails 3.2.3 app:
dependencies {
// default grails dependencies
compile project( ':core-module' )
compile 'org.grails.plugins:mongodb:6.0.4'
compile 'org.grails.plugins:spring-security-core:3.1.1'
// etc
}
the GORM is initialized so:
def config = config.grails.mongodb
log.info "using $config"
new MongoDatastore( config, Module, UserAccount )
and it prints out this into the log file:
g.app.init.com.mozaiq.Application - using [host:localhost, port:27017, databaseName:theDB, username:theUN, pooled:true, mongoOptions:[connectionsPerHost:100, autoConnectRetry:true, connectTimeout:3000]]
The problem is, that the property grailsApplication.domainClasses is empty, and even though the Module.collection is not-null, the Module.count() return 0 despite the collection being not empty.
Also in my mongo-client I see that upon app start a new database test is created with an empty collection, named by one of my domain classes. If I insert some documents into it, the .count() returns 0 and the CRUD list remains empty.
Grails only scans packages in the application by default for performance reasons. Override limitScanningToApplication() to return false in your Application class and define the packages you wish to scan by overriding packageNames() in your Application class.
Grails will then automatically discover the Mongo GORM classes

Code coverage on Play! project

I have a Play! project where I would like to add some code coverage information. So far I have tried JaCoCo and scct. The former has the problem that it is based on bytecode, hence it seems to give warning about missing tests for methods that are autogenerated by the Scala compiler, such as copy or canEqual. scct seems a better option, but in any case I get many errors during tests with both.
Let me stick with scct. I essentially get errors for every test that tries to connect to the database. Many of my tests load some fixtures into an H2 database in memory and then make some assertions. My Global.scala contains
override def onStart(app: Application) {
SessionFactory.concreteFactory = Some(() => connection)
def connection() = {
Session.create(DB.getConnection()(app), new MySQLInnoDBAdapter)
}
}
while the tests usually are enclosed in a block like
class MySpec extends Specification {
def app = FakeApplication(additionalConfiguration = inMemoryDatabase())
"The models" should {
"be five" in running(app) {
Fixtures.load()
MyModels.all.size should be_==(5)
}
}
}
The line running(app) allows me to run a test in the context of a working application connected to an in-memory database, at least usually. But when I run code coverage tasks, such as scct coverage:doc, I get a lot of errors related to connecting to the database.
What is even more weird is that there are at least 4 different errors, like:
ObjectExistsException: Cache play already exists
SQLException: Attempting to obtain a connection from a pool that has already been shutdown
Configuration error [Cannot connect to database [default]]
No suitable driver found for jdbc:h2:mem:play-test--410454547
Why is that launching tests in the default configuration is able to connect to the database, while running in the context of scct (or JaCoCo) fails to initialize the cache and the db?
specs2 tests run in parallel by default. Play disables parallel execution for the standard unit test configuration, but scct uses a different configuration so it doesn't know not to run in parallel.
Try adding this to your Build.scala:
.settings(parallelExecution in ScctPlugin.ScctTest := false)
Alternatively, you can add sequential to the beginning of your test classes to force all possible run configurations to run sequentially. I've got both in my files still, as I think I had some problems with the Build.scala solution at one point when I was using an early release candidate of Play.
A better option for Scala code coverage is Scoverage which gives statement line coverage.
https://github.com/scoverage/scalac-scoverage-plugin
Add to project/plugins.sbt:
addSbtPlugin("com.sksamuel.scoverage" % "sbt-scoverage" % "1.0.1")
Then run SBT with
sbt clean coverage test
You need to add sequential in the beginning of your Specification.
class MySpec extends Specification {
sequential
"MyApp" should {
//...//
}
}

Has anyone successfully deployed a GWT app on Heroku?

Heroku recently began supporting Java apps. Looking through the docs, it seems to resemble the Java Servlet Standard. Does anyone know of an instance where a GWT app has been successfully deployed on Heroku? If so, are there any limitations?
Yes, I've got a successful deployment using the getting started with Java instructions here:
http://devcenter.heroku.com/articles/java
I use the Maven project with appassembler plugin approach but added gwt-maven-plugin to compile a GWT app during the build.
When you push to heroku you see the GWT compile process running, on one thread only so quite slow but it works fine.
The embedded Jetty instance is configured to serve up static resources at /static from src/main/resources/static and I copy the compiled GWT app to this location during the build and then reference the .nocache.js as normal.
What else do you want to know?
You've got a choice, either build the Javascript representation of your GWT app locally into your Maven project, commit it and the read it from your app, or to generate it inside Heroku via the gwt-maven-plugin as I mentioned.
The code to serve up files from a static location inside your jar via embedded Jetty is something like this inside a Guice ServletModule:
(See my other answer below for a simpler and less Guice-driven way to do this.)
protected void configureServlets() {
bind(DefaultServlet.class).in(Singleton.class);
Map<String, String> initParams = new HashMap<String, String>();
initParams.put("pathInfoOnly", "true");
initParams.put("resourceBase", staticResourceBase());
serve("/static/*").with(DefaultServlet.class, initParams);
}
private String staticResourceBase() {
try {
return WebServletModule.class.getResource("/static").toURI().toString();
}
catch (URISyntaxException e) {
e.printStackTrace();
return "couldn't resolve real path to static/";
}
}
There's a few other tricks to getting embedded Jetty working with guice-servlet, let me know if this isn't enough.
My first answer to this turned out to have problems when GWT tried to read its serialization policy. In the end I went for a simpler approach that was less Guice-based. I had to step through the Jetty code to understand why setBaseResource() was the way to go - it's not immediately obvious from the Javadoc.
Here's my server class - the one with the main() method that you point Heroku at via your app-assembler plugin as per the Heroku docs.
public class MyServer {
public static void main(String[] args) throws Exception {
if (args.length > 0) {
new MyServer().start(Integer.valueOf(args[0]));
}
else {
new MyServer().start(Integer.valueOf(System.getenv("PORT")));
}
}
public void start(int port) throws Exception {
Server server = new Server(port);
ServletContextHandler context = new ServletContextHandler(ServletContextHandler.SESSIONS);
context.setBaseResource(createResourceForStatics());
context.setContextPath("/");
context.addEventListener(new AppConfig());
context.addFilter(GuiceFilter.class, "/*", null);
context.addServlet(DefaultServlet.class, "/");
server.setHandler(context);
server.start();
server.join();
}
private Resource createResourceForStatics() throws MalformedURLException, IOException {
String staticDir = getClass().getClassLoader().getResource("static/").toExternalForm();
Resource staticResource = Resource.newResource(staticDir);
return staticResource;
}
}
AppConfig.java is a GuiceServletContextListener.
You then put your static resources under src/main/resources/static/.
In theory, one should be able to run GWT using the embedded versions of Jetty or Tomcat, and bootstrap the server in main as described in the Heroku Java docs.