Grails audit plugin is not working with the MongoDB
Even though it's mention in the documentation that it will work with mongoDB
Compatibility issues
Users of Grails 1.2.x and below should use version 0.5.3 of this plugin. Users of Grails 1.3.x and above should use version 0.5.5.3 of this plugin. If you use Grails >= 2.3 we recommend to use 1.0.0 or above.
Starting with version 1.0.0, this plugin is ORM mapper agnostic, so you can use it with the ORM mapper of your choice (Hibernate3, Hibernate4, MongoDB, etc.).
I have the following configuration setup
BuildConfig.groovy
compile ":mongodb:3.0.3"
compile ":audit-logging:1.0.5"
NOTE: not using any Hibernate plugin
Error I am getting on Starting the Application is:
Configuring Spring Security Core ...
... finished configuring Spring Security Core
2015-08-03 20:30:48,774 +0530 ERROR GrailsContextLoaderListener:213 - Error initializing the application: Cannot get property 'datastores' on null object
java.lang.NullPointerException: Cannot get property 'datastores' on null object
at org.codehaus.groovy.runtime.NullObject.getProperty(NullObject.java:57)
at org.codehaus.groovy.runtime.InvokerHelper.getProperty(InvokerHelper.java:168)
at org.codehaus.groovy.runtime.callsite.NullCallSite.getProperty(NullCallSite.java:44)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callGetProperty(AbstractCallSite.java:227)
at AuditLoggingGrailsPlugin$_closure1.doCall(AuditLoggingGrailsPlugin.groovy:106)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at org.springsource.loaded.ri.ReflectiveInterceptor.jlrMethodInvoke(ReflectiveInterceptor.java:1270)
at org.codehaus.groovy.reflection.CachedMethod.invoke(CachedMethod.java:90)
at groovy.lang.MetaMethod.doMethodInvoke(MetaMethod.java:324)
at groovy.lang.MetaClassImpl.invokeMethod(MetaClassImpl.java:1207)
at groovy.lang.ExpandoMetaClass.invokeMethod(ExpandoMetaClass.java:1110)
at groovy.lang.MetaClassImpl.invokeMethod(MetaClassImpl.java:1016)
at groovy.lang.Closure.call(Closure.java:423)
at AuditLoggingGrailsPlugin$_closure1.call(AuditLoggingGrailsPlugin.groovy)
at org.codehaus.groovy.grails.plugins.DefaultGrailsPlugin.doWithApplicationContext(DefaultGrailsPlugin.java:488)
at org.codehaus.groovy.grails.plugins.AbstractGrailsPluginManager.doPostProcessing(AbstractGrailsPluginManager.java:176)
at org.codehaus.groovy.grails.commons.spring.GrailsRuntimeConfigurator.performPostProcessing(GrailsRuntimeConfigurator.java:240)
at org.codehaus.groovy.grails.commons.spring.GrailsRuntimeConfigurator.configure(GrailsRuntimeConfigurator.java:176)
at org.codehaus.groovy.grails.commons.spring.GrailsRuntimeConfigurator.configure(GrailsRuntimeConfigurator.java:127)
at org.codehaus.groovy.grails.web.context.GrailsConfigUtils.configureWebApplicationContext(GrailsConfigUtils.java:126)
at org.codehaus.groovy.grails.web.context.GrailsContextLoaderListener.initWebApplicationContext(GrailsContextLoaderListener.java:109)
at org.springframework.web.context.ContextLoaderListener.contextInitialized(ContextLoaderListener.java:106)
at org.apache.catalina.core.StandardContext.listenerStart(StandardContext.java:4992)
at org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5490)
at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150)
at org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1575)
at org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1565)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Disconnected from the target VM, address: '127.0.0.1:35642', transport: 'socket'
I also Found a similar question which redirects to this link
http://stackoverflow.com/questions/23470095/grails-audit-logging-plugin-for-mongodb-is-not-working
Please try with latest 1.0.6-SNAPSHOT version. See Issue https://github.com/robertoschwald/grails-audit-logging-plugin/issues/91
Related
I'm using a spark-shell instance to test the pulling of data from a client's kafka source. To launch the instance I am using the command spark-shell --jars spark-sql-kafka-0-10_2.11-2.5.0-palantir.8.jar, kafka_2.12-2.5.0.jar, kafka-clients-2.5.0.jar (all jars are present in the woring dir).
However, when I run the command val df = spark.read.format("kafka")........... after a few seconds it crashes with the below:
java.lang.NoClassDefFoundError: org/apache/spark/sql/sources/v2/StreamingWriteSupportProvider
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(ClassLoader.java:760)
at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:455)
at java.net.URLClassLoader.access$100(URLClassLoader.java:73)
at java.net.URLClassLoader$1.run(URLClassLoader.java:367)
at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:411)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:344)
at java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:370)
at java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:404)
at java.util.ServiceLoader$1.next(ServiceLoader.java:480)
at scala.collection.convert.Wrappers$JIteratorWrapper.next(Wrappers.scala:43)
at scala.collection.Iterator$class.foreach(Iterator.scala:893)
at scala.collection.AbstractIterator.foreach(Iterator.scala:1336)
at scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
at scala.collection.AbstractIterable.foreach(Iterable.scala:54)
at scala.collection.TraversableLike$class.filterImpl(TraversableLike.scala:247)
at scala.collection.TraversableLike$class.filter(TraversableLike.scala:259)
at scala.collection.AbstractTraversable.filter(Traversable.scala:104)
at org.apache.spark.sql.execution.datasources.DataSource$.lookupDataSource(DataSource.scala:533)
at org.apache.spark.sql.execution.datasources.DataSource.providingClass$lzycompute(DataSource.scala:89)
at org.apache.spark.sql.execution.datasources.DataSource.providingClass(DataSource.scala:89)
at org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:304)
at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:178)
at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:146)
... 48 elided
Caused by: java.lang.ClassNotFoundException: org.apache.spark.sql.sources.v2.StreamingWriteSupportProvider
at java.net.URLClassLoader$1.run(URLClassLoader.java:372)
at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 79 more
HOWEVER - if I change the order of the jars in the spark-shell command to spark-shell --jars kafka_2.12-2.5.0.jar, kafka-clients-2.5.0.jar, spark-sql-kafka-0-10_2.11-2.5.0-palantir.8.jar, instead crashes with:
java.lang.NoClassDefFoundError: org/apache/kafka/common/serialization/ByteArrayDeserializer
at org.apache.spark.sql.kafka010.KafkaSourceProvider$.<init>(KafkaSourceProvider.scala:376)
at org.apache.spark.sql.kafka010.KafkaSourceProvider$.<clinit>(KafkaSourceProvider.scala)
at org.apache.spark.sql.kafka010.KafkaSourceProvider.validateBatchOptions(KafkaSourceProvider.scala:330)
at org.apache.spark.sql.kafka010.KafkaSourceProvider.createRelation(KafkaSourceProvider.scala:113)
at org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:309)
at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:178)
at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:146)
... 48 elided
Caused by: java.lang.ClassNotFoundException: org.apache.kafka.common.serialization.ByteArrayDeserializer
at java.net.URLClassLoader$1.run(URLClassLoader.java:372)
at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 55 more
I am developing behind a very strict proxy managed by our client and am unable to user --packages instead, and I am at a bit of a loss here, am I unable to load all 3 dependencies at the launch of the shell? Am I missing another step somewhere?
In the Structured Streaming + Kafka Integration Guide it says:
For experimenting on spark-shell, you need to add this above library and its dependencies too when invoking spark-shell.
The library you are using seems to be customized and not publicly available in the maven central repository. That means, I can not look into its dependencies.
However, looking at the latest stable version 2.4.5 the dependencies according to maven central repository is kafka-clients version 2.0.0.
You are trying to import multiple scala versions 2.11 & 2.12 of different libraries.
Please add same version of scala libraries & check below how to import into spark-shell.
spark-shell --packages org.apache.spark:spark-sql-kafka-0-10_2.11:2.4.5,org.apache.kafka:kafka_2.11:2.4.1,org.apache.kafka:kafka-clients:2.4.1
One occasionally disruptive issue is dealing with dependency conflicts in cases where a user application and Spark itself both depend on the same library. This comes up relatively rarely, but when it does, it can be vexing for users. Typically, this will manifest itself when a NoSuchMethodError, a ClassNotFoundException, or some other JVM exception related to class loading is thrown during the execution of a Spark job. There are two solutions to this problem. The first is to modify your application to depend on the same version of the third-party library that Spark does. The second is to modify the packaging of your application using a procedure that is often called “shading.” The Maven build tool supports shading through advanced configuration of the plug-in shown in Example 7-5 (in fact, the shading capability is why the plugin is named maven-shade-plugin). Shading allows you to make a second copy of the conflicting package under a different namespace and rewrites your application’s code to use the renamed version. This somewhat brute-force technique is quite effective at resolving runtime dependency conflicts. For specific instructions on how to shade dependencies, see the documentation for your build tool.
I would try to know the scala version of the spark-shell because, it can be a scala version issue
scala> util.Properties.versionString
res3: String = version 2.11.8
if not, then check what spark version you are using and third-party library versions you are using as dependencies because, I am sure there is newest or oldest that your spark version doesn't support.
I hope it helps.
When I try to access Services, Validate and Administration link on the URL http://localhost:82/SOAPDemo/, where my Axis2 Web Application is deployed, it gives me the following error :
Servlet.init() for servlet AxisServlet threw exception
At the back-end, while apache tomcat 7 is being started in eclipse, it shows me the following warning :
[WARN] Unable to instantiate deployer org.apache.axis2.deployment.ServiceDeployer;
I solved this by adding axis2-jaxws.jar in the classpath.
You can integrate it with maven : http://mvnrepository.com/artifact/org.apache.axis2/axis2-jaxws
I got the same issue when I use axis2 1.7.9, then I tried to follow
"I solved this by adding axis2-jaxws.jar in the classpath. You can integrate it with maven : http://mvnrepository.com/artifact/org.apache.axis2/axis2-jaxws", solution by Pierre-Yves Le Dévéhat
But it didn't work. Then I tried the
"I probably hit the same issue, here is the call stack:" ->
"I fixed it with old axis2 1.6.4 instead of 1.7.1 installation and new project creation", solution by user6140506, edited by TeWu
then the error is fixed.
axis2 1.6.4 is working for me...
I probably hit the same issue, here is the call stack:
[WARN] Unable to instantiate deployer org.apache.axis2.deployment.ServiceDeployer; see debug logs for more details
avr. 28, 2016 6:27:14 PM org.apache.catalina.core.ApplicationContext log
GRAVE: StandardWrapper.Throwable
java.lang.NoClassDefFoundError: org/apache/ws/commons/schema/resolver/URIResolver
at org.apache.axis2.deployment.ModuleDeployer.deploy(ModuleDeployer.java:128)
at org.apache.axis2.deployment.repository.util.DeploymentFileData.deploy(DeploymentFileData.java:144)
at org.apache.axis2.deployment.DeploymentEngine.doDeploy(DeploymentEngine.java:585)
at org.apache.axis2.deployment.RepositoryListener.init(RepositoryListener.java:264)
at org.apache.axis2.deployment.RepositoryListener.init2(RepositoryListener.java:66)
at org.apache.axis2.deployment.RepositoryListener.<init>(RepositoryListener.java:61)
at org.apache.axis2.deployment.DeploymentEngine.loadRepository(DeploymentEngine.java:152)
at org.apache.axis2.deployment.WarBasedAxisConfigurator.getAxisConfiguration(WarBasedAxisConfigurator.java:233)
at org.apache.axis2.context.ConfigurationContextFactory.createConfigurationContext(ConfigurationContextFactory.java:64)
at org.apache.axis2.transport.http.AxisServlet.initConfigContext(AxisServlet.java:620)
at org.apache.axis2.transport.http.AxisServlet.init(AxisServlet.java:471)
at org.apache.axis2.webapp.AxisAdminServlet.init(AxisAdminServlet.java:60)
at org.apache.catalina.core.StandardWrapper.initServlet(StandardWrapper.java:1238)
at org.apache.catalina.core.StandardWrapper.loadServlet(StandardWrapper.java:1151)
at org.apache.catalina.core.StandardWrapper.load(StandardWrapper.java:1038)
at org.apache.catalina.core.StandardContext.loadOnStartup(StandardContext.java:4996)
at org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5285)
at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:147)
at org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1408)
at org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1398)
at java.util.concurrent.FutureTask.run(Unknown Source)
at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
at java.lang.Thread.run(Unknown Source)
Caused by: java.lang.ClassNotFoundException: org.apache.ws.commons.schema.resolver.URIResolver
at org.apache.catalina.loader.WebappClassLoaderBase.loadClass(WebappClassLoaderBase.java:1305)
at org.apache.catalina.loader.WebappClassLoaderBase.loadClass(WebappClassLoaderBase.java:1139)
... 24 more
I fixed it with old axis2 1.6.4 instead of 1.7.1 installation and new project creation
I want to use Datanucleus JDO version 3.2.8 with App Engine to avoid RDBMS string to bigint problem (see https://stackoverflow.com/questions/21588107/datanucleus-jdo-map-string-to-mysql-type-bigint-in-app-engine)
I downloaded Datanucleus App Engine plugin from here http://www.datanucleus.org/products/accessplatform/datastores/appengine.html and created a folder 'v3' in [AppEngine SDK]/lib/opt/tools/datanucleus and [AppEngine SDK]/lib/opt/user/datanucleus where i put the plugin, datanucleus-core, datanucleus rdbms, jdo api jars in version 3.2.8 as well as a "jdo-api-3.0.1.jar" there which was also in the v2 folder
I also switched from v2 to v3 in project properties and the project WEB-INF/lib/ contains the new jars.
when I try to enhance classes Datanucleus Enhancer 3.2.8. throws an unexpected exception with following log:
java.lang.RuntimeException: Unexpected exception
at com.google.appengine.tools.enhancer.Enhancer.execute(Enhancer.java:76)
at com.google.appengine.tools.enhancer.Enhance.<init>(Enhance.java:71)
at com.google.appengine.tools.enhancer.Enhance.main(Enhance.java:51)
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at com.google.appengine.tools.enhancer.Enhancer.execute(Enhancer.java:74)
... 2 more
Caused by: java.lang.NoSuchFieldError: updateLock
at org.datanucleus.api.jdo.metadata.JDOMetaDataManager.getMetaDataForClassInternal(JDOMetaDataManager.java:440)
at org.datanucleus.metadata.MetaDataManager.getMetaDataForClass(MetaDataManager.java:1488)
at org.datanucleus.metadata.MetaDataManager.loadClasses(MetaDataManager.java:545)
at org.datanucleus.enhancer.DataNucleusEnhancer.getFileMetadataForInput(DataNucleusEnhancer.java:737)
at org.datanucleus.enhancer.DataNucleusEnhancer.enhance(DataNucleusEnhancer.java:513)
at org.datanucleus.enhancer.DataNucleusEnhancer.main(DataNucleusEnhancer.java:1281)
... 7 more
What am I missing to make it work? Thank you very much for your help
The problem is due to you having inconsistent/incompatible versions of DataNucleus jars in the CLASSPATH. The message says that the field "updateLock" is not present in the MetaDataManager. Hence you have some old version of datanucleus-core.jar present in the CLASSPATH somewhere, either in your project, or in the config of this "GAE Eclipse Plugin"
After updating Sonar from Version 3.5.1 to 3.7 the local analysis in eclipse does not work anymore. An Upgrade from 3.7.0 to 3.7.2 didn't change the situation. The error-message in eclipse indigo (Sonar Java Analyser and Sonar m2e Connector version 3.2.0.20130627-1142-RELEASE) is:
INFO: Sonar Server 3.7.2
11:34:08.423 INFO - Load batch settings
11:34:08.619 INFO - User cache: C:\Users\xxxx\.sonar\cache
11:34:08.627 INFO - Install plugins
11:34:08.661 INFO - Exclude plugins: devcockpit, jira, pdfreport, views, report, scmactivity
11:34:11.218 INFO - Dry run
Exception in thread "main" org.sonar.runner.impl.RunnerException: Unable to execute Sonar
at org.sonar.runner.impl.BatchLauncher$1.delegateExecution(BatchLauncher.java:79)
at org.sonar.runner.impl.BatchLauncher$1.run(BatchLauncher.java:63)
at java.security.AccessController.doPrivileged(Native Method)
at org.sonar.runner.impl.BatchLauncher.doExecute(BatchLauncher.java:57)
at org.sonar.runner.impl.BatchLauncher.execute(BatchLauncher.java:50)
at org.sonar.runner.impl.BatchLauncherMain.execute(BatchLauncherMain.java:41)
at org.sonar.runner.impl.BatchLauncherMain.main(BatchLauncherMain.java:59)
Caused by: org.sonar.api.utils.HttpDownloader$HttpException: Fail to download [http://sonar-server:9000/batch_bootstrap/db?project=com.project:xxx.yyy]. Response code: 500
at org.sonar.api.utils.HttpDownloader$BaseHttpDownloader$HttpInputSupplier.getInput(HttpDownloader.java:279)
at org.sonar.api.utils.HttpDownloader$BaseHttpDownloader$HttpInputSupplier.getInput(HttpDownloader.java:235)
at com.google.common.io.ByteStreams.copy(ByteStreams.java:116)
at com.google.common.io.Files.copy(Files.java:231)
at org.sonar.batch.bootstrap.ServerClient.download(ServerClient.java:69)
at org.sonar.batch.bootstrap.DryRunDatabase.downloadDatabase(DryRunDatabase.java:88)
at org.sonar.batch.bootstrap.DryRunDatabase.start(DryRunDatabase.java:70)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:601)
at org.picocontainer.lifecycle.ReflectionLifecycleStrategy.invokeMethod(ReflectionLifecycleStrategy.java:110)
at org.picocontainer.lifecycle.ReflectionLifecycleStrategy.start(ReflectionLifecycleStrategy.java:89)
at org.picocontainer.injectors.AbstractInjectionFactory$LifecycleAdapter.start(AbstractInjectionFactory.java:84)
at org.picocontainer.behaviors.AbstractBehavior.start(AbstractBehavior.java:169)
at org.picocontainer.behaviors.Stored$RealComponentLifecycle.start(Stored.java:132)
at org.picocontainer.behaviors.Stored.start(Stored.java:110)
at org.picocontainer.DefaultPicoContainer.potentiallyStartAdapter(DefaultPicoContainer.java:1015)
at org.picocontainer.DefaultPicoContainer.startAdapters(DefaultPicoContainer.java:1008)
at org.picocontainer.DefaultPicoContainer.start(DefaultPicoContainer.java:766)
at org.sonar.api.platform.ComponentContainer.startComponents(ComponentContainer.java:91)
at org.sonar.api.platform.ComponentContainer.execute(ComponentContainer.java:77)
at org.sonar.batch.bootstrapper.Batch.startBatch(Batch.java:92)
at org.sonar.batch.bootstrapper.Batch.execute(Batch.java:74)
at org.sonar.runner.batch.IsolatedLauncher.execute(IsolatedLauncher.java:45)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:601)
at org.sonar.runner.impl.BatchLauncher$1.delegateExecution(BatchLauncher.java:75)
... 6 more
In the sonar.log file on our sonar server I found the following stacktrace:
ERROR o.s.c.p.DbTemplate Fail to copy table rules
org.h2.jdbc.JdbcBatchUpdateException:
Unique index or primary key violation: "RULES_PLUGIN_KEY_AND_NAME ON PUBLIC.RULES(PLUGIN_RULE_KEY, PLUGIN_NAME)"; SQL statement:
INSERT INTO rules(ID,PLUGIN_RULE_KEY,PLUGIN_NAME,DESCRIPTION,PRIORITY,CARDINALITY,PARENT_ID,PLUGIN_CONFIG_KEY,NAME,STATUS,LANGUAGE,CREATED_AT,UPDATED_AT) VALUES(?,?,?,?,?,?,?,?,?,?,?,?,?) [23505-172]
at org.h2.jdbc.JdbcPreparedStatement.executeBatch(JdbcPreparedStatement.java:1167) ~[h2-1.3.172.jar:1.3.172]
at org.apache.commons.dbcp.DelegatingStatement.executeBatch(DelegatingStatement.java:297) ~[commons-dbcp-1.4.jar:1.4]
at org.apache.commons.dbcp.DelegatingStatement.executeBatch(DelegatingStatement.java:297) ~[commons-dbcp-1.4.jar:1.4]
at org.sonar.core.persistence.DbTemplate.copyTable(DbTemplate.java:85) [sonar-core-3.7.2.jar:na]
at org.sonar.core.persistence.DbTemplate.copyTable(DbTemplate.java:46) [sonar-core-3.7.2.jar:na]
at org.sonar.core.persistence.DryRunDatabaseFactory.copy(DryRunDatabaseFactory.java:83) [sonar-core-3.7.2.jar:na]
at org.sonar.core.persistence.DryRunDatabaseFactory.createNewDatabaseForDryRun(DryRunDatabaseFactory.java:60) [sonar-core-3.7.2.jar:na]
at org.sonar.core.dryrun.DryRunCache.generateNewDB(DryRunCache.java:121) [sonar-core-3.7.2.jar:na]
at org.sonar.core.dryrun.DryRunCache.getDatabaseForDryRun(DryRunCache.java:81) [sonar-core-3.7.2.jar:na]
at org.sonar.server.ui.JRubyFacade.createDatabaseForDryRun(JRubyFacade.java:500) [classes/:na]
I tried to figure out what the problem is by looking at the sonar source code but without deeper knowledge what sonar wants to do I am not able to get an idea how to solve the problem. Does anyone have any hint?
I had this issue and it prevents Analysis from Eclipse (because Sonar create a H2 copy of database for dry run).
The issues was that I had twice the same user login in database, once I deleted one of them it works.
I am using Grails 2.1.4 and Maven Integration.
I created pom file for Grails project. I am running a goal like this from Eclipse:
mvn -Dgrails.env=test package
I am getting error like this:
Fatal error forking Grails JVM: java.lang.reflect.InvocationTargetException
java.lang.RuntimeException: java.lang.reflect.InvocationTargetException
at org.grails.launcher.GrailsLauncher.launch(GrailsLauncher.java:150)
at org.grails.maven.plugin.tools.ForkedGrailsRuntime.main(ForkedGrailsRuntime.java:168)
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:601)
at org.grails.launcher.GrailsLauncher.launch(GrailsLauncher.java:144)
... 1 more
Caused by: java.lang.IllegalStateException: User input is not enabled, cannot obtain input stream
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:525)
at org.codehaus.groovy.reflection.CachedConstructor.invoke(CachedConstructor.java:77)
at org.codehaus.groovy.runtime.callsite.ConstructorSite$ConstructorSiteNoUnwrapNoCoerce.callConstructor(ConstructorSite.java:102)
at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCallConstructor(CallSiteArray.java:54)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callConstructor(AbstractCallSite.java:182)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callConstructor(AbstractCallSite.java:194)
at gant.Gant$_dispatch_closure5.doCall(Gant.groovy:391)
The error that you're seeing may happen once in a while, especially in Grails 2.1.x.
Run the command(mvn -Dgrails.env=test package) couple of times and you'll notice that your problem is intermittent, it cannot be reproduced all the time.
Setting the fork option to false in your pom.xml seems to 'fix' it, in most cases. It's probably a Grails bug for 2.1.x releases.
I haven't seen the issue yet in newer Grails versions.
I just encountered the same error after upgrading Grails from 2.2.1 to 2.2.4 .
The issue was that the plugin directory was not cleared out and was clashing with the new plugins coming in.
The solution is to delete all of the plugins in the project root/plugins directory and then run any grails or maven commands to reinstall the new set of plugins.