How do I add G1GC parameter in JBOSS EAP 7.1.0? - jboss

We are using JBOSS EAP7.1.0 module and have written puppet module for the same . We have separated the hiera and code .The max and min heap size is added as below :
wildfly::java_xmx: '2048m'
wildfly::java_xms: '2048m'
Can anyone tell me how can I add -XX:+UseG1GC, [Garbage Collector --> G1] and -XX:+HeapDumpOnOutOfMemoryError ?

wildfy::java_opts: '-XX:+UseG1GC -XX:+HeapDumpOnOutOfMemoryError'

Related

In Eclipse: How to set default VM arguments for a new server runtime?

I quite often have to update the server runtime in Eclipse's Servers View. For this, I import the standalone JBoss installation into the Servers View via:
New -> Server -> Red Hat Jboss EAP 7.x ...
Now, this server does not have enough heap space configured, so I have to open the Launch Configuration where the default VM arguments are defined. Then I change the heap from -Xmx512m to something like -Xmx4g.
I would have suspected this configuration to be in the jboss installation somewhere, but I have switched every occurence of -Xmx512m in every file to -Xmx4g without luck.
How can I change the default value to be -Xmx4g without changing it manually every time I have to import a new server runtime?
In Jboss, JVM configurations are placed under "Jboss Directory/bin/standalone.conf" (if your are running in standalone mode) .
Inside standalone.conf , you can find the JVM startup config
#
#Specify options to pass to the Java VM.
#
if [ "x$JAVA_OPTS" = "x" ]; then
JAVA_OPTS="-Xms64m -Xmx512m -XX:MetaspaceSize=96M -XX:MaxMetaspaceSize=256m -Djava.net.preferIPv4Stack=true"
JAVA_OPTS="$JAVA_OPTS -Djboss.modules.system.pkgs=$JBOSS_MODULES_SYSTEM_PKGS -Djava.awt.headless=true"
else
echo "JAVA_OPTS already set in environment; overriding default settings with values: $JAVA_OPTS"
fi
You need to modify "JAVA_OPTS="-Xms64m -Xmx512m " to your requirements .
Hope this suits your need . Thanks.

Failed to start Neo4j service

I am using neo4j enterprise 3.0.3 version for windows. Following the operations manual 3.0, I have installed the neo4j service with bin\neo4j install-service. But I can't start it with bin\neo4j start. It said
Invoke-Neo4j : Failed to start service 'Neo4j Graph Database - neo4j (neo4j)'.
And I can't start the neo4j service in windows serice either. Maybe anyone have encountered this case before?
I had the same problem: I am using neo4j community 3.1.2 for windows and installed the service with the neo4j.bat file without any problems.Then i wanted to start the service with neo4j.bat and got the same error as you
I found a solution that worked for me. My neo4j files were in a folder, where the path to the folder contained spaces (C:\Program Files\Neo4j) Then i moved the folder one level up (C:\Neo4j).
After that i could start the service without problems.
Maybe this solution helps.
I am running neo4j on windows and in my case the crux of the issue was that there was an incompatibility between the installed versions of Java (32-bit) v/s OS version. The biggest clue that led me to this is the following set of lines in neo4j-service.2018-08-03 log file
[2018-08-03 14:55:42] [info] [ 1432] Starting service...
[2018-08-03 14:55:42] [error] [ 1432] %1 is not a valid Win32 application.
[2018-08-03 14:55:42] [error] [ 1432] Failed creating java C:\JavaNew\bin\server\jvm.dll
[2018-08-03 14:55:42] [error] [ 1432] %1 is not a valid Win32 application.
[2018-08-03 14:55:42] [error] [ 1432] ServiceStart returned 1
There are a fair number of potential issues, and I have made an attempt to compile all the issues with this,
Windows services cannot deal with service names in folders that have spaces; especially if there is another folder with the same name as the one with spaces.
For example - C:\Program Files... will have issues if C:\Program\Something...
To work around this, I put Neo4j in root folder c:\Neo4j
Get-Java.ps1 (under ..\bin\Neo4j-Management folder)looks in the path variable for 'JAVA_HOME' (usually found in *nix environments). If it does not find it here, it keeps looking in registry, and finally throws up its hand!
To deal with this, I simply put in a path variable. For a good measure, I uninstalled Java and re-installed Java in the root folder under C:\JavaNew
In retrospect, this step is probably not on part of the problem, and hence can be ignored. But I am leaving it here for completeness sake.
Invoke-Neo4j.ps1 (also under ..\bin\Neo4j-Management folder) has code that determines if the OS is 32-bit (or 64-bit). Based on this it determines if it should run prunsrv-i386.exe (32-bit) or prunsrv-amd64.exe (64-bit).
This has to match the Java version installed.
Upon running java -XshowSettings:all, and inspecting the sun.arch.data.model value (32, in my case), I realized that my OS is 64 bit and the Java version is 32-bit.
To deal with this, I put in code (very klugey!). I am sure there are much better ways to get to the same outcome, but this is what I used.
switch ( (Get-WMIObject -Class Win32_Processor | Select-Object -First 1).Addresswidth ) {
32 { $PrunSrvName = 'prunsrv-i386.exe' } # 4 Bytes = 32bit
#64 { $PrunSrvName = 'prunsrv-amd64.exe' } # 8 Bytes = 64bit COMMENTED as a workaround!!!
64 { $PrunSrvName = 'prunsrv-i386.exe' } # 8 Bytes = 64bit
Now, uninstall the neo4j service, install it, and start the service.
Hope this works for you.
neo4j console
Posting for latest versions > 4.x
I had the same issue using neo4j start, Neo4j console is the right command I was looking for. It is a web-based graph that acts as an interactive tutorial.
i had the same problem , after the neo4j worked for few weeks it stoop working (without any change that i made)
i have set java_home uninstall and install and now it works
neo4j-enterprise-3.3.4
I was also having weired issue as there was no error but neo4J service did not start.
[xx#ss1 bin]$ ./neo4j console
[xx#ss1 bin]$ .
The problem was with the permission on Java directory and I tried
chmod -R 777 jdk_directory
and problem got solved.
#neo4j #neo4jnotstarting

CXF DOSGI ZOOKEEPER

Good morning,
I'm looking for help please.I'm only a beginner.
I am using cxf-dosgi from (DOSGi Apache Karaf Feature distribution)
I want to make transparent use of services between two remote machines. So I have a karaf container on each of these two machines.
I tested this example : to start with two containers karaf hosted on the same machine then I changed the configuration to test with two containers hosted on two different remote machines. And it works great !
So I want to do the same thing to export to export my web services. I am using Spring DM. So I do this on the server side :
<osgi:service id="osgi-service" ref="myservice" interface="org.apache.camel.Endpoint"> <osgi:service-properties> <entry key="name" value="service"/> <entry key="service.exported.interfaces" value="*"/> </osgi:service-properties> </osgi:service>
I did the installations like in the tutorial with cxf dosgi version 1.6 But I get this error:
16:25:53,256 | ERROR | pool-21-thread-3 | w.service.RemoteServiceAdminCore 193 | 184 - cxf-dosgi-ri-dsw-cxf - 1.6.0 | failed to create server for interface org.apache.camel.Endpoint
java.lang.NullPointerException
at java.lang.reflect.Array.newArray(Native Method)[:1.7.0_79]
at java.lang.reflect.Array.newInstance(Array.java:70)[:1.7.0_79]
at org.apache.cxf.aegis.type.TypeUtil.getTypeRelatedClass(TypeUtil.java:259)
at org.apache.cxf.aegis.type.AbstractTypeCreator.createTypeForClass(AbstractTypeCreator.java:108)
at org.apache.cxf.aegis.type.AbstractTypeCreator.createType(AbstractTypeCreator.java:402)
at org.apache.cxf.aegis.type.basic.BeanTypeInfo.getType(BeanTypeInfo.java:192)
at org.apache.cxf.aegis.type.basic.BeanType.getDependencies(BeanType.java:547)
at org.apache.cxf.aegis.databinding.AegisDatabinding.addDependencies(AegisDatabinding.java:394)
at org.apache.cxf.aegis.databinding.AegisDatabinding.addDependencies(AegisDatabinding.java:399)
at org.apache.cxf.aegis.databinding.AegisDatabinding.addDependencies(AegisDatabinding.java:399)
at org.apache.cxf.aegis.databinding.AegisDatabinding.initializeMessage(AegisDatabinding.java:371)
at org.apache.cxf.aegis.databinding.AegisDatabinding.initializeOperation(AegisDatabinding.java:283)
at org.apache.cxf.aegis.databinding.AegisDatabinding.initialize(AegisDatabinding.java:242)
at org.apache.cxf.service.factory.AbstractServiceFactoryBean.initializeDataBindings(AbstractServiceFactoryBean.java:86)
at org.apache.cxf.service.factory.ReflectionServiceFactoryBean.buildServiceFromClass(ReflectionServiceFactoryBean.java:490)
at org.apache.cxf.service.factory.ReflectionServiceFactoryBean.initializeServiceModel(ReflectionServiceFactoryBean.java:550)
at org.apache.cxf.service.factory.ReflectionServiceFactoryBean.create(ReflectionServiceFactoryBean.java:265)
at org.apache.cxf.frontend.AbstractWSDLBasedEndpointFactory.createEndpoint(AbstractWSDLBasedEndpointFactory.java:102)
at org.apache.cxf.frontend.ServerFactoryBean.create(ServerFactoryBean.java:159)
at org.apache.cxf.dosgi.dsw.handlers.AbstractPojoConfigurationTypeHandler.createServerFromFactory(AbstractPojoConfigurationTypeHandler.java:191)
at org.apache.cxf.dosgi.dsw.handlers.PojoConfigurationTypeHandler.createServer(PojoConfigurationTypeHandler.java:119)
at org.apache.cxf.dosgi.dsw.service.RemoteServiceAdminCore.exportInterfaces(RemoteServiceAdminCore.java:184)
at org.apache.cxf.dosgi.dsw.service.RemoteServiceAdminCore.exportService(RemoteServiceAdminCore.java:140)
at org.apache.cxf.dosgi.dsw.service.RemoteServiceAdminInstance$1.run(RemoteServiceAdminInstance.java:59)
at org.apache.cxf.dosgi.dsw.service.RemoteServiceAdminInstance$1.run(RemoteServiceAdminInstance.java:57)
at java.security.AccessController.doPrivileged(Native Method)[:1.7.0_79]
at org.apache.cxf.dosgi.dsw.service.RemoteServiceAdminInstance.exportService(RemoteServiceAdminInstance.java:57)[184:cxf-dosgi-ri-dsw-cxf:1.6.0]
at org.apache.cxf.dosgi.dsw.service.RemoteServiceAdminInstance.exportService(RemoteServiceAdminInstance.java:41)[184:cxf-dosgi-ri-dsw-cxf:1.6.0]
at org.apache.cxf.dosgi.topologymanager.exporter.TopologyManagerExport.exportServiceUsingRemoteServiceAdmin(TopologyManagerExport.java:185)[183:cxf-dosgi-ri-topology-manager:1.6.0]
at org.apache.cxf.dosgi.topologymanager.exporter.TopologyManagerExport.doExportService(TopologyManagerExport.java:168)[183:cxf-dosgi-ri-topology-manager:1.6.0]
at org.apache.cxf.dosgi.topologymanager.exporter.TopologyManagerExport$3.run(TopologyManagerExport.java:143)[183:cxf-dosgi-ri-topology-manager:1.6.0]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)[:1.7.0_79]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)[:1.7.0_79]
at java.lang.Thread.run(Thread.java:745)[:1.7.0_79]
Have you an idea of what is wrong?
Ouch .. what are you doing with 1.4-SNAPSHOT. First it is not a release and second it is quite old.
Another thing that looks suspicious is the service.exported.interfaces=myInterface. It should be a fully qualified java interface name. For the start try service.exported.interfaces=* for it.
You should start with my CXF DOSGi tutorial. The code there should work out of the box. You can then add your changes in the config. So it is easier than to start completely new.

Error running hadoop application in Eclipse on Windows

I'm trying to set up an Eclipse environment for developing and debugging hadoop. I'm following Tom White's Definitive Hadoop 3rd ed. What I would like to do is get the MaxTemperature app working locally on my Windows within Eclipse before moving it to my Hortonworks sandbox VM. The comment on page 158 about using the local job runner seems to be what I want. I don't want to set up a full hadoop implementation on Windows. I'm hoping with the right config params I can convince it to run as a java application inside Eclipse.
Windows: 7
Eclipse: Luna
Hadoop: 2.4.0
JDK: 7
When I set the Run configuration for MaxTemperatureDriver (Source code on page 157) to
inputfile outputdir foo (deliberate bogus 3rd parameter)
I get the usage message so I know I'm running my program with those params.
If I remove the bogus third param I get
Exception in thread "main" java.io.IOException: Cannot initialize Cluster. Please check your configuration for mapreduce.framework.name and the correspond server addresses.
at org.apache.hadoop.mapreduce.Cluster.initialize(Cluster.java:120)
at org.apache.hadoop.mapreduce.Cluster.<init>(Cluster.java:82)
at org.apache.hadoop.mapreduce.Cluster.<init>(Cluster.java:75)
at org.apache.hadoop.mapreduce.Job$9.run(Job.java:1255)
at org.apache.hadoop.mapreduce.Job$9.run(Job.java:1251)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)
at org.apache.hadoop.mapreduce.Job.connect(Job.java:1250)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:1279)
at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1303)
at mark.MaxTemperatureDriver.run(MaxTemperatureDriver.java:52)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
at mark.MaxTemperatureDriver.main(MaxTemperatureDriver.java:56)
I've tried inserting -conf but it seems to be ignored. There is no error message if I specify a nonexistent path.
I've tried inserting -fs file:/// -jt local, but it makes no difference
I've tried inserting -D mapreduce.framework.name=local
I've tried specifying the input and output with the file: format
Note. I'm not asking about how to configure eclipse to connect to a remote Hadoop installation. I want the application to run within eclipse.
Is this possible? Any ideas?
Additional info:
I turned on debugging. I saw:
582 [main] DEBUG org.apache.hadoop.mapreduce.Cluster - Trying ClientProtocolProvider : org.apache.hadoop.mapred.YarnClientProtocolProvider
583 [main] DEBUG org.apache.hadoop.mapreduce.Cluster - Cannot pick org.apache.hadoop.mapred.YarnClientProtocolProvider as the ClientProtocolProvider - returned null protocol
I'm wondering not why YarnClientProtocolProvider failed, but why it didn't try LocalClientProtocolProvider.
New info:
It seems that this is an issue with Hadoop 2.4.0. I recreated my environment with Hadoop 1.2.1, followed the instructions in
http://gerrymcnicol.com/index.php/2014/01/02/hadoop-and-cassandra-part-4-writing-your-first-mapreduce-job/
added the Windows hack from
http://bigdatanerd.wordpress.com/2013/11/14/mapreduce-running-mapreduce-in-windows-file-system-debug-mapreduce-in-eclipse
and it all started working.
Following blog will be useful.
Running mapreduce in Windows filesystem

set CLASSPATH in Jruby 1.6.4

How to set class path in JRuby 1.6.4 ?
I set my class path as
set CLASSPATH=
%<project_name>%\lib\java\ant-1.6.5.jar;
%<project_name>%\lib\java\ant-antlr-1.6.5.jar;
%<project_name>%\lib\java\ant-junit-1.6.5.jar;
%<project_name>%\lib\java\ant-launcher-1.6.5.jar;
%<project_name>%\lib\java\ant-swing-1.6.5.jar;
%<project_name>%\lib\java\antlr-2.7.6.jar;
%<project_name>%\lib\java\asm-attrs.jar;
%<project_name>%\lib\java\asm.jar;
%<project_name>%\lib\java\auriga-cryptolib.jar;
%<project_name>%\lib\java\bcprov-jdk12-137.jar;
%<project_name>%\lib\java\c3p0-0.9.1.jar;
%<project_name>%\lib\java\cglib-nodep-2.1_3.jar;
%<project_name>%\lib\java\checkstyle-all.jar;
%<project_name>%\lib\java\cleanimports.jar;
%<project_name>%\lib\java\commons-collections-2.1.1.jar;
%<project_name>%\lib\java\commons-logging-1.0.4.jar;
%<project_name>%\lib\java\concurrent-1.3.2.jar;
%<project_name>%\lib\java\dom4j-1.6.1.jar;
%<project_name>%\lib\java\ehcache-1.2.3.jar;
%<project_name>%\lib\java\hibernate-tools.jar;
%<project_name>%\lib\java\hibernate3.jar;
%<project_name>%\lib\java\jaas.jar;
%<project_name>%\lib\java\jacc-1_0-fr.jar;
%<project_name>%\lib\java\javassist.jar;
%<project_name>%\lib\java\jaxen-1.1-beta-7.jar;
%<project_name>%\lib\java\jboss-cache.jar;
%<project_name>%\lib\java\jboss-common.jar;
%<project_name>%\lib\java\jboss-jmx.jar;
%<project_name>%\lib\java\jboss-system.jar;
%<project_name>%\lib\java\trg-dao-hibernate-0.5.0.jar;
%<project_name>%\lib\java\trg-search-0.5.0.jar;
%<project_name>%\lib\java\trg-search-hibernate-0.5.0.jar;
%<project_name>%\lib\java\trilead-ssh2-build213.jar;
%<project_name>%\lib\java\versioncheck.jar;
%<project_name>%\lib\java\xerces-2.6.2.jar;
%<project_name>%\lib\java\xml-apis.jar;
%<project_name>%\lib\java\jasperreports-3.7.6.jar;
%<project_name>%\lib\java\commons-digester-1.7.jar;
%<project_name>%\lib\java\iText-2.1.7.jar;
%<project_name>%\lib\java\jfreechart-1.0.12.jar;
%<project_name>%\lib\java\jcommon-1.0.15.jar;
%<project_name>%\classes\;
But Windows cannot load these classes. I even tried to set the class in Windows environment variables at that time its works. But I can't put these much classes there more over its bad practice.
Actually I'm migrating an app from 1.4 Jruby. In 1.4 the CLASSPATH works perfectly.
Any help would be greatly appreciated?
At last I found the answer you need to load all the local classes in your project then load the other class.
In my case I load this class first
% project_name %\classes\;