I am using akka stm and when my application starts it prints out (to the stderr):
Okt 20, 2011 10:17:10 AM org.multiverse.api.GlobalStmInstance <clinit>
Information: Initializing GlobalStmInstance using factoryMethod 'org.multiverse.stms.alpha.AlphaStm.createFast'.
Okt 20, 2011 10:17:10 AM org.multiverse.stms.alpha.AlphaStm <init>
Information: Created a new AlphaStm instance
Okt 20, 2011 10:17:10 AM org.multiverse.api.GlobalStmInstance <clinit>
Information: Successfully initialized GlobalStmInstance using factoryMethod 'org.multiverse.stms.alpha.AlphaStm.createFast'.
How can I disable it (logging)?
Although I'm sure there is a better solution via config xml files, this is a quick fix which worked for me:
import java.util.logging.{Logger, Level}
object DisableLogging {
Logger.getLogger("org.multiverse.api.GlobalStmInstance").setLevel(Level.OFF)
Logger.getLogger("org.multiverse.stms.alpha.AlphaStm").setLevel(Level.OFF)
}
Related
I am trying to consume kafka message using apache beam in dataflow. I write a simple pipeline using apache beam version 2.1.0 below :
public static void main(String[] args) {
DrainOptions options = PipelineOptionsFactory.fromArgs(args).withValidation().as(DrainOptions.class);
options.setStreaming(true);
Pipeline p = Pipeline.create(options);
Map<String, Object> props = new HashMap<>();
props.put("auto.offset.reset", "latest");
props.put("group.id", "test-group");
p.apply(KafkaIO.readBytes()
.updateConsumerProperties(props)
.withTopic(options.getTopic())
.withBootstrapServers(options.getBootstrapServer())
).apply(ParDo.of(new GetValue()))
.apply("ToString", ParDo.of(new ToString()))
.apply("FixedWindow", Window.<String>into(FixedWindows.of(Duration.standardSeconds(30))))
.apply(TextIO.write().to(options.getOutput()).withWindowedWrites().withNumShards(1));
PipelineResult pipelineResult = p.run();
pipelineResult.waitUntilFinish();
}
When it tried to run it using dataflow runner :
mvn compile exec:java -Dexec.mainClass=com.test.beamexample.Drain -Dexec.args="--project=my-project --gcpTempLocation=gs://my_bucket/tmp/drain --streaming=true --stagingLocation=gs://my_bucket/staging/drain --output=gs://my_bucket/output/staging/drainresult --bootstrapServer=kafka-broker:9092 --topic=test --runner=DataflowRunner" -Pdataflow-runner
The pipeline successfully being built and being uploaded to staging location, but before Dataflow runner run the pipeline, it being executed in local, that makes no Dataflow job created, like when we are using direct-runner :
Nov 14, 2017 2:14:52 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 111 files. Enable logging at DEBUG level to see which files will be staged.
Nov 14, 2017 2:14:53 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Nov 14, 2017 2:14:53 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 111 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Nov 14, 2017 2:14:59 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 111 files cached, 0 files newly uploaded
Nov 14, 2017 2:15:00 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding KafkaIO.Read/Read(UnboundedKafkaSource)/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Nov 14, 2017 2:15:00 PM org.apache.kafka.common.config.AbstractConfig logAll
INFO: ConsumerConfig values:
auto.commit.interval.ms = 5000
auto.offset.reset = latest
...
Are there something missing ?
I am sending REST POST request to the FIWARE CEP and expecting output even in a file. But nothing in the file.
REST POST (Producer) -> CEP -> File Consumer
http://194.28.122.118:8080/ProtonOnWebServer/rest/events
{"Name":"TrafficReport", "volume":"9000"}
Catalina.out
Apr 3, 2015 4:54:19 PM com.ibm.hrl.proton.webapp.providers.EventJSONMessageReader readFrom
INFO: started event message body reader
Apr 3, 2015 4:54:19 PM com.ibm.hrl.proton.webapp.providers.EventJSONMessageReader readFrom
INFO: name value: TrafficReport looking for: Name
Apr 3, 2015 4:54:19 PM com.ibm.hrl.proton.webapp.providers.EventJSONMessageReader readFrom
INFO: finished event message body reader
Apr 3, 2015 4:54:19 PM com.ibm.hrl.proton.webapp.resources.EventResource submitNewEvent
INFO: starting submitNewEvent
Apr 3, 2015 4:54:19 PM com.ibm.hrl.proton.router.EventRouter routeTimedObject
INFO: routeTimedObject: forwarding event TrafficReport; Name=TrafficReport; Certainty=0.0; Cost=0.0; EventSource=; OccurrenceTime=null; Annotation=; Duration=0.0; volume=100000; EventId=f4aee2d0-2d4b-4c0c-a24f-ae452896fa75; ExpirationTime=null; Chronon=null; DetectionTime=1428072859603; to consumer...
Apr 3, 2015 4:54:19 PM com.ibm.hrl.proton.webapp.resources.EventResource submitNewEvent
INFO: events sent to proton runtime...
The reason might be that the path you specified as the Consumer's output file does not exist or that tomcat has no permission to write to this path or to write to the file you specified.
Look at the log file (logs/catalina.out) and see if you see a warning like:
WARNING: initializeAdapters: failed to initialize adapter Output adapter for consumer: DoSAttackTRConsumer, reason: No such file or directorycode here
I would also recommend to use an absolute path and not relative path for the output file path, since in different operating systems, the Tomcat "current" directory might be different.
You don't need to create the file, but you do need to create the directory and make sure tomcat has permission to write to this directory (or if the file exists to write to this file)
So here are my recommendation:
Stop tomcat
Delete catalina.out
Activate tomcat
In the CEP web UI, change the path of the Consumers to an absolute path, save the project, export it to the repository
Make sure the path you specified for the Consumers exists, and that tomcat has permission to write to the directory, and if the file exist, to this file.
Change the status of the CEP engine to Stop
Change the status of the CEP engine to Start
Send an input event
Make sure you don't see the warning listed above in the catalina.out
I've created a simple sinatra app, but can't get sessions to work when running it as an executable war.
I've verified that it works when run via "jruby -S rackup", but when run with "java -jar myapp.war", I find that the session is reset on each request:
INFO: Winstone Servlet Engine v0.9.10 running: controlPort=disabled
session: {"session_id"=>"75936d3d21367f5c1896e749ba401d7715e41a5fd01317484faa44d80c8afaea", "csrf"=>"60367cb6c5ead39b2669668ed28db3a1", "tracking"=>{"
HTTP_USER_AGENT"=>"9f3d63482f1fb48a317c5c9e2de6196f9cd239cc", "HTTP_ACCEPT_LANGUAGE"=>"66eae971492938c2dcc2fb1ddc8d7ec3196037da"}}
Jul 20, 2014 8:00:20 PM winstone.Logger logInternal
INFO: 0:0:0:0:0:0:0:1 - [20/Jul/2014 20:00:20] "GET / " 200 765 0.1670
session: {"session_id"=>"19d266ffb8ccb29108464961e68fa9e29f1c3b45e0097806b4cbc8db156d71d7", "csrf"=>"5ac12991c2ec8d4acf22180d79c494c2", "tracking"=>{"
HTTP_USER_AGENT"=>"9f3d63482f1fb48a317c5c9e2de6196f9cd239cc", "HTTP_ACCEPT_LANGUAGE"=>"66eae971492938c2dcc2fb1ddc8d7ec3196037da"}, "name"=>"john"}
Jul 20, 2014 8:00:31 PM winstone.Logger logInternal
INFO: 0:0:0:0:0:0:0:1 - [20/Jul/2014 20:00:31] "GET /login/john " 200 9 0.0240
session: {"session_id"=>"60f161941822b4f0fae9085db58fe9ea30e86d56dc16fff2ea5859bb4008c58f", "csrf"=>"7dd3977bef9fca9c7ed9b77fdc774657", "tracking"=>{"
HTTP_USER_AGENT"=>"9f3d63482f1fb48a317c5c9e2de6196f9cd239cc", "HTTP_ACCEPT_LANGUAGE"=>"66eae971492938c2dcc2fb1ddc8d7ec3196037da"}}
Jul 20, 2014 8:00:40 PM winstone.Logger logInternal
Other than setting sessions to be enabled, is there any special setup that is needed to have sessions work when the app is packaged with warbler and run as an executable war?
nothing special should be needed - I tried your sample and it worked fine.
it's probably a bug with the jruby-rack version you're using ... please try >= 1.1.15
also I would recommend to try out the jetty webserver (you'll find an option at config/warbler.rb) ... I'll try to make sure jetty is the default for a future Warbler version.
I am using STS(eclipse) and facing a weird issue with Tomcat. It was running fine and all of a sudden it started giving issues. Firstly it responded very slowly and after it did not respond at all in debug mode. While it runs fine when is started in RUN mode but when debugging it waits for something after -
May 15, 2013 9:03:51 PM org.apache.catalina.core.StandardService start
INFO: Starting service Catalina
May 15, 2013 9:03:51 PM org.apache.catalina.core.StandardEngine start
INFO: Starting Servlet Engine: Apache Tomcat/6.0.32
May 15, 2013 9:03:53 PM org.apache.catalina.core.ApplicationContext log
INFO: Initializing Spring root WebApplicationContext
The server is stuck forever, seems like it is asking for something from somewhere or so. I did many things like - deleting the server and setting the whole thing again, took a different Tomcat. I did not change any set up for sure.
Make sure you don't have any breakpoints set, this could cause the server to pause during startup.
Check if you have any breakpoints in the declaration of a method
public void myMethod( ParamType myParameter ){ //breakpoint in this line
...
}
You would see the break point marked as [entry] in the breakpoints view in Debug perspective instead of [line: XX] as breakspoints inside implementation are shown.
Disable that breakpoint and try to start server in Debug. This worked for me, hope this helps someone.
Hi Everyone,
I have installed ATG 10.1.2 , along with CRS , Search , CSC on my Linux machine. I'm using weblogic as my application server. However, when i try to run ./cim.sh , I get an error which is as follows
It says that its unable to find the class weblogic.utils.classloaders.ClassFinder .
I have set my environment variables as follows :
export JAVA_HOME=/home/install/mediaStore/jdk1.6.0_41
export PATH=$JAVA_HOME/bin:/home/install/software/ant/apache-ant-1.8.2/bin:/home/install/Oracle11gR2/install/product/11.2.0/dbhome_1/bin:$PATH
export ANT_HOME=/home/install/software/ant/apache-ant-1.8.2
export PATH=$PATH:ANT_HOME/bin
export DYNAMO_ROOT=/home/install/mediaStore/ATG/ATG10.1.2
export DYNAMO_HOME=$DYNAMO_ROOT/home
export ATGJRE=$JAVA_HOME/bin/java
export CLASSPATH=/home/install/Oracle11gR2/install/product/11.2.0/dbhome_1/jdbc/lib/ojdbc6.jar:$CLASSPATH
export WEBLOGIC_HOME=/home/install/mediaStore/Weblogic
export WEBLOGIC_SERVER=$WEBLOGIC_HOME/wlserver_12.1
[install#JJPLRHEL01 bin]$ ./cim.sh
The following installed ATG components are being used to launch:
ATGPlatform version 10.1.2 installed at /home/install/mediaStore/ATG/ATG10.1.2
Error Thu Feb 28 15:21:35 IST 2013 1362045095625 / **atg.nucleus.NucleusResources->cantResolveComponent : Unable to resolve component /atg/dynamo/service/validation/JavaxValidatorFactory** java.lang.NoClassDefFoundError: weblogic/utils/classloaders/ClassFinder
Error Thu Feb 28 15:21:35 IST 2013 1362045095625 / at javax.validation.Validation.byProvider(Validation.java:166)
Error Thu Feb 28 15:21:35 IST 2013 1362045095625 / ***Caused by :java.lang.ClassNotFoundException: weblogic.utils.classloaders.ClassFinder***
Error Thu Feb 28 15:21:35 IST 2013 1362045095625 / at
Any help or views or guidance would be highly appreciated .
Thanks ,
Aazim
I added the following to my classpath
/home/install/mediaStore/Weblogic/wlserver_12.1/server/lib/wls-api.jar
Also, currently ATG 10.1.2 does not support Weblogic 12.1. So, you won't be able to configure CRS with it.