I am trying Scala code using the KCL library to read a Kinesis stream. I keep getting this CloudWatchException and I would like to know why?
16:16:06.629 [aws-akka-http-akka.actor.default-dispatcher-20] DEBUG software.amazon.awssdk.request - Received error response: 400
16:16:06.638 [cw-metrics-publisher] WARN software.amazon.kinesis.metrics.CloudWatchMetricsPublisher - Could not publish 16 datums to CloudWatch
software.amazon.awssdk.services.cloudwatch.model.CloudWatchException: When Content-Type:application/x-www-form-urlencoded, URL cannot include query-string parameters (after '?'): '/?Action=PutMetricData&Version=2010-08-01&Namespace=......
Any idea what's causing this or as I suspect, perhaps it's a bug in the Kinesis library?
Related
We are trying to run a spark program using NiFi. This is the basic sample we tried to follow.
We have configured Apache-Livy server in 127.0.0.1:8998.
ExecutiveSparkInteractive processor is used to run sample Spark code.
val gdpDF = spark.read.json("gdp.json")
val gdpRDD = gdpDF.rdd
gdpRDD.count()
LivyController is confiured for 127.0.0.1 port 8998 and Session Type : spark.
When we run the processor we get following error :
Spark Session returned an error, sending the output JSON object as the flow file content to failure (after penalizing)
We just want to output the line count in JSON file. How to redirect it to flowfile?
NiFi User log :
2020-04-13 21:50:49,955 INFO [NiFi Web Server-85]
org.apache.nifi.web.filter.RequestLogger Attempting request for
(anonymous) GET
http://localhost:9090/nifi-api/flow/controller/bulletins (source ip:
127.0.0.1)
NiFi app.log
ERROR [Timer-Driven Process Thread-3]
o.a.n.p.livy.ExecuteSparkInteractive
ExecuteSparkInteractive[id=9a338053-0173-1000-fbe9-e613558ad33b] Spark
Session returned an error, sending the output JSON object as the flow
file content to failure (after penalizing)
I have seen several people struggling with this example. I recommend following this example from the Cloudera Community (especially note part 2).
https://community.cloudera.com/t5/Community-Articles/HDF-3-1-Executing-Apache-Spark-via-ExecuteSparkInteractive/ta-p/247772
The key points I would be concerned with:
Does your spark work in general
Does your livy work in general
Is the Spark sample code good
I'm working on a REST API that uses Akka, we inherited it from a previous team, and none of us have experience with Akka before this.
Akka is being used to process the data the API is returning, and acting as the HTTP server.
Recently when the API was under load, we started getting failures like so:
),HttpProtocol(HTTP/1.1)), Response: HttpResponse(500 Internal Server Error,List(),HttpEntity.Strict(text/plain; charset=UTF-8,
Error Code: 500
Type: Internal Server Error
Stack Trace:
akka.stream.StreamTcpException: The connection actor has terminated. Stopping now.
),HttpProtocol(HTTP/1.1)), Time: 6430 ms
I have no idea where the above error is happening in the code, or how to appropriately handle this error when it happens.
Can anyone give suggestions on how to trace this down further, or suggestions on how to handle and recover from these types of issues?
Is it possible to let the server send messages to all connected clients without waiting for any action from them? Let me explain it :-) I've been reading the docs/examples and I have found nothing that satifies my needs: the flow is always the same; a client connects (e.g: a GET call to a Rest API), the connection is suspendend and until a new API call is received (e.g.: a POST call) the server simply waits (or at least this is what I have understood). My use case is pretty different: I want the server to send some "notifications" once new data become available. This would be my use case (pretty simplifed):
Client A connects to server
Connection is suspended since no new data is available at the moment
The server gets notified new data is available from an external
source and broadcasts it to client A
Go to step 2
What I have achieved so far is getting the connection successfully established. The next step is to solve this server issue. I must say this technology is completely new to me so it is possible I misunderstood how something works. If that's the case, let me know!
This is my stack:
Spring 3.2.0 RELEASE
Jersey 1.8
Atmosphere Jersey 1.0.13
Tomcat 7.0.40
Thank you all in advance!
UPDATE: After following this I get this warning, which I have no idea how to get rid of:
2013-06-04 09:40:36,284 WARN [org.atmosphere.cpr.AtmosphereFramework] - Failed using comet support: org.atmosphere.container.Tomcat7AsyncSupportWithWebSocket, error: Tomcat failed to detect this is a Comet application because context.xml is missing or the Http11NioProtocol Connector is not enabled.
If that's not the case, you can also remove META-INF/context.xml and WEB-INF/lib/atmosphere-compat-tomcat.jar Is the Nio or Apr Connector enabled?
2013-06-04 09:40:36,285 WARN [org.atmosphere.cpr.AtmosphereFramework] - Using org.atmosphere.container.Tomcat7BIOSupportWithWebSocket
I followed the app structure commented here, so this should not be a problem. I have noticed that by changing the transport to "websocket" instead of "long-polling" shows no errors. The server finally sends data tough :)
I followed your link and modified the code a little.
When you are in the step 3 "The server gets notified new data is available from an external source and broadcasts it to client A", you have to write a line like this:
BroadcasterFactory.getDefault().lookup("/*").broadcast(response);
At first I used the TextMessage received from my ActiveMQ Queue but I get this error, so I used a Jackson class as an object response and everything worked fine.
SEVERE: A message body writer for Java class org.apache.activemq.command.ActiveMQTextMessage, and Java type class org.apache.activemq.command.ActiveMQTextMessage, and MIME media type application/json was not found
jun 03, 2014 11:32:21 AM com.sun.jersey.spi.container.ContainerResponse write
SEVERE: The registered message body writers compatible with the MIME media type are:
application/json (JSONJAXBElementProvider, JSONArrayProvider, JSONObjectProvider, JSONRootElementProvider, JSONListElementProvider, ...)
Was trying to implement the triggers in Cassandra.
Have been trying to use the available Cassandra help: https://github.com/hmsonline/cassandra-triggers
Porting it on the latest version 1.2.3, and following the GettingStarted instruction.
when i tried to set the value for the trigger, inserting the data to fire the log, and checked the logs, following is the error which i received
java.lang.AssertionError
at org.apache.cassandra.thrift.ThriftSessionManager.currentSession(ThriftSessionManager.java:51)
at org.apache.cassandra.thrift.CassandraServer.state(CassandraServer.java:88)
at org.apache.cassandra.thrift.CassandraServer.validateLogin(CassandraServer.java:881)
at org.apache.cassandra.thrift.CassandraServer.set_keyspace(CassandraServer.java:1492)
at com.hmsonline.cassandra.triggers.dao.CassandraStore.getConnection(CassandraStore.java:42)
at com.hmsonline.cassandra.triggers.dao.ConfigurationStore.getConfiguration(ConfigurationStore.java:76)
at com.hmsonline.cassandra.triggers.dao.ConfigurationStore.isCommitLogEnabled(ConfigurationStore.java:44)
at com.hmsonline.cassandra.triggers.TriggerTask.run(TriggerTask.java:47)
at java.lang.Thread.run(Thread.java:636)
But we test it works fine at older version(ex: 1.1.2)
So is this a configuration problem, or the Thrift API implementation has been changed?
thanks
From time to time I get the following warning in the logfile of my Wicket application:
04.10.2012 14:52:08,525 WARN [org.apache.wicket.core.request.mapper.AbstractBookmarkableMapper]
Unknown listener interface 'd allow_url_include=On '
What does that mean and how do I fix it? I tried Google, but I could only find results for the PHP configuration allow_url_include.
I'm using Wicket 6.0.0
Most likely an automated tool tries to exploit some PHP application. Wicket can't handle this request and prints the warning. Look in the access log what HTTP requests hit your server at this timestamp to see which request caused this warning.
It's safe to ignore this warning in this case.