I have play 2.4 app with following logger.xml:
<?xml version="1.0" encoding="UTF-8"?>
<configuration>
<appender name="stdout" class="ch.qos.logback.core.ConsoleAppender">
<encoder>
<pattern>%d %p [%c{0}] - <%m>%n%ex</pattern>
</encoder>
<filter class="ch.qos.logback.classic.filter.ThresholdFilter">
<level>INFO</level>
</filter>
</appender>
<root level="INFO">
<appender-ref ref="stdout"/>
</root>
</configuration>
When I do request from local machine
curl http://localhost:9000/
it executes and access filter logged fine.
object AccessLoggingFilter extends Filter {
val logger: Logger = Logger(this.getClass)
override def apply(nextFilter: (RequestHeader) => Future[Result])(requestHeader: RequestHeader): Future[Result] = {
val startTime = System.currentTimeMillis
nextFilter(requestHeader).map { result =>
val requestTime = System.currentTimeMillis - startTime
logger.info(s"${requestHeader.method} ${requestHeader.uri} took ${requestTime}ms and returned ${result.header.status}")
result
}
}
}
but via cname:
curl http://mycname.com/
I see page but no any logged entry. also some routes don't work via cname ( i see play errors but they are also don't logged).
Is it logger misconfiguration or smth else?
You need a port number in your 2nd curl invocation:
curl http://mycname.com:9000/
Figured out that this is ops configuration problem. cname lead to old version of app when i was looking at new version. so logging is ok :)
Related
I am coding an application with Akka v2.5.23. The application involves below actors:
An router actor class named CalculatorRouter
An routee actor class named Calculator
I've configured a PinnedDispatcher when creating Calculator actor and put log.info in this actor class's receive method. I've expected to see in the log file the thread name field to contain pinned. However, the thread name field is default-dispatcher. I've searched in the log file and found that all the thread name with respect to this log.info to be default-dispatcher. Is there something wrong with my code?
Log file snippet:
09:49:25.116 [server-akka.actor.default-dispatcher-14] INFO handler.Calculator $anonfun$applyOrElse$3 92 - akka://server/user/device/$a/$a Total calc received
Follows are the code snippets:
class CalculatorRouter extends Actor with ActorLogging {
var router = {
val routees = Vector.fill(5) {
val r = context.actorOf(Props[Calculator].withDispatcher("calc.my-pinned-dispatcher"))
context.watch(r)
ActorRefRoutee(r)
}
Router(SmallestMailboxRoutingLogic(), routees)
}
def receive = {
case w: Calc => router.route(w, sender)
case Terminated(a) =>
router.removeRoutee(a)
val r = context.actorOf(Props[Calculator].withDispatcher("calc.my-pinned-dispatcher"))
context.watch(r)
router = router.addRoutee(r)
}
}
The calc.my-pinned-dispatcher is configured as follows:
calc.my-pinned-dispatcher {
executor="thread-pool-executor"
type=PinnedDispatcher
}
Source code of class calculator as follows:
class Calculator extends Actor with ActorLogging {
val w = new UdanRemoteCalculateTotalBalanceTime
def receive = {
case TotalCalc(fn, ocvFilepath, ratedCapacity, battCount) ⇒
log.info(s"${self.path} Total calc received")
Try{
w.CalculateTotalBalanceTime(1, fn, ocvFilepath, ratedCapacity)
} match {
case Success(t) ⇒
val v = t.getIntData
sender.!(Calculated(v))(context.parent)
case Failure(e) ⇒ log.error(e.getMessage)
}
}
}
object Calculator {
sealed trait Calc
final case class TotalCalc(filename: String, ocvFilepath: String, ratedCapacity: String, batteryCount: Int) extends Calc
}
logback.xml
<configuration debug="true">
<contextListener class="ch.qos.logback.classic.jul.LevelChangePropagator">
<!-- reset all previous level configurations of all j.u.l. loggers -->
<resetJUL>true</resetJUL>
</contextListener>
<appender name="FILE" class="ch.qos.logback.core.rolling.RollingFileAppender">
<file>/var/log/app.log</file>
<append>true</append>
<rollingPolicy class="ch.qos.logback.core.rolling.TimeBasedRollingPolicy">
<!-- daily rollover -->
<fileNamePattern>/var/log/app.%d{yyyy-MM-dd}.log</fileNamePattern>
<!-- keep 30 days' worth of history capped at 3GB total size -->
<maxHistory>100</maxHistory>
<totalSizeCap>30000MB</totalSizeCap>
</rollingPolicy>
<encoder>
<pattern>%d{HH:mm:ss.SSS} [%thread] %-5level %logger{36} %M %L - %msg%n</pattern>
</encoder>
</appender>
<appender name="ASYNCFILE" class="ch.qos.logback.classic.AsyncAppender">
<appender-ref ref="FILE" />
<queueSize>500</queueSize>
<includeCallerData>true</includeCallerData>
</appender>
<appender name="STDOUT" class="ch.qos.logback.core.ConsoleAppender">
<encoder>
<pattern>%d{HH:mm:ss.SSS} [%thread] %-5level %logger{36} %M %L - %msg%n</pattern>
</encoder>
</appender>
<logger name="application" level="DEBUG"/>
<root level="INFo">
<appender-ref ref="ASYNCFILE"/>
</root>
</configuration>
'20 Mar 4 Update
Thanks #anand-sai. After I put akka.loggers-dispatcher = "calc.my-pinned-dispatcher" in conf file, I've got my-pinned-dispatcher-xx as the thread name in every line of the log file. I thought the thread name should indicate the thread wherein actor Calculator's receive method is executing, in this case, something similar to 'pinned-dispatcher-xx' as the thread was obtained by a pinned dispatcher per my configuration. Now it proves that it indicates the thread obtained by logger's dispatcher. If this is the case, how to log the thread name for an actor's message handler code?
I think the solution is to add akka.loggers-dispatcher in your application.conf
calc.my-pinned-dispatcher {
executor="thread-pool-executor"
type=PinnedDispatcher
}
akka.loggers-dispatcher = "calc.my-pinned-dispatcher"
If you search for logger-dispatcher in the default configuration of akka, you will find the value to be "akka.actor.default-dispatcher` and we need to override this config as shown above.
EDIT
ActorLogging is asynchronous. When you log using ActorLogging, it sends a message to the logging actor, which by default runs on the default dispatcher. Logback logs the thread that called it, which will be the ActorLogging actor's thread, not your actor's thread.In order to achieve this goal, there is a so-called Mapped Diagnostic Context (MDC) that captures the akka source(The path of the actor in which the logging was performed ) , source thread( the thread in which the logging was performed) and much more in which the logging was performed.
As given in the documentation:
Since the logging is done asynchronously the thread in which the
logging was performed is captured in MDC with attribute name
sourceThread.
The path of the actor in which the logging was performed is available
in the MDC with attribute name akkaSource.
The actor system name in which the logging was performed is available
in the MDC with attribute name sourceActorSystem, but that is
typically also included in the akkaSource attribute.
The address of the actor system, containing host and port if the
system is using cluster, is available through akkaAddress.
For typed actors the log event timestamp is taken when the log call
was made but for Akka’s internal logging as well as the classic actor
logging is asynchronous which means that the timestamp of a log entry
is taken from when the underlying logger implementation is called,
which can be surprising at first. If you want to more accurately
output the timestamp for such loggers, use the MDC attribute
akkaTimestamp. Note that the MDC key will not have any value for a
typed actor.
Let me know if it helps!!
import org.slf4j.LoggerFactory
object add {
private val LOGGER = LoggerFactory.getLogger(this.getClass)
def addAll() {
LOGGER.info("Start addAll for add Object")
}
}
In the logger statment I don't want to hard code addALL method and add class. How can I
provide a substitution for these In logger statement with take care of perfomance overhead.
You will have to configure logback file something like below.
<appender name="STDOUT" class="ch.qos.logback.core.ConsoleAppender">
<filter class="ch.qos.logback.classic.filter.ThresholdFilter">
<level>INFO</level>
</filter>
<encoder class="ch.qos.logback.classic.encoder.PatternLayoutEncoder">
<pattern>%d{HH:mm:ss.SSS} [%thread] %-5level %class{36}.%M %L - %msg%n</pattern>
</encoder>
</appender>
You can have a look at Logback doc
I am playing around with spray.io and I am not able to make spray debugging directives logRequestResponse work - I don't see any output in the log.
val route: Route = {
pathPrefix("city") {
pathPrefix("v1") {
path("transaction" / Segment / Segment) {
(siteId: String, transactionId: String) =>
post {
authenticate(BasicAuth(UserPasswordAuthenticator _, realm = "bd cinema import api")) {
user =>
DebuggingDirectives.logRequestResponse("city-trans", Logging.InfoLevel) {
val resp = "Hello"
complete {
resp
}
}
}
}
}
}
}
}
Am I missing something here?
Do I need to enable debugging in global somewhere in spray configuration? I tried different places and none of them worked as expected
Check you have sensible values at your application.conf and logback.xml as these sample files on the Spray project
Pay attention at application.conf akka.loglevel=INFO
akka {
log-config-on-start = on
loglevel = "INFO"
actor.timeoutsecs = 2
loggers = ["akka.event.slf4j.Slf4jLogger"]
}
A minimum logback.xml to display logs on stdout.
<configuration>
<appender name="CONSOLE" class="ch.qos.logback.core.ConsoleAppender">
<target>System.out</target>
<encoder>
<pattern>[%d{dd/MM/yyyy HH:mm:ss.SSS}] [%level] [%thread] %logger{36} - %msg %n</pattern>
<!--<pattern>%X{akkaTimestamp} %-5level[%thread] %logger{0} - %msg%n</pattern>-->
</encoder>
</appender>
<!-- <logger name="com.vegatic" level="DEBUG"/> -->
<root level="DEBUG">
<appender-ref ref="CONSOLE"/>
</root>
</configuration>
Usual suspects are logger's name attributes not matching the Scala namespaces or not verbose enough which have been commented on the example above for clarity
Docs link for LoggingContext
A LoggingAdapter that can always be supplied implicitly. If an
implicit ActorSystem the created LoggingContext forwards to the log of
the system. If an implicit ActorContext is in scope the created
LoggingContext uses the context's ActorRef as a log source. Otherwise,
i.e. if neither an ActorSystem nor an ActorContext is implicitly
available, the created LoggingContext will forward to NoLogging, i.e.
"/dev/null".
Why is my logback logging choking with Akka? If I leave Akka config alone but delete my logback.xml file it works w/o problems with whatever the defaults are. Is there a config error in my logback.xml file?
logback.xml
<configuration>
<appender name="STDOUT" class="ch.qos.logback.core.ConsoleAppender">
<encoder>
<pattern>%d{HH:mm:ss.SSS} [%thread] %-5level %logger{36} - %msg%n</pattern>
</encoder>
</appender>
<root level="ERROR">
<appender-ref ref="STDOUT" />
</root>
</configuration>
application.conf
akka {
loglevel = "ERROR"
stdout-loglevel = "ERROR"
loggers = ["akka.event.slf4j.Slf4jLogger"]
actor {
provider = akka.remote.RemoteActorRefProvider
}
remote {
enabled-transports = ["akka.remote.netty.tcp"]
}
}
Build.scala (clip):
lazy val root = project.in(file("."))
.settings(basicSettings: _*)
.settings(libraryDependencies ++=
dep_compile(
typesafe_config, logback, akka_actor, akka_remote, akka_slf4j) ++
dep_test(scalatest)
)
When something attempts to log an error I get:
error while starting up loggers
akka.ConfigurationException: Logger specified in config can't be loaded [akka.event.slf4j.Slf4jLogger] due to [akka.event.Logging$LoggerInitializationException: Logger log1-Slf4jLogger did not respond with LoggerInitialized, sent instead [TIMEOUT]]
at akka.event.LoggingBus$$anonfun$4$$anonfun$apply$1.applyOrElse(Logging.scala:116)
at akka.event.LoggingBus$$anonfun$4$$anonfun$apply$1.applyOrElse(Logging.scala:115)
at scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:33)
at scala.util.Failure$$anonfun$recover$1.apply(Try.scala:185)
at scala.util.Try$.apply(Try.scala:161)
...
Caused by: akka.event.Logging$LoggerInitializationException: Logger log1-Slf4jLogger did not respond with LoggerInitialized, sent instead [TIMEOUT]
at akka.event.LoggingBus$class.akka$event$LoggingBus$$addLogger(Logging.scala:185)
at akka.event.LoggingBus$$anonfun$4$$anonfun$apply$4.apply(Logging.scala:114)
at akka.event.LoggingBus$$anonfun$4$$anonfun$apply$4.apply(Logging.scala:113)
at scala.util.Success$$anonfun$map$1.apply(Try.scala:206)
at scala.util.Try$.apply(Try.scala:161)
at scala.util.Success.map(Try.scala:206)
at akka.event.LoggingBus$$anonfun$4.apply(Logging.scala:113)
... 36 more
You are using the old logger class in config. Here is how to fix it:
Akka (2.3.0) fails to load Slf4jEventHandler class with java.lang.ClassNotFoundException
I have a server which has implemented in scala.
import org.apache.thrift.server.TServer
import org.apache.thrift.server.TThreadPoolServer
import org.apache.thrift.transport.TServerSocket
import org.apache.thrift.transport.TTransportException
import hms.config.demo.AppStoreConfig
object Server {
def main(args: Array[String]) {
start
}
private def start {
try {
val serverTransport: TServerSocket = new TServerSocket(7911)
val processor = new AppStoreConfig.Processor(new AppStoreConfigImpl)
val server: TServer = new TThreadPoolServer(new TThreadPoolServer.Args(serverTransport).processor(processor))
println("Starting server on port 7911 ...")
server.serve
}
catch {
case e: TTransportException => {
e.printStackTrace
}
}
}
}
I need to get logs from the server. That means how it handles the requests and respond to clients like tomcat server. Is there a way to achieve my task in thrift server?
It depends on slf4j as you can see here in "depends on". Here is documentation on how to configure slf4j: http://slf4j.org/faq.html. Basically you start by making a config file src/main/resources/logback.xml and putting something like this into it:
<configuration>
<appender name="STDOUT" class="ch.qos.logback.core.ConsoleAppender">
<encoder class="ch.qos.logback.classic.encoder.PatternLayoutEncoder">
<pattern>%d{HH:mm:ss.SSS} [%thread] %-5level %logger{36} - %msg%n</pattern>
</encoder>
</appender>
<appender name="FILE" class="ch.qos.logback.core.FileAppender">
<file>/var/log/mysuperapp/supername.log</file>
<append>true</append>
<encoder>
<pattern>%d{yyyy-MM-dd HH:mm:ss.SSS} [%thread] %-5level %logger{36} - %msg%n</pattern>
</encoder>
</appender>
<logger name="org.apache.thrift.server" level="info" />
<root level="info">
<appender-ref ref="FILE" />
</root>
</configuration>
Pick the appropriate level for desired package, add multiple lines for multiple packages. This requires you to have logback dependecy/jar present on classpath.