Merge java and scala versions of logging.md (#23332)

* Merge java and scala versions of logging.md

* Replace logging.md with a symlink

* Fix typos
This commit is contained in:
Arnout Engelen 2017-07-12 01:38:35 -07:00 committed by Konrad `ktoso` Malawski
parent c7ac498520
commit ac33af271b
2 changed files with 96 additions and 524 deletions

View file

@ -1,500 +0,0 @@
# Logging
Logging in Akka is not tied to a specific logging backend. By default
log messages are printed to STDOUT, but you can plug-in a SLF4J logger or
your own logger. Logging is performed asynchronously to ensure that logging
has minimal performance impact. Logging generally means IO and locks,
which can slow down the operations of your code if it was performed
synchronously.
## How to Log
Create a `LoggingAdapter` and use the `error`, `warning`, `info`, or `debug` methods,
as illustrated in this example:
@@snip [LoggingDocTest.java]($code$/java/jdocs/event/LoggingDocTest.java) { #imports }
@@snip [LoggingDocTest.java]($code$/java/jdocs/event/LoggingDocTest.java) { #my-actor }
The first parameter to `Logging.getLogger` could also be any
`LoggingBus`, specifically `system.eventStream()`; in the demonstrated
case, the actor systems address is included in the `akkaSource`
representation of the log source (see [Logging Thread, Akka Source and Actor System in MDC](#logging-thread-akka-source-and-actor-system-in-mdc))
while in the second case this is not automatically done. The second parameter
to `Logging.getLogger` is the source of this logging channel. The source
object is translated to a String according to the following rules:
* if it is an Actor or ActorRef, its path is used
* in case of a String it is used as is
* in case of a class an approximation of its simpleName
* and in all other cases the simpleName of its class
The log message may contain argument placeholders `{}`, which will be
substituted if the log level is enabled. Giving more arguments than
placeholders results in a warning being appended to the log statement (i.e. on
the same line with the same severity). You may pass a Java array as the only
substitution argument to have its elements be treated individually:
@@snip [LoggingDocTest.java]($code$/java/jdocs/event/LoggingDocTest.java) { #array }
The Java `Class` of the log source is also included in the generated
`LogEvent`. In case of a simple string this is replaced with a “marker”
class `akka.event.DummyClassForStringSources` in order to allow special
treatment of this case, e.g. in the SLF4J event listener which will then use
the string instead of the class name for looking up the logger instance to
use.
### Logging of Dead Letters
By default messages sent to dead letters are logged at info level. Existence of dead letters
does not necessarily indicate a problem, but they are logged by default for the sake of caution.
After a few messages this logging is turned off, to avoid flooding the logs.
You can disable this logging completely or adjust how many dead letters are
logged. During system shutdown it is likely that you see dead letters, since pending
messages in the actor mailboxes are sent to dead letters. You can also disable logging
of dead letters during shutdown.
```ruby
akka {
log-dead-letters = 10
log-dead-letters-during-shutdown = on
}
```
To customize the logging further or take other actions for dead letters you can subscribe
to the @ref:[Event Stream](event-bus.md#event-stream).
### Auxiliary logging options
Akka has a few configuration options for very low level debugging. These make more sense in development than in production.
You almost definitely need to have logging set to DEBUG to use any of the options below:
```ruby
akka {
loglevel = "DEBUG"
}
```
This config option is very good if you want to know what config settings are loaded by Akka:
```ruby
akka {
# Log the complete configuration at INFO level when the actor system is started.
# This is useful when you are uncertain of what configuration is used.
log-config-on-start = on
}
```
If you want very detailed logging of all automatically received messages that are processed
by Actors:
```ruby
akka {
actor {
debug {
# enable DEBUG logging of all AutoReceiveMessages (Kill, PoisonPill et.c.)
autoreceive = on
}
}
}
```
If you want very detailed logging of all lifecycle changes of Actors (restarts, deaths etc):
```ruby
akka {
actor {
debug {
# enable DEBUG logging of actor lifecycle changes
lifecycle = on
}
}
}
```
If you want unhandled messages logged at DEBUG:
```ruby
akka {
actor {
debug {
# enable DEBUG logging of unhandled messages
unhandled = on
}
}
}
```
If you want very detailed logging of all events, transitions and timers of FSM Actors that extend LoggingFSM:
```ruby
akka {
actor {
debug {
# enable DEBUG logging of all LoggingFSMs for events, transitions and timers
fsm = on
}
}
}
```
If you want to monitor subscriptions (subscribe/unsubscribe) on the ActorSystem.eventStream:
```ruby
akka {
actor {
debug {
# enable DEBUG logging of subscription changes on the eventStream
event-stream = on
}
}
}
```
<a id="logging-remote"></a>
### Auxiliary remote logging options
If you want to see all messages that are sent through remoting at DEBUG log level, use the following config option. Note that this logs the messages as they are sent by the transport layer, not by an actor.
```ruby
akka {
remote {
# If this is "on", Akka will log all outbound messages at DEBUG level,
# if off then they are not logged
log-sent-messages = on
}
}
```
If you want to see all messages that are received through remoting at DEBUG log level, use the following config option. Note that this logs the messages as they are sent by the transport layer, not by an actor.
```ruby
akka {
remote {
# If this is "on", Akka will log all inbound messages at DEBUG level,
# if off then they are not logged
log-received-messages = on
}
}
```
If you want to see message types with payload size in bytes larger than
a specified limit at INFO log level:
```ruby
akka {
remote {
# Logging of message types with payload size in bytes larger than
# this value. Maximum detected size per message type is logged once,
# with an increase threshold of 10%.
# By default this feature is turned off. Activate it by setting the property to
# a value in bytes, such as 1000b. Note that for all messages larger than this
# limit there will be extra performance and scalability cost.
log-frame-size-exceeding = 1000b
}
}
```
Also see the logging options for TestKit: @ref:[actor.logging-java](testing.md#actor-logging).
### Turn Off Logging
To turn off logging you can configure the log levels to be `OFF` like this.
```ruby
akka {
stdout-loglevel = "OFF"
loglevel = "OFF"
}
```
The `stdout-loglevel` is only in effect during system startup and shutdown, and setting
it to `OFF` as well, ensures that nothing gets logged during system startup or shutdown.
## Loggers
Logging is performed asynchronously through an event bus. Log events are processed by an event handler actor
that receives the log events in the same order they were emitted.
@@@ note
The event handler actor does not have a bounded inbox and is run on the default dispatcher. This means
that logging extreme amounts of data may affect your application badly. This can be somewhat mitigated by using an async logging backend though. (See [Using the SLF4J API directly](#slf4j-directly))
@@@
You can configure which event handlers are created at system start-up and listen to logging events. That is done using the
`loggers` element in the @ref:[configuration](general/configuration.md).
Here you can also define the log level. More fine grained filtering based on the log source
can be implemented in a custom `LoggingFilter`, which can be defined in the `logging-filter`
configuration property.
```ruby
akka {
# Loggers to register at boot time (akka.event.Logging$DefaultLogger logs
# to STDOUT)
loggers = ["akka.event.Logging$DefaultLogger"]
# Options: OFF, ERROR, WARNING, INFO, DEBUG
loglevel = "DEBUG"
}
```
The default one logs to STDOUT and is registered by default. It is not intended to be used for production. There is also an [SLF4J](#slf4j)
logger available in the 'akka-slf4j' module.
Example of creating a listener:
@@snip [LoggingDocTest.java]($code$/java/jdocs/event/LoggingDocTest.java) { #imports #imports-listener }
@@snip [LoggingDocTest.java]($code$/java/jdocs/event/LoggingDocTest.java) { #my-event-listener }
## Logging to stdout during startup and shutdown
While the actor system is starting up and shutting down the configured `loggers` are not used.
Instead log messages are printed to stdout (System.out). The default log level for this
stdout logger is `WARNING` and it can be silenced completely by setting
`akka.stdout-loglevel=OFF`.
<a id="slf4j"></a>
## SLF4J
Akka provides a logger for [SL4FJ](http://www.slf4j.org/). This module is available in the 'akka-slf4j.jar'.
It has a single dependency: the slf4j-api jar. In your runtime, you also need a SLF4J backend. We recommend [Logback](http://logback.qos.ch/):
```xml
<dependency>
<groupId>ch.qos.logback</groupId>
<artifactId>logback-classic</artifactId>
<version>1.2.3</version>
</dependency>
```
You need to enable the Slf4jLogger in the `loggers` element in
the @ref:[configuration](general/configuration.md). Here you can also define the log level of the event bus.
More fine grained log levels can be defined in the configuration of the SLF4J backend
(e.g. logback.xml). You should also define `akka.event.slf4j.Slf4jLoggingFilter` in
the `logging-filter` configuration property. It will filter the log events using the backend
configuration (e.g. logback.xml) before they are published to the event bus.
@@@ warning
If you set the `loglevel` to a higher level than "DEBUG", any DEBUG events will be filtered
out already at the source and will never reach the logging backend, regardless of how the backend
is configured.
@@@
```ruby
akka {
loggers = ["akka.event.slf4j.Slf4jLogger"]
loglevel = "DEBUG"
logging-filter = "akka.event.slf4j.Slf4jLoggingFilter"
}
```
One gotcha is that the timestamp is attributed in the event handler, not when actually doing the logging.
The SLF4J logger selected for each log event is chosen based on the
`Class` of the log source specified when creating the
`LoggingAdapter`, unless that was given directly as a string in which
case that string is used (i.e. `LoggerFactory.getLogger(Class c)` is used in
the first case and `LoggerFactory.getLogger(String s)` in the second).
@@@ note
Beware that the actor systems name is appended to a `String` log
source if the LoggingAdapter was created giving an `ActorSystem` to
the factory. If this is not intended, give a `LoggingBus` instead as
shown below:
@@@
```scala
final LoggingAdapter log = Logging.getLogger(system.eventStream(), "my.string");
```
<a id="slf4j-directly"></a>
### Using the SLF4J API directly
If you use the SLF4J API directly in your application, remember that the logging operations will block
while the underlying infrastructure writes the log statements.
This can be avoided by configuring the logging implementation to use
a non-blocking appender. Logback provides [AsyncAppender](http://logback.qos.ch/manual/appenders.html#AsyncAppender)
that does this. It also contains a feature which will drop `INFO` and `DEBUG` messages if the logging
load is high.
### Logging Thread, Akka Source and Actor System in MDC
Since the logging is done asynchronously the thread in which the logging was performed is captured in
Mapped Diagnostic Context (MDC) with attribute name `sourceThread`.
With Logback the thread name is available with `%X{sourceThread}` specifier within the pattern layout configuration:
```
<appender name="STDOUT" class="ch.qos.logback.core.ConsoleAppender">
<encoder>
<pattern>%date{ISO8601} %-5level %logger{36} %X{sourceThread} - %msg%n</pattern>
</encoder>
</appender>
```
@@@ note
It will probably be a good idea to use the `sourceThread` MDC value also in
non-Akka parts of the application in order to have this property consistently
available in the logs.
@@@
Another helpful facility is that Akka captures the actors address when
instantiating a logger within it, meaning that the full instance identification
is available for associating log messages e.g. with members of a router. This
information is available in the MDC with attribute name `akkaSource`:
```
<appender name="STDOUT" class="ch.qos.logback.core.ConsoleAppender">
<encoder>
<pattern>%date{ISO8601} %-5level %logger{36} %X{akkaSource} - %msg%n</pattern>
</encoder>
</appender>
```
Finally, the actor system in which the logging was performed
is available in the MDC with attribute name `sourceActorSystem`:
```
<appender name="STDOUT" class="ch.qos.logback.core.ConsoleAppender">
<encoder>
<pattern>%date{ISO8601} %-5level %logger{36} %X{sourceActorSystem} - %msg%n</pattern>
</encoder>
</appender>
```
For more details on what this attribute contains—also for non-actors—please see
[How to Log](#how-to-log).
### More accurate timestamps for log output in MDC
Akka's logging is asynchronous which means that the timestamp of a log entry is taken from
when the underlying logger implementation is called, which can be surprising at first.
If you want to more accurately output the timestamp, use the MDC attribute `akkaTimestamp`:
```
<appender name="STDOUT" class="ch.qos.logback.core.ConsoleAppender">
<encoder>
<pattern>%X{akkaTimestamp} %-5level %logger{36} %X{akkaSource} - %msg%n</pattern>
</encoder>
</appender>
```
### MDC values defined by the application
One useful feature available in Slf4j is [MDC](http://logback.qos.ch/manual/mdc.html),
Akka has a way to let the application specify custom values, you just need to get a
specialized `LoggingAdapter`, the `DiagnosticLoggingAdapter`. In order to
get it you can use the factory, providing an AbstractActor as logSource:
```scala
// Within your AbstractActor
final DiagnosticLoggingAdapter log = Logging.getLogger(this);
```
Once you have the logger, you just need to add the custom values before you log something.
This way, the values will be put in the SLF4J MDC right before appending the log and removed after.
@@@ note
The cleanup (removal) should be done in the actor at the end,
otherwise, the next message will log with same MDC values,
if it is not set to a new map. Use `log.clearMDC()`.
@@@
@@snip [LoggingDocTest.java]($code$/java/jdocs/event/LoggingDocTest.java) { #imports-mdc }
@@snip [LoggingDocTest.java]($code$/java/jdocs/event/LoggingDocTest.java) { #mdc-actor }
Now, the values will be available in the MDC, so you can use them in the layout pattern:
```
<appender name="STDOUT" class="ch.qos.logback.core.ConsoleAppender">
<encoder>
<pattern>
%-5level %logger{36} [req: %X{requestId}, visitor: %X{visitorId}] - %msg%n
</pattern>
</encoder>
</appender>
```
### Using Markers
Some logging libraries allow, in addition to MDC data, attaching so called "markers" to log statements.
These are used to filter out rare and special events, for example you might want to mark logs that detect
some malicious activity and mark them with a `SECURITY` tag, and in your appender configuration make these
trigger emails and other notifications immediately.
Markers are available through the LoggingAdapters, when obtained via `Logging.withMarker`.
The first argument passed into all log calls then should be a `akka.event.LogMarker`.
The slf4j bridge provided by akka in `akka-slf4j` will automatically pick up this marker value and make it available to SLF4J.
For example you could use it like this:
```
<pattern>%date{ISO8601} [%marker][%level] [%msg]%n</pattern>
```
A more advanced (including most Akka added information) example pattern would be:
```
<pattern>%date{ISO8601} level=[%level] marker=[%marker] logger=[%logger] akkaSource=[%X{akkaSource}] sourceActorSystem=[%X{sourceActorSystem}] sourceThread=[%X{sourceThread}] mdc=[ticket-#%X{ticketNumber}: %X{ticketDesc}] - msg=[%msg]%n----%n</pattern>
```
<a id="jul"></a>
## java.util.logging
Akka includes a logger for [java.util.logging](https://docs.oracle.com/javase/8/jdocs/api/java/util/logging/package-summary.html#package.description).
You need to enable the `akka.event.jul.JavaLogger` in the `loggers` element in
the @ref:[configuration](general/configuration.md). Here you can also define the log level of the event bus.
More fine grained log levels can be defined in the configuration of the logging backend.
You should also define `akka.event.jul.JavaLoggingFilter` in
the `logging-filter` configuration property. It will filter the log events using the backend
configuration before they are published to the event bus.
@@@ warning
If you set the `loglevel` to a higher level than "DEBUG", any DEBUG events will be filtered
out already at the source and will never reach the logging backend, regardless of how the backend
is configured.
@@@
```ruby
akka {
loglevel = DEBUG
loggers = ["akka.event.jul.JavaLogger"]
logging-filter = "akka.event.jul.JavaLoggingFilter"
}
```
One gotcha is that the timestamp is attributed in the event handler, not when actually doing the logging.
The `java.util.logging.Logger` selected for each log event is chosen based on the
`Class` of the log source specified when creating the
`LoggingAdapter`, unless that was given directly as a string in which
case that string is used (i.e. `LoggerFactory.getLogger(Class c)` is used in
the first case and `LoggerFactory.getLogger(String s)` in the second).
@@@ note
Beware that the actor systems name is appended to a `String` log
source if the LoggingAdapter was created giving an `ActorSystem` to
the factory. If this is not intended, give a `LoggingBus` instead as
shown below:
@@@
```scala
final LoggingAdapter log = Logging.getLogger(system.eventStream(), "my.string");
```

View file

@ -0,0 +1 @@
../scala/logging.md

View file

@ -12,7 +12,15 @@ synchronously.
Create a `LoggingAdapter` and use the `error`, `warning`, `info`, or `debug` methods, Create a `LoggingAdapter` and use the `error`, `warning`, `info`, or `debug` methods,
as illustrated in this example: as illustrated in this example:
@@snip [LoggingDocSpec.scala]($code$/scala/docs/event/LoggingDocSpec.scala) { #my-actor } Scala
: @@snip [LoggingDocSpec.scala]($code$/scala/docs/event/LoggingDocSpec.scala) { #my-actor }
Java
: @@snip [LoggingDocTest.java]($code$/java/jdocs/event/LoggingDocTest.java) { #imports }
@@snip [LoggingDocTest.java]($code$/java/jdocs/event/LoggingDocTest.java) { #my-actor }
@@@ div { .group-scala }
For convenience, you can mix in the `log` member into actors, instead of defining it as above. For convenience, you can mix in the `log` member into actors, instead of defining it as above.
@ -22,14 +30,21 @@ class MyActor extends Actor with akka.actor.ActorLogging {
} }
``` ```
The second parameter to the `Logging` is the source of this logging channel. @@@
The first parameter to @scala[`Logging`] @java[`Logging.getLogger`] could also be any
`LoggingBus`, specifically @scala[`system.eventStream`] @scala[`system.eventStream()`]; in the demonstrated
case, the actor system's address is included in the `akkaSource`
representation of the log source (see @ref[Logging Thread, Akka Source and Actor System in MDC](#logging-thread-akka-source-and-actor-system-in-mdc))
while in the second case this is not automatically done.
The second parameter to @scala[`Logging`] @java[`Logging.getLogger`] is the source of this logging channel.
The source object is translated to a String according to the following rules: The source object is translated to a String according to the following rules:
* if it is an Actor or ActorRef, its path is used * if it is an Actor or ActorRef, its path is used
* in case of a String it is used as is * in case of a String it is used as is
* in case of a class an approximation of its simpleName * in case of a class an approximation of its simpleName
* and in all other cases a compile error occurs unless an implicit * and in all other cases @scala[a compile error occurs unless an implicit
`LogSource[T]` is in scope for the type in question. `LogSource[T]` is in scope for the type in question] @java[the simpleName of its class]
The log message may contain argument placeholders `{}`, which will be The log message may contain argument placeholders `{}`, which will be
substituted if the log level is enabled. Giving more arguments than substituted if the log level is enabled. Giving more arguments than
@ -37,7 +52,11 @@ placeholders results in a warning being appended to the log statement (i.e. on
the same line with the same severity). You may pass an array as the only the same line with the same severity). You may pass an array as the only
substitution argument to have its elements be treated individually: substitution argument to have its elements be treated individually:
@@snip [LoggingDocSpec.scala]($code$/scala/docs/event/LoggingDocSpec.scala) { #array } Scala
: @@snip [LoggingDocSpec.scala]($code$/scala/docs/event/LoggingDocSpec.scala) { #array }
Java
: @@snip [LoggingDocTest.java]($code$/java/jdocs/event/LoggingDocTest.java) { #array }
The Java `Class` of the log source is also included in the generated The Java `Class` of the log source is also included in the generated
`LogEvent`. In case of a simple string this is replaced with a “marker” `LogEvent`. In case of a simple string this is replaced with a “marker”
@ -51,7 +70,7 @@ use.
By default messages sent to dead letters are logged at info level. Existence of dead letters By default messages sent to dead letters are logged at info level. Existence of dead letters
does not necessarily indicate a problem, but they are logged by default for the sake of caution. does not necessarily indicate a problem, but they are logged by default for the sake of caution.
After a few messages this logging is turned off, to avoid flooding the logs. After a few messages this logging is turned off, to avoid flooding the logs.
You can disable this logging completely or adjust how many dead letters that are You can disable this logging completely or adjust how many dead letters are
logged. During system shutdown it is likely that you see dead letters, since pending logged. During system shutdown it is likely that you see dead letters, since pending
messages in the actor mailboxes are sent to dead letters. You can also disable logging messages in the actor mailboxes are sent to dead letters. You can also disable logging
of dead letters during shutdown. of dead letters during shutdown.
@ -88,6 +107,8 @@ akka {
} }
``` ```
@@@ div { .group-scala }
If you want very detailed logging of user-level messages then wrap your actors' behaviors with If you want very detailed logging of user-level messages then wrap your actors' behaviors with
`akka.event.LoggingReceive` and enable the `receive` option: `akka.event.LoggingReceive` and enable the `receive` option:
@ -103,6 +124,8 @@ akka {
} }
``` ```
@@@
If you want very detailed logging of all automatically received messages that are processed If you want very detailed logging of all automatically received messages that are processed
by Actors: by Actors:
@ -212,7 +235,9 @@ akka {
} }
``` ```
Also see the logging options for TestKit: @ref:[actor.logging-scala](testing.md#actor-logging). Also see the @ref[logging options for TestKit](testing.md#actor-logging).
@@@ div { .group-scala }
### Translating Log Source to String and Class ### Translating Log Source to String and Class
@ -242,6 +267,8 @@ might want to do this also in case you implement your own logging adapter.
@@@ @@@
@@@
### Turn Off Logging ### Turn Off Logging
To turn off logging you can configure the log levels to be `OFF` like this. To turn off logging you can configure the log levels to be `OFF` like this.
@ -270,7 +297,7 @@ using an async logging backend though. (See [Using the SLF4J API directly](#slf4
@@@ @@@
You can configure which event handlers are created at system start-up and listen to logging events. That is done using the You can configure which event handlers are created at system start-up and listen to logging events. That is done using the
`loggers` element in the @ref:[configuration](general/configuration.md). `loggers` element in the @ref[configuration](general/configuration.md).
Here you can also define the log level. More fine grained filtering based on the log source Here you can also define the log level. More fine grained filtering based on the log source
can be implemented in a custom `LoggingFilter`, which can be defined in the `logging-filter` can be implemented in a custom `LoggingFilter`, which can be defined in the `logging-filter`
configuration property. configuration property.
@ -286,12 +313,17 @@ akka {
``` ```
The default one logs to STDOUT and is registered by default. It is not intended The default one logs to STDOUT and is registered by default. It is not intended
to be used for production. There is also an [SLF4J](#slf4j) to be used for production. There is also an @ref[SLF4J](#slf4j)
logger available in the 'akka-slf4j' module. logger available in the 'akka-slf4j' module.
Example of creating a listener: Example of creating a listener:
@@snip [LoggingDocSpec.scala]($code$/scala/docs/event/LoggingDocSpec.scala) { #my-event-listener } Scala
: @@snip [LoggingDocSpec.scala]($code$/scala/docs/event/LoggingDocSpec.scala) { #my-event-listener }
Java
: @@snip [LoggingDocTest.java]($code$/java/jdocs/event/LoggingDocTest.java) { #imports #imports-listener }
@@snip [LoggingDocTest.java]($code$/java/jdocs/event/LoggingDocTest.java) { #my-event-listener }
## Logging to stdout during startup and shutdown ## Logging to stdout during startup and shutdown
@ -306,10 +338,21 @@ stdout logger is `WARNING` and it can be silenced completely by setting
Akka provides a logger for [SL4FJ](http://www.slf4j.org/). This module is available in the 'akka-slf4j.jar'. Akka provides a logger for [SL4FJ](http://www.slf4j.org/). This module is available in the 'akka-slf4j.jar'.
It has a single dependency: the slf4j-api jar. In your runtime, you also need a SLF4J backend. We recommend [Logback](http://logback.qos.ch/): It has a single dependency: the slf4j-api jar. In your runtime, you also need a SLF4J backend. We recommend [Logback](http://logback.qos.ch/):
```scala sbt
: ```scala
libraryDependencies += "ch.qos.logback" % "logback-classic" % "1.2.3" libraryDependencies += "ch.qos.logback" % "logback-classic" % "1.2.3"
``` ```
maven
: ```xml
<dependency>
<groupId>ch.qos.logback</groupId>
<artifactId>logback-classic</artifactId>
<version>1.2.3</version>
</dependency>
```
You need to enable the Slf4jLogger in the `loggers` element in You need to enable the Slf4jLogger in the `loggers` element in
the @ref:[configuration](general/configuration.md). Here you can also define the log level of the event bus. the @ref:[configuration](general/configuration.md). Here you can also define the log level of the event bus.
More fine grained log levels can be defined in the configuration of the SLF4J backend More fine grained log levels can be defined in the configuration of the SLF4J backend
@ -336,10 +379,10 @@ akka {
One gotcha is that the timestamp is attributed in the event handler, not when actually doing the logging. One gotcha is that the timestamp is attributed in the event handler, not when actually doing the logging.
The SLF4J logger selected for each log event is chosen based on the The SLF4J logger selected for each log event is chosen based on the
`Class[_]` of the log source specified when creating the @scala[`Class[_]`] @java[`Class`] of the log source specified when creating the
`LoggingAdapter`, unless that was given directly as a string in which `LoggingAdapter`, unless that was given directly as a string in which
case that string is used (i.e. `LoggerFactory.getLogger(c: Class[_])` is used in case that string is used (i.e. @scala[`LoggerFactory.getLogger(c: Class[_])`] @java[`LoggerFactory.getLogger(Class c)`] is used in
the first case and `LoggerFactory.getLogger(s: String)` in the second). the first case and @scala[`LoggerFactory.getLogger(s: String)`] @java[`LoggerFactory.getLogger(String s)`] in the second).
@@@ note @@@ note
@ -350,10 +393,16 @@ shown below:
@@@ @@@
```scala Scala
: ```scala
val log = Logging(system.eventStream, "my.nice.string") val log = Logging(system.eventStream, "my.nice.string")
``` ```
Java
: ```java
final LoggingAdapter log = Logging.getLogger(system.eventStream(), "my.string");
```
<a id="slf4j-directly"></a> <a id="slf4j-directly"></a>
### Using the SLF4J API directly ### Using the SLF4J API directly
@ -433,25 +482,39 @@ If you want to more accurately output the timestamp, use the MDC attribute `akka
One useful feature available in Slf4j is [MDC](http://logback.qos.ch/manual/mdc.html), One useful feature available in Slf4j is [MDC](http://logback.qos.ch/manual/mdc.html),
Akka has a way to let the application specify custom values, you just need to get a Akka has a way to let the application specify custom values, you just need to get a
specialized `LoggingAdapter`, the `DiagnosticLoggingAdapter`. In order to specialized `LoggingAdapter`, the `DiagnosticLoggingAdapter`. In order to
get it you can use the factory, providing an Actor as logSource: get it you can use the factory, providing an @scala[Actor] @java[AbstractActor] as logSource:
```scala Scala
: ```scala
// Within your Actor // Within your Actor
val log: DiagnosticLoggingAdapter = Logging(this); val log: DiagnosticLoggingAdapter = Logging(this);
``` ```
Java
: ```java
// Within your AbstractActor
final DiagnosticLoggingAdapter log = Logging.getLogger(this);
```
Once you have the logger, you just need to add the custom values before you log something. Once you have the logger, you just need to add the custom values before you log something.
This way, the values will be put in the SLF4J MDC right before appending the log and removed after. This way, the values will be put in the SLF4J MDC right before appending the log and removed after.
@@@ note @@@ note
The cleanup (removal) should be done in the actor at the end, The cleanup (removal) should be done in the actor at the end,
otherwise, next message will log with same mdc values, otherwise, next message will log with same MDC values,
if it is not set to a new map. Use `log.clearMDC()`. if it is not set to a new map. Use `log.clearMDC()`.
@@@ @@@
@@snip [LoggingDocSpec.scala]($code$/scala/docs/event/LoggingDocSpec.scala) { #mdc } Scala
: @@snip [LoggingDocSpec.scala]($code$/scala/docs/event/LoggingDocSpec.scala) { #mdc }
Java
: @@snip [LoggingDocTest.java]($code$/java/jdocs/event/LoggingDocTest.java) { #imports-mdc }
@@snip [LoggingDocTest.java]($code$/java/jdocs/event/LoggingDocTest.java) { #mdc-actor }
@@@ div { .group-scala }
For convenience, you can mix in the `log` member into actors, instead of defining it as above. For convenience, you can mix in the `log` member into actors, instead of defining it as above.
This trait also lets you override `def mdc(msg: Any): MDC` for specifying MDC values This trait also lets you override `def mdc(msg: Any): MDC` for specifying MDC values
@ -459,6 +522,8 @@ depending on current message and lets you forget about the cleanup as well, sinc
@@snip [LoggingDocSpec.scala]($code$/scala/docs/event/LoggingDocSpec.scala) { #mdc-actor } @@snip [LoggingDocSpec.scala]($code$/scala/docs/event/LoggingDocSpec.scala) { #mdc-actor }
@@@
Now, the values will be available in the MDC, so you can use them in the layout pattern: Now, the values will be available in the MDC, so you can use them in the layout pattern:
``` ```
@ -500,7 +565,7 @@ A more advanced (including most Akka added information) example pattern would be
Akka includes a logger for [java.util.logging](https://docs.oracle.com/javase/8/docs/api/java/util/logging/package-summary.html#package.description). Akka includes a logger for [java.util.logging](https://docs.oracle.com/javase/8/docs/api/java/util/logging/package-summary.html#package.description).
You need to enable the `akka.event.jul.JavaLogger` in the `loggers` element in You need to enable the `akka.event.jul.JavaLogger` in the `loggers` element in
the @ref:[configuration](general/configuration.md). Here you can also define the log level of the event bus. the @ref[configuration](general/configuration.md). Here you can also define the log level of the event bus.
More fine grained log levels can be defined in the configuration of the logging backend. More fine grained log levels can be defined in the configuration of the logging backend.
You should also define `akka.event.jul.JavaLoggingFilter` in You should also define `akka.event.jul.JavaLoggingFilter` in
the `logging-filter` configuration property. It will filter the log events using the backend the `logging-filter` configuration property. It will filter the log events using the backend
@ -525,10 +590,10 @@ akka {
One gotcha is that the timestamp is attributed in the event handler, not when actually doing the logging. One gotcha is that the timestamp is attributed in the event handler, not when actually doing the logging.
The `java.util.logging.Logger` selected for each log event is chosen based on the The `java.util.logging.Logger` selected for each log event is chosen based on the
`Class[_]` of the log source specified when creating the @scala[`Class[_]`]@java[`Class`] of the log source specified when creating the
`LoggingAdapter`, unless that was given directly as a string in which `LoggingAdapter`, unless that was given directly as a string in which
case that string is used (i.e. `LoggerFactory.getLogger(c: Class[_])` is used in case that string is used (i.e. @scala[`LoggerFactory.getLogger(c: Class[_])`] @java[`LoggerFactory.getLogger(Class c)`] is used in
the first case and `LoggerFactory.getLogger(s: String)` in the second). the first case and @scala[`LoggerFactory.getLogger(s: String)`] @java[`LoggerFactory.getLogger(String s)`] in the second).
@@@ note @@@ note
@ -539,6 +604,12 @@ shown below:
@@@ @@@
```scala Scala
: ```scala
val log = Logging(system.eventStream, "my.nice.string") val log = Logging(system.eventStream, "my.nice.string")
``` ```
Java
: ```java
final LoggingAdapter log = Logging.getLogger(system.eventStream(), "my.string");
```