Skip to content
alexhall edited this page Aug 8, 2011 · 1 revision

This page describes the logging framework used by Spark and Sherpa, and describes how to configure the logging framework at runtime and for testing.

General

The Spark and Sherpa libraries use the SLF4J framework for recording log messages. Libraries using SLF4J call a thin, interface-only API for recording log messages. Projects using these libraries configure them at runtime by including a 'binding' JAR on their classpath which acts as a bridge to write log messages using the project's logging system of choice. For instance, an application using Log4j would include slf4j-log4j12 on the classpath.

In other words, SLF4J specifies the 'what' to log; the underlying logging implementation specifies the 'when', 'how' and 'where' to log.

Runtime Configuration

Applications which declare a dependency on one of the Spark or Sherpa libraries which use SLF4J will inherit a transitive dependency on the slf4j-api library. Running your application code without any further configuration will likely result in an error message similar to the following being written to System.err:

SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.

Properly configuring your application to handle log messages recorded with SLF4J is a three step process:

  1. Select your application's desired logging implementation. Often this choice has already been made for you. Common logging implementations include Log4j and java.util.logging. Note: Apache Commons Logging is a logging framework, similar to SLF4J, and not an implementation.
  2. Identify the SLF4J binding library, and add a runtime dependency on this.
  3. Extend the configuration for your application's logging implementation of choice with the appropriate settings to log messages from Spark or Sherpa.

For example, suppose your application is already using Log4j, and you would like to use the Spark HTTP Client. You would add the following dependencies to your project's pom.xml file:

<dependencies>
  <dependency>
    <groupId>com.revelytix</groupId>
    <artifactId>spark-http-client</artifactId>
    <version>0.1.4-SNAPSHOT</version>
  </dependency>
  <dependency>
    <groupId>org.slf4j</groupId>
    <artifactId>slf4j-log4j12</artifactId>
    <version>${slf4j-version}</version>
    <scope>runtime</scope>
  </dependency>
</dependencies>

Note: Using the 'runtime' dependency scope for the SLF4J-Log4j binding means that any projects which depend on your project will inherit a transitive runtime dependency on this binding. This will cause problems if they use a logging implementation other than Log4j. If this is the case, you should change the 'runtime' scope to 'provided'.

The remaining piece of the puzzle is to modify your application's Log4j configuration with loggers for the Spark HTTP Client. Locate your application's Log4j configuration (typically in a file called log4j.properties on the application classpath) and edit it appropriately. Here is a sample Log4j configuration:

# Log everything INFO and higher to stdout
log4j.rootLogger=INFO, stdout

log4j.appender.stdout=org.apache.log4j.ConsoleAppender
log4j.appender.stdout.layout=org.apache.log4j.PatternLayout
log4j.appender.stdout.layout.ConversionPattern=%d{HH:mm:ss,SSS}: %5p [%t] (%c:%L) - %m%n

# Log anything in the Spark package at DEBUG level
log4j.logger.spark=DEBUG

Clone this wiki locally