Logging

The Data Processing Library uses context-logging to deliver logs enriched with log contexts, a set of key/value pairs that provide additional information about the computational context at the time when a log message is generated.

context-logging is based on SLF4J.

Note

context-logging has no dependencies on the Data Processing Library and can be used as a standalone component in your projects.

Log Context and Context Aware Loggers

The log context is a dynamically bound, thread local set of key/value pairs that are automatically appended to each log message delivered through a ContextAwareLogger.

The following snippet is an example of log message augmented with the log context:

2020/10/13 22:33:54.393 INFO    Main$: Final assessment: Assessment(true,0.0) // Task=main deltaset=mapValues13 catalog=output layer=assessment partition=assessment spark.stageId=59 spark.partitionId=16 spark.attemptNumber=0 spark.taskAttemptId=400

The log context can be arbitrarily augmented via LogContext.withChild, specifying an execution block where the new binding is in place until the control flow leaves its scope, as shown in the following code snippet:

Scala
Java
import com.here.platform.pipeline.logging.{ContextLogging, LogContext}

object LogContextExample extends ContextLogging {
  // extending `ContextLogging` is roughly equivalent to:
  // protected val logger = new ContextAwareLogger(getClass)

  def someLoggingMethod(): Unit = logger.info("some info message")

  def main(arg: Array[String]): Unit = {
    // => some info message
    someLoggingMethod()

    LogContext.withChild("key", "value") {
      // => some info message // key=value
      someLoggingMethod()

      LogContext.withChild("another-key", "another-value") {
        // => some info message // key=value another-key=another-value
        someLoggingMethod()
      }

      // => some info message // key=value
      someLoggingMethod()
    }
  }
}
import com.here.platform.pipeline.logging.LogContext;
import com.here.platform.pipeline.logging.java.ContextAwareLogger;

public class LogContextExample {

  static ContextAwareLogger logger = new ContextAwareLogger(LogContextExample.class);

  public static void someLoggingMethod() {
    logger.info("some info message");
  }

  public static void main(String[] args) {

    // => some info message
    someLoggingMethod();

    LogContext.withChild("key", "value", () -> {
      // => some info message // key=value
      someLoggingMethod();

      LogContext.withChild("another-key", "another-value", () -> {
        // => some info message // key=value another-key=another-value
        someLoggingMethod();
      });

      // => some info message // key=value
      someLoggingMethod();
    });
  }
}

Log Context in the Data Processing Library

The Data Processing Library populates the log context around most user-defined functions passed to the framework, with information about the current driver task, the compilation phase, and the partition that is being processed. Therefore, you should always prefer a ContextAwareLogger over other logging solutions to not lose this useful source of debugging information, even if you do not plan to augment the log context yourself.

Log Context in Functional Compilers

In each functional compiler method (mappingFn, resolveFn, compileInFn, compileOutFn) the log context includes at least the name of the method, the partition Key that is being processed and information about the current Spark stage as well as the task within the stage.

Log Context in Deltaset Transformations

In every DeltaSet transformation, the log context includes the id of the DeltaSet and information about the current Spark stage as well as the task within the stage. Mapping transformations also include the key that is being processed.

results matching ""

    No results matching ""