Logback JSON Encoder in Java: Structured Logging for Modern Applications

JSON logging provides machine-readable log output that integrates seamlessly with log aggregation systems like ELK Stack, Splunk, and cloud monitoring services. Logback's JSON encoder transforms traditional text logs into structured JSON format.


Dependencies and Setup

Maven Dependencies
<properties>
<logback.version>1.4.11</logback.version>
<logstash-logback-encoder.version>7.4</logstash-logback-encoder.version>
<jackson.version>2.15.2</jackson.version>
</properties>
<dependencies>
<!-- Logback Core -->
<dependency>
<groupId>ch.qos.logback</groupId>
<artifactId>logback-classic</artifactId>
<version>${logback.version}</version>
</dependency>
<!-- Logstash Logback Encoder -->
<dependency>
<groupId>net.logstash.logback</groupId>
<artifactId>logstash-logback-encoder</artifactId>
<version>${logstash-logback-encoder.version}</version>
</dependency>
<!-- Jackson for JSON processing -->
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-databind</artifactId>
<version>${jackson.version}</version>
</dependency>
</dependencies>

Basic Logback JSON Configuration

1. Simple JSON Configuration
<!-- src/main/resources/logback-spring.xml -->
<configuration>
<appender name="JSON_CONSOLE" class="ch.qos.logback.core.ConsoleAppender">
<encoder class="net.logstash.logback.encoder.LoggingEventCompositeJsonEncoder">
<providers>
<timestamp/>
<logLevel/>
<loggerName/>
<message/>
<threadName/>
<context/>
<mdc/>
</providers>
</encoder>
</appender>
<root level="INFO">
<appender-ref ref="JSON_CONSOLE" />
</root>
</configuration>

Sample Output:

{
"@timestamp": "2023-10-15T14:30:45.123Z",
"level": "INFO",
"logger_name": "com.example.UserService",
"message": "User created successfully",
"thread_name": "main",
"traceId": "abc123def456"
}
2. Advanced JSON Configuration with Custom Fields
<configuration>
<!-- Custom JSON Fields -->
<springProperty scope="context" name="applicationName" source="spring.application.name" defaultValue="unknown"/>
<springProperty scope="context" name="environment" source="spring.profiles.active" defaultValue="default"/>
<appender name="JSON_CONSOLE" class="ch.qos.logback.core.ConsoleAppender">
<encoder class="net.logstash.logback.encoder.LoggingEventCompositeJsonEncoder">
<providers>
<!-- Timestamp in ISO format -->
<timestamp>
<timeZone>UTC</timeZone>
<fieldName>timestamp</fieldName>
</timestamp>
<!-- Log level -->
<logLevel>
<fieldName>level</fieldName>
</logLevel>
<!-- Logger name -->
<loggerName>
<fieldName>logger</fieldName>
<shortenedLoggerNameLength>36</shortenedLoggerNameLength>
</loggerName>
<!-- Message -->
<message>
<fieldName>message</fieldName>
</message>
<!-- Thread name -->
<threadName>
<fieldName>thread</fieldName>
</threadName>
<!-- MDC (Mapped Diagnostic Context) -->
<mdc>
<includeMdcKeyName>traceId</includeMdcKeyName>
<includeMdcKeyName>spanId</includeMdcKeyName>
<includeMdcKeyName>userId</includeMdcKeyName>
</mdc>
<!-- Stack trace -->
<stackTrace>
<fieldName>stack_trace</fieldName>
</stackTrace>
<!-- Pattern for custom fields -->
<pattern>
<pattern>
{
"service": "${applicationName}",
"env": "${environment}",
"version": "1.0.0"
}
</pattern>
</pattern>
<!-- Log context -->
<context/>
</providers>
</encoder>
</appender>
<root level="INFO">
<appender-ref ref="JSON_CONSOLE" />
</root>
</configuration>

Advanced Configuration Patterns

1. Environment-Specific Configurations
<configuration>
<springProfile name="dev">
<include resource="console-appender.xml" />
</springProfile>
<springProfile name="prod">
<include resource="file-appender.xml" />
<include resource="logstash-appender.xml" />
</springProfile>
</configuration>
2. Console Appender for Development
<!-- console-appender.xml -->
<included>
<appender name="CONSOLE" class="ch.qos.logback.core.ConsoleAppender">
<encoder class="net.logstash.logback.encoder.LoggingEventCompositeJsonEncoder">
<providers>
<timestamp/>
<logLevel/>
<loggerName/>
<message/>
<mdc/>
<pattern>
<pattern>
{
"environment": "development",
"service": "user-service"
}
</pattern>
</pattern>
</providers>
</encoder>
</appender>
<root level="DEBUG">
<appender-ref ref="CONSOLE" />
</root>
</included>
3. File Appender for Production
<!-- file-appender.xml -->
<included>
<property name="LOG_DIR" value="/var/log/myapp" />
<property name="LOG_FILE" value="${LOG_DIR}/application.log" />
<appender name="FILE" class="ch.qos.logback.core.rolling.RollingFileAppender">
<file>${LOG_FILE}</file>
<rollingPolicy class="ch.qos.logback.core.rolling.TimeBasedRollingPolicy">
<fileNamePattern>${LOG_DIR}/application.%d{yyyy-MM-dd}.%i.log.gz</fileNamePattern>
<timeBasedFileNamingAndTriggeringPolicy class="ch.qos.logback.core.rolling.SizeAndTimeBasedFNATP">
<maxFileSize>100MB</maxFileSize>
</timeBasedFileNamingAndTriggeringPolicy>
<maxHistory>30</maxHistory>
</rollingPolicy>
<encoder class="net.logstash.logback.encoder.LoggingEventCompositeJsonEncoder">
<providers>
<timestamp/>
<logLevel/>
<loggerName/>
<message/>
<threadName/>
<mdc/>
<stackTrace/>
<pattern>
<pattern>
{
"environment": "production",
"service": "user-service"
}
</pattern>
</pattern>
</providers>
</encoder>
</appender>
</included>
4. Logstash Appender for Centralized Logging
<!-- logstash-appender.xml -->
<included>
<appender name="LOGSTASH" class="net.logstash.logback.appender.LogstashTcpSocketAppender">
<destination>logstash-server:5000</destination>
<encoder class="net.logstash.logback.encoder.LoggingEventCompositeJsonEncoder">
<providers>
<timestamp/>
<version/>
<logLevel/>
<loggerName/>
<message/>
<threadName/>
<mdc/>
<stackTrace/>
<context/>
<pattern>
<pattern>
{
"type": "application",
"service": "user-service",
"environment": "production"
}
</pattern>
</pattern>
</providers>
</encoder>
<ssl/>
</appender>
</included>

Custom JSON Providers

1. Custom Logstash Provider
public class CustomLogstashProvider implements JsonProvider<ILoggingEvent> {
@Override
public void writeTo(JsonGenerator generator, ILoggingEvent event) throws IOException {
generator.writeStringField("custom_field", "custom_value");
generator.writeNumberField("sequence_number", System.currentTimeMillis());
}
@Override
public void prepareForDeferredProcessing(ILoggingEvent event) {
// No preparation needed
}
}
2. Custom JSON Encoder
public class CustomJsonEncoder extends LoggingEventCompositeJsonEncoder {
public CustomJsonEncoder() {
JsonFactory jsonFactory = new JsonFactory();
jsonFactory.configure(JsonGenerator.Feature.ESCAPE_NON_ASCII, true);
setJsonFactory(jsonFactory);
}
}
3. XML Configuration for Custom Provider
<configuration>
<appender name="JSON_CONSOLE" class="ch.qos.logback.core.ConsoleAppender">
<encoder class="net.logstash.logback.encoder.LoggingEventCompositeJsonEncoder">
<providers>
<timestamp/>
<logLevel/>
<message/>
<customProvider class="com.example.logging.CustomLogstashProvider"/>
<pattern>
<pattern>
{
"application": "my-service",
"team": "backend-team"
}
</pattern>
</pattern>
</providers>
</encoder>
</appender>
</configuration>

Java Implementation with Structured Logging

1. Basic Logging with MDC
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.slf4j.MDC;
@Service
public class UserService {
private static final Logger logger = LoggerFactory.getLogger(UserService.class);
public User createUser(CreateUserRequest request) {
// Set contextual information in MDC
MDC.put("userId", request.getEmail());
MDC.put("operation", "create_user");
MDC.put("traceId", generateTraceId());
try {
logger.info("Creating user with email: {}", request.getEmail());
validateUserRequest(request);
User user = userRepository.save(mapToUser(request));
logger.info("User created successfully", 
key("user_id", user.getId()),
key("email", user.getEmail()));
return user;
} catch (Exception e) {
logger.error("Failed to create user", 
key("email", request.getEmail()),
key("error", e.getMessage()));
throw e;
} finally {
// Clear MDC to prevent memory leaks
MDC.clear();
}
}
// Helper method for structured logging
private void logWithStructure(String message, Object... keyValuePairs) {
if (logger.isInfoEnabled()) {
StringBuilder jsonMessage = new StringBuilder();
jsonMessage.append("{\"message\":\"").append(message).append("\"");
for (int i = 0; i < keyValuePairs.length; i += 2) {
if (i + 1 < keyValuePairs.length) {
jsonMessage.append(", \"")
.append(keyValuePairs[i])
.append("\": \"")
.append(keyValuePairs[i + 1])
.append("\"");
}
}
jsonMessage.append("}");
logger.info(jsonMessage.toString());
}
}
}
2. Advanced Structured Logging Utility
@Component
public class StructuredLogger {
private final Logger logger;
private final ObjectMapper objectMapper;
public StructuredLogger(Class<?> clazz) {
this.logger = LoggerFactory.getLogger(clazz);
this.objectMapper = new ObjectMapper();
}
public void info(String message, Map<String, Object> fields) {
log("INFO", message, fields, null);
}
public void error(String message, Map<String, Object> fields, Throwable throwable) {
log("ERROR", message, fields, throwable);
}
public void warn(String message, Map<String, Object> fields) {
log("WARN", message, fields, null);
}
public void debug(String message, Map<String, Object> fields) {
log("DEBUG", message, fields, null);
}
private void log(String level, String message, Map<String, Object> fields, Throwable throwable) {
try {
Map<String, Object> logEntry = new HashMap<>();
logEntry.put("timestamp", Instant.now().toString());
logEntry.put("level", level);
logEntry.put("message", message);
logEntry.put("logger", logger.getName());
logEntry.put("thread", Thread.currentThread().getName());
// Add MDC context
Map<String, String> mdcContext = MDC.getCopyOfContextMap();
if (mdcContext != null) {
logEntry.putAll(mdcContext);
}
// Add custom fields
if (fields != null) {
logEntry.putAll(fields);
}
// Add throwable information
if (throwable != null) {
logEntry.put("error", throwable.getMessage());
logEntry.put("stack_trace", getStackTrace(throwable));
}
String jsonLog = objectMapper.writeValueAsString(logEntry);
switch (level) {
case "INFO" -> logger.info(jsonLog);
case "ERROR" -> logger.error(jsonLog);
case "WARN" -> logger.warn(jsonLog);
case "DEBUG" -> logger.debug(jsonLog);
default -> logger.info(jsonLog);
}
} catch (JsonProcessingException e) {
// Fallback to traditional logging
logger.error("Failed to create structured log: {}", e.getMessage());
logger.info(message);
}
}
private String getStackTrace(Throwable throwable) {
StringWriter sw = new StringWriter();
PrintWriter pw = new PrintWriter(sw);
throwable.printStackTrace(pw);
return sw.toString();
}
// Fluent API for building log entries
public LogBuilder info() {
return new LogBuilder("INFO");
}
public LogBuilder error() {
return new LogBuilder("ERROR");
}
public class LogBuilder {
private final String level;
private final Map<String, Object> fields = new HashMap<>();
private String message;
private Throwable throwable;
public LogBuilder(String level) {
this.level = level;
}
public LogBuilder message(String message) {
this.message = message;
return this;
}
public LogBuilder field(String key, Object value) {
this.fields.put(key, value);
return this;
}
public LogBuilder fields(Map<String, Object> additionalFields) {
this.fields.putAll(additionalFields);
return this;
}
public LogBuilder throwable(Throwable throwable) {
this.throwable = throwable;
return this;
}
public void log() {
StructuredLogger.this.log(level, message, fields, throwable);
}
}
}
3. Usage of Structured Logger
@Service
public class OrderService {
private final StructuredLogger logger;
private final OrderRepository orderRepository;
public OrderService(OrderRepository orderRepository) {
this.logger = new StructuredLogger(OrderService.class);
this.orderRepository = orderRepository;
}
public Order processOrder(OrderRequest request) {
// Set trace context
MDC.put("traceId", request.getTraceId());
MDC.put("userId", request.getUserId());
try {
logger.info()
.message("Processing order request")
.field("order_id", request.getOrderId())
.field("amount", request.getAmount())
.field("currency", request.getCurrency())
.log();
// Validate order
if (!isValidOrder(request)) {
logger.error()
.message("Invalid order request")
.field("order_id", request.getOrderId())
.field("validation_errors", getValidationErrors(request))
.log();
throw new ValidationException("Invalid order");
}
Order order = createOrder(request);
orderRepository.save(order);
logger.info()
.message("Order processed successfully")
.field("order_id", order.getId())
.field("status", order.getStatus())
.field("processing_time_ms", System.currentTimeMillis() - startTime)
.log();
return order;
} catch (Exception e) {
logger.error()
.message("Order processing failed")
.field("order_id", request.getOrderId())
.field("error_type", e.getClass().getSimpleName())
.throwable(e)
.log();
throw e;
} finally {
MDC.clear();
}
}
}
4. Spring Boot Configuration Class
@Configuration
public class LoggingConfig {
@Bean
@Profile("!test")
public LoggingEventListener loggingEventListener() {
return new LoggingEventListener();
}
@Bean
public FilterRegistrationBean<MDCClearingFilter> mdcClearingFilter() {
FilterRegistrationBean<MDCClearingFilter> registrationBean = new FilterRegistrationBean<>();
registrationBean.setFilter(new MDCClearingFilter());
registrationBean.addUrlPatterns("/*");
return registrationBean;
}
}
@Component
class LoggingEventListener {
private static final Logger logger = LoggerFactory.getLogger(LoggingEventListener.class);
@EventListener
public void handleApplicationReady(ApplicationReadyEvent event) {
Map<String, Object> startupFields = Map.of(
"event", "application_started",
"version", event.getSpringApplication().getMainApplicationClass().getPackage().getImplementationVersion(),
"profiles", String.join(",", event.getApplicationContext().getEnvironment().getActiveProfiles())
);
logger.info("Application started successfully", startupFields);
}
}

Logging Best Practices with JSON

1. Consistent Field Names
public class LogFields {
public static final String TRACE_ID = "trace_id";
public static final String USER_ID = "user_id";
public static final String DURATION_MS = "duration_ms";
public static final String ERROR_CODE = "error_code";
public static final String HTTP_STATUS = "http_status";
private LogFields() {
// constants class
}
}
2. Performance-Sensitive Logging
@Service
public class PerformanceAwareService {
private static final Logger logger = LoggerFactory.getLogger(PerformanceAwareService.class);
public void processItem(Item item) {
long startTime = System.currentTimeMillis();
try {
// Processing logic
processItemInternal(item);
} finally {
long duration = System.currentTimeMillis() - startTime;
if (duration > 1000) { // Log slow operations
logger.warn("Slow operation detected",
key("operation", "process_item"),
key("item_id", item.getId()),
key("duration_ms", duration));
}
if (logger.isDebugEnabled()) {
logger.debug("Operation completed",
key("operation", "process_item"),
key("item_id", item.getId()),
key("duration_ms", duration));
}
}
}
}
3. Security-Aware Logging
@Component
public class SecureLogger {
private static final List<String> SENSITIVE_FIELDS = Arrays.asList(
"password", "token", "secret", "credit_card", "ssn"
);
public Map<String, Object> sanitizeFields(Map<String, Object> fields) {
return fields.entrySet().stream()
.collect(Collectors.toMap(
Map.Entry::getKey,
entry -> isSensitive(entry.getKey()) ? "***REDACTED***" : entry.getValue()
));
}
private boolean isSensitive(String fieldName) {
return SENSITIVE_FIELDS.stream()
.anyMatch(sensitive -> fieldName.toLowerCase().contains(sensitive));
}
}

Testing JSON Logging

1. Logback Test Configuration
<!-- src/test/resources/logback-test.xml -->
<configuration>
<appender name="CONSOLE" class="ch.qos.logback.core.ConsoleAppender">
<encoder>
<pattern>%d{HH:mm:ss.SSS} [%thread] %-5level %logger{36} - %msg%n</pattern>
</encoder>
</appender>
<root level="INFO">
<appender-ref ref="CONSOLE" />
</root>
<!-- Silence noisy loggers during tests -->
<logger name="org.springframework" level="WARN"/>
<logger name="org.hibernate" level="WARN"/>
</configuration>
2. Unit Test for Structured Logging
@ExtendWith(MockitoExtension.class)
class StructuredLoggerTest {
@Test
void shouldLogStructuredData() {
// Given
StructuredLogger logger = new StructuredLogger(StructuredLoggerTest.class);
// When
logger.info("User action completed", Map.of(
"user_id", "12345",
"action", "login",
"success", true,
"duration_ms", 150
));
// Then - Verify log output (would typically use a test appender)
// The log should contain structured JSON with the provided fields
}
}

Sample JSON Output

Info Level Log:

{
"timestamp": "2023-10-15T14:30:45.123Z",
"level": "INFO",
"logger": "com.example.UserService",
"message": "User created successfully",
"thread": "http-nio-8080-exec-1",
"trace_id": "abc123def456",
"user_id": "[email protected]",
"service": "user-service",
"environment": "production",
"operation": "create_user",
"duration_ms": 45
}

Error Level Log:

{
"timestamp": "2023-10-15T14:31:22.456Z",
"level": "ERROR",
"logger": "com.example.OrderService",
"message": "Payment processing failed",
"thread": "http-nio-8080-exec-2",
"trace_id": "def456abc123",
"order_id": "ORD-789",
"error": "Insufficient funds",
"stack_trace": "com.example.PaymentException: Insufficient funds\n\tat com.example.PaymentService.process(PaymentService.java:45)",
"service": "order-service",
"environment": "production"
}

Best Practices

  1. Consistent Schema: Use consistent field names across all services
  2. Avoid Sensitive Data: Never log passwords, tokens, or PII
  3. Meaningful Messages: Ensure log messages are descriptive and actionable
  4. Appropriate Levels: Use correct log levels (DEBUG, INFO, WARN, ERROR)
  5. Context Enrichment: Include relevant context in every log entry
  6. Performance: Use parameterized logging and guard conditions
  7. Structured Data: Prefer structured fields over string concatenation

Conclusion

Logback JSON encoder provides:

  • Structured logging for better log analysis
  • Easy integration with log aggregation systems
  • Rich contextual information in every log entry
  • Consistent log format across microservices
  • Better searchability and filtering capabilities

By implementing JSON logging with Logback, you can significantly improve your application's observability and make log analysis much more efficient. The structured format enables powerful querying, alerting, and visualization in modern log management systems.

Java Observability, Logging Intelligence & AI-Driven Monitoring (APM, Tracing, Logs & Anomaly Detection)

https://macronepal.com/blog/beyond-metrics-observing-serverless-and-traditional-java-applications-with-thundra-apm/
Explains using Thundra APM to observe both serverless and traditional Java applications by combining tracing, metrics, and logs into a unified observability platform for faster debugging and performance insights.

https://macronepal.com/blog/dynatrace-oneagent-in-java-2/
Explains Dynatrace OneAgent for Java, which automatically instruments JVM applications to capture metrics, traces, and logs, enabling full-stack monitoring and root-cause analysis with minimal configuration.

https://macronepal.com/blog/lightstep-java-sdk-distributed-tracing-and-observability-implementation/
Explains Lightstep Java SDK for distributed tracing, helping developers track requests across microservices and identify latency issues using OpenTelemetry-based observability.

https://macronepal.com/blog/honeycomb-io-beeline-for-java-complete-guide-2/
Explains Honeycomb Beeline for Java, which provides high-cardinality observability and deep query capabilities to understand complex system behavior and debug distributed systems efficiently.

https://macronepal.com/blog/lumigo-for-serverless-in-java-complete-distributed-tracing-guide-2/
Explains Lumigo for Java serverless applications, offering automatic distributed tracing, log correlation, and error tracking to simplify debugging in cloud-native environments. (Lumigo Docs)

https://macronepal.com/blog/from-noise-to-signals-implementing-log-anomaly-detection-in-java-applications/
Explains how to detect anomalies in Java logs using behavioral patterns and machine learning techniques to separate meaningful incidents from noisy log data and improve incident response.

https://macronepal.com/blog/ai-powered-log-analysis-in-java-from-reactive-debugging-to-proactive-insights/
Explains AI-driven log analysis for Java applications, shifting from manual debugging to predictive insights that identify issues early and improve system reliability using intelligent log processing.

https://macronepal.com/blog/titliel-java-logging-best-practices/
Explains best practices for Java logging, focusing on structured logs, proper log levels, performance optimization, and ensuring logs are useful for debugging and observability systems.

https://macronepal.com/blog/seeking-a-loguru-for-java-the-quest-for-elegant-and-simple-logging/
Explains the search for simpler, more elegant logging frameworks in Java, comparing modern logging approaches that aim to reduce complexity while improving readability and developer experience.

Leave a Reply

Your email address will not be published. Required fields are marked *


Macro Nepal Helper