Structured logging transforms traditional text-based logs into machine-readable JSON format, enabling better log analysis, filtering, and integration with modern observability platforms.
Core Concepts
What is Structured Logging?
- Logging data as structured key-value pairs instead of unstructured text
- Typically output in JSON format for easy parsing
- Enables powerful querying and analysis in log management systems
Benefits:
- Machine-readable: Easy to parse and process
- Better searchability: Query specific fields
- Consistent schema: Standardized log format across services
- Rich context: Include business-specific fields
- Integration: Works seamlessly with ELK Stack, Splunk, CloudWatch
Dependencies and Setup
Maven Dependencies
<properties>
<logback.version>1.4.11</logback.version>
<logstash-logback-encoder.version>7.4</logstash-logback-encoder.version>
<slf4j.version>2.0.7</slf4j.version>
<jackson.version>2.15.2</jackson.version>
</properties>
<dependencies>
<!-- Logging -->
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-api</artifactId>
<version>${slf4j.version}</version>
</dependency>
<dependency>
<groupId>ch.qos.logback</groupId>
<artifactId>logback-classic</artifactId>
<version>${logback.version}</version>
</dependency>
<!-- JSON Logging -->
<dependency>
<groupId>net.logstash.logback</groupId>
<artifactId>logstash-logback-encoder</artifactId>
<version>${logstash-logback-encoder.version}</version>
</dependency>
<!-- Jackson for JSON processing -->
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-databind</artifactId>
<version>${jackson.version}</version>
</dependency>
</dependencies>
Logback Configuration
<!-- src/main/resources/logback-spring.xml -->
<configuration>
<!-- Console Appender for Development -->
<appender name="CONSOLE_JSON" class="ch.qos.logback.core.ConsoleAppender">
<encoder class="net.logstash.logback.encoder.LoggingEventCompositeJsonEncoder">
<providers>
<timestamp>
<timeZone>UTC</timeZone>
</timestamp>
<logLevel/>
<loggerName/>
<message/>
<mdc/>
<arguments/>
<stackTrace>
<throwableConverter class="net.logstash.logback.stacktrace.ShortenedThrowableConverter">
<maxDepthPerThrowable>30</maxDepthPerThrowable>
<maxLength>2048</maxLength>
<shortenedClassNameLength>20</shortenedClassNameLength>
<rootCauseFirst>true</rootCauseFirst>
</throwableConverter>
</stackTrace>
<context/>
<pattern>
<pattern>
{
"service": "user-service",
"environment": "${ENVIRONMENT:-local}",
"version": "${APP_VERSION:-1.0.0}"
}
</pattern>
</pattern>
</providers>
</encoder>
</appender>
<!-- File Appender for Production -->
<appender name="FILE_JSON" class="ch.qos.logback.core.rolling.RollingFileAppender">
<file>logs/application.json</file>
<rollingPolicy class="ch.qos.logback.core.rolling.TimeBasedRollingPolicy">
<fileNamePattern>logs/application.%d{yyyy-MM-dd}.json.gz</fileNamePattern>
<maxHistory>30</maxHistory>
</rollingPolicy>
<encoder class="net.logstash.logback.encoder.LoggingEventCompositeJsonEncoder">
<providers>
<timestamp/>
<logLevel/>
<loggerName/>
<message/>
<mdc/>
<stackTrace/>
<context/>
</providers>
</encoder>
</appender>
<!-- Async Appender for Better Performance -->
<appender name="ASYNC_JSON" class="ch.qos.logback.classic.AsyncAppender">
<appender-ref ref="FILE_JSON" />
<queueSize>10000</queueSize>
<discardingThreshold>0</discardingThreshold>
<includeCallerData>true</includeCallerData>
</appender>
<!-- Development Profile - Pretty Print JSON -->
<springProfile name="dev">
<appender name="PRETTY_JSON" class="ch.qos.logback.core.ConsoleAppender">
<encoder class="net.logstash.logback.encoder.LoggingEventCompositeJsonEncoder">
<providers>
<timestamp/>
<logLevel/>
<loggerName/>
<message/>
<mdc/>
<stackTrace/>
</providers>
<jsonGeneratorDecorator class="net.logstash.logback.decorate.PrettyPrintingJsonGeneratorDecorator"/>
</encoder>
</appender>
<root level="INFO">
<appender-ref ref="PRETTY_JSON" />
</root>
</springProfile>
<!-- Production Profile -->
<springProfile name="prod">
<root level="INFO">
<appender-ref ref="ASYNC_JSON" />
</root>
</springProfile>
<!-- Default Profile -->
<springProfile name="!dev & !prod">
<root level="INFO">
<appender-ref ref="CONSOLE_JSON" />
</root>
</springProfile>
<!-- Specific logger configurations -->
<logger name="com.example" level="DEBUG" additivity="false">
<appender-ref ref="CONSOLE_JSON" />
</logger>
</configuration>
Core Implementation
1. Structured Logger Wrapper
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.slf4j.MDC;
import com.fasterxml.jackson.core.JsonProcessingException;
import com.fasterxml.jackson.databind.ObjectMapper;
public class StructuredLogger {
private final Logger logger;
private final ObjectMapper objectMapper;
private final String serviceName;
public StructuredLogger(Class<?> clazz) {
this.logger = LoggerFactory.getLogger(clazz);
this.objectMapper = new ObjectMapper();
this.serviceName = "user-service"; // Could be from config
}
public void info(String message, Object... fields) {
log("INFO", message, null, fields);
}
public void warn(String message, Object... fields) {
log("WARN", message, null, fields);
}
public void error(String message, Throwable throwable, Object... fields) {
log("ERROR", message, throwable, fields);
}
public void debug(String message, Object... fields) {
log("DEBUG", message, null, fields);
}
private void log(String level, String message, Throwable throwable, Object... fields) {
try {
LogEntry logEntry = buildLogEntry(level, message, throwable, fields);
String jsonLog = objectMapper.writeValueAsString(logEntry);
switch (level) {
case "INFO" -> logger.info(jsonLog);
case "WARN" -> logger.warn(jsonLog);
case "ERROR" -> logger.error(jsonLog, throwable);
case "DEBUG" -> logger.debug(jsonLog);
default -> logger.info(jsonLog);
}
} catch (JsonProcessingException e) {
// Fallback to traditional logging
logger.error("Failed to create structured log: {}", message, e);
}
}
private LogEntry buildLogEntry(String level, String message, Throwable throwable, Object... fields) {
LogEntry entry = new LogEntry();
entry.setTimestamp(java.time.Instant.now().toString());
entry.setLevel(level);
entry.setService(serviceName);
entry.setMessage(message);
entry.setThread(Thread.currentThread().getName());
entry.setLogger(logger.getName());
// Add MDC context
entry.setMdc(MDC.getCopyOfContextMap());
// Add custom fields
if (fields != null && fields.length > 0) {
processCustomFields(entry, fields);
}
// Add exception details
if (throwable != null) {
entry.setException(throwable.getClass().getName());
entry.setErrorMessage(throwable.getMessage());
entry.setStacktrace(getStackTrace(throwable));
}
return entry;
}
private void processCustomFields(LogEntry entry, Object[] fields) {
for (int i = 0; i < fields.length; i += 2) {
if (i + 1 < fields.length) {
String key = fields[i].toString();
Object value = fields[i + 1];
entry.addField(key, value);
}
}
}
private String getStackTrace(Throwable throwable) {
if (throwable == null) return null;
java.io.StringWriter sw = new java.io.StringWriter();
java.io.PrintWriter pw = new java.io.PrintWriter(sw);
throwable.printStackTrace(pw);
return sw.toString();
}
}
2. Log Entry Model
import java.util.HashMap;
import java.util.Map;
public class LogEntry {
private String timestamp;
private String level;
private String service;
private String message;
private String thread;
private String logger;
private Map<String, Object> mdc;
private String exception;
private String errorMessage;
private String stacktrace;
private Map<String, Object> fields = new HashMap<>();
// Getters and Setters
public String getTimestamp() { return timestamp; }
public void setTimestamp(String timestamp) { this.timestamp = timestamp; }
public String getLevel() { return level; }
public void setLevel(String level) { this.level = level; }
public String getService() { return service; }
public void setService(String service) { this.service = service; }
public String getMessage() { return message; }
public void setMessage(String message) { this.message = message; }
public String getThread() { return thread; }
public void setThread(String thread) { this.thread = thread; }
public String getLogger() { return logger; }
public void setLogger(String logger) { this.logger = logger; }
public Map<String, Object> getMdc() { return mdc; }
public void setMdc(Map<String, Object> mdc) { this.mdc = mdc; }
public String getException() { return exception; }
public void setException(String exception) { this.exception = exception; }
public String getErrorMessage() { return errorMessage; }
public void setErrorMessage(String errorMessage) { this.errorMessage = errorMessage; }
public String getStacktrace() { return stacktrace; }
public void setStacktrace(String stacktrace) { this.stacktrace = stacktrace; }
public Map<String, Object> getFields() { return fields; }
public void setFields(Map<String, Object> fields) { this.fields = fields; }
public void addField(String key, Object value) {
this.fields.put(key, value);
}
}
3. MDC (Mapped Diagnostic Context) Manager
import org.slf4j.MDC;
import org.springframework.stereotype.Component;
import java.util.Map;
import java.util.UUID;
@Component
public class LogContext {
public static final String TRACE_ID = "traceId";
public static final String SPAN_ID = "spanId";
public static final String USER_ID = "userId";
public static final String SESSION_ID = "sessionId";
public static final String CORRELATION_ID = "correlationId";
public static final String REQUEST_PATH = "requestPath";
public static final String HTTP_METHOD = "httpMethod";
public void put(String key, String value) {
if (value != null) {
MDC.put(key, value);
}
}
public String get(String key) {
return MDC.get(key);
}
public void remove(String key) {
MDC.remove(key);
}
public void clear() {
MDC.clear();
}
public void initializeContext() {
if (MDC.get(TRACE_ID) == null) {
MDC.put(TRACE_ID, generateId());
}
if (MDC.get(CORRELATION_ID) == null) {
MDC.put(CORRELATION_ID, generateId());
}
}
public void setRequestContext(String method, String path, String userId) {
put(HTTP_METHOD, method);
put(REQUEST_PATH, path);
put(USER_ID, userId);
initializeContext();
}
public Map<String, String> getCopyOfContextMap() {
return MDC.getCopyOfContextMap();
}
private String generateId() {
return UUID.randomUUID().toString().replace("-", "").substring(0, 16);
}
// AutoCloseable for try-with-resources
public static class CloseableContext implements AutoCloseable {
private final Map<String, String> previousContext;
public CloseableContext(Map<String, String> newContext) {
this.previousContext = MDC.getCopyOfContextMap();
MDC.setContextMap(newContext);
}
@Override
public void close() {
MDC.setContextMap(previousContext);
}
}
}
Spring Boot Integration
1. HTTP Filter for Request Context
@Component
public class LoggingFilter implements Filter {
private final LogContext logContext;
public LoggingFilter(LogContext logContext) {
this.logContext = logContext;
}
@Override
public void doFilter(ServletRequest request, ServletResponse response, FilterChain chain)
throws IOException, ServletException {
HttpServletRequest httpRequest = (HttpServletRequest) request;
HttpServletResponse httpResponse = (HttpServletResponse) response;
try {
// Initialize logging context
initializeLogContext(httpRequest);
// Log request
logRequest(httpRequest);
long startTime = System.currentTimeMillis();
chain.doFilter(request, response);
long duration = System.currentTimeMillis() - startTime;
// Log response
logResponse(httpRequest, httpResponse, duration);
} finally {
logContext.clear();
}
}
private void initializeLogContext(HttpServletRequest request) {
String userId = extractUserId(request);
logContext.setRequestContext(
request.getMethod(),
request.getRequestURI(),
userId
);
}
private void logRequest(HttpServletRequest request) {
StructuredLogger logger = new StructuredLogger(LoggingFilter.class);
logger.info("Incoming HTTP request",
"method", request.getMethod(),
"path", request.getRequestURI(),
"query", request.getQueryString(),
"userAgent", request.getHeader("User-Agent"),
"clientIp", getClientIp(request));
}
private void logResponse(HttpServletRequest request, HttpServletResponse response, long duration) {
StructuredLogger logger = new StructuredLogger(LoggingFilter.class);
logger.info("HTTP request completed",
"method", request.getMethod(),
"path", request.getRequestURI(),
"status", response.getStatus(),
"durationMs", duration,
"responseSize", response.getHeader("Content-Length"));
}
private String extractUserId(HttpServletRequest request) {
// Extract from JWT, session, or header
return request.getHeader("X-User-Id");
}
private String getClientIp(HttpServletRequest request) {
String xForwardedFor = request.getHeader("X-Forwarded-For");
if (xForwardedFor != null && !xForwardedFor.isEmpty()) {
return xForwardedFor.split(",")[0];
}
return request.getRemoteAddr();
}
}
2. Spring AOP for Method Logging
@Aspect
@Component
public class MethodLoggingAspect {
private final StructuredLogger logger = new StructuredLogger(MethodLoggingAspect.class);
@Around("@annotation(LogMethod)")
public Object logMethodExecution(ProceedingJoinPoint joinPoint) throws Throwable {
String methodName = joinPoint.getSignature().toShortString();
Object[] args = joinPoint.getArgs();
logger.debug("Method execution started",
"method", methodName,
"args", Arrays.toString(args));
long startTime = System.currentTimeMillis();
try {
Object result = joinPoint.proceed();
long duration = System.currentTimeMillis() - startTime;
logger.debug("Method execution completed",
"method", methodName,
"durationMs", duration,
"result", sanitizeResult(result));
return result;
} catch (Exception e) {
long duration = System.currentTimeMillis() - startTime;
logger.error("Method execution failed",
e,
"method", methodName,
"durationMs", duration,
"errorType", e.getClass().getSimpleName());
throw e;
}
}
private Object sanitizeResult(Object result) {
if (result == null) return null;
// Don't log large objects or sensitive data
if (result instanceof String stringResult) {
return stringResult.length() > 100 ?
stringResult.substring(0, 100) + "..." : stringResult;
}
// For collections, log only size
if (result instanceof Collection<?> collection) {
return "Collection[size=" + collection.size() + "]";
}
if (result instanceof Map<?, ?> map) {
return "Map[size=" + map.size() + "]";
}
return result.toString();
}
}
@Retention(RetentionPolicy.RUNTIME)
@Target(ElementType.METHOD)
public @interface LogMethod {
Level value() default Level.DEBUG;
enum Level {
DEBUG, INFO, WARN
}
}
3. Service Implementation Examples
@Service
public class UserService {
private final StructuredLogger logger = new StructuredLogger(UserService.class);
private final LogContext logContext;
public UserService(LogContext logContext) {
this.logContext = logContext;
}
@LogMethod
public User createUser(CreateUserRequest request) {
logger.info("Creating new user",
"email", request.getEmail(),
"username", request.getUsername(),
"requestSource", request.getSource());
try {
// Validate user
validateUser(request);
// Create user in database
User user = saveUser(request);
logger.info("User created successfully",
"userId", user.getId(),
"email", user.getEmail());
return user;
} catch (ValidationException e) {
logger.warn("User creation failed validation",
"email", request.getEmail(),
"validationError", e.getMessage());
throw e;
}
}
@LogMethod(LogMethod.Level.INFO)
public Page<User> searchUsers(UserSearchCriteria criteria) {
logger.info("Searching users",
"criteria", criteria.toString(),
"page", criteria.getPage(),
"size", criteria.getSize());
Page<User> results = userRepository.search(criteria);
logger.info("User search completed",
"resultCount", results.getTotalElements(),
"pageCount", results.getTotalPages());
return results;
}
public void processBatchUsers(List<User> users) {
try (LogContext.CloseableContext ctx =
new LogContext.CloseableContext(createBatchContext())) {
logger.info("Starting batch user processing",
"batchSize", users.size(),
"processingType", "BATCH_UPDATE");
int successCount = 0;
int failureCount = 0;
for (User user : users) {
try {
processSingleUser(user);
successCount++;
} catch (Exception e) {
failureCount++;
logger.error("Failed to process user in batch",
e,
"userId", user.getId(),
"batchPosition", users.indexOf(user));
}
}
logger.info("Batch processing completed",
"totalProcessed", users.size(),
"successCount", successCount,
"failureCount", failureCount,
"successRate", calculateSuccessRate(successCount, users.size()));
}
}
private Map<String, String> createBatchContext() {
Map<String, String> context = new HashMap<>();
context.put(LogContext.TRACE_ID, "batch-" + System.currentTimeMillis());
context.put("processingMode", "BATCH");
return context;
}
private double calculateSuccessRate(int success, int total) {
return total > 0 ? (double) success / total * 100 : 0;
}
// ... other methods
}
4. REST Controller with Structured Logging
@RestController
@RequestMapping("/api/users")
public class UserController {
private final StructuredLogger logger = new StructuredLogger(UserController.class);
private final UserService userService;
private final LogContext logContext;
public UserController(UserService userService, LogContext logContext) {
this.userService = userService;
this.logContext = logContext;
}
@PostMapping
public ResponseEntity<UserResponse> createUser(@Valid @RequestBody CreateUserRequest request) {
logger.info("Received create user request",
"email", request.getEmail(),
"username", request.getUsername());
try {
User user = userService.createUser(request);
UserResponse response = UserResponse.from(user);
logger.info("User creation API completed",
"userId", user.getId(),
"status", "SUCCESS");
return ResponseEntity.status(HttpStatus.CREATED).body(response);
} catch (ValidationException e) {
logger.warn("User creation API validation failed",
"email", request.getEmail(),
"error", e.getMessage());
return ResponseEntity.badRequest().build();
}
}
@GetMapping("/{userId}")
public ResponseEntity<UserResponse> getUser(@PathVariable String userId) {
logger.debug("Fetching user by ID",
"userId", userId);
try {
User user = userService.findById(userId);
if (user == null) {
logger.warn("User not found",
"userId", userId);
return ResponseEntity.notFound().build();
}
return ResponseEntity.ok(UserResponse.from(user));
} catch (Exception e) {
logger.error("Failed to fetch user",
e,
"userId", userId,
"operation", "GET_USER");
return ResponseEntity.status(HttpStatus.INTERNAL_SERVER_ERROR).build();
}
}
@GetMapping
public ResponseEntity<PageResponse<UserResponse>> searchUsers(
@ModelAttribute UserSearchRequest request) {
logger.info("Searching users with criteria",
"filters", request.getFilters(),
"page", request.getPage(),
"size", request.getSize(),
"sortBy", request.getSortBy());
UserSearchCriteria criteria = request.toCriteria();
Page<User> users = userService.searchUsers(criteria);
PageResponse<UserResponse> response = PageResponse.from(users, UserResponse::from);
logger.info("User search API completed",
"totalResults", users.getTotalElements(),
"returnedResults", users.getContent().size());
return ResponseEntity.ok(response);
}
}
5. Exception Handler with Structured Logging
@ControllerAdvice
public class GlobalExceptionHandler {
private final StructuredLogger logger = new StructuredLogger(GlobalExceptionHandler.class);
@ExceptionHandler(Exception.class)
public ResponseEntity<ErrorResponse> handleGenericException(Exception e, WebRequest request) {
String path = ((ServletWebRequest) request).getRequest().getRequestURI();
logger.error("Unhandled exception in API",
e,
"path", path,
"exceptionType", e.getClass().getSimpleName(),
"handledBy", "GlobalExceptionHandler");
ErrorResponse error = ErrorResponse.builder()
.errorCode("INTERNAL_ERROR")
.message("An unexpected error occurred")
.timestamp(Instant.now())
.path(path)
.build();
return ResponseEntity.status(HttpStatus.INTERNAL_SERVER_ERROR).body(error);
}
@ExceptionHandler(ValidationException.class)
public ResponseEntity<ErrorResponse> handleValidationException(ValidationException e, WebRequest request) {
String path = ((ServletWebRequest) request).getRequest().getRequestURI();
logger.warn("Validation exception in API",
"path", path,
"validationErrors", e.getErrors(),
"exceptionType", e.getClass().getSimpleName());
ErrorResponse error = ErrorResponse.builder()
.errorCode("VALIDATION_ERROR")
.message("Request validation failed")
.details(e.getErrors())
.timestamp(Instant.now())
.path(path)
.build();
return ResponseEntity.badRequest().body(error);
}
@ExceptionHandler(ResourceNotFoundException.class)
public ResponseEntity<ErrorResponse> handleResourceNotFound(ResourceNotFoundException e, WebRequest request) {
String path = ((ServletWebRequest) request).getRequest().getRequestURI();
logger.info("Resource not found",
"path", path,
"resourceType", e.getResourceType(),
"resourceId", e.getResourceId());
ErrorResponse error = ErrorResponse.builder()
.errorCode("NOT_FOUND")
.message(e.getMessage())
.timestamp(Instant.now())
.path(path)
.build();
return ResponseEntity.status(HttpStatus.NOT_FOUND).body(error);
}
}
Configuration and Customization
1. Application Properties
# application.yml
logging:
level:
com.example: DEBUG
org.springframework: INFO
org.hibernate: WARN
config: classpath:logback-spring.xml
app:
logging:
service-name: user-service
environment: ${ENVIRONMENT:local}
version: ${APP_VERSION:1.0.0}
enable-json: true
include-mdc: true
pretty-print: ${PRETTY_LOGS:false}
2. Custom JSON Layout
@Component
public class CustomJsonLayout extends LogstashLayout {
public CustomJsonLayout() {
setTimestampPattern("yyyy-MM-dd'T'HH:mm:ss.SSS'Z'");
setTimeZone("UTC");
setIncludeMdc(true);
setIncludeContext(true);
setIncludeStructuredData(true);
}
@Override
public String doLayout(ILoggingEvent event) {
// Custom logic to modify JSON output
String json = super.doLayout(event);
// Add custom fields or modify existing ones
// This is where you can implement custom serialization logic
return json;
}
}
Testing Structured Logging
1. Log Capture in Tests
@ExtendWith(SpringExtension.class)
@SpringBootTest
@TestPropertySource(properties = {
"app.logging.enable-json=true",
"app.logging.pretty-print=false"
})
public class StructuredLoggingTest {
@Autowired
private UserService userService;
@Captor
private ArgumentCaptor<String> logCaptor;
@Test
void shouldLogStructuredJson() throws JSONException {
CreateUserRequest request = new CreateUserRequest("[email protected]", "john_doe");
userService.createUser(request);
// Verify logs were written
verify(loggingAppender).doAppend(logCaptor.capture());
String logMessage = logCaptor.getValue();
JSONObject jsonLog = new JSONObject(logMessage);
assertThat(jsonLog.getString("level")).isEqualTo("INFO");
assertThat(jsonLog.getString("message")).contains("Creating new user");
assertThat(jsonLog.getString("email")).isEqualTo("[email protected]");
assertThat(jsonLog.has("timestamp")).isTrue();
assertThat(jsonLog.has("traceId")).isTrue();
}
}
// Test Appender for capturing logs
class TestLogAppender extends AppenderBase<ILoggingEvent> {
private final List<ILoggingEvent> events = new ArrayList<>();
@Override
protected void append(ILoggingEvent event) {
events.add(event);
}
public List<ILoggingEvent> getEvents() {
return events;
}
public void clear() {
events.clear();
}
}
Best Practices
- Consistent Field Names: Use standardized field names across services
- Sensitive Data: Never log passwords, tokens, or PII
- Field Cardinality: Avoid high-cardinality fields in JSON
- Performance: Use async appenders in production
- Structured Exceptions: Log exceptions with proper context
- Correlation IDs: Include trace IDs for distributed tracing
// Example of safe logging practices
public class SafeStructuredLogger extends StructuredLogger {
@Override
protected void processCustomFields(LogEntry entry, Object[] fields) {
for (int i = 0; i < fields.length; i += 2) {
if (i + 1 < fields.length) {
String key = fields[i].toString();
Object value = fields[i + 1];
if (!isSensitiveField(key)) {
entry.addField(key, sanitizeValue(key, value));
}
}
}
}
private boolean isSensitiveField(String key) {
List<String> sensitivePatterns = List.of("password", "token", "secret", "credit", "ssn");
return sensitivePatterns.stream()
.anyMatch(pattern -> key.toLowerCase().contains(pattern));
}
private Object sanitizeValue(String key, Object value) {
if (value instanceof String stringValue) {
if (isSensitiveField(key)) {
return "***REDACTED***";
}
if (key.toLowerCase().contains("email")) {
return maskEmail(stringValue);
}
}
return value;
}
private String maskEmail(String email) {
// Basic email masking for logs
int atIndex = email.indexOf('@');
if (atIndex > 2) {
return email.substring(0, 2) + "***" + email.substring(atIndex);
}
return "***@***";
}
}
Sample Log Output
{
"timestamp": "2023-10-15T14:30:45.123Z",
"level": "INFO",
"service": "user-service",
"message": "User created successfully",
"thread": "http-nio-8080-exec-1",
"logger": "com.example.service.UserService",
"mdc": {
"traceId": "abc123def4567890",
"userId": "user-123",
"requestPath": "/api/users"
},
"fields": {
"userId": "user-123",
"email": "jo***@example.com",
"operation": "CREATE_USER",
"durationMs": 45
}
}
{
"timestamp": "2023-10-15T14:30:46.234Z",
"level": "ERROR",
"service": "user-service",
"message": "Failed to process user in batch",
"thread": "batch-processor-1",
"logger": "com.example.service.UserService",
"exception": "com.example.exception.DatabaseException",
"errorMessage": "Connection timeout",
"stacktrace": "com.example.exception.DatabaseException: Connection timeout...",
"mdc": {
"traceId": "batch-1697380245123",
"processingMode": "BATCH"
},
"fields": {
"userId": "user-456",
"batchPosition": 15,
"batchId": "batch-123"
}
}
Conclusion
Structured JSON logging provides:
- Machine-readable logs for automated processing
- Enhanced searchability and filtering capabilities
- Better integration with modern observability platforms
- Rich context for debugging and monitoring
- Consistent format across microservices
By implementing structured logging with JSON, you transform your logs from simple text messages into valuable data streams that can be efficiently analyzed, correlated, and used for both debugging and business intelligence purposes. The combination of proper MDC management, consistent field naming, and thoughtful log enrichment creates a powerful foundation for application observability.
Java Observability, Logging Intelligence & AI-Driven Monitoring (APM, Tracing, Logs & Anomaly Detection)
https://macronepal.com/blog/beyond-metrics-observing-serverless-and-traditional-java-applications-with-thundra-apm/
Explains using Thundra APM to observe both serverless and traditional Java applications by combining tracing, metrics, and logs into a unified observability platform for faster debugging and performance insights.
https://macronepal.com/blog/dynatrace-oneagent-in-java-2/
Explains Dynatrace OneAgent for Java, which automatically instruments JVM applications to capture metrics, traces, and logs, enabling full-stack monitoring and root-cause analysis with minimal configuration.
https://macronepal.com/blog/lightstep-java-sdk-distributed-tracing-and-observability-implementation/
Explains Lightstep Java SDK for distributed tracing, helping developers track requests across microservices and identify latency issues using OpenTelemetry-based observability.
https://macronepal.com/blog/honeycomb-io-beeline-for-java-complete-guide-2/
Explains Honeycomb Beeline for Java, which provides high-cardinality observability and deep query capabilities to understand complex system behavior and debug distributed systems efficiently.
https://macronepal.com/blog/lumigo-for-serverless-in-java-complete-distributed-tracing-guide-2/
Explains Lumigo for Java serverless applications, offering automatic distributed tracing, log correlation, and error tracking to simplify debugging in cloud-native environments. (Lumigo Docs)
https://macronepal.com/blog/from-noise-to-signals-implementing-log-anomaly-detection-in-java-applications/
Explains how to detect anomalies in Java logs using behavioral patterns and machine learning techniques to separate meaningful incidents from noisy log data and improve incident response.
https://macronepal.com/blog/ai-powered-log-analysis-in-java-from-reactive-debugging-to-proactive-insights/
Explains AI-driven log analysis for Java applications, shifting from manual debugging to predictive insights that identify issues early and improve system reliability using intelligent log processing.
https://macronepal.com/blog/titliel-java-logging-best-practices/
Explains best practices for Java logging, focusing on structured logs, proper log levels, performance optimization, and ensuring logs are useful for debugging and observability systems.
https://macronepal.com/blog/seeking-a-loguru-for-java-the-quest-for-elegant-and-simple-logging/
Explains the search for simpler, more elegant logging frameworks in Java, comparing modern logging approaches that aim to reduce complexity while improving readability and developer experience.