CloudWatch Embedded Metrics in Java: Structured Logging for AWS Monitoring

CloudWatch Embedded Metric Format (EMF) enables Java applications to generate structured logs that AWS CloudWatch automatically parses into custom metrics. This approach eliminates the need for custom metric APIs while providing rich, contextual monitoring data. Java developers can leverage EMF to create detailed observability with minimal overhead.

Understanding CloudWatch EMF Architecture

Key Components:

  • Structured JSON logs with embedded metric definitions
  • Automatic metric extraction by CloudWatch Logs
  • Custom dimensions for metric segmentation
  • High-cardinality data in log context
  • Asynchronous processing without blocking application code

Core Implementation Patterns

1. Project Setup and Dependencies

Configure AWS SDK and JSON processing dependencies.

Maven Configuration:

<dependencies>
<!-- AWS SDK for CloudWatch -->
<dependency>
<groupId>software.amazon.awssdk</groupId>
<artifactId>cloudwatch</artifactId>
<version>2.20.0</version>
</dependency>
<!-- AWS SDK for CloudWatch Logs -->
<dependency>
<groupId>software.amazon.awssdk</groupId>
<artifactId>cloudwatchlogs</artifactId>
<version>2.20.0</version>
</dependency>
<!-- JSON Processing -->
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-databind</artifactId>
<version>2.15.2</version>
</dependency>
<!-- Logging Framework -->
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-api</artifactId>
<version>2.0.7</version>
</dependency>
<!-- Micrometer for additional metrics -->
<dependency>
<groupId>io.micrometer</groupId>
<artifactId>micrometer-core</artifactId>
<version>1.11.5</version>
</dependency>
</dependencies>

2. EMF Domain Models

Create comprehensive models for EMF data structures.

Core Domain Models:

@Data
public class EmbeddedMetric {
private final String name;
private final double value;
private final Unit unit;
private final Map<String, String> dimensions;
public EmbeddedMetric(String name, double value, Unit unit) {
this(name, value, unit, new HashMap<>());
}
public EmbeddedMetric(String name, double value, Unit unit, Map<String, String> dimensions) {
this.name = name;
this.value = value;
this.unit = unit;
this.dimensions = new HashMap<>(dimensions);
}
public EmbeddedMetric addDimension(String key, String value) {
this.dimensions.put(key, value);
return this;
}
public enum Unit {
SECONDS("Seconds"),
MICROSECONDS("Microseconds"),
MILLISECONDS("Milliseconds"),
BYTES("Bytes"),
KILOBYTES("Kilobytes"),
MEGABYTES("Megabytes"),
GIGABYTES("Gigabytes"),
TERABYTES("Terabytes"),
BITS("Bits"),
KILOBITS("Kilobits"),
MEGABITS("Megabits"),
GIGABITS("Gigabits"),
TERABITS("Terabits"),
PERCENT("Percent"),
COUNT("Count"),
BYTES_PER_SECOND("Bytes/Second"),
KILOBYTES_PER_SECOND("Kilobytes/Second"),
MEGABYTES_PER_SECOND("Megabytes/Second"),
GIGABYTES_PER_SECOND("Gigabytes/Second"),
TERABYTES_PER_SECOND("Terabytes/Second"),
BITS_PER_SECOND("Bits/Second"),
KILOBITS_PER_SECOND("Kilobits/Second"),
MEGABITS_PER_SECOND("Megabits/Second"),
GIGABITS_PER_SECOND("Gigabits/Second"),
TERABITS_PER_SECOND("Terabits/Second"),
COUNT_PER_SECOND("Count/Second"),
NONE("None");
private final String value;
Unit(String value) {
this.value = value;
}
public String getValue() {
return value;
}
}
}
@Data
public class MetricDocument {
private String namespace;
private List<Map<String, Object>> metrics;
private Map<String, String> dimensions;
private Map<String, Object> properties;
private long timestamp;
public MetricDocument(String namespace) {
this.namespace = namespace;
this.metrics = new ArrayList<>();
this.dimensions = new HashMap<>();
this.properties = new HashMap<>();
this.timestamp = System.currentTimeMillis();
}
public MetricDocument addMetric(EmbeddedMetric metric) {
Map<String, Object> metricMap = new HashMap<>();
metricMap.put("Name", metric.getName());
metricMap.put("Unit", metric.getUnit().getValue());
// Add value based on type
if (metric.getValue() == (int) metric.getValue()) {
metricMap.put("Value", (int) metric.getValue());
} else {
metricMap.put("Value", metric.getValue());
}
this.metrics.add(metricMap);
return this;
}
public MetricDocument addDimension(String key, String value) {
this.dimensions.put(key, value);
return this;
}
public MetricDocument addProperty(String key, Object value) {
this.properties.put(key, value);
return this;
}
public String toJsonString() {
try {
ObjectMapper mapper = new ObjectMapper();
Map<String, Object> document = new HashMap<>();
// Add AWS EMF structure
document.put("_aws", createAwsStructure());
// Add custom properties
document.putAll(properties);
return mapper.writerWithDefaultPrettyPrinter().writeValueAsString(document);
} catch (Exception e) {
throw new RuntimeException("Failed to serialize metric document to JSON", e);
}
}
private Map<String, Object> createAwsStructure() {
Map<String, Object> aws = new HashMap<>();
aws.put("Timestamp", timestamp);
aws.put("CloudWatchMetrics", createCloudWatchMetricsStructure());
return aws;
}
private List<Map<String, Object>> createCloudWatchMetricsStructure() {
Map<String, Object> metricsDefinition = new HashMap<>();
metricsDefinition.put("Namespace", namespace);
metricsDefinition.put("Dimensions", List.of(new ArrayList<>(dimensions.keySet())));
metricsDefinition.put("Metrics", metrics.stream()
.map(m -> {
Map<String, Object> metricDef = new HashMap<>();
metricDef.put("Name", m.get("Name"));
metricDef.put("Unit", m.get("Unit"));
return metricDef;
})
.collect(Collectors.toList()));
return List.of(metricsDefinition);
}
}
@Data
public class MetricBatch {
private final List<MetricDocument> documents;
private final String logGroupName;
private final String logStreamName;
public MetricBatch(String logGroupName, String logStreamName) {
this.documents = new ArrayList<>();
this.logGroupName = logGroupName;
this.logStreamName = logStreamName;
}
public void addDocument(MetricDocument document) {
documents.add(document);
}
public boolean isEmpty() {
return documents.isEmpty();
}
public int size() {
return documents.size();
}
}
@Data
public class MetricContext {
private String namespace;
private Map<String, String> defaultDimensions;
private Map<String, Object> defaultProperties;
private String serviceName;
private String environment;
public MetricContext(String namespace, String serviceName, String environment) {
this.namespace = namespace;
this.serviceName = serviceName;
this.environment = environment;
this.defaultDimensions = new HashMap<>();
this.defaultProperties = new HashMap<>();
// Set default dimensions
defaultDimensions.put("Service", serviceName);
defaultDimensions.put("Environment", environment);
}
public MetricContext addDefaultDimension(String key, String value) {
defaultDimensions.put(key, value);
return this;
}
public MetricContext addDefaultProperty(String key, Object value) {
defaultProperties.put(key, value);
return this;
}
}

3. CloudWatch EMF Service

Implement core EMF logging functionality.

EMF Service:

@Service
@Slf4j
public class CloudWatchEMFService {
private final ObjectMapper objectMapper;
private final MetricContext defaultContext;
private final boolean enabled;
private final String logGroupName;
public CloudWatchEMFService(
@Value("${app.metrics.namespace:ApplicationMetrics}") String namespace,
@Value("${app.metrics.service:UnknownService}") String serviceName,
@Value("${app.metrics.environment:Unknown}") String environment,
@Value("${app.metrics.enabled:true}") boolean enabled) {
this.objectMapper = new ObjectMapper();
this.defaultContext = new MetricContext(namespace, serviceName, environment);
this.enabled = enabled;
this.logGroupName = String.format("/aws/metrics/%s/%s", serviceName, environment);
log.info("CloudWatch EMF Service initialized - Namespace: {}, Enabled: {}", 
namespace, enabled);
}
public void recordMetric(EmbeddedMetric metric) {
recordMetric(metric, defaultContext, new HashMap<>());
}
public void recordMetric(EmbeddedMetric metric, Map<String, Object> properties) {
recordMetric(metric, defaultContext, properties);
}
public void recordMetric(EmbeddedMetric metric, MetricContext context, 
Map<String, Object> properties) {
if (!enabled) {
log.debug("Metrics disabled, skipping: {}", metric.getName());
return;
}
try {
MetricDocument document = createMetricDocument(metric, context, properties);
String emfJson = document.toJsonString();
// Log the EMF formatted JSON
log.info(emfJson);
log.debug("Recorded EMF metric: {} = {}", metric.getName(), metric.getValue());
} catch (Exception e) {
log.error("Failed to record EMF metric: {}", metric.getName(), e);
}
}
public void recordMetrics(List<EmbeddedMetric> metrics) {
recordMetrics(metrics, defaultContext, new HashMap<>());
}
public void recordMetrics(List<EmbeddedMetric> metrics, Map<String, Object> properties) {
recordMetrics(metrics, defaultContext, properties);
}
public void recordMetrics(List<EmbeddedMetric> metrics, MetricContext context,
Map<String, Object> properties) {
if (!enabled || metrics.isEmpty()) {
return;
}
try {
MetricDocument document = createMetricDocument(context, properties);
for (EmbeddedMetric metric : metrics) {
document.addMetric(metric);
}
String emfJson = document.toJsonString();
log.info(emfJson);
log.debug("Recorded {} EMF metrics", metrics.size());
} catch (Exception e) {
log.error("Failed to record EMF metrics batch", e);
}
}
public void recordTiming(String metricName, long durationMs, Map<String, String> dimensions) {
recordTiming(metricName, durationMs, dimensions, new HashMap<>());
}
public void recordTiming(String metricName, long durationMs, Map<String, String> dimensions,
Map<String, Object> properties) {
EmbeddedMetric metric = new EmbeddedMetric(metricName, durationMs, 
EmbeddedMetric.Unit.MILLISECONDS, dimensions);
recordMetric(metric, properties);
}
public void recordCount(String metricName, double count, Map<String, String> dimensions) {
recordCount(metricName, count, dimensions, new HashMap<>());
}
public void recordCount(String metricName, double count, Map<String, String> dimensions,
Map<String, Object> properties) {
EmbeddedMetric metric = new EmbeddedMetric(metricName, count, 
EmbeddedMetric.Unit.COUNT, dimensions);
recordMetric(metric, properties);
}
public void recordError(String errorType, String source, Map<String, String> dimensions) {
Map<String, Object> properties = new HashMap<>();
properties.put("error_type", errorType);
properties.put("error_source", source);
properties.put("stack_trace", getStackTrace());
EmbeddedMetric metric = new EmbeddedMetric("Errors", 1, 
EmbeddedMetric.Unit.COUNT, dimensions);
recordMetric(metric, properties);
}
public MetricContext createContext(String namespace, String serviceName, String environment) {
return new MetricContext(namespace, serviceName, environment);
}
private MetricDocument createMetricDocument(EmbeddedMetric metric, MetricContext context,
Map<String, Object> properties) {
MetricDocument document = createMetricDocument(context, properties);
document.addMetric(metric);
return document;
}
private MetricDocument createMetricDocument(MetricContext context, 
Map<String, Object> properties) {
MetricDocument document = new MetricDocument(context.getNamespace());
document.setTimestamp(System.currentTimeMillis());
// Add default dimensions
context.getDefaultDimensions().forEach(document::addDimension);
// Add default properties
context.getDefaultProperties().forEach(document::addProperty);
// Add custom properties
properties.forEach(document::addProperty);
// Add metadata
document.addProperty("timestamp", Instant.ofEpochMilli(document.getTimestamp()).toString());
document.addProperty("service", context.getServiceName());
document.addProperty("environment", context.getEnvironment());
return document;
}
private String getStackTrace() {
StringWriter sw = new StringWriter();
PrintWriter pw = new PrintWriter(sw);
new Throwable().printStackTrace(pw);
return sw.toString();
}
public boolean isEnabled() {
return enabled;
}
}
@Component
@Slf4j
public class EMFLogger {
private final CloudWatchEMFService emfService;
private final Logger jsonLogger;
public EMFLogger(CloudWatchEMFService emfService) {
this.emfService = emfService;
this.jsonLogger = LoggerFactory.getLogger("CLOUDWATCH_EMF");
}
public void logMetricDocument(MetricDocument document) {
if (!emfService.isEnabled()) {
return;
}
try {
String emfJson = document.toJsonString();
jsonLogger.info(emfJson);
} catch (Exception e) {
log.error("Failed to log EMF document", e);
}
}
public void logMetricsBatch(List<MetricDocument> documents) {
if (!emfService.isEnabled() || documents.isEmpty()) {
return;
}
for (MetricDocument document : documents) {
logMetricDocument(document);
}
log.debug("Logged batch of {} EMF documents", documents.size());
}
}

4. Metrics Collection and Aggregation

Implement metrics collection with buffering and batching.

Metrics Collector:

@Service
@Slf4j
public class MetricsCollector {
private final CloudWatchEMFService emfService;
private final Map<String, AtomicLong> counters;
private final Map<String, AtomicDouble> gauges;
private final Map<String, Histogram> histograms;
private final ScheduledExecutorService scheduler;
public MetricsCollector(CloudWatchEMFService emfService) {
this.emfService = emfService;
this.counters = new ConcurrentHashMap<>();
this.gauges = new ConcurrentHashMap<>();
this.histograms = new ConcurrentHashMap<>();
this.scheduler = Executors.newSingleThreadScheduledExecutor();
startPeriodicFlush();
}
public void incrementCounter(String name, Map<String, String> dimensions) {
incrementCounter(name, 1, dimensions);
}
public void incrementCounter(String name, double value, Map<String, String> dimensions) {
String key = createKey(name, dimensions);
AtomicLong counter = counters.computeIfAbsent(key, k -> new AtomicLong(0));
counter.addAndGet((long) value);
}
public void setGauge(String name, double value, Map<String, String> dimensions) {
String key = createKey(name, dimensions);
AtomicDouble gauge = gauges.computeIfAbsent(key, k -> new AtomicDouble(value));
gauge.set(value);
}
public void recordHistogram(String name, double value, Map<String, String> dimensions) {
String key = createKey(name, dimensions);
Histogram histogram = histograms.computeIfAbsent(key, k -> new Histogram());
histogram.record(value);
}
public void recordTiming(String name, long durationMs, Map<String, String> dimensions) {
recordHistogram(name + ".duration", durationMs, dimensions);
// Also record as a separate timing metric
Map<String, Object> properties = new HashMap<>();
properties.put("metric_type", "timing");
properties.put("operation", name);
emfService.recordTiming(name, durationMs, dimensions, properties);
}
public void recordError(String errorType, String source, Map<String, String> dimensions) {
incrementCounter("errors", Map.of("type", errorType, "source", source));
emfService.recordError(errorType, source, dimensions);
}
public void flush() {
flushCounters();
flushGauges();
flushHistograms();
}
public void flushCounters() {
if (counters.isEmpty()) {
return;
}
List<EmbeddedMetric> metrics = new ArrayList<>();
Map<String, AtomicLong> countersSnapshot = new HashMap<>(counters);
counters.clear();
for (Map.Entry<String, AtomicLong> entry : countersSnapshot.entrySet()) {
String key = entry.getKey();
long value = entry.getValue().get();
ParsedKey parsedKey = parseKey(key);
EmbeddedMetric metric = new EmbeddedMetric(
parsedKey.getName(), 
value, 
EmbeddedMetric.Unit.COUNT,
parsedKey.getDimensions()
);
metrics.add(metric);
}
if (!metrics.isEmpty()) {
emfService.recordMetrics(metrics);
log.debug("Flushed {} counter metrics", metrics.size());
}
}
public void flushGauges() {
if (gauges.isEmpty()) {
return;
}
List<EmbeddedMetric> metrics = new ArrayList<>();
Map<String, AtomicDouble> gaugesSnapshot = new HashMap<>(gauges);
for (Map.Entry<String, AtomicDouble> entry : gaugesSnapshot.entrySet()) {
String key = entry.getKey();
double value = entry.getValue().get();
ParsedKey parsedKey = parseKey(key);
EmbeddedMetric metric = new EmbeddedMetric(
parsedKey.getName(), 
value, 
EmbeddedMetric.Unit.NONE,
parsedKey.getDimensions()
);
metrics.add(metric);
}
if (!metrics.isEmpty()) {
emfService.recordMetrics(metrics);
log.debug("Flushed {} gauge metrics", metrics.size());
}
}
public void flushHistograms() {
if (histograms.isEmpty()) {
return;
}
List<EmbeddedMetric> metrics = new ArrayList<>();
Map<String, Histogram> histogramsSnapshot = new HashMap<>(histograms);
histograms.clear();
for (Map.Entry<String, Histogram> entry : histogramsSnapshot.entrySet()) {
String key = entry.getKey();
Histogram histogram = entry.getValue();
ParsedKey parsedKey = parseKey(key);
// Record multiple histogram metrics
metrics.add(new EmbeddedMetric(
parsedKey.getName() + ".count",
histogram.getCount(),
EmbeddedMetric.Unit.COUNT,
parsedKey.getDimensions()
));
metrics.add(new EmbeddedMetric(
parsedKey.getName() + ".avg",
histogram.getAverage(),
EmbeddedMetric.Unit.NONE,
parsedKey.getDimensions()
));
metrics.add(new EmbeddedMetric(
parsedKey.getName() + ".min",
histogram.getMin(),
EmbeddedMetric.Unit.NONE,
parsedKey.getDimensions()
));
metrics.add(new EmbeddedMetric(
parsedKey.getName() + ".max",
histogram.getMax(),
EmbeddedMetric.Unit.NONE,
parsedKey.getDimensions()
));
metrics.add(new EmbeddedMetric(
parsedKey.getName() + ".p95",
histogram.getPercentile(95),
EmbeddedMetric.Unit.NONE,
parsedKey.getDimensions()
));
}
if (!metrics.isEmpty()) {
emfService.recordMetrics(metrics);
log.debug("Flushed {} histogram metrics", metrics.size());
}
}
private String createKey(String name, Map<String, String> dimensions) {
String dimensionString = dimensions.entrySet().stream()
.sorted(Map.Entry.comparingByKey())
.map(entry -> entry.getKey() + "=" + entry.getValue())
.collect(Collectors.joining(";"));
return name + "|" + dimensionString;
}
private ParsedKey parseKey(String key) {
String[] parts = key.split("\\|", 2);
String name = parts[0];
Map<String, String> dimensions = new HashMap<>();
if (parts.length > 1 && !parts[1].isEmpty()) {
String[] dimensionPairs = parts[1].split(";");
for (String pair : dimensionPairs) {
String[] keyValue = pair.split("=", 2);
if (keyValue.length == 2) {
dimensions.put(keyValue[0], keyValue[1]);
}
}
}
return new ParsedKey(name, dimensions);
}
private void startPeriodicFlush() {
scheduler.scheduleAtFixedRate(() -> {
try {
flush();
} catch (Exception e) {
log.error("Failed to flush metrics", e);
}
}, 60, 60, TimeUnit.SECONDS); // Flush every 60 seconds
}
@PreDestroy
public void shutdown() {
scheduler.shutdown();
try {
// Final flush before shutdown
flush();
} catch (Exception e) {
log.error("Failed to flush metrics during shutdown", e);
}
}
@Data
private static class ParsedKey {
private final String name;
private final Map<String, String> dimensions;
}
@Data
private static class Histogram {
private final List<Double> values = new ArrayList<>();
private double sum = 0;
private double min = Double.MAX_VALUE;
private double max = Double.MIN_VALUE;
public void record(double value) {
synchronized (values) {
values.add(value);
sum += value;
min = Math.min(min, value);
max = Math.max(max, value);
}
}
public long getCount() {
return values.size();
}
public double getAverage() {
return getCount() > 0 ? sum / getCount() : 0;
}
public double getMin() {
return getCount() > 0 ? min : 0;
}
public double getMax() {
return getCount() > 0 ? max : 0;
}
public double getPercentile(double percentile) {
if (values.isEmpty()) {
return 0;
}
List<Double> sorted = new ArrayList<>(values);
Collections.sort(sorted);
int index = (int) Math.ceil(percentile / 100.0 * sorted.size()) - 1;
index = Math.max(0, Math.min(index, sorted.size() - 1));
return sorted.get(index);
}
}
}

5. Application Metrics Integration

Integrate EMF with application business logic.

Application Metrics Service:

@Service
@Slf4j
public class ApplicationMetricsService {
private final CloudWatchEMFService emfService;
private final MetricsCollector metricsCollector;
public ApplicationMetricsService(CloudWatchEMFService emfService,
MetricsCollector metricsCollector) {
this.emfService = emfService;
this.metricsCollector = metricsCollector;
}
public void recordHttpRequest(String method, String path, int statusCode, 
long durationMs, String userId) {
Map<String, String> dimensions = new HashMap<>();
dimensions.put("http_method", method);
dimensions.put("http_path", path);
dimensions.put("http_status", String.valueOf(statusCode));
dimensions.put("status_category", getStatusCategory(statusCode));
Map<String, Object> properties = new HashMap<>();
properties.put("user_id", userId);
properties.put("path", path);
properties.put("method", method);
// Record timing
emfService.recordTiming("http.request.duration", durationMs, dimensions, properties);
// Record count
metricsCollector.incrementCounter("http.requests", dimensions);
// Record error if applicable
if (statusCode >= 400) {
metricsCollector.recordError("http_error", "http_request", dimensions);
}
}
public void recordDatabaseQuery(String operation, String table, long durationMs, 
boolean success, int rowCount) {
Map<String, String> dimensions = new HashMap<>();
dimensions.put("db_operation", operation);
dimensions.put("db_table", table);
dimensions.put("success", String.valueOf(success));
Map<String, Object> properties = new HashMap<>();
properties.put("row_count", rowCount);
properties.put("operation", operation);
emfService.recordTiming("db.query.duration", durationMs, dimensions, properties);
metricsCollector.incrementCounter("db.queries", dimensions);
if (!success) {
metricsCollector.recordError("db_error", "database_query", dimensions);
}
}
public void recordBusinessEvent(String eventType, String userId, 
Map<String, Object> eventData) {
Map<String, String> dimensions = new HashMap<>();
dimensions.put("event_type", eventType);
dimensions.put("user_id", userId);
Map<String, Object> properties = new HashMap<>();
properties.put("event_type", eventType);
properties.put("user_id", userId);
properties.put("event_data", eventData);
properties.put("timestamp", Instant.now().toString());
metricsCollector.incrementCounter("business.events", dimensions);
EmbeddedMetric metric = new EmbeddedMetric("business.event", 1, 
EmbeddedMetric.Unit.COUNT, dimensions);
emfService.recordMetric(metric, properties);
}
public void recordCacheOperation(String operation, String cacheName, 
long durationMs, boolean hit) {
Map<String, String> dimensions = new HashMap<>();
dimensions.put("cache_operation", operation);
dimensions.put("cache_name", cacheName);
dimensions.put("hit", String.valueOf(hit));
emfService.recordTiming("cache.operation.duration", durationMs, dimensions);
metricsCollector.incrementCounter("cache.operations", dimensions);
if (hit) {
metricsCollector.incrementCounter("cache.hits", dimensions);
} else {
metricsCollector.incrementCounter("cache.misses", dimensions);
}
}
public void recordExternalApiCall(String service, String endpoint, 
int statusCode, long durationMs) {
Map<String, String> dimensions = new HashMap<>();
dimensions.put("external_service", service);
dimensions.put("endpoint", endpoint);
dimensions.put("status_code", String.valueOf(statusCode));
Map<String, Object> properties = new HashMap<>();
properties.put("service", service);
properties.put("endpoint", endpoint);
emfService.recordTiming("external.api.duration", durationMs, dimensions, properties);
metricsCollector.incrementCounter("external.api.calls", dimensions);
if (statusCode >= 400) {
metricsCollector.recordError("external_api_error", service, dimensions);
}
}
public void recordQueueOperation(String queueName, String operation, 
long durationMs, int messageCount) {
Map<String, String> dimensions = new HashMap<>();
dimensions.put("queue_name", queueName);
dimensions.put("operation", operation);
Map<String, Object> properties = new HashMap<>();
properties.put("message_count", messageCount);
properties.put("queue", queueName);
emfService.recordTiming("queue.operation.duration", durationMs, dimensions, properties);
metricsCollector.incrementCounter("queue.operations", dimensions);
}
public void recordCustomBusinessMetric(String metricName, double value,
Map<String, String> dimensions,
Map<String, Object> properties) {
EmbeddedMetric metric = new EmbeddedMetric(metricName, value, 
EmbeddedMetric.Unit.NONE, dimensions);
emfService.recordMetric(metric, properties);
}
public void recordJvmMetrics() {
Runtime runtime = Runtime.getRuntime();
Map<String, String> dimensions = new HashMap<>();
dimensions.put("metric_type", "jvm");
Map<String, Object> properties = new HashMap<>();
properties.put("free_memory", runtime.freeMemory());
properties.put("total_memory", runtime.totalMemory());
properties.put("max_memory", runtime.maxMemory());
properties.put("available_processors", runtime.availableProcessors());
List<EmbeddedMetric> metrics = Arrays.asList(
new EmbeddedMetric("jvm.memory.used", 
runtime.totalMemory() - runtime.freeMemory(), 
EmbeddedMetric.Unit.BYTES, dimensions),
new EmbeddedMetric("jvm.memory.free", 
runtime.freeMemory(), 
EmbeddedMetric.Unit.BYTES, dimensions),
new EmbeddedMetric("jvm.memory.max", 
runtime.maxMemory(), 
EmbeddedMetric.Unit.BYTES, dimensions),
new EmbeddedMetric("jvm.thread.count", 
Thread.activeCount(), 
EmbeddedMetric.Unit.COUNT, dimensions)
);
emfService.recordMetrics(metrics, properties);
}
private String getStatusCategory(int statusCode) {
if (statusCode < 400) return "success";
if (statusCode < 500) return "client_error";
return "server_error";
}
@Scheduled(fixedRate = 60000) // Every minute
public void recordSystemMetrics() {
recordJvmMetrics();
// Record custom application metrics
Map<String, String> dimensions = new HashMap<>();
dimensions.put("metric_type", "application");
metricsCollector.setGauge("application.uptime", 
getUptimeMinutes(), dimensions);
}
private double getUptimeMinutes() {
// This would be implemented based on your application's uptime tracking
return ManagementFactory.getRuntimeMXBean().getUptime() / 60000.0;
}
}

6. Spring Boot Integration

Integrate EMF with Spring Boot components.

Spring Boot Configuration:

@Configuration
@EnableScheduling
@EnableAspectJAutoProxy
public class EMFConfiguration {
@Bean
@ConfigurationProperties(prefix = "app.metrics")
public MetricsConfig metricsConfig() {
return new MetricsConfig();
}
@Bean
public CloudWatchEMFService cloudWatchEMFService(MetricsConfig config) {
return new CloudWatchEMFService(
config.getNamespace(),
config.getServiceName(),
config.getEnvironment(),
config.isEnabled()
);
}
@Bean
public MetricsCollector metricsCollector(CloudWatchEMFService emfService) {
return new MetricsCollector(emfService);
}
@Bean
public ApplicationMetricsService applicationMetricsService(
CloudWatchEMFService emfService, MetricsCollector metricsCollector) {
return new ApplicationMetricsService(emfService, metricsCollector);
}
@Bean
public EMFLogger emfLogger(CloudWatchEMFService emfService) {
return new EMFLogger(emfService);
}
}
@Data
@ConfigurationProperties(prefix = "app.metrics")
public class MetricsConfig {
private String namespace = "ApplicationMetrics";
private String serviceName = "UnknownService";
private String environment = "Unknown";
private boolean enabled = true;
private int flushIntervalSeconds = 60;
private int batchSize = 100;
}
@Aspect
@Component
@Slf4j
public class MetricsAspect {
private final ApplicationMetricsService metricsService;
public MetricsAspect(ApplicationMetricsService metricsService) {
this.metricsService = metricsService;
}
@Around("@within(org.springframework.web.bind.annotation.RestController) || " +
"@annotation(org.springframework.web.bind.annotation.RequestMapping)")
public Object monitorRestController(ProceedingJoinPoint joinPoint) throws Throwable {
String method = extractHttpMethod(joinPoint);
String path = extractPath(joinPoint);
long startTime = System.currentTimeMillis();
Object result = null;
int statusCode = 200;
try {
result = joinPoint.proceed();
return result;
} catch (Exception e) {
statusCode = extractStatusCode(e);
throw e;
} finally {
long duration = System.currentTimeMillis() - startTime;
String userId = extractUserId();
metricsService.recordHttpRequest(method, path, statusCode, duration, userId);
}
}
@Around("execution(* org.springframework.data.repository.Repository+.*(..))")
public Object monitorRepository(ProceedingJoinPoint joinPoint) throws Throwable {
String operation = joinPoint.getSignature().getName();
String repository = joinPoint.getTarget().getClass().getSimpleName();
long startTime = System.currentTimeMillis();
boolean success = false;
int rowCount = 0;
try {
Object result = joinPoint.proceed();
success = true;
// Estimate row count for queries
if (result instanceof List) {
rowCount = ((List<?>) result).size();
} else if (result != null) {
rowCount = 1;
}
return result;
} finally {
long duration = System.currentTimeMillis() - startTime;
metricsService.recordDatabaseQuery(operation, repository, duration, success, rowCount);
}
}
private String extractHttpMethod(ProceedingJoinPoint joinPoint) {
Method method = ((MethodSignature) joinPoint.getSignature()).getMethod();
if (method.isAnnotationPresent(GetMapping.class)) return "GET";
if (method.isAnnotationPresent(PostMapping.class)) return "POST";
if (method.isAnnotationPresent(PutMapping.class)) return "PUT";
if (method.isAnnotationPresent(DeleteMapping.class)) return "DELETE";
if (method.isAnnotationPresent(PatchMapping.class)) return "PATCH";
return "UNKNOWN";
}
private String extractPath(ProceedingJoinPoint joinPoint) {
Method method = ((MethodSignature) joinPoint.getSignature()).getMethod();
// Extract from method annotations
if (method.isAnnotationPresent(RequestMapping.class)) {
RequestMapping mapping = method.getAnnotation(RequestMapping.class);
if (mapping.value().length > 0) {
return mapping.value()[0];
}
}
if (method.isAnnotationPresent(GetMapping.class)) {
GetMapping mapping = method.getAnnotation(GetMapping.class);
if (mapping.value().length > 0) {
return mapping.value()[0];
}
}
// Similar for other mapping annotations...
return "unknown";
}
private int extractStatusCode(Exception e) {
if (e instanceof HttpStatusCodeException) {
return ((HttpStatusCodeException) e).getStatusCode().value();
}
return 500;
}
private String extractUserId() {
// Extract from Spring Security context
try {
Authentication authentication = SecurityContextHolder.getContext().getAuthentication();
if (authentication != null && authentication.isAuthenticated()) {
return authentication.getName();
}
} catch (Exception e) {
// Ignore - security context not available
}
return "anonymous";
}
}
@Component
public class MetricsWebConfig implements WebMvcConfigurer {
@Override
public void addInterceptors(InterceptorRegistry registry) {
registry.addInterceptor(new MetricsInterceptor());
}
}
@Slf4j
public class MetricsInterceptor implements HandlerInterceptor {
private final ApplicationMetricsService metricsService;
public MetricsInterceptor() {
// Would be injected in real implementation
this.metricsService = null; // Simplified for example
}
@Override
public boolean preHandle(HttpServletRequest request, HttpServletResponse response, 
Object handler) throws Exception {
request.setAttribute("startTime", System.currentTimeMillis());
return true;
}
@Override
public void afterCompletion(HttpServletRequest request, HttpServletResponse response,
Object handler, Exception ex) throws Exception {
Long startTime = (Long) request.getAttribute("startTime");
if (startTime != null && metricsService != null) {
long duration = System.currentTimeMillis() - startTime;
String method = request.getMethod();
String path = request.getRequestURI();
int status = response.getStatus();
String userId = getUserId(request);
metricsService.recordHttpRequest(method, path, status, duration, userId);
}
}
private String getUserId(HttpServletRequest request) {
Principal principal = request.getUserPrincipal();
return principal != null ? principal.getName() : "anonymous";
}
}

7. Logback Configuration for EMF

Configure Logback to properly format EMF logs.

logback-spring.xml:

<?xml version="1.0" encoding="UTF-8"?>
<configuration>
<!-- EMF-specific logger -->
<logger name="CLOUDWATCH_EMF" level="INFO" additivity="false">
<appender name="EMF_JSON" class="ch.qos.logback.core.ConsoleAppender">
<encoder class="ch.qos.logback.core.encoder.LayoutWrappingEncoder">
<layout class="com.example.logging.EMFLayout"/>
</encoder>
</appender>
<appender-ref ref="EMF_JSON"/>
</logger>
<!-- Regular application logging -->
<appender name="STDOUT" class="ch.qos.logback.core.ConsoleAppender">
<encoder>
<pattern>%d{yyyy-MM-dd HH:mm:ss} [%thread] %-5level %logger{36} - %msg%n</pattern>
</encoder>
</appender>
<root level="INFO">
<appender-ref ref="STDOUT"/>
</root>
</configuration>

Custom EMF Layout:

public class EMFLayout extends LayoutBase<ILoggingEvent> {
@Override
public String doLayout(ILoggingEvent event) {
// For EMF logging, we expect the message to already be formatted JSON
// Just return it as-is with a newline
return event.getFormattedMessage() + "\n";
}
}

8. REST API for Metrics Management

Expose metrics configuration and control.

Metrics Controller:

@RestController
@RequestMapping("/api/metrics")
@Slf4j
public class MetricsController {
private final CloudWatchEMFService emfService;
private final MetricsCollector metricsCollector;
private final ApplicationMetricsService appMetricsService;
public MetricsController(CloudWatchEMFService emfService,
MetricsCollector metricsCollector,
ApplicationMetricsService appMetricsService) {
this.emfService = emfService;
this.metricsCollector = metricsCollector;
this.appMetricsService = appMetricsService;
}
@PostMapping("/custom")
public ResponseEntity<String> recordCustomMetric(@RequestBody CustomMetricRequest request) {
try {
Map<String, Object> properties = new HashMap<>();
if (request.getProperties() != null) {
properties.putAll(request.getProperties());
}
EmbeddedMetric metric = new EmbeddedMetric(
request.getName(),
request.getValue(),
EmbeddedMetric.Unit.valueOf(request.getUnit()),
request.getDimensions()
);
emfService.recordMetric(metric, properties);
return ResponseEntity.ok("Custom metric recorded successfully");
} catch (Exception e) {
log.error("Failed to record custom metric", e);
return ResponseEntity.status(HttpStatus.INTERNAL_SERVER_ERROR)
.body("Failed to record metric: " + e.getMessage());
}
}
@PostMapping("/flush")
public ResponseEntity<String> flushMetrics() {
try {
metricsCollector.flush();
return ResponseEntity.ok("Metrics flushed successfully");
} catch (Exception e) {
log.error("Failed to flush metrics", e);
return ResponseEntity.status(HttpStatus.INTERNAL_SERVER_ERROR)
.body("Failed to flush metrics: " + e.getMessage());
}
}
@GetMapping("/status")
public ResponseEntity<Map<String, Object>> getMetricsStatus() {
Map<String, Object> status = new HashMap<>();
status.put("enabled", emfService.isEnabled());
status.put("timestamp", Instant.now().toString());
return ResponseEntity.ok(status);
}
@PostMapping("/business-event")
public ResponseEntity<String> recordBusinessEvent(@RequestBody BusinessEventRequest request) {
try {
appMetricsService.recordBusinessEvent(
request.getEventType(),
request.getUserId(),
request.getEventData()
);
return ResponseEntity.ok("Business event recorded successfully");
} catch (Exception e) {
log.error("Failed to record business event", e);
return ResponseEntity.status(HttpStatus.INTERNAL_SERVER_ERROR)
.body("Failed to record business event: " + e.getMessage());
}
}
@PostMapping("/test/metrics")
public ResponseEntity<String> testMetrics() {
try {
// Record sample metrics for testing
Map<String, String> dimensions = Map.of(
"test", "true",
"environment", "test"
);
Map<String, Object> properties = Map.of(
"test_timestamp", Instant.now().toString(),
"test_id", UUID.randomUUID().toString()
);
EmbeddedMetric metric = new EmbeddedMetric(
"test.metric",
42.0,
EmbeddedMetric.Unit.COUNT,
dimensions
);
emfService.recordMetric(metric, properties);
return ResponseEntity.ok("Test metrics recorded successfully");
} catch (Exception e) {
log.error("Failed to record test metrics", e);
return ResponseEntity.status(HttpStatus.INTERNAL_SERVER_ERROR)
.body("Failed to record test metrics: " + e.getMessage());
}
}
}
@Data
class CustomMetricRequest {
private String name;
private double value;
private String unit;
private Map<String, String> dimensions;
private Map<String, Object> properties;
}
@Data
class BusinessEventRequest {
private String eventType;
private String userId;
private Map<String, Object> eventData;
}

Best Practices for CloudWatch EMF in Java

  1. Dimension Design: Keep dimensions low-cardinality for cost efficiency
  2. Metric Namespace: Use meaningful namespaces for different applications
  3. Batching: Aggregate metrics before sending to reduce costs
  4. Error Handling: Implement robust error handling for metric submission failures
  5. Sampling: Consider sampling for high-volume metrics
  6. Security: Ensure proper IAM roles for CloudWatch access
  7. Testing: Test EMF formatting in non-production environments first

Conclusion: Cost-Effective Cloud Monitoring

CloudWatch Embedded Metrics Format provides a powerful, cost-effective way to implement comprehensive monitoring in Java applications. By leveraging structured logging and automatic metric extraction, developers can achieve detailed observability without the complexity and cost of custom metric APIs.

This implementation demonstrates that EMF isn't just about logging—it's about creating a cohesive monitoring strategy that connects application behavior to business outcomes through rich, contextual metrics that drive operational excellence and cost optimization.

Leave a Reply

Your email address will not be published. Required fields are marked *


Macro Nepal Helper