Mimir for Long-Term Monitoring in Java: A Complete Guide to Time-Series Data Management

Mimir (formerly VictoriaMetrics) is a scalable, long-term time-series database designed for monitoring and observability. This comprehensive guide covers Mimir integration, data management, and long-term retention strategies in Java applications.


Understanding Mimir for Long-Term Storage

What is Mimir?

  • Open-source time-series database focused on long-term data retention
  • Prometheus-compatible metrics storage and querying
  • Horizontal scalability with high availability
  • Efficient compression for long-term data storage

Key Features for Long-Term Storage:

  • Downsampling: Reduce data resolution over time
  • Retention Policies: Automatic data lifecycle management
  • Compression: Efficient storage for historical data
  • Federation: Aggregate data from multiple sources
  • Multi-tenancy: Isolated data for different teams/applications

Dependencies and Setup

Maven Dependencies
<properties>
<micrometer.version>1.11.5</micrometer.version>
<prometheus.version>0.16.0</prometheus.version>
<okhttp.version>4.11.0</okhttp.version>
<jackson.version>2.15.2</jackson.version>
<spring-boot.version>3.1.0</spring-boot.version>
</properties>
<dependencies>
<!-- Micrometer for metrics -->
<dependency>
<groupId>io.micrometer</groupId>
<artifactId>micrometer-core</artifactId>
<version>${micrometer.version}</version>
</dependency>
<dependency>
<groupId>io.micrometer</groupId>
<artifactId>micrometer-registry-prometheus</artifactId>
<version>${micrometer.version}</version>
</dependency>
<!-- Prometheus Java Client -->
<dependency>
<groupId>io.prometheus</groupId>
<artifactId>simpleclient</artifactId>
<version>${prometheus.version}</version>
</dependency>
<dependency>
<groupId>io.prometheus</groupId>
<artifactId>simpleclient_httpserver</artifactId>
<version>${prometheus.version}</version>
</dependency>
<!-- HTTP Client -->
<dependency>
<groupId>com.squareup.okhttp3</groupId>
<artifactId>okhttp</artifactId>
<version>${okhttp.version}</version>
</dependency>
<!-- JSON Processing -->
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-databind</artifactId>
<version>${jackson.version}</version>
</dependency>
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-core</artifactId>
<version>${jackson.version}</version>
</dependency>
<!-- Spring Boot -->
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-web</artifactId>
<version>${spring-boot.version}</version>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-actuator</artifactId>
<version>${spring-boot.version}</version>
</dependency>
</dependencies>
Mimir Configuration
# application.yml
mimir:
url: http://localhost:9009
read-url: ${mimir.url}/prometheus
write-url: ${mimir.url}/api/v1/push
tenant-id: ${HOSTNAME:default-tenant}
timeout: 30000
batch-size: 1000
retention:
enabled: true
policies:
- name: "high-resolution"
duration: "7d"
resolution: "1m"
- name: "medium-resolution"
duration: "30d"
resolution: "5m"
- name: "low-resolution"
duration: "1y"
resolution: "1h"
- name: "archive"
duration: "5y"
resolution: "1d"
management:
endpoints:
web:
exposure:
include: health,info,metrics,prometheus
endpoint:
metrics:
enabled: true
prometheus:
enabled: true

Core Mimir Client Implementation

1. Mimir HTTP Client
import okhttp3.*;
import com.fasterxml.jackson.databind.ObjectMapper;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.stereotype.Component;
import java.io.IOException;
import java.util.concurrent.TimeUnit;
@Component
public class MimirClient {
private final OkHttpClient httpClient;
private final ObjectMapper objectMapper;
private final String writeUrl;
private final String readUrl;
private final String tenantId;
private static final MediaType JSON_MEDIA_TYPE = MediaType.parse("application/json");
public MimirClient(@Value("${mimir.write-url}") String writeUrl,
@Value("${mimir.read-url}") String readUrl,
@Value("${mimir.tenant-id}") String tenantId,
@Value("${mimir.timeout:30000}") long timeout) {
this.writeUrl = writeUrl;
this.readUrl = readUrl;
this.tenantId = tenantId;
this.objectMapper = new ObjectMapper();
this.httpClient = new OkHttpClient.Builder()
.connectTimeout(timeout, TimeUnit.MILLISECONDS)
.readTimeout(timeout, TimeUnit.MILLISECONDS)
.writeTimeout(timeout, TimeUnit.MILLISECONDS)
.build();
}
public void pushMetrics(MetricWriteRequest request) throws IOException {
String jsonPayload = objectMapper.writeValueAsString(request);
RequestBody body = RequestBody.create(jsonPayload, JSON_MEDIA_TYPE);
Request httpRequest = new Request.Builder()
.url(writeUrl)
.post(body)
.addHeader("Content-Type", "application/json")
.addHeader("X-Scope-OrgID", tenantId)
.build();
try (Response response = httpClient.newCall(httpRequest).execute()) {
if (!response.isSuccessful()) {
throw new IOException("Failed to push metrics: " + response.code() + " - " + response.message());
}
}
}
public QueryResponse queryMetrics(String query) throws IOException {
HttpUrl url = HttpUrl.parse(readUrl + "/api/v1/query")
.newBuilder()
.addQueryParameter("query", query)
.build();
Request request = new Request.Builder()
.url(url)
.addHeader("X-Scope-OrgID", tenantId)
.build();
try (Response response = httpClient.newCall(request).execute()) {
if (!response.isSuccessful()) {
throw new IOException("Query failed: " + response.code() + " - " + response.message());
}
ResponseBody body = response.body();
if (body == null) {
throw new IOException("Empty response body");
}
return objectMapper.readValue(body.string(), QueryResponse.class);
}
}
public QueryResponse queryRangeMetrics(String query, long start, long end, String step) throws IOException {
HttpUrl url = HttpUrl.parse(readUrl + "/api/v1/query_range")
.newBuilder()
.addQueryParameter("query", query)
.addQueryParameter("start", String.valueOf(start))
.addQueryParameter("end", String.valueOf(end))
.addQueryParameter("step", step)
.build();
Request request = new Request.Builder()
.url(url)
.addHeader("X-Scope-OrgID", tenantId)
.build();
try (Response response = httpClient.newCall(request).execute()) {
if (!response.isSuccessful()) {
throw new IOException("Range query failed: " + response.code() + " - " + response.message());
}
ResponseBody body = response.body();
if (body == null) {
throw new IOException("Empty response body");
}
return objectMapper.readValue(body.string(), QueryResponse.class);
}
}
public SeriesResponse querySeries(String match, long start, long end) throws IOException {
HttpUrl url = HttpUrl.parse(readUrl + "/api/v1/series")
.newBuilder()
.addQueryParameter("match[]", match)
.addQueryParameter("start", String.valueOf(start))
.addQueryParameter("end", String.valueOf(end))
.build();
Request request = new Request.Builder()
.url(url)
.addHeader("X-Scope-OrgID", tenantId)
.build();
try (Response response = httpClient.newCall(request).execute()) {
if (!response.isSuccessful()) {
throw new IOException("Series query failed: " + response.code() + " - " + response.message());
}
ResponseBody body = response.body();
if (body == null) {
throw new IOException("Empty response body");
}
return objectMapper.readValue(body.string(), SeriesResponse.class);
}
}
}
2. Data Models for Mimir
import com.fasterxml.jackson.annotation.JsonProperty;
import java.util.List;
import java.util.Map;
public class MetricWriteRequest {
@JsonProperty("streams")
private List<Stream> streams;
public MetricWriteRequest() {}
public MetricWriteRequest(List<Stream> streams) {
this.streams = streams;
}
// Getters and setters
public List<Stream> getStreams() { return streams; }
public void setStreams(List<Stream> streams) { this.streams = streams; }
}
public class Stream {
@JsonProperty("stream")
private Map<String, String> labels;
@JsonProperty("values")
private List<List<String>> values; // [[timestamp, value], ...]
public Stream() {}
public Stream(Map<String, String> labels, List<List<String>> values) {
this.labels = labels;
this.values = values;
}
// Getters and setters
public Map<String, String> getLabels() { return labels; }
public void setLabels(Map<String, String> labels) { this.labels = labels; }
public List<List<String>> getValues() { return values; }
public void setValues(List<List<String>> values) { this.values = values; }
}
public class QueryResponse {
private String status;
private QueryData data;
// Getters and setters
public String getStatus() { return status; }
public void setStatus(String status) { this.status = status; }
public QueryData getData() { return data; }
public void setData(QueryData data) { this.data = data; }
}
public class QueryData {
private String resultType;
private List<QueryResult> result;
// Getters and setters
public String getResultType() { return resultType; }
public void setResultType(String resultType) { this.resultType = resultType; }
public List<QueryResult> getResult() { return result; }
public void setResult(List<QueryResult> result) { this.result = result; }
}
public class QueryResult {
private Map<String, String> metric;
private List<List<Object>> values; // [[timestamp, value], ...]
// Getters and setters
public Map<String, String> getMetric() { return metric; }
public void setMetric(Map<String, String> metric) { this.metric = metric; }
public List<List<Object>> getValues() { return values; }
public void setValues(List<List<Object>> values) { this.values = values; }
}
public class SeriesResponse {
private String status;
private List<Map<String, String>> data;
// Getters and setters
public String getStatus() { return status; }
public void setStatus(String status) { this.status = status; }
public List<Map<String, String>> getData() { return data; }
public void setData(List<Map<String, String>> data) { this.data = data; }
}
public class MetricSample {
private long timestamp; // Unix timestamp in milliseconds
private double value;
private Map<String, String> labels;
public MetricSample() {}
public MetricSample(long timestamp, double value, Map<String, String> labels) {
this.timestamp = timestamp;
this.value = value;
this.labels = labels;
}
// Getters and setters
public long getTimestamp() { return timestamp; }
public void setTimestamp(long timestamp) { this.timestamp = timestamp; }
public double getValue() { return value; }
public void setValue(double value) { this.value = value; }
public Map<String, String> getLabels() { return labels; }
public void setLabels(Map<String, String> labels) { this.labels = labels; }
public List<String> toValueList() {
return List.of(String.valueOf(timestamp), String.valueOf(value));
}
}

Long-Term Data Management

1. Retention Policy Manager
import org.springframework.stereotype.Component;
import java.time.Instant;
import java.time.temporal.ChronoUnit;
import java.util.*;
import java.util.concurrent.ConcurrentHashMap;
@Component
public class RetentionPolicyManager {
private final MimirClient mimirClient;
private final Map<String, RetentionPolicy> policies;
private final Map<String, DownsamplingConfig> downsamplingConfigs;
public RetentionPolicyManager(MimirClient mimirClient) {
this.mimirClient = mimirClient;
this.policies = new ConcurrentHashMap<>();
this.downsamplingConfigs = new ConcurrentHashMap<>();
initializeDefaultPolicies();
}
private void initializeDefaultPolicies() {
// High resolution - 1 minute granularity for 7 days
policies.put("high-resolution", new RetentionPolicy("high-resolution", 
"7d", "1m", 10080)); // 7 days * 24 hours * 60 minutes
// Medium resolution - 5 minute granularity for 30 days
policies.put("medium-resolution", new RetentionPolicy("medium-resolution", 
"30d", "5m", 8640)); // 30 days * 24 hours * 12 (5-min intervals)
// Low resolution - 1 hour granularity for 1 year
policies.put("low-resolution", new RetentionPolicy("low-resolution", 
"365d", "1h", 8760)); // 365 days * 24 hours
// Archive - 1 day granularity for 5 years
policies.put("archive", new RetentionPolicy("archive", 
"1825d", "1d", 1825)); // 5 years * 365 days
}
public void applyRetentionPolicy(String metricName, String policyName) {
RetentionPolicy policy = policies.get(policyName);
if (policy == null) {
throw new IllegalArgumentException("Unknown retention policy: " + policyName);
}
// In a real implementation, this would:
// 1. Query existing data for the metric
// 2. Apply downsampling if needed
// 3. Update retention settings in Mimir
// 4. Schedule periodic cleanup
System.out.printf("Applied retention policy '%s' to metric '%s'%n", policyName, metricName);
}
public void downsampleMetric(String metricName, String sourcePolicy, String targetPolicy) {
RetentionPolicy source = policies.get(sourcePolicy);
RetentionPolicy target = policies.get(targetPolicy);
if (source == null || target == null) {
throw new IllegalArgumentException("Invalid policy names");
}
long cutoffTime = Instant.now()
.minus(source.getDurationInDays(), ChronoUnit.DAYS)
.toEpochMilli() / 1000; // Convert to seconds
String downsamplingQuery = String.format(
"avg_over_time(%s[%s])", metricName, target.getResolution());
try {
// Query the downsampled data
QueryResponse response = mimirClient.queryRangeMetrics(
downsamplingQuery, 
cutoffTime - 86400, // Start 1 day before cutoff for overlap
cutoffTime, 
target.getResolution()
);
// Write downsampled data with new labels indicating resolution
writeDownsampledData(metricName, response, target);
// Delete original high-resolution data beyond retention
deleteExpiredData(metricName, cutoffTime);
} catch (Exception e) {
System.err.println("Failed to downsample metric " + metricName + ": " + e.getMessage());
}
}
private void writeDownsampledData(String metricName, QueryResponse response, RetentionPolicy policy) {
if (response.getData() == null || response.getData().getResult() == null) {
return;
}
List<Stream> streams = new ArrayList<>();
for (QueryResult result : response.getData().getResult()) {
Map<String, String> labels = new HashMap<>(result.getMetric());
labels.put("resolution", policy.getResolution());
labels.put("retention_policy", policy.getName());
List<List<String>> values = new ArrayList<>();
if (result.getValues() != null) {
for (List<Object> valuePair : result.getValues()) {
if (valuePair.size() >= 2) {
String timestamp = String.valueOf(valuePair.get(0));
String value = String.valueOf(valuePair.get(1));
values.add(List.of(timestamp, value));
}
}
}
streams.add(new Stream(labels, values));
}
if (!streams.isEmpty()) {
try {
mimirClient.pushMetrics(new MetricWriteRequest(streams));
System.out.printf("Written downsampled data for %s with policy %s%n", 
metricName, policy.getName());
} catch (Exception e) {
System.err.println("Failed to write downsampled data: " + e.getMessage());
}
}
}
private void deleteExpiredData(String metricName, long cutoffTime) {
// In Mimir, data deletion is typically handled automatically based on retention periods
// This method would be used for manual cleanup if needed
System.out.printf("Expired data cleanup for %s before %d%n", metricName, cutoffTime);
}
public Map<String, RetentionPolicy> getPolicies() {
return Collections.unmodifiableMap(policies);
}
public void addCustomPolicy(String name, String duration, String resolution) {
policies.put(name, new RetentionPolicy(name, duration, resolution, 0));
}
}
public class RetentionPolicy {
private final String name;
private final String duration;
private final String resolution;
private final int expectedDataPoints;
public RetentionPolicy(String name, String duration, String resolution, int expectedDataPoints) {
this.name = name;
this.duration = duration;
this.resolution = resolution;
this.expectedDataPoints = expectedDataPoints;
}
// Getters
public String getName() { return name; }
public String getDuration() { return duration; }
public String getResolution() { return resolution; }
public int getExpectedDataPoints() { return expectedDataPoints; }
public long getDurationInDays() {
if (duration.endsWith("d")) {
return Long.parseLong(duration.substring(0, duration.length() - 1));
} else if (duration.endsWith("y")) {
return Long.parseLong(duration.substring(0, duration.length() - 1)) * 365;
}
return 7; // Default 7 days
}
}
public class DownsamplingConfig {
private String sourceMetric;
private String targetMetric;
private String aggregation;
private String interval;
private Map<String, String> labels;
// Getters and setters
public String getSourceMetric() { return sourceMetric; }
public void setSourceMetric(String sourceMetric) { this.sourceMetric = sourceMetric; }
public String getTargetMetric() { return targetMetric; }
public void setTargetMetric(String targetMetric) { this.targetMetric = targetMetric; }
public String getAggregation() { return aggregation; }
public void setAggregation(String aggregation) { this.aggregation = aggregation; }
public String getInterval() { return interval; }
public void setInterval(String interval) { this.interval = interval; }
public Map<String, String> getLabels() { return labels; }
public void setLabels(Map<String, String> labels) { this.labels = labels; }
}
2. Batch Metrics Processor
import org.springframework.stereotype.Component;
import java.util.*;
import java.util.concurrent.BlockingQueue;
import java.util.concurrent.LinkedBlockingQueue;
import java.util.concurrent.Executors;
import java.util.concurrent.ScheduledExecutorService;
import java.util.concurrent.TimeUnit;
@Component
public class BatchMetricsProcessor {
private final MimirClient mimirClient;
private final BlockingQueue<MetricSample> metricsQueue;
private final int batchSize;
private final long flushIntervalMs;
private final ScheduledExecutorService scheduler;
private volatile List<MetricSample> currentBatch;
public BatchMetricsProcessor(MimirClient mimirClient,
@Value("${mimir.batch-size:1000}") int batchSize,
@Value("${mimir.flush-interval:5000}") long flushIntervalMs) {
this.mimirClient = mimirClient;
this.batchSize = batchSize;
this.flushIntervalMs = flushIntervalMs;
this.metricsQueue = new LinkedBlockingQueue<>();
this.currentBatch = new ArrayList<>(batchSize);
this.scheduler = Executors.newSingleThreadScheduledExecutor();
startProcessing();
}
private void startProcessing() {
// Start batch processor thread
Thread processorThread = new Thread(this::processMetrics, "metrics-processor");
processorThread.setDaemon(true);
processorThread.start();
// Schedule periodic flushes
scheduler.scheduleAtFixedRate(this::flushBatch, 
flushIntervalMs, flushIntervalMs, TimeUnit.MILLISECONDS);
}
public void submitMetric(MetricSample metric) {
try {
metricsQueue.put(metric);
} catch (InterruptedException e) {
Thread.currentThread().interrupt();
System.err.println("Failed to submit metric: " + e.getMessage());
}
}
public void submitMetrics(List<MetricSample> metrics) {
metrics.forEach(this::submitMetric);
}
private void processMetrics() {
while (!Thread.currentThread().isInterrupted()) {
try {
MetricSample metric = metricsQueue.poll(100, TimeUnit.MILLISECONDS);
if (metric != null) {
currentBatch.add(metric);
if (currentBatch.size() >= batchSize) {
flushBatch();
}
}
} catch (InterruptedException e) {
Thread.currentThread().interrupt();
break;
}
}
// Final flush on shutdown
flushBatch();
}
private synchronized void flushBatch() {
if (currentBatch.isEmpty()) {
return;
}
List<MetricSample> batchToSend = new ArrayList<>(currentBatch);
currentBatch.clear();
try {
sendBatchToMimir(batchToSend);
} catch (Exception e) {
System.err.println("Failed to send batch to Mimir: " + e.getMessage());
// Optionally implement retry logic or dead letter queue
}
}
private void sendBatchToMimir(List<MetricSample> batch) {
// Group metrics by their labels for efficient batching
Map<String, List<MetricSample>> groupedMetrics = new HashMap<>();
for (MetricSample sample : batch) {
String labelKey = generateLabelKey(sample.getLabels());
groupedMetrics.computeIfAbsent(labelKey, k -> new ArrayList<>()).add(sample);
}
List<Stream> streams = new ArrayList<>();
for (Map.Entry<String, List<MetricSample>> entry : groupedMetrics.entrySet()) {
List<MetricSample> samples = entry.getValue();
// Sort by timestamp for better compression
samples.sort(Comparator.comparingLong(MetricSample::getTimestamp));
List<List<String>> values = samples.stream()
.map(MetricSample::toValueList)
.toList();
// Use the labels from the first sample (they should be the same within group)
Map<String, String> labels = samples.get(0).getLabels();
streams.add(new Stream(labels, values));
}
if (!streams.isEmpty()) {
try {
mimirClient.pushMetrics(new MetricWriteRequest(streams));
System.out.printf("Successfully sent %d metrics in %d streams%n", 
batch.size(), streams.size());
} catch (Exception e) {
throw new RuntimeException("Failed to push metrics to Mimir", e);
}
}
}
private String generateLabelKey(Map<String, String> labels) {
// Create a consistent key for grouping metrics with same labels
List<String> sortedKeys = new ArrayList<>(labels.keySet());
Collections.sort(sortedKeys);
StringBuilder keyBuilder = new StringBuilder();
for (String key : sortedKeys) {
keyBuilder.append(key).append("=").append(labels.get(key)).append(";");
}
return keyBuilder.toString();
}
public void shutdown() {
scheduler.shutdown();
try {
if (!scheduler.awaitTermination(10, TimeUnit.SECONDS)) {
scheduler.shutdownNow();
}
} catch (InterruptedException e) {
scheduler.shutdownNow();
Thread.currentThread().interrupt();
}
// Final flush
flushBatch();
}
public int getQueueSize() {
return metricsQueue.size();
}
public int getCurrentBatchSize() {
return currentBatch.size();
}
}

Metrics Collection and Export

1. Micrometer Integration
import io.micrometer.core.instrument.*;
import io.micrometer.core.instrument.config.MeterFilter;
import io.micrometer.prometheus.PrometheusConfig;
import io.micrometer.prometheus.PrometheusMeterRegistry;
import org.springframework.stereotype.Component;
import java.util.concurrent.ConcurrentHashMap;
import java.util.Map;
import java.util.function.ToDoubleFunction;
@Component
public class MimirMetricsRegistry {
private final PrometheusMeterRegistry registry;
private final BatchMetricsProcessor batchProcessor;
private final Map<String, Meter> registeredMeters;
public MimirMetricsRegistry(BatchMetricsProcessor batchProcessor) {
this.batchProcessor = batchProcessor;
this.registeredMeters = new ConcurrentHashMap<>();
this.registry = new PrometheusMeterRegistry(PrometheusConfig.DEFAULT);
// Configure common tags
registry.config().commonTags("application", "mimir-java-client", "environment", "production");
// Add meter filters for long-term retention
configureMeterFilters();
// Register JVM metrics
registerJvmMetrics();
// Register custom metrics
registerCustomMetrics();
}
private void configureMeterFilters() {
// Configure retention policies for different metric types
registry.config().meterFilter(
MeterFilter.maxExpected("http.server.requests", 10000L) // High cardinality
);
registry.config().meterFilter(
MeterFilter.deny(name -> name.startsWith("jvm.gc")) // Too high cardinality for long-term
);
}
private void registerJvmMetrics() {
// Memory usage
registerGauge("jvm.memory.used", "Memory usage in bytes", 
Tags.of("area", "heap"), Runtime.getRuntime(), Runtime::totalMemory);
registerGauge("jvm.memory.free", "Free memory in bytes", 
Tags.empty(), Runtime.getRuntime(), Runtime::freeMemory);
// Thread count
registerGauge("jvm.threads.live", "Live thread count", 
Tags.empty(), Thread::activeCount);
}
private void registerCustomMetrics() {
// Application-specific metrics
Counter.builder("application.events.processed")
.description("Total number of processed events")
.tag("type", "business")
.register(registry);
Gauge.builder("application.queue.size")
.description("Current queue size")
.tag("type", "processing")
.register(registry, batchProcessor, BatchMetricsProcessor::getQueueSize);
Timer.builder("application.request.duration")
.description("Request processing duration")
.publishPercentiles(0.5, 0.95, 0.99) // For long-term analysis
.register(registry);
}
public <T> void registerGauge(String name, String description, Tags tags, T obj, ToDoubleFunction<T> f) {
Gauge.builder(name, obj, f)
.description(description)
.tags(tags)
.register(registry);
}
public void recordCustomMetric(String name, double value, Map<String, String> labels) {
String metricKey = name + labels.hashCode();
Gauge gauge = (Gauge) registeredMeters.computeIfAbsent(metricKey, k -> 
Gauge.builder(name)
.description("Custom application metric")
.tags(convertToTags(labels))
.register(registry)
);
// Note: Micrometer gauges are typically used with object references
// For direct value setting, we'd use a different approach
}
private Iterable<Tag> convertToTags(Map<String, String> labels) {
return labels.entrySet().stream()
.map(entry -> Tag.of(entry.getKey(), entry.getValue()))
.toList();
}
public void exportToMimir() {
// Scrape metrics from registry and send to Mimir
String scrapeData = registry.scrape();
// Parse scrape data and convert to Mimir format
// This is a simplified version - in practice, you'd parse the Prometheus format
System.out.println("Exporting metrics to Mimir...");
// Implementation would parse the scrape data and send via BatchMetricsProcessor
}
public PrometheusMeterRegistry getRegistry() {
return registry;
}
}
2. Custom Metrics Collector
import org.springframework.stereotype.Component;
import java.lang.management.*;
import java.util.HashMap;
import java.util.Map;
import java.util.concurrent.atomic.AtomicLong;
@Component
public class SystemMetricsCollector {
private final BatchMetricsProcessor batchProcessor;
private final AtomicLong collectionCount;
private final OperatingSystemMXBean osBean;
private final MemoryMXBean memoryBean;
private final ThreadMXBean threadBean;
public SystemMetricsCollector(BatchMetricsProcessor batchProcessor) {
this.batchProcessor = batchProcessor;
this.collectionCount = new AtomicLong(0);
this.osBean = ManagementFactory.getOperatingSystemMXBean();
this.memoryBean = ManagementFactory.getMemoryMXBean();
this.threadBean = ManagementFactory.getThreadMXBean();
startCollection();
}
private void startCollection() {
Thread collectionThread = new Thread(this::collectMetrics, "system-metrics-collector");
collectionThread.setDaemon(true);
collectionThread.start();
}
private void collectMetrics() {
while (!Thread.currentThread().isInterrupted()) {
try {
collectSystemMetrics();
collectJvmMetrics();
collectApplicationMetrics();
collectionCount.incrementAndGet();
Thread.sleep(60000); // Collect every minute
} catch (InterruptedException e) {
Thread.currentThread().interrupt();
break;
} catch (Exception e) {
System.err.println("Error collecting system metrics: " + e.getMessage());
}
}
}
private void collectSystemMetrics() {
long timestamp = System.currentTimeMillis();
// CPU usage
if (osBean instanceof com.sun.management.OperatingSystemMXBean sunOsBean) {
double systemCpuLoad = sunOsBean.getSystemCpuLoad();
double processCpuLoad = sunOsBean.getProcessCpuLoad();
submitMetric("system.cpu.usage", systemCpuLoad, timestamp, 
Map.of("type", "system"));
submitMetric("system.cpu.usage", processCpuLoad, timestamp, 
Map.of("type", "process"));
}
// System load average
double loadAverage = osBean.getSystemLoadAverage();
if (loadAverage >= 0) {
submitMetric("system.load.average", loadAverage, timestamp, Map.of());
}
// Memory usage
com.sun.management.OperatingSystemMXBean sunOsBean = 
(com.sun.management.OperatingSystemMXBean) osBean;
long totalPhysicalMemory = sunOsBean.getTotalPhysicalMemorySize();
long freePhysicalMemory = sunOsBean.getFreePhysicalMemorySize();
long usedPhysicalMemory = totalPhysicalMemory - freePhysicalMemory;
submitMetric("system.memory.total", totalPhysicalMemory, timestamp, Map.of());
submitMetric("system.memory.used", usedPhysicalMemory, timestamp, Map.of());
submitMetric("system.memory.free", freePhysicalMemory, timestamp, Map.of());
}
private void collectJvmMetrics() {
long timestamp = System.currentTimeMillis();
// Heap memory
MemoryUsage heapUsage = memoryBean.getHeapMemoryUsage();
submitMetric("jvm.memory.heap.used", heapUsage.getUsed(), timestamp, Map.of());
submitMetric("jvm.memory.heap.committed", heapUsage.getCommitted(), timestamp, Map.of());
submitMetric("jvm.memory.heap.max", heapUsage.getMax(), timestamp, Map.of());
// Non-heap memory
MemoryUsage nonHeapUsage = memoryBean.getNonHeapMemoryUsage();
submitMetric("jvm.memory.nonheap.used", nonHeapUsage.getUsed(), timestamp, Map.of());
submitMetric("jvm.memory.nonheap.committed", nonHeapUsage.getCommitted(), timestamp, Map.of());
// Threads
submitMetric("jvm.threads.live", threadBean.getThreadCount(), timestamp, Map.of());
submitMetric("jvm.threads.daemon", threadBean.getDaemonThreadCount(), timestamp, Map.of());
submitMetric("jvm.threads.peak", threadBean.getPeakThreadCount(), timestamp, Map.of());
// GC statistics
for (GarbageCollectorMXBean gcBean : ManagementFactory.getGarbageCollectorMXBeans()) {
Map<String, String> labels = Map.of("gc", gcBean.getName());
submitMetric("jvm.gc.count", gcBean.getCollectionCount(), timestamp, labels);
submitMetric("jvm.gc.time", gcBean.getCollectionTime(), timestamp, labels);
}
}
private void collectApplicationMetrics() {
long timestamp = System.currentTimeMillis();
// Custom application metrics
submitMetric("application.metrics.collection.count", collectionCount.get(), timestamp, Map.of());
submitMetric("application.batch.queue.size", batchProcessor.getQueueSize(), timestamp, Map.of());
submitMetric("application.batch.current.size", batchProcessor.getCurrentBatchSize(), timestamp, Map.of());
// Simulate some business metrics
double randomMetric = Math.random() * 100;
submitMetric("application.business.metric", randomMetric, timestamp, 
Map.of("source", "simulated"));
}
private void submitMetric(String name, double value, long timestamp, Map<String, String> labels) {
Map<String, String> fullLabels = new HashMap<>(labels);
fullLabels.put("__name__", name);
fullLabels.put("instance", "java-application");
fullLabels.put("job", "mimir-java-client");
MetricSample sample = new MetricSample(timestamp, value, fullLabels);
batchProcessor.submitMetric(sample);
}
private void submitMetric(String name, long value, long timestamp, Map<String, String> labels) {
submitMetric(name, (double) value, timestamp, labels);
}
}

Query and Analytics Layer

1. Mimir Query Service
import org.springframework.stereotype.Service;
import java.time.Instant;
import java.time.ZoneId;
import java.time.format.DateTimeFormatter;
import java.util.*;
@Service
public class MimirQueryService {
private final MimirClient mimirClient;
private final DateTimeFormatter formatter;
public MimirQueryService(MimirClient mimirClient) {
this.mimirClient = mimirClient;
this.formatter = DateTimeFormatter.ofPattern("yyyy-MM-dd HH:mm:ss")
.withZone(ZoneId.systemDefault());
}
public MetricQueryResult queryMetric(String metricName, long startTime, long endTime, String step) {
try {
String query = metricName;
QueryResponse response = mimirClient.queryRangeMetrics(query, startTime, endTime, step);
return convertToMetricQueryResult(response);
} catch (Exception e) {
throw new RuntimeException("Failed to query metric: " + metricName, e);
}
}
public List<MetricSeries> getMetricSeries(String metricPattern, long startTime, long endTime) {
try {
SeriesResponse response = mimirClient.querySeries(metricPattern, startTime, endTime);
List<MetricSeries> seriesList = new ArrayList<>();
for (Map<String, String> seriesData : response.getData()) {
MetricSeries series = new MetricSeries();
series.setLabels(seriesData);
seriesList.add(series);
}
return seriesList;
} catch (Exception e) {
throw new RuntimeException("Failed to get metric series", e);
}
}
public StatisticalSummary getStatisticalSummary(String metricName, long startTime, long endTime) {
try {
// Query raw data
QueryResponse response = mimirClient.queryRangeMetrics(
metricName, startTime, endTime, "1m");
return calculateStatistics(response);
} catch (Exception e) {
throw new RuntimeException("Failed to calculate statistics for: " + metricName, e);
}
}
public TrendAnalysis analyzeTrend(String metricName, long startTime, long endTime) {
try {
// Query data at appropriate resolution based on time range
String step = calculateOptimalStep(startTime, endTime);
QueryResponse response = mimirClient.queryRangeMetrics(
metricName, startTime, endTime, step);
return performTrendAnalysis(response, metricName);
} catch (Exception e) {
throw new RuntimeException("Failed to analyze trend for: " + metricName, e);
}
}
private MetricQueryResult convertToMetricQueryResult(QueryResponse response) {
MetricQueryResult result = new MetricQueryResult();
if (response.getData() != null && response.getData().getResult() != null) {
List<MetricTimeSeries> timeSeriesList = new ArrayList<>();
for (QueryResult queryResult : response.getData().getResult()) {
MetricTimeSeries timeSeries = new MetricTimeSeries();
timeSeries.setLabels(queryResult.getMetric());
List<DataPoint> dataPoints = new ArrayList<>();
if (queryResult.getValues() != null) {
for (List<Object> valuePair : queryResult.getValues()) {
if (valuePair.size() >= 2) {
long timestamp = ((Number) valuePair.get(0)).longValue();
double value = Double.parseDouble(valuePair.get(1).toString());
dataPoints.add(new DataPoint(timestamp, value));
}
}
}
timeSeries.setDataPoints(dataPoints);
timeSeriesList.add(timeSeries);
}
result.setTimeSeries(timeSeriesList);
}
return result;
}
private StatisticalSummary calculateStatistics(QueryResponse response) {
StatisticalSummary summary = new StatisticalSummary();
List<Double> allValues = new ArrayList<>();
if (response.getData() != null && response.getData().getResult() != null) {
for (QueryResult result : response.getData().getResult()) {
if (result.getValues() != null) {
for (List<Object> valuePair : result.getValues()) {
if (valuePair.size() >= 2) {
double value = Double.parseDouble(valuePair.get(1).toString());
allValues.add(value);
}
}
}
}
}
if (!allValues.isEmpty()) {
double[] values = allValues.stream().mapToDouble(Double::doubleValue).toArray();
Arrays.sort(values);
summary.setCount(values.length);
summary.setMin(values[0]);
summary.setMax(values[values.length - 1]);
summary.setMean(Arrays.stream(values).average().orElse(0.0));
summary.setMedian(values[values.length / 2]);
summary.setP95(values[(int) (values.length * 0.95)]);
summary.setP99(values[(int) (values.length * 0.99)]);
}
return summary;
}
private TrendAnalysis performTrendAnalysis(QueryResponse response, String metricName) {
TrendAnalysis analysis = new TrendAnalysis();
analysis.setMetricName(metricName);
List<Double> values = new ArrayList<>();
if (response.getData() != null && response.getData().getResult() != null) {
for (QueryResult result : response.getData().getResult()) {
if (result.getValues() != null) {
for (List<Object> valuePair : result.getValues()) {
if (valuePair.size() >= 2) {
double value = Double.parseDouble(valuePair.get(1).toString());
values.add(value);
}
}
}
}
}
if (values.size() >= 2) {
// Simple linear regression for trend
double slope = calculateSlope(values);
analysis.setSlope(slope);
analysis.setTrend(slope > 0 ? "INCREASING" : slope < 0 ? "DECREASING" : "STABLE");
analysis.setStrength(Math.abs(slope));
}
return analysis;
}
private double calculateSlope(List<Double> values) {
int n = values.size();
double sumX = 0, sumY = 0, sumXY = 0, sumX2 = 0;
for (int i = 0; i < n; i++) {
double x = i;
double y = values.get(i);
sumX += x;
sumY += y;
sumXY += x * y;
sumX2 += x * x;
}
return (n * sumXY - sumX * sumY) / (n * sumX2 - sumX * sumX);
}
private String calculateOptimalStep(long startTime, long endTime) {
long duration = endTime - startTime;
if (duration <= 3600) { // 1 hour
return "15s";
} else if (duration <= 86400) { // 1 day
return "1m";
} else if (duration <= 604800) { // 1 week
return "5m";
} else if (duration <= 2592000) { // 30 days
return "15m";
} else {
return "1h";
}
}
}
// Supporting data classes
public class MetricQueryResult {
private List<MetricTimeSeries> timeSeries;
public List<MetricTimeSeries> getTimeSeries() { return timeSeries; }
public void setTimeSeries(List<MetricTimeSeries> timeSeries) { this.timeSeries = timeSeries; }
}
public class MetricTimeSeries {
private Map<String, String> labels;
private List<DataPoint> dataPoints;
public Map<String, String> getLabels() { return labels; }
public void setLabels(Map<String, String> labels) { this.labels = labels; }
public List<DataPoint> getDataPoints() { return dataPoints; }
public void setDataPoints(List<DataPoint> dataPoints) { this.dataPoints = dataPoints; }
}
public class DataPoint {
private long timestamp;
private double value;
public DataPoint() {}
public DataPoint(long timestamp, double value) {
this.timestamp = timestamp;
this.value = value;
}
public long getTimestamp() { return timestamp; }
public void setTimestamp(long timestamp) { this.timestamp = timestamp; }
public double getValue() { return value; }
public void setValue(double value) { this.value = value; }
}
public class StatisticalSummary {
private int count;
private double min;
private double max;
private double mean;
private double median;
private double p95;
private double p99;
// Getters and setters
public int getCount() { return count; }
public void setCount(int count) { this.count = count; }
public double getMin() { return min; }
public void setMin(double min) { this.min = min; }
public double getMax() { return max; }
public void setMax(double max) { this.max = max; }
public double getMean() { return mean; }
public void setMean(double mean) { this.mean = mean; }
public double getMedian() { return median; }
public void setMedian(double median) { this.median = median; }
public double getP95() { return p95; }
public void setP95(double p95) { this.p95 = p95; }
public double getP99() { return p99; }
public void setP99(double p99) { this.p99 = p99; }
}
public class TrendAnalysis {
private String metricName;
private double slope;
private String trend;
private double strength;
// Getters and setters
public String getMetricName() { return metricName; }
public void setMetricName(String metricName) { this.metricName = metricName; }
public double getSlope() { return slope; }
public void setSlope(double slope) { this.slope = slope; }
public String getTrend() { return trend; }
public void setTrend(String trend) { this.trend = trend; }
public double getStrength() { return strength; }
public void setStrength(double strength) { this.strength = strength; }
}
public class MetricSeries {
private Map<String, String> labels;
public Map<String, String> getLabels() { return labels; }
public void setLabels(Map<String, String> labels) { this.labels = labels; }
}

REST API Controllers

1. Metrics Query API
import org.springframework.web.bind.annotation.*;
import org.springframework.http.ResponseEntity;
import java.time.Instant;
import java.time.LocalDateTime;
import java.time.ZoneId;
import java.util.List;
@RestController
@RequestMapping("/api/metrics")
public class MetricsController {
private final MimirQueryService queryService;
private final RetentionPolicyManager retentionManager;
public MetricsController(MimirQueryService queryService, 
RetentionPolicyManager retentionManager) {
this.queryService = queryService;
this.retentionManager = retentionManager;
}
@GetMapping("/query")
public ResponseEntity<MetricQueryResult> queryMetrics(
@RequestParam String query,
@RequestParam(defaultValue = "1h") String range,
@RequestParam(defaultValue = "1m") String step) {
long endTime = Instant.now().getEpochSecond();
long startTime = calculateStartTime(endTime, range);
MetricQueryResult result = queryService.queryMetric(query, startTime, endTime, step);
return ResponseEntity.ok(result);
}
@GetMapping("/series")
public ResponseEntity<List<MetricSeries>> getMetricSeries(
@RequestParam String match,
@RequestParam(defaultValue = "1h") String range) {
long endTime = Instant.now().getEpochSecond();
long startTime = calculateStartTime(endTime, range);
List<MetricSeries> series = queryService.getMetricSeries(match, startTime, endTime);
return ResponseEntity.ok(series);
}
@GetMapping("/stats/{metricName}")
public ResponseEntity<StatisticalSummary> getMetricStatistics(
@PathVariable String metricName,
@RequestParam(defaultValue = "24h") String range) {
long endTime = Instant.now().getEpochSecond();
long startTime = calculateStartTime(endTime, range);
StatisticalSummary stats = queryService.getStatisticalSummary(metricName, startTime, endTime);
return ResponseEntity.ok(stats);
}
@GetMapping("/trend/{metricName}")
public ResponseEntity<TrendAnalysis> analyzeMetricTrend(
@PathVariable String metricName,
@RequestParam(defaultValue = "7d") String range) {
long endTime = Instant.now().getEpochSecond();
long startTime = calculateStartTime(endTime, range);
TrendAnalysis analysis = queryService.analyzeTrend(metricName, startTime, endTime);
return ResponseEntity.ok(analysis);
}
@PostMapping("/retention/{metricName}")
public ResponseEntity<String> applyRetentionPolicy(
@PathVariable String metricName,
@RequestParam String policy) {
retentionManager.applyRetentionPolicy(metricName, policy);
return ResponseEntity.ok("Retention policy applied successfully");
}
@GetMapping("/retention/policies")
public ResponseEntity<Map<String, RetentionPolicy>> getRetentionPolicies() {
return ResponseEntity.ok(retentionManager.getPolicies());
}
private long calculateStartTime(long endTime, String range) {
if (range.endsWith("h")) {
int hours = Integer.parseInt(range.substring(0, range.length() - 1));
return endTime - (hours * 3600);
} else if (range.endsWith("d")) {
int days = Integer.parseInt(range.substring(0, range.length() - 1));
return endTime - (days * 86400);
} else if (range.endsWith("w")) {
int weeks = Integer.parseInt(range.substring(0, range.length() - 1));
return endTime - (weeks * 604800);
} else {
return endTime - 3600; // Default 1 hour
}
}
}
2. Monitoring Dashboard API
@RestController
@RequestMapping("/api/dashboard")
public class DashboardController {
private final MimirQueryService queryService;
private final BatchMetricsProcessor batchProcessor;
public DashboardController(MimirQueryService queryService, 
BatchMetricsProcessor batchProcessor) {
this.queryService = queryService;
this.batchProcessor = batchProcessor;
}
@GetMapping("/overview")
public ResponseEntity<DashboardOverview> getDashboardOverview() {
DashboardOverview overview = new DashboardOverview();
long now = Instant.now().getEpochSecond();
long oneHourAgo = now - 3600;
long oneDayAgo = now - 86400;
// System metrics
overview.setCpuUsage(getCurrentCpuUsage());
overview.setMemoryUsage(getCurrentMemoryUsage());
overview.setDiskUsage(getCurrentDiskUsage());
// Application metrics
overview.setRequestRate(getRequestRate(oneHourAgo, now));
overview.setErrorRate(getErrorRate(oneHourAgo, now));
overview.setQueueSize(batchProcessor.getQueueSize());
// Long-term trends
overview.setDailyTrend(getDailyTrend(oneDayAgo, now));
return ResponseEntity.ok(overview);
}
@GetMapping("/long-term/{metricName}")
public ResponseEntity<LongTermView> getLongTermView(
@PathVariable String metricName,
@RequestParam(defaultValue = "30d") String period) {
LongTermView view = new LongTermView();
view.setMetricName(metricName);
view.setPeriod(period);
long endTime = Instant.now().getEpochSecond();
long startTime = calculateStartTime(endTime, period);
// Get data at different resolutions for the period
String step = calculateStepForPeriod(period);
MetricQueryResult result = queryService.queryMetric(metricName, startTime, endTime, step);
view.setData(result);
// Calculate trends
TrendAnalysis trend = queryService.analyzeTrend(metricName, startTime, endTime);
view.setTrend(trend);
// Statistical summary
StatisticalSummary stats = queryService.getStatisticalSummary(metricName, startTime, endTime);
view.setStatistics(stats);
return ResponseEntity.ok(view);
}
private double getCurrentCpuUsage() {
// Implementation would query Mimir for current CPU usage
return 45.7; // Example value
}
private double getCurrentMemoryUsage() {
// Implementation would query Mimir for current memory usage
return 67.2; // Example value
}
private double getCurrentDiskUsage() {
// Implementation would query Mimir for current disk usage
return 23.1; // Example value
}
private double getRequestRate(long startTime, long endTime) {
// Implementation would query request rate from Mimir
return 1250.5; // Example value - requests per second
}
private double getErrorRate(long startTime, long endTime) {
// Implementation would query error rate from Mimir
return 2.3; // Example value - percentage
}
private String getDailyTrend(long startTime, long endTime) {
// Implementation would analyze daily trend
return "STABLE"; // Example value
}
private long calculateStartTime(long endTime, String range) {
// Same implementation as in MetricsController
return endTime - 3600; // Simplified
}
private String calculateStepForPeriod(String period) {
if (period.endsWith("h")) {
return "1m";
} else if (period.endsWith("d")) {
int days = Integer.parseInt(period.substring(0, period.length() - 1));
return days <= 7 ? "5m" : "1h";
} else if (period.endsWith("w")) {
return "1h";
} else if (period.endsWith("y")) {
return "1d";
} else {
return "1h";
}
}
}
public class DashboardOverview {
private double cpuUsage;
private double memoryUsage;
private double diskUsage;
private double requestRate;
private double errorRate;
private int queueSize;
private String dailyTrend;
// Getters and setters
public double getCpuUsage() { return cpuUsage; }
public void setCpuUsage(double cpuUsage) { this.cpuUsage = cpuUsage; }
public double getMemoryUsage() { return memoryUsage; }
public void setMemoryUsage(double memoryUsage) { this.memoryUsage = memoryUsage; }
public double getDiskUsage() { return diskUsage; }
public void setDiskUsage(double diskUsage) { this.diskUsage = diskUsage; }
public double getRequestRate() { return requestRate; }
public void setRequestRate(double requestRate) { this.requestRate = requestRate; }
public double getErrorRate() { return errorRate; }
public void setErrorRate(double errorRate) { this.errorRate = errorRate; }
public int getQueueSize() { return queueSize; }
public void setQueueSize(int queueSize) { this.queueSize = queueSize; }
public String getDailyTrend() { return dailyTrend; }
public void setDailyTrend(String dailyTrend) { this.dailyTrend = dailyTrend; }
}
public class LongTermView {
private String metricName;
private String period;
private MetricQueryResult data;
private TrendAnalysis trend;
private StatisticalSummary statistics;
// Getters and setters
public String getMetricName() { return metricName; }
public void setMetricName(String metricName) { this.metricName = metricName; }
public String getPeriod() { return period; }
public void setPeriod(String period) { this.period = period; }
public MetricQueryResult getData() { return data; }
public void setData(MetricQueryResult data) { this.data = data; }
public TrendAnalysis getTrend() { return trend; }
public void setTrend(TrendAnalysis trend) { this.trend = trend; }
public StatisticalSummary getStatistics() { return statistics; }
public void setStatistics(StatisticalSummary statistics) { this.statistics = statistics; }
}

Configuration and Main Application

1. Spring Boot Configuration
import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.SpringBootApplication;
import org.springframework.scheduling.annotation.EnableScheduling;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
@SpringBootApplication
@EnableScheduling
public class MimirLongTermApplication {
public static void main(String[] args) {
SpringApplication.run(MimirLongTermApplication.class, args);
}
@Bean
public ObjectMapper objectMapper() {
ObjectMapper mapper = new ObjectMapper();
mapper.findAndRegisterModules();
return mapper;
}
}
@Configuration
class MimirConfig {
@Bean
@ConfigurationProperties(prefix = "mimir")
public MimirProperties mimirProperties() {
return new MimirProperties();
}
@Bean
public MimirClient mimirClient(MimirProperties properties) {
return new MimirClient(
properties.getWriteUrl(),
properties.getReadUrl(),
properties.getTenantId(),
properties.getTimeout()
);
}
}
@ConfigurationProperties(prefix = "mimir")
public class MimirProperties {
private String url;
private String readUrl;
private String writeUrl;
private String tenantId;
private long timeout = 30000;
private int batchSize = 1000;
private long flushInterval = 5000;
private RetentionConfig retention;
// Getters and setters
public String getUrl() { return url; }
public void setUrl(String url) { this.url = url; }
public String getReadUrl() { return readUrl; }
public void setReadUrl(String readUrl) { this.readUrl = readUrl; }
public String getWriteUrl() { return writeUrl; }
public void setWriteUrl(String writeUrl) { this.writeUrl = writeUrl; }
public String getTenantId() { return tenantId; }
public void setTenantId(String tenantId) { this.tenantId = tenantId; }
public long getTimeout() { return timeout; }
public void setTimeout(long timeout) { this.timeout = timeout; }
public int getBatchSize() { return batchSize; }
public void setBatchSize(int batchSize) { this.batchSize = batchSize; }
public long getFlushInterval() { return flushInterval; }
public void setFlushInterval(long flushInterval) { this.flushInterval = flushInterval; }
public RetentionConfig getRetention() { return retention; }
public void setRetention(RetentionConfig retention) { this.retention = retention; }
}
public class RetentionConfig {
private boolean enabled;
private List<RetentionPolicyConfig> policies;
// Getters and setters
public boolean isEnabled() { return enabled; }
public void setEnabled(boolean enabled) { this.enabled = enabled; }
public List<RetentionPolicyConfig> getPolicies() { return policies; }
public void setPolicies(List<RetentionPolicyConfig> policies) { this.policies = policies; }
}
public class RetentionPolicyConfig {
private String name;
private String duration;
private String resolution;
// Getters and setters
public String getName() { return name; }
public void setName(String name) { this.name = name; }
public String getDuration() { return duration; }
public void setDuration(String duration) { this.duration = duration; }
public String getResolution() { return resolution; }
public void setResolution(String resolution) { this.resolution = resolution; }
}

Best Practices for Long-Term Storage

  1. Data Modeling: Use meaningful label names and avoid high cardinality
  2. Retention Strategy: Implement tiered retention with downsampling
  3. Monitoring: Monitor Mimir performance and storage usage
  4. Backup: Implement regular backups of critical metrics
  5. Compression: Leverage Mimir's efficient compression algorithms
  6. Query Optimization: Use appropriate time ranges and step parameters
// Example of efficient query patterns
public class EfficientQueryPatterns {
public void demonstrateEfficientQueries(MimirQueryService queryService) {
long now = Instant.now().getEpochSecond();
// Good: Appropriate time range and step
queryService.queryMetric("cpu_usage", now - 3600, now, "15s");
// Good: Long-term view with proper downsampling
queryService.queryMetric("memory_usage", now - 2592000, now, "1h");
// Avoid: Too high resolution for long periods
// queryService.queryMetric("network_bytes", now - 604800, now, "1s");
// Use: Statistical queries for summaries
queryService.getStatisticalSummary("request_duration", now - 86400, now);
}
}

Conclusion

This comprehensive Mimir integration for long-term monitoring provides:

  • Efficient metrics collection with batching and compression
  • Flexible retention policies with automatic downsampling
  • Powerful query capabilities for historical analysis
  • Real-time and long-term monitoring dashboards
  • Scalable architecture suitable for enterprise applications

The implementation demonstrates how to build a complete observability stack using Mimir as the long-term storage backend, enabling organizations to maintain years of historical metrics data while optimizing storage costs and query performance.

Java Observability, Logging Intelligence & AI-Driven Monitoring (APM, Tracing, Logs & Anomaly Detection)

https://macronepal.com/blog/beyond-metrics-observing-serverless-and-traditional-java-applications-with-thundra-apm/
Explains using Thundra APM to observe both serverless and traditional Java applications by combining tracing, metrics, and logs into a unified observability platform for faster debugging and performance insights.

https://macronepal.com/blog/dynatrace-oneagent-in-java-2/
Explains Dynatrace OneAgent for Java, which automatically instruments JVM applications to capture metrics, traces, and logs, enabling full-stack monitoring and root-cause analysis with minimal configuration.

https://macronepal.com/blog/lightstep-java-sdk-distributed-tracing-and-observability-implementation/
Explains Lightstep Java SDK for distributed tracing, helping developers track requests across microservices and identify latency issues using OpenTelemetry-based observability.

https://macronepal.com/blog/honeycomb-io-beeline-for-java-complete-guide-2/
Explains Honeycomb Beeline for Java, which provides high-cardinality observability and deep query capabilities to understand complex system behavior and debug distributed systems efficiently.

https://macronepal.com/blog/lumigo-for-serverless-in-java-complete-distributed-tracing-guide-2/
Explains Lumigo for Java serverless applications, offering automatic distributed tracing, log correlation, and error tracking to simplify debugging in cloud-native environments. (Lumigo Docs)

https://macronepal.com/blog/from-noise-to-signals-implementing-log-anomaly-detection-in-java-applications/
Explains how to detect anomalies in Java logs using behavioral patterns and machine learning techniques to separate meaningful incidents from noisy log data and improve incident response.

https://macronepal.com/blog/ai-powered-log-analysis-in-java-from-reactive-debugging-to-proactive-insights/
Explains AI-driven log analysis for Java applications, shifting from manual debugging to predictive insights that identify issues early and improve system reliability using intelligent log processing.

https://macronepal.com/blog/titliel-java-logging-best-practices/
Explains best practices for Java logging, focusing on structured logs, proper log levels, performance optimization, and ensuring logs are useful for debugging and observability systems.

https://macronepal.com/blog/seeking-a-loguru-for-java-the-quest-for-elegant-and-simple-logging/
Explains the search for simpler, more elegant logging frameworks in Java, comparing modern logging approaches that aim to reduce complexity while improving readability and developer experience.

Leave a Reply

Your email address will not be published. Required fields are marked *


Macro Nepal Helper