OpenSearch Dashboards is the visualization layer for the OpenSearch ecosystem, providing powerful data exploration, dashboard creation, and real-time analytics capabilities. When combined with Java applications, it enables developers to build comprehensive monitoring, logging, and business intelligence solutions.
What is OpenSearch Dashboards?
OpenSearch Dashboards is an open-source data visualization and exploration tool that works seamlessly with OpenSearch. It provides:
- Interactive dashboards for data visualization
- Data discovery and exploration tools
- Real-time analytics and monitoring
- Alerting and notification system
- Custom plugin development framework
Java Integration Architecture
[Java Application] → [OpenSearch Client] → [OpenSearch Cluster] ← [OpenSearch Dashboards] | | | | Log events, Index Store and Visualize, metrics, business documents search data analyze, monitor data with REST API with dashboards
Hands-On Tutorial: Building a Complete Monitoring Solution
Let's build a Java application that sends structured data to OpenSearch and creates comprehensive dashboards for monitoring and analysis.
Step 1: Project Setup and Dependencies
Maven Dependencies (pom.xml):
<properties>
<opensearch.version>2.11.0</opensearch.version>
<spring-boot.version>3.2.0</spring-boot.version>
<jackson.version>2.16.1</jackson.version>
</properties>
<dependencies>
<!-- Spring Boot -->
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-web</artifactId>
<version>${spring-boot.version}</version>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-actuator</artifactId>
<version>${spring-boot.version}</version>
</dependency>
<!-- OpenSearch Java Client -->
<dependency>
<groupId>org.opensearch.client</groupId>
<artifactId>opensearch-java</artifactId>
<version>${opensearch.version}</version>
</dependency>
<!-- OpenSearch REST Client -->
<dependency>
<groupId>org.opensearch.client</groupId>
<artifactId>opensearch-rest-client</artifactId>
<version>${opensearch.version}</version>
</dependency>
<!-- Jackson for JSON processing -->
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-databind</artifactId>
<version>${jackson.version}</version>
</dependency>
<dependency>
<groupId>com.fasterxml.jackson.datatype</groupId>
<artifactId>jackson-datatype-jsr310</artifactId>
<version>${jackson.version}</version>
</dependency>
<!-- Micrometer for metrics -->
<dependency>
<groupId>io.micrometer</groupId>
<artifactId>micrometer-core</artifactId>
</dependency>
<dependency>
<groupId>io.micrometer</groupId>
<artifactId>micrometer-registry-prometheus</artifactId>
</dependency>
<!-- Logback with OpenSearch appender -->
<dependency>
<groupId>net.logstash.logback</groupId>
<artifactId>logstash-logback-encoder</artifactId>
<version>7.4</version>
</dependency>
</dependencies>
Step 2: OpenSearch Client Configuration
application.yml:
opensearch: host: localhost port: 9200 protocol: http username: admin password: admin indices: application-logs: application-logs business-metrics: business-metrics system-metrics: system-metrics user-activity: user-activity # Application configuration spring: application: name: ecommerce-service # Logging configuration logging: level: org.opensearch: INFO com.example: DEBUG
Java Configuration:
@Configuration
public class OpenSearchConfig {
@Value("${opensearch.host}")
private String host;
@Value("${opensearch.port}")
private int port;
@Value("${opensearch.username}")
private String username;
@Value("${opensearch.password}")
private String password;
@Bean
public RestClient restClient() {
return RestClient.builder(new HttpHost(host, port))
.setHttpClientConfigCallback(httpClientBuilder -> {
// Add basic authentication
CredentialsProvider credentialsProvider = new BasicCredentialsProvider();
credentialsProvider.setCredentials(
AuthScope.ANY,
new UsernamePasswordCredentials(username, password)
);
return httpClientBuilder.setDefaultCredentialsProvider(credentialsProvider);
})
.setRequestConfigCallback(requestConfigBuilder ->
requestConfigBuilder
.setConnectTimeout(5000)
.setSocketTimeout(60000)
)
.build();
}
@Bean
public OpenSearchClient openSearchClient(RestClient restClient) {
return new OpenSearchClient(restClient);
}
@Bean
public ObjectMapper objectMapper() {
ObjectMapper mapper = new ObjectMapper();
mapper.registerModule(new JavaTimeModule());
mapper.disable(SerializationFeature.WRITE_DATES_AS_TIMESTAMPS);
return mapper;
}
}
Step 3: OpenSearch Service Abstraction
@Service
public class OpenSearchService {
private final OpenSearchClient client;
private final ObjectMapper objectMapper;
private final String applicationLogsIndex;
private final String businessMetricsIndex;
private final String systemMetricsIndex;
private final String userActivityIndex;
private static final Logger logger = LoggerFactory.getLogger(OpenSearchService.class);
public OpenSearchService(OpenSearchClient client,
ObjectMapper objectMapper,
@Value("${opensearch.indices.application-logs}") String applicationLogsIndex,
@Value("${opensearch.indices.business-metrics}") String businessMetricsIndex,
@Value("${opensearch.indices.system-metrics}") String systemMetricsIndex,
@Value("${opensearch.indices.user-activity}") String userActivityIndex) {
this.client = client;
this.objectMapper = objectMapper;
this.applicationLogsIndex = applicationLogsIndex;
this.businessMetricsIndex = businessMetricsIndex;
this.systemMetricsIndex = systemMetricsIndex;
this.userActivityIndex = userActivityIndex;
}
/**
* Index a document with automatic timestamp
*/
public <T> void indexDocument(String index, T document) {
try {
IndexResponse response = client.index(IndexRequest.of(idx -> idx
.index(index)
.document(objectMapper.convertValue(document, Map.class))
));
logger.debug("Document indexed successfully: {}", response.id());
} catch (Exception e) {
logger.error("Failed to index document in index: {}", index, e);
throw new OpenSearchException("Indexing failed", e);
}
}
/**
* Bulk index multiple documents
*/
public <T> void bulkIndexDocuments(String index, List<T> documents) {
if (documents.isEmpty()) return;
BulkRequest.Builder bulkBuilder = new BulkRequest.Builder();
for (T document : documents) {
bulkBuilder.operations(op -> op
.index(idx -> idx
.index(index)
.document(objectMapper.convertValue(document, Map.class))
)
);
}
try {
BulkResponse response = client.bulk(bulkBuilder.build());
if (response.errors()) {
logger.warn("Some documents failed in bulk index operation");
response.items().forEach(item -> {
if (item.error() != null) {
logger.error("Document error: {}", item.error().reason());
}
});
} else {
logger.info("Bulk indexed {} documents to index: {}", documents.size(), index);
}
} catch (Exception e) {
logger.error("Bulk indexing failed for index: {}", index, e);
throw new OpenSearchException("Bulk indexing failed", e);
}
}
/**
* Search documents with query
*/
public <T> List<T> searchDocuments(String index, SearchQuery query, Class<T> type) {
try {
SearchResponse<Map> response = client.search(SearchRequest.of(sr -> sr
.index(index)
.query(query._toQuery())
.size(1000)
), Map.class);
return response.hits().hits().stream()
.map(hit -> objectMapper.convertValue(hit.source(), type))
.collect(Collectors.toList());
} catch (Exception e) {
logger.error("Search failed for index: {}", index, e);
throw new OpenSearchException("Search failed", e);
}
}
/**
* Create index with mapping
*/
public void createIndexWithMapping(String index, Map<String, Property> properties) {
try {
CreateIndexResponse response = client.indices().create(CreateIndexRequest.of(cir -> cir
.index(index)
.mappings(TypeMapping.of(tm -> tm
.properties(properties)
))
));
logger.info("Index created successfully: {}", index);
} catch (Exception e) {
logger.error("Failed to create index: {}", index, e);
throw new OpenSearchException("Index creation failed", e);
}
}
/**
* Get index mapping
*/
public GetIndicesResponse getIndexMapping(String index) {
try {
return client.indices().get(GetIndicesRequest.of(gir -> gir
.index(index)
));
} catch (Exception e) {
logger.error("Failed to get mapping for index: {}", index, e);
throw new OpenSearchException("Failed to get mapping", e);
}
}
}
Step 4: Data Models for Dashboards
Application Log Event:
public class ApplicationLogEvent {
private String id;
private String timestamp;
private String level;
private String logger;
private String message;
private String thread;
private String stackTrace;
private String application;
private String environment;
private String hostname;
private Map<String, Object> mdc; // Mapped Diagnostic Context
// Constructors
public ApplicationLogEvent() {
this.timestamp = Instant.now().toString();
this.id = UUID.randomUUID().toString();
}
public ApplicationLogEvent(String level, String logger, String message) {
this();
this.level = level;
this.logger = logger;
this.message = message;
this.application = "ecommerce-service";
this.environment = System.getenv().getOrDefault("ENVIRONMENT", "development");
}
// Getters and setters...
public String getId() { return id; }
public void setId(String id) { this.id = id; }
public String getTimestamp() { return timestamp; }
public void setTimestamp(String timestamp) { this.timestamp = timestamp; }
public String getLevel() { return level; }
public void setLevel(String level) { this.level = level; }
public String getLogger() { return logger; }
public void setLogger(String logger) { this.logger = logger; }
public String getMessage() { return message; }
public void setMessage(String message) { this.message = message; }
public String getThread() { return thread; }
public void setThread(String thread) { this.thread = thread; }
public String getStackTrace() { return stackTrace; }
public void setStackTrace(String stackTrace) { this.stackTrace = stackTrace; }
public String getApplication() { return application; }
public void setApplication(String application) { this.application = application; }
public String getEnvironment() { return environment; }
public void setEnvironment(String environment) { this.environment = environment; }
public String getHostname() { return hostname; }
public void setHostname(String hostname) { this.hostname = hostname; }
public Map<String, Object> getMdc() { return mdc; }
public void setMdc(Map<String, Object> mdc) { this.mdc = mdc; }
}
Business Metrics:
public class BusinessMetrics {
private String id;
private String timestamp;
private String metricName;
private double value;
private Map<String, String> tags;
private String application;
private String environment;
// Business-specific metrics
public static final String ORDERS_CREATED = "orders.created";
public static final String PAYMENTS_PROCESSED = "payments.processed";
public static final String REVENUE_GENERATED = "revenue.generated";
public static final String USERS_REGISTERED = "users.registered";
public static final String INVENTORY_UPDATED = "inventory.updated";
public BusinessMetrics() {
this.id = UUID.randomUUID().toString();
this.timestamp = Instant.now().toString();
this.application = "ecommerce-service";
this.environment = System.getenv().getOrDefault("ENVIRONMENT", "development");
}
public BusinessMetrics(String metricName, double value) {
this();
this.metricName = metricName;
this.value = value;
this.tags = new HashMap<>();
}
public BusinessMetrics withTag(String key, String value) {
this.tags.put(key, value);
return this;
}
// Getters and setters...
}
User Activity Event:
public class UserActivityEvent {
private String id;
private String timestamp;
private String userId;
private String sessionId;
private String action;
private String resource;
private String httpMethod;
private int statusCode;
private long responseTime;
private String userAgent;
private String ipAddress;
private Map<String, Object> metadata;
// Common actions
public static final String ACTION_LOGIN = "LOGIN";
public static final String ACTION_LOGOUT = "LOGOUT";
public static final String ACTION_VIEW_PRODUCT = "VIEW_PRODUCT";
public static final String ACTION_ADD_TO_CART = "ADD_TO_CART";
public static final String ACTION_PURCHASE = "PURCHASE";
public static final String ACTION_SEARCH = "SEARCH";
public UserActivityEvent() {
this.id = UUID.randomUUID().toString();
this.timestamp = Instant.now().toString();
}
public UserActivityEvent(String userId, String action, String resource) {
this();
this.userId = userId;
this.action = action;
this.resource = resource;
this.metadata = new HashMap<>();
}
public UserActivityEvent withMetadata(String key, Object value) {
this.metadata.put(key, value);
return this;
}
// Getters and setters...
}
Step 5: Logback Configuration for OpenSearch
logback-spring.xml:
<?xml version="1.0" encoding="UTF-8"?>
<configuration>
<include resource="org/springframework/boot/logging/logback/defaults.xml"/>
<springProperty scope="context" name="APPLICATION_NAME" source="spring.application.name"/>
<springProperty scope="context" name="ENVIRONMENT" source="spring.profiles.active" defaultValue="development"/>
<!-- OpenSearch Appender -->
<appender name="OPENSEARCH" class="com.example.logging.OpenSearchAppender">
<index>application-logs</index>
<application>${APPLICATION_NAME}</application>
<environment>${ENVIRONMENT}</environment>
<url>http://localhost:9200</url>
<username>admin</username>
<password>admin</password>
</appender>
<!-- Console Appender -->
<appender name="CONSOLE" class="ch.qos.logback.core.ConsoleAppender">
<encoder>
<pattern>%d{yyyy-MM-dd HH:mm:ss.SSS} [%thread] %-5level %logger{36} - %msg%n</pattern>
</encoder>
</appender>
<!-- Async OpenSearch Appender -->
<appender name="ASYNC_OPENSEARCH" class="ch.qos.logback.classic.AsyncAppender">
<appender-ref ref="OPENSEARCH"/>
<queueSize>1000</queueSize>
<discardingThreshold>0</discardingThreshold>
<includeCallerData>true</includeCallerData>
</appender>
<root level="INFO">
<appender-ref ref="CONSOLE"/>
<appender-ref ref="ASYNC_OPENSEARCH"/>
</root>
<logger name="com.example" level="DEBUG"/>
</configuration>
Custom OpenSearch Appender:
public class OpenSearchAppender extends AppenderBase<ILoggingEvent> {
private String index;
private String application;
private String environment;
private String url;
private String username;
private String password;
private RestHighLevelClient client;
private ObjectMapper objectMapper;
@Override
public void start() {
this.objectMapper = new ObjectMapper();
this.objectMapper.registerModule(new JavaTimeModule());
// Initialize OpenSearch client
RestClientBuilder restClientBuilder = RestClient.builder(
HttpHost.create(url)
);
if (username != null && password != null) {
final CredentialsProvider credentialsProvider = new BasicCredentialsProvider();
credentialsProvider.setCredentials(
AuthScope.ANY,
new UsernamePasswordCredentials(username, password)
);
restClientBuilder.setHttpClientConfigCallback(httpClientBuilder ->
httpClientBuilder.setDefaultCredentialsProvider(credentialsProvider)
);
}
this.client = new RestHighLevelClient(restClientBuilder);
super.start();
}
@Override
protected void append(ILoggingEvent event) {
try {
ApplicationLogEvent logEvent = createLogEvent(event);
IndexRequest request = new IndexRequest(index)
.source(objectMapper.convertValue(logEvent, Map.class));
// Async indexing to avoid blocking
client.indexAsync(request, RequestOptions.DEFAULT,
new ActionListener<IndexResponse>() {
@Override
public void onResponse(IndexResponse response) {
// Successfully indexed
}
@Override
public void onFailure(Exception e) {
System.err.println("Failed to index log event: " + e.getMessage());
}
});
} catch (Exception e) {
// Fallback to console if OpenSearch fails
System.err.println("OpenSearch logging failed: " + e.getMessage());
}
}
private ApplicationLogEvent createLogEvent(ILoggingEvent event) {
ApplicationLogEvent logEvent = new ApplicationLogEvent(
event.getLevel().toString(),
event.getLoggerName(),
event.getFormattedMessage()
);
logEvent.setThread(event.getThreadName());
logEvent.setHostname(getHostname());
if (event.getThrowableProxy() != null) {
logEvent.setStackTrace(event.getThrowableProxy().getMessage());
}
// Add MDC context
if (event.getMDCPropertyMap() != null && !event.getMDCPropertyMap().isEmpty()) {
logEvent.setMdc(new HashMap<>(event.getMDCPropertyMap()));
}
return logEvent;
}
private String getHostname() {
try {
return InetAddress.getLocalHost().getHostName();
} catch (UnknownHostException e) {
return "unknown";
}
}
// Getters and setters for configuration properties
public void setIndex(String index) { this.index = index; }
public void setApplication(String application) { this.application = application; }
public void setEnvironment(String environment) { this.environment = environment; }
public void setUrl(String url) { this.url = url; }
public void setUsername(String username) { this.username = username; }
public void setPassword(String password) { this.password = password; }
@Override
public void stop() {
try {
if (client != null) {
client.close();
}
} catch (IOException e) {
// Ignore on shutdown
}
super.stop();
}
}
Step 6: Metrics Collection Service
@Service
public class MetricsCollectorService {
private final OpenSearchService openSearchService;
private final MeterRegistry meterRegistry;
private final Counter ordersCreatedCounter;
private final Counter paymentsProcessedCounter;
private final Timer orderProcessingTimer;
private final DistributionSummary orderValueSummary;
public MetricsCollectorService(OpenSearchService openSearchService,
MeterRegistry meterRegistry) {
this.openSearchService = openSearchService;
this.meterRegistry = meterRegistry;
// Initialize Micrometer metrics
this.ordersCreatedCounter = Counter.builder("orders.created")
.description("Total number of orders created")
.register(meterRegistry);
this.paymentsProcessedCounter = Counter.builder("payments.processed")
.description("Total number of payments processed")
.register(meterRegistry);
this.orderProcessingTimer = Timer.builder("order.processing.time")
.description("Time taken to process orders")
.register(meterRegistry);
this.orderValueSummary = DistributionSummary.builder("order.value")
.description("Distribution of order values")
.baseUnit("USD")
.register(meterRegistry);
}
/**
* Record business metrics to OpenSearch
*/
public void recordOrderCreated(double orderValue, String paymentMethod, String customerTier) {
// Increment counter
ordersCreatedCounter.increment();
orderValueSummary.record(orderValue);
// Record detailed business metric
BusinessMetrics metrics = new BusinessMetrics(
BusinessMetrics.ORDERS_CREATED,
orderValue
)
.withTag("payment_method", paymentMethod)
.withTag("customer_tier", customerTier);
openSearchService.indexDocument("business-metrics", metrics);
}
public void recordPaymentProcessed(double amount, boolean success, String gateway) {
paymentsProcessedCounter.increment();
BusinessMetrics metrics = new BusinessMetrics(
BusinessMetrics.PAYMENTS_PROCESSED,
amount
)
.withTag("success", String.valueOf(success))
.withTag("gateway", gateway);
openSearchService.indexDocument("business-metrics", metrics);
}
public void recordUserActivity(UserActivityEvent event) {
openSearchService.indexDocument("user-activity", event);
}
/**
* Collect and ship system metrics periodically
*/
@Scheduled(fixedRate = 60000) // Every minute
public void collectSystemMetrics() {
try {
SystemMetrics metrics = gatherSystemMetrics();
openSearchService.indexDocument("system-metrics", metrics);
} catch (Exception e) {
logger.error("Failed to collect system metrics", e);
}
}
private SystemMetrics gatherSystemMetrics() {
SystemMetrics metrics = new SystemMetrics();
// JVM metrics
Runtime runtime = Runtime.getRuntime();
metrics.setHeapUsed(runtime.totalMemory() - runtime.freeMemory());
metrics.setHeapMax(runtime.maxMemory());
metrics.setThreadCount(Thread.activeCount());
// System metrics
metrics.setCpuUsage(getProcessCpuLoad());
metrics.setSystemLoad(getSystemLoadAverage());
return metrics;
}
private double getProcessCpuLoad() {
// Implement CPU load calculation
return 0.0;
}
private double getSystemLoadAverage() {
return ManagementFactory.getOperatingSystemMXBean().getSystemLoadAverage();
}
}
Step 7: REST Controller with Activity Tracking
@RestController
@RequestMapping("/api")
public class EcommerceController {
private final MetricsCollectorService metricsCollector;
private final OpenSearchService openSearchService;
private static final Logger logger = LoggerFactory.getLogger(EcommerceController.class);
public EcommerceController(MetricsCollectorService metricsCollector,
OpenSearchService openSearchService) {
this.metricsCollector = metricsCollector;
this.openSearchService = openSearchService;
}
@PostMapping("/orders")
public ResponseEntity<Order> createOrder(@RequestBody OrderRequest request,
HttpServletRequest httpRequest) {
// Track user activity
UserActivityEvent activity = new UserActivityEvent(
request.getUserId(),
UserActivityEvent.ACTION_PURCHASE,
"/api/orders"
)
.withMetadata("order_total", request.getTotalAmount())
.withMetadata("item_count", request.getItems().size());
metricsCollector.recordUserActivity(activity);
Timer.Sample sample = Timer.start();
try {
// Process order
Order order = processOrder(request);
// Record business metrics
metricsCollector.recordOrderCreated(
request.getTotalAmount().doubleValue(),
request.getPaymentMethod(),
"standard" // customer tier
);
logger.info("Order created successfully: {}", order.getId());
return ResponseEntity.ok(order);
} finally {
sample.stop(metricsCollector.getOrderProcessingTimer());
}
}
@GetMapping("/products/{id}")
public ResponseEntity<Product> getProduct(@PathVariable String id,
@RequestHeader("User-Agent") String userAgent,
HttpServletRequest request) {
// Track product view
UserActivityEvent activity = new UserActivityEvent(
"anonymous", // or get from authentication
UserActivityEvent.ACTION_VIEW_PRODUCT,
"/api/products/" + id
)
.withMetadata("product_id", id)
.withMetadata("user_agent", userAgent);
metricsCollector.recordUserActivity(activity);
Product product = productService.getProduct(id);
return ResponseEntity.ok(product);
}
@GetMapping("/analytics/dashboard-data")
public ResponseEntity<Map<String, Object>> getDashboardData() {
try {
// Query OpenSearch for dashboard data
Map<String, Object> dashboardData = new HashMap<>();
// Recent orders count
SearchQuery recentOrdersQuery = MatchQuery.of(m -> m
.field("metricName")
.query(BusinessMetrics.ORDERS_CREATED)
)._toQuery();
List<BusinessMetrics> recentOrders = openSearchService.searchDocuments(
"business-metrics", recentOrdersQuery, BusinessMetrics.class
);
dashboardData.put("recentOrdersCount", recentOrders.size());
dashboardData.put("recentOrders", recentOrders);
// User activity summary
SearchQuery userActivityQuery = RangeQuery.of(r -> r
.field("timestamp")
.gte(JsonData.of(Instant.now().minus(1, ChronoUnit.HOURS).toString()))
)._toQuery();
List<UserActivityEvent> recentActivities = openSearchService.searchDocuments(
"user-activity", userActivityQuery, UserActivityEvent.class
);
Map<String, Long> activityCounts = recentActivities.stream()
.collect(Collectors.groupingBy(UserActivityEvent::getAction, Collectors.counting()));
dashboardData.put("activitySummary", activityCounts);
return ResponseEntity.ok(dashboardData);
} catch (Exception e) {
logger.error("Failed to fetch dashboard data", e);
return ResponseEntity.status(500).build();
}
}
private Order processOrder(OrderRequest request) {
// Order processing logic
return new Order(UUID.randomUUID().toString(), request.getUserId(),
request.getTotalAmount(), "CONFIRMED");
}
}
Step 8: Index Management Service
@Service
public class IndexManagementService {
private final OpenSearchClient client;
public IndexManagementService(OpenSearchClient client) {
this.client = client;
}
/**
* Initialize all required indices with proper mappings
*/
@PostConstruct
public void initializeIndices() {
createApplicationLogsIndex();
createBusinessMetricsIndex();
createUserActivityIndex();
createSystemMetricsIndex();
}
private void createApplicationLogsIndex() {
Map<String, Property> properties = new HashMap<>();
properties.put("timestamp", Property.of(p -> p.date(d -> d)));
properties.put("level", Property.of(p -> p.keyword(k -> k)));
properties.put("application", Property.of(p -> p.keyword(k -> k)));
properties.put("environment", Property.of(p -> p.keyword(k -> k)));
properties.put("message", Property.of(p -> p.text(t -> t)));
properties.put("logger", Property.of(p -> p.keyword(k -> k)));
try {
client.indices().create(CreateIndexRequest.of(cir -> cir
.index("application-logs")
.mappings(TypeMapping.of(tm -> tm.properties(properties)))
));
logger.info("Created application-logs index");
} catch (OpenSearchException e) {
if (e.getMessage().contains("resource_already_exists_exception")) {
logger.debug("application-logs index already exists");
} else {
throw e;
}
}
}
private void createBusinessMetricsIndex() {
Map<String, Property> properties = new HashMap<>();
properties.put("timestamp", Property.of(p -> p.date(d -> d)));
properties.put("metricName", Property.of(p -> p.keyword(k -> k)));
properties.put("value", Property.of(p -> p.double_(d -> d)));
properties.put("application", Property.of(p -> p.keyword(k -> k)));
properties.put("tags", Property.of(p -> p.flattened(f -> f)));
try {
client.indices().create(CreateIndexRequest.of(cir -> cir
.index("business-metrics")
.mappings(TypeMapping.of(tm -> tm.properties(properties)))
));
logger.info("Created business-metrics index");
} catch (OpenSearchException e) {
if (e.getMessage().contains("resource_already_exists_exception")) {
logger.debug("business-metrics index already exists");
} else {
throw e;
}
}
}
private void createUserActivityIndex() {
Map<String, Property> properties = new HashMap<>();
properties.put("timestamp", Property.of(p -> p.date(d -> d)));
properties.put("userId", Property.of(p -> p.keyword(k -> k)));
properties.put("action", Property.of(p -> p.keyword(k -> k)));
properties.put("resource", Property.of(p -> p.keyword(k -> k)));
properties.put("sessionId", Property.of(p -> p.keyword(k -> k)));
properties.put("statusCode", Property.of(p -> p.integer(i -> i)));
properties.put("responseTime", Property.of(p -> p.long_(l -> l)));
properties.put("metadata", Property.of(p -> p.flattened(f -> f)));
try {
client.indices().create(CreateIndexRequest.of(cir -> cir
.index("user-activity")
.mappings(TypeMapping.of(tm -> tm.properties(properties)))
));
logger.info("Created user-activity index");
} catch (OpenSearchException e) {
if (e.getMessage().contains("resource_already_exists_exception")) {
logger.debug("user-activity index already exists");
} else {
throw e;
}
}
}
private void createSystemMetricsIndex() {
// Similar implementation for system metrics
}
}
OpenSearch Dashboards Visualization
Sample Dashboard Queries
1. Application Logs Dashboard:
- Error rate over time
- Log level distribution
- Most frequent error messages
- Application performance correlation
2. Business Metrics Dashboard:
- Orders created per hour
- Revenue trends
- Payment success rates
- Customer activity patterns
3. User Activity Dashboard:
- User journey analysis
- Popular products and categories
- Conversion funnel
- Geographic distribution
Dashboard Configuration Example
{
"title": "Ecommerce Monitoring Dashboard",
"panels": [
{
"type": "timeseries",
"title": "Orders Created",
"metrics": [
{
"id": "orders_count",
"type": "count",
"field": "metricName",
"query": "orders.created"
}
]
},
{
"type": "metric",
"title": "Payment Success Rate",
"metrics": [
{
"id": "success_rate",
"type": "avg",
"field": "value",
"query": "metricName:payments.processed AND tags.success:true"
}
]
}
]
}
Running the Solution
- Start OpenSearch and Dashboards:
docker run -d --name opensearch \ -p 9200:9200 -p 9600:9600 \ -e "discovery.type=single-node" \ -e "plugins.security.disabled=true" \ opensearchproject/opensearch:2.11.0 docker run -d --name opensearch-dashboards \ -p 5601:5601 \ -e "OPENSEARCH_HOSTS=http://opensearch:9200" \ opensearchproject/opensearch-dashboards:2.11.0
- Start your Spring Boot application
- Generate traffic and view data in Dashboards:
curl -X POST http://localhost:8080/api/orders \
-H "Content-Type: application/json" \
-d '{"userId": "user123", "totalAmount": 99.99, "items": [], "paymentMethod": "credit_card"}'
- Access OpenSearch Dashboards:
http://localhost:5601
Best Practices
1. Index Management
- Use index templates for consistent mappings
- Implement index lifecycle management (ILM)
- Configure proper sharding and replication
- Set up retention policies
2. Performance Optimization
- Use bulk operations for high-volume data
- Implement async logging to avoid blocking
- Configure proper buffer sizes and timeouts
- Monitor OpenSearch cluster health
3. Security
- Implement proper authentication and authorization
- Use SSL/TLS for communication
- Apply field-level security in sensitive data
- Regular security audits
Conclusion
Integrating OpenSearch Dashboards with Java applications provides a powerful foundation for comprehensive monitoring, logging, and business intelligence. By implementing structured logging, metrics collection, and user activity tracking, you can:
- Gain real-time insights into application performance
- Debug issues faster with correlated logs and metrics
- Understand user behavior and business trends
- Build custom dashboards for different stakeholders
- Proactively identify performance bottlenecks and errors
The combination of Java's robust ecosystem and OpenSearch's powerful visualization capabilities creates an enterprise-grade observability platform that scales with your application's needs.