Process Anomaly Detection in Java: Comprehensive Behavioral Monitoring

Process anomaly detection involves monitoring system processes, applications, and runtime behavior to identify suspicious activities, performance issues, and security threats. This guide covers comprehensive implementation in Java.


Dependencies and Setup

1. Maven Dependencies
<properties>
<opencsv.version>5.7.1</opencsv.version>
<smile.version>3.0.1</smile.version>
<jackson.version>2.15.2</jackson.version>
<ehcache.version>3.10.8</ehcache.version>
<quartz.version>2.3.2</quartz.version>
</properties>
<dependencies>
<!-- Machine Learning -->
<dependency>
<groupId>com.github.haifengl</groupId>
<artifactId>smile-core</artifactId>
<version>${smile.version}</version>
</dependency>
<dependency>
<groupId>com.github.haifengl</groupId>
<artifactId>smile-plot</artifactId>
<version>${smile.version}</version>
</dependency>
<!-- CSV Processing -->
<dependency>
<groupId>com.opencsv</groupId>
<artifactId>opencsv</artifactId>
<version>${opencsv.version}</version>
</dependency>
<!-- JSON Processing -->
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-databind</artifactId>
<version>${jackson.version}</version>
</dependency>
<!-- Caching -->
<dependency>
<groupId>org.ehcache</groupId>
<artifactId>ehcache</artifactId>
<version>${ehcache.version}</version>
</dependency>
<!-- Scheduling -->
<dependency>
<groupId>org.quartz-scheduler</groupId>
<artifactId>quartz</artifactId>
<version>${quartz.version}</version>
</dependency>
<!-- Monitoring -->
<dependency>
<groupId>com.sun.management</groupId>
<artifactId>sun-management-api</artifactId>
<version>1.0</version>
</dependency>
<!-- Spring Boot (Optional) -->
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-web</artifactId>
<version>3.1.0</version>
</dependency>
</dependencies>

Core Process Monitoring

1. Process Metrics Collector
package com.example.anomaly.detection;
import com.sun.management.OperatingSystemMXBean;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import java.lang.management.*;
import java.util.*;
import java.util.concurrent.ConcurrentHashMap;
public class ProcessMetricsCollector {
private static final Logger logger = LoggerFactory.getLogger(ProcessMetricsCollector.class);
private final OperatingSystemMXBean osBean;
private final ThreadMXBean threadBean;
private final MemoryMXBean memoryBean;
private final Runtime runtime;
private final Map<String, ProcessMetrics> processMetrics;
private final Map<Long, ThreadMetrics> threadMetrics;
public ProcessMetricsCollector() {
this.osBean = ManagementFactory.getPlatformMXBean(OperatingSystemMXBean.class);
this.threadBean = ManagementFactory.getThreadMXBean();
this.memoryBean = ManagementFactory.getMemoryMXBean();
this.runtime = Runtime.getRuntime();
this.processMetrics = new ConcurrentHashMap<>();
this.threadMetrics = new ConcurrentHashMap<>();
}
public ProcessMetrics collectSystemMetrics() {
ProcessMetrics metrics = new ProcessMetrics();
metrics.setTimestamp(System.currentTimeMillis());
// CPU Metrics
metrics.setProcessCpuLoad(osBean.getProcessCpuLoad());
metrics.setSystemCpuLoad(osBean.getSystemCpuLoad());
metrics.setAvailableProcessors(runtime.availableProcessors());
// Memory Metrics
MemoryUsage heapMemory = memoryBean.getHeapMemoryUsage();
MemoryUsage nonHeapMemory = memoryBean.getNonHeapMemoryUsage();
metrics.setHeapUsed(heapMemory.getUsed());
metrics.setHeapCommitted(heapMemory.getCommitted());
metrics.setHeapMax(heapMemory.getMax());
metrics.setNonHeapUsed(nonHeapMemory.getUsed());
metrics.setNonHeapCommitted(nonHeapMemory.getCommitted());
// Thread Metrics
metrics.setThreadCount(threadBean.getThreadCount());
metrics.setDaemonThreadCount(threadBean.getDaemonThreadCount());
metrics.setPeakThreadCount(threadBean.getPeakThreadCount());
// GC Metrics
collectGarbageCollectionMetrics(metrics);
// Class Loading Metrics
ClassLoadingMXBean classBean = ManagementFactory.getClassLoadingMXBean();
metrics.setLoadedClassCount(classBean.getLoadedClassCount());
metrics.setUnloadedClassCount(classBean.getUnloadedClassCount());
metrics.setTotalLoadedClassCount(classBean.getTotalLoadedClassCount());
// Operating System Metrics
metrics.setCommittedVirtualMemory(osBean.getCommittedVirtualMemorySize());
metrics.setFreePhysicalMemory(osBean.getFreePhysicalMemorySize());
metrics.setTotalPhysicalMemory(osBean.getTotalPhysicalMemorySize());
metrics.setFreeSwapSpace(osBean.getFreeSwapSpaceSize());
metrics.setTotalSwapSpace(osBean.getTotalSwapSpaceSize());
// Process-specific metrics
metrics.setOpenFileDescriptorCount(getOpenFileDescriptorCount());
processMetrics.put(String.valueOf(metrics.getTimestamp()), metrics);
return metrics;
}
public List<ThreadMetrics> collectThreadMetrics() {
List<ThreadMetrics> threadMetricsList = new ArrayList<>();
long[] threadIds = threadBean.getAllThreadIds();
for (long threadId : threadIds) {
ThreadInfo threadInfo = threadBean.getThreadInfo(threadId);
if (threadInfo != null) {
ThreadMetrics metrics = new ThreadMetrics();
metrics.setThreadId(threadId);
metrics.setThreadName(threadInfo.getThreadName());
metrics.setThreadState(threadInfo.getThreadState().name());
metrics.setBlockedTime(threadInfo.getBlockedTime());
metrics.setBlockedCount(threadInfo.getBlockedCount());
metrics.setWaitedTime(threadInfo.getWaitedTime());
metrics.setWaitedCount(threadInfo.getWaitedCount());
metrics.setLockName(threadInfo.getLockName());
metrics.setLockOwnerId(threadInfo.getLockOwnerId());
metrics.setLockOwnerName(threadInfo.getLockOwnerName());
metrics.setInNative(threadInfo.isInNative());
metrics.setSuspended(threadInfo.isSuspended());
metrics.setTimestamp(System.currentTimeMillis());
// CPU time for the thread
if (threadBean.isThreadCpuTimeSupported()) {
metrics.setCpuTime(threadBean.getThreadCpuTime(threadId));
metrics.setUserTime(threadBean.getThreadUserTime(threadId));
}
threadMetricsList.add(metrics);
threadMetrics.put(threadId, metrics);
}
}
return threadMetricsList;
}
public Map<String, ProcessMetrics> getHistoricalMetrics() {
return new HashMap<>(processMetrics);
}
public void cleanupOldMetrics(long retentionPeriodMs) {
long cutoffTime = System.currentTimeMillis() - retentionPeriodMs;
processMetrics.entrySet().removeIf(entry -> {
ProcessMetrics metrics = entry.getValue();
return metrics.getTimestamp() < cutoffTime;
});
}
private void collectGarbageCollectionMetrics(ProcessMetrics metrics) {
List<GarbageCollectorMXBean> gcBeans = ManagementFactory.getGarbageCollectorMXBeans();
long totalGcCount = 0;
long totalGcTime = 0;
for (GarbageCollectorMXBean gcBean : gcBeans) {
totalGcCount += gcBean.getCollectionCount();
totalGcTime += gcBean.getCollectionTime();
}
metrics.setGcCount(totalGcCount);
metrics.setGcTime(totalGcTime);
}
private long getOpenFileDescriptorCount() {
try {
if (osBean instanceof com.sun.management.UnixOperatingSystemMXBean) {
return ((com.sun.management.UnixOperatingSystemMXBean) osBean).getOpenFileDescriptorCount();
}
} catch (Exception e) {
logger.warn("Unable to get file descriptor count", e);
}
return -1;
}
}
2. Data Models
package com.example.anomaly.detection;
import java.util.HashMap;
import java.util.Map;
public class ProcessMetrics {
private long timestamp;
private double processCpuLoad;
private double systemCpuLoad;
private int availableProcessors;
// Memory metrics
private long heapUsed;
private long heapCommitted;
private long heapMax;
private long nonHeapUsed;
private long nonHeapCommitted;
// Thread metrics
private int threadCount;
private int daemonThreadCount;
private int peakThreadCount;
// GC metrics
private long gcCount;
private long gcTime;
// Class loading
private int loadedClassCount;
private long unloadedClassCount;
private long totalLoadedClassCount;
// OS metrics
private long committedVirtualMemory;
private long freePhysicalMemory;
private long totalPhysicalMemory;
private long freeSwapSpace;
private long totalSwapSpace;
// Process-specific
private long openFileDescriptorCount;
// Additional custom metrics
private Map<String, Object> customMetrics = new HashMap<>();
// Getters and setters
public long getTimestamp() { return timestamp; }
public void setTimestamp(long timestamp) { this.timestamp = timestamp; }
public double getProcessCpuLoad() { return processCpuLoad; }
public void setProcessCpuLoad(double processCpuLoad) { this.processCpuLoad = processCpuLoad; }
public double getSystemCpuLoad() { return systemCpuLoad; }
public void setSystemCpuLoad(double systemCpuLoad) { this.systemCpuLoad = systemCpuLoad; }
public int getAvailableProcessors() { return availableProcessors; }
public void setAvailableProcessors(int availableProcessors) { this.availableProcessors = availableProcessors; }
public long getHeapUsed() { return heapUsed; }
public void setHeapUsed(long heapUsed) { this.heapUsed = heapUsed; }
public long getHeapCommitted() { return heapCommitted; }
public void setHeapCommitted(long heapCommitted) { this.heapCommitted = heapCommitted; }
public long getHeapMax() { return heapMax; }
public void setHeapMax(long heapMax) { this.heapMax = heapMax; }
public long getNonHeapUsed() { return nonHeapUsed; }
public void setNonHeapUsed(long nonHeapUsed) { this.nonHeapUsed = nonHeapUsed; }
public long getNonHeapCommitted() { return nonHeapCommitted; }
public void setNonHeapCommitted(long nonHeapCommitted) { this.nonHeapCommitted = nonHeapCommitted; }
public int getThreadCount() { return threadCount; }
public void setThreadCount(int threadCount) { this.threadCount = threadCount; }
public int getDaemonThreadCount() { return daemonThreadCount; }
public void setDaemonThreadCount(int daemonThreadCount) { this.daemonThreadCount = daemonThreadCount; }
public int getPeakThreadCount() { return peakThreadCount; }
public void setPeakThreadCount(int peakThreadCount) { this.peakThreadCount = peakThreadCount; }
public long getGcCount() { return gcCount; }
public void setGcCount(long gcCount) { this.gcCount = gcCount; }
public long getGcTime() { return gcTime; }
public void setGcTime(long gcTime) { this.gcTime = gcTime; }
public int getLoadedClassCount() { return loadedClassCount; }
public void setLoadedClassCount(int loadedClassCount) { this.loadedClassCount = loadedClassCount; }
public long getUnloadedClassCount() { return unloadedClassCount; }
public void setUnloadedClassCount(long unloadedClassCount) { this.unloadedClassCount = unloadedClassCount; }
public long getTotalLoadedClassCount() { return totalLoadedClassCount; }
public void setTotalLoadedClassCount(long totalLoadedClassCount) { this.totalLoadedClassCount = totalLoadedClassCount; }
public long getCommittedVirtualMemory() { return committedVirtualMemory; }
public void setCommittedVirtualMemory(long committedVirtualMemory) { this.committedVirtualMemory = committedVirtualMemory; }
public long getFreePhysicalMemory() { return freePhysicalMemory; }
public void setFreePhysicalMemory(long freePhysicalMemory) { this.freePhysicalMemory = freePhysicalMemory; }
public long getTotalPhysicalMemory() { return totalPhysicalMemory; }
public void setTotalPhysicalMemory(long totalPhysicalMemory) { this.totalPhysicalMemory = totalPhysicalMemory; }
public long getFreeSwapSpace() { return freeSwapSpace; }
public void setFreeSwapSpace(long freeSwapSpace) { this.freeSwapSpace = freeSwapSpace; }
public long getTotalSwapSpace() { return totalSwapSpace; }
public void setTotalSwapSpace(long totalSwapSpace) { this.totalSwapSpace = totalSwapSpace; }
public long getOpenFileDescriptorCount() { return openFileDescriptorCount; }
public void setOpenFileDescriptorCount(long openFileDescriptorCount) { this.openFileDescriptorCount = openFileDescriptorCount; }
public Map<String, Object> getCustomMetrics() { return customMetrics; }
public void setCustomMetrics(Map<String, Object> customMetrics) { this.customMetrics = customMetrics; }
public void addCustomMetric(String key, Object value) {
this.customMetrics.put(key, value);
}
public double[] toFeatureVector() {
return new double[] {
processCpuLoad,
systemCpuLoad,
normalizeMemory(heapUsed),
normalizeMemory(nonHeapUsed),
threadCount,
normalizeGcCount(gcCount),
normalizeGcTime(gcTime)
};
}
private double normalizeMemory(long memoryBytes) {
return memoryBytes / (1024.0 * 1024.0); // Convert to MB
}
private double normalizeGcCount(long gcCount) {
return gcCount / 1000.0;
}
private double normalizeGcTime(long gcTime) {
return gcTime / 1000.0; // Convert to seconds
}
}
package com.example.anomaly.detection;
public class ThreadMetrics {
private long threadId;
private String threadName;
private String threadState;
private long blockedTime;
private long blockedCount;
private long waitedTime;
private long waitedCount;
private String lockName;
private long lockOwnerId;
private String lockOwnerName;
private boolean inNative;
private boolean suspended;
private long cpuTime;
private long userTime;
private long timestamp;
// Getters and setters
public long getThreadId() { return threadId; }
public void setThreadId(long threadId) { this.threadId = threadId; }
public String getThreadName() { return threadName; }
public void setThreadName(String threadName) { this.threadName = threadName; }
public String getThreadState() { return threadState; }
public void setThreadState(String threadState) { this.threadState = threadState; }
public long getBlockedTime() { return blockedTime; }
public void setBlockedTime(long blockedTime) { this.blockedTime = blockedTime; }
public long getBlockedCount() { return blockedCount; }
public void setBlockedCount(long blockedCount) { this.blockedCount = blockedCount; }
public long getWaitedTime() { return waitedTime; }
public void setWaitedTime(long waitedTime) { this.waitedTime = waitedTime; }
public long getWaitedCount() { return waitedCount; }
public void setWaitedCount(long waitedCount) { this.waitedCount = waitedCount; }
public String getLockName() { return lockName; }
public void setLockName(String lockName) { this.lockName = lockName; }
public long getLockOwnerId() { return lockOwnerId; }
public void setLockOwnerId(long lockOwnerId) { this.lockOwnerId = lockOwnerId; }
public String getLockOwnerName() { return lockOwnerName; }
public void setLockOwnerName(String lockOwnerName) { this.lockOwnerName = lockOwnerName; }
public boolean isInNative() { return inNative; }
public void setInNative(boolean inNative) { this.inNative = inNative; }
public boolean isSuspended() { return suspended; }
public void setSuspended(boolean suspended) { this.suspended = suspended; }
public long getCpuTime() { return cpuTime; }
public void setCpuTime(long cpuTime) { this.cpuTime = cpuTime; }
public long getUserTime() { return userTime; }
public void setUserTime(long userTime) { this.userTime = userTime; }
public long getTimestamp() { return timestamp; }
public void setTimestamp(long timestamp) { this.timestamp = timestamp; }
public double[] toFeatureVector() {
return new double[] {
blockedTime,
blockedCount,
waitedTime,
waitedCount,
cpuTime / 1_000_000_000.0, // Convert to seconds
userTime / 1_000_000_000.0  // Convert to seconds
};
}
}
package com.example.anomaly.detection;
public class AnomalyDetectionResult {
private boolean isAnomaly;
private double anomalyScore;
private String anomalyType;
private String description;
private ProcessMetrics metrics;
private long timestamp;
private double confidence;
private Map<String, Object> details;
public AnomalyDetectionResult() {
this.details = new HashMap<>();
}
// Getters and setters
public boolean isAnomaly() { return isAnomaly; }
public void setAnomaly(boolean anomaly) { isAnomaly = anomaly; }
public double getAnomalyScore() { return anomalyScore; }
public void setAnomalyScore(double anomalyScore) { this.anomalyScore = anomalyScore; }
public String getAnomalyType() { return anomalyType; }
public void setAnomalyType(String anomalyType) { this.anomalyType = anomalyType; }
public String getDescription() { return description; }
public void setDescription(String description) { this.description = description; }
public ProcessMetrics getMetrics() { return metrics; }
public void setMetrics(ProcessMetrics metrics) { this.metrics = metrics; }
public long getTimestamp() { return timestamp; }
public void setTimestamp(long timestamp) { this.timestamp = timestamp; }
public double getConfidence() { return confidence; }
public void setConfidence(double confidence) { this.confidence = confidence; }
public Map<String, Object> getDetails() { return details; }
public void setDetails(Map<String, Object> details) { this.details = details; }
public void addDetail(String key, Object value) {
this.details.put(key, value);
}
public enum AnomalyType {
CPU_SPIKE,
MEMORY_LEAK,
THREAD_DEADLOCK,
GC_OVERHEAD,
FILE_DESCRIPTOR_LEAK,
NETWORK_ANOMALY,
CUSTOM_ANOMALY
}
}

Anomaly Detection Algorithms

1. Statistical Anomaly Detection
package com.example.anomaly.detection.algorithm;
import com.example.anomaly.detection.AnomalyDetectionResult;
import com.example.anomaly.detection.ProcessMetrics;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import java.util.ArrayList;
import java.util.List;
import java.util.DoubleSummaryStatistics;
import java.util.stream.Collectors;
public class StatisticalAnomalyDetector {
private static final Logger logger = LoggerFactory.getLogger(StatisticalAnomalyDetector.class);
private final List<ProcessMetrics> trainingData;
private final double zScoreThreshold;
private final double iqrMultiplier;
private double[] means;
private double[] standardDeviations;
private double[] q1;
private double[] q3;
public StatisticalAnomalyDetector(double zScoreThreshold, double iqrMultiplier) {
this.trainingData = new ArrayList<>();
this.zScoreThreshold = zScoreThreshold;
this.iqrMultiplier = iqrMultiplier;
}
public void train(List<ProcessMetrics> trainingData) {
this.trainingData.clear();
this.trainingData.addAll(trainingData);
if (trainingData.isEmpty()) {
throw new IllegalArgumentException("Training data cannot be empty");
}
int featureCount = trainingData.get(0).toFeatureVector().length;
this.means = new double[featureCount];
this.standardDeviations = new double[featureCount];
this.q1 = new double[featureCount];
this.q3 = new double[featureCount];
calculateStatistics();
logger.info("Statistical anomaly detector trained with {} samples", trainingData.size());
}
public AnomalyDetectionResult detect(ProcessMetrics metrics) {
double[] features = metrics.toFeatureVector();
AnomalyDetectionResult result = new AnomalyDetectionResult();
result.setMetrics(metrics);
result.setTimestamp(System.currentTimeMillis());
boolean isAnomaly = false;
double maxZScore = 0;
double maxIqrScore = 0;
List<String> anomalyReasons = new ArrayList<>();
for (int i = 0; i < features.length; i++) {
double zScore = calculateZScore(features[i], i);
double iqrScore = calculateIqrScore(features[i], i);
if (Math.abs(zScore) > zScoreThreshold) {
isAnomaly = true;
maxZScore = Math.max(maxZScore, Math.abs(zScore));
anomalyReasons.add(String.format("Feature %d Z-score: %.2f", i, zScore));
}
if (Math.abs(iqrScore) > iqrMultiplier) {
isAnomaly = true;
maxIqrScore = Math.max(maxIqrScore, Math.abs(iqrScore));
anomalyReasons.add(String.format("Feature %d IQR-score: %.2f", i, iqrScore));
}
}
result.setAnomaly(isAnomaly);
result.setAnomalyScore(Math.max(maxZScore, maxIqrScore));
result.setConfidence(calculateConfidence(maxZScore, maxIqrScore));
if (isAnomaly) {
result.setAnomalyType(AnomalyDetectionResult.AnomalyType.CUSTOM_ANOMALY.name());
result.setDescription("Statistical anomaly detected: " + String.join(", ", anomalyReasons));
result.addDetail("zScore", maxZScore);
result.addDetail("iqrScore", maxIqrScore);
result.addDetail("anomalyReasons", anomalyReasons);
}
return result;
}
private void calculateStatistics() {
int featureCount = trainingData.get(0).toFeatureVector().length;
int sampleCount = trainingData.size();
// Calculate means
double[][] featureMatrix = new double[sampleCount][featureCount];
for (int i = 0; i < sampleCount; i++) {
featureMatrix[i] = trainingData.get(i).toFeatureVector();
}
for (int j = 0; j < featureCount; j++) {
final int featureIndex = j;
DoubleSummaryStatistics stats = trainingData.stream()
.mapToDouble(metrics -> metrics.toFeatureVector()[featureIndex])
.summaryStatistics();
means[j] = stats.getAverage();
// Calculate standard deviation
double variance = trainingData.stream()
.mapToDouble(metrics -> {
double value = metrics.toFeatureVector()[featureIndex];
return Math.pow(value - means[j], 2);
})
.average()
.orElse(0.0);
standardDeviations[j] = Math.sqrt(variance);
// Calculate quartiles
List<Double> values = trainingData.stream()
.map(metrics -> metrics.toFeatureVector()[featureIndex])
.sorted()
.collect(Collectors.toList());
q1[j] = calculatePercentile(values, 25);
q3[j] = calculatePercentile(values, 75);
}
}
private double calculateZScore(double value, int featureIndex) {
if (standardDeviations[featureIndex] == 0) {
return 0;
}
return (value - means[featureIndex]) / standardDeviations[featureIndex];
}
private double calculateIqrScore(double value, int featureIndex) {
double iqr = q3[featureIndex] - q1[featureIndex];
if (iqr == 0) {
return 0;
}
if (value < q1[featureIndex]) {
return (value - q1[featureIndex]) / iqr;
} else if (value > q3[featureIndex]) {
return (value - q3[featureIndex]) / iqr;
} else {
return 0;
}
}
private double calculatePercentile(List<Double> values, double percentile) {
if (values.isEmpty()) {
return 0;
}
int index = (int) Math.ceil(percentile / 100.0 * values.size()) - 1;
index = Math.max(0, Math.min(index, values.size() - 1));
return values.get(index);
}
private double calculateConfidence(double zScore, double iqrScore) {
double score = Math.max(zScore, iqrScore);
return Math.min(score / 10.0, 1.0); // Normalize to 0-1
}
public double[] getMeans() {
return means.clone();
}
public double[] getStandardDeviations() {
return standardDeviations.clone();
}
}
2. Machine Learning-Based Detection
package com.example.anomaly.detection.algorithm;
import com.example.anomaly.detection.AnomalyDetectionResult;
import com.example.anomaly.detection.ProcessMetrics;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import smile.anomaly.*;
import smile.data.DataFrame;
import smile.data.type.StructType;
import smile.data.vector.BaseVector;
import smile.data.vector.DoubleVector;
import java.util.ArrayList;
import java.util.List;
public class MachineLearningAnomalyDetector {
private static final Logger logger = LoggerFactory.getLogger(MachineLearningAnomalyDetector.class);
private final List<ProcessMetrics> trainingData;
private LOF<double[]> lofDetector;
private OCSVM<double[]> ocsvmDetector;
private boolean isTrained;
public MachineLearningAnomalyDetector() {
this.trainingData = new ArrayList<>();
this.isTrained = false;
}
public void trainLOF(List<ProcessMetrics> trainingData, int k) {
this.trainingData.clear();
this.trainingData.addAll(trainingData);
if (trainingData.isEmpty()) {
throw new IllegalArgumentException("Training data cannot be empty");
}
double[][] features = convertToFeatureMatrix(trainingData);
this.lofDetector = LOF.fit(features, k);
this.isTrained = true;
logger.info("LOF anomaly detector trained with {} samples, k={}", trainingData.size(), k);
}
public void trainOCSVM(List<ProcessMetrics> trainingData, double nu, double gamma) {
this.trainingData.clear();
this.trainingData.addAll(trainingData);
if (trainingData.isEmpty()) {
throw new IllegalArgumentException("Training data cannot be empty");
}
double[][] features = convertToFeatureMatrix(trainingData);
this.ocsvmDetector = OCSVM.fit(features, smile.math.kernel.GaussianKernel(gamma), nu);
this.isTrained = true;
logger.info("OCSVM anomaly detector trained with {} samples, nu={}, gamma={}", 
trainingData.size(), nu, gamma);
}
public AnomalyDetectionResult detectWithLOF(ProcessMetrics metrics) {
if (!isTrained || lofDetector == null) {
throw new IllegalStateException("LOF detector not trained");
}
double[] features = metrics.toFeatureVector();
double score = lofDetector.score(features);
AnomalyDetectionResult result = createResult(metrics, score, "LOF");
result.addDetail("lofScore", score);
return result;
}
public AnomalyDetectionResult detectWithOCSVM(ProcessMetrics metrics) {
if (!isTrained || ocsvmDetector == null) {
throw new IllegalStateException("OCSVM detector not trained");
}
double[] features = metrics.toFeatureVector();
double score = ocsvmDetector.score(features);
AnomalyDetectionResult result = createResult(metrics, score, "OCSVM");
result.addDetail("ocsvmScore", score);
return result;
}
public EnsembleDetectionResult detectEnsemble(ProcessMetrics metrics) {
if (!isTrained) {
throw new IllegalStateException("No detectors trained");
}
EnsembleDetectionResult ensembleResult = new EnsembleDetectionResult();
ensembleResult.setMetrics(metrics);
ensembleResult.setTimestamp(System.currentTimeMillis());
List<AnomalyDetectionResult> individualResults = new ArrayList<>();
if (lofDetector != null) {
AnomalyDetectionResult lofResult = detectWithLOF(metrics);
individualResults.add(lofResult);
}
if (ocsvmDetector != null) {
AnomalyDetectionResult ocsvmResult = detectWithOCSVM(metrics);
individualResults.add(ocsvmResult);
}
// Combine results using weighted average
double combinedScore = individualResults.stream()
.mapToDouble(AnomalyDetectionResult::getAnomalyScore)
.average()
.orElse(0.0);
long anomalyCount = individualResults.stream()
.filter(AnomalyDetectionResult::isAnomaly)
.count();
boolean isAnomaly = anomalyCount > individualResults.size() / 2;
ensembleResult.setAnomaly(isAnomaly);
ensembleResult.setAnomalyScore(combinedScore);
ensembleResult.setIndividualResults(individualResults);
ensembleResult.setVoteCount((int) anomalyCount);
ensembleResult.setTotalVotes(individualResults.size());
if (isAnomaly) {
ensembleResult.setAnomalyType("ENSEMBLE_ANOMALY");
ensembleResult.setDescription(String.format(
"Ensemble anomaly detected (%d/%d detectors)", anomalyCount, individualResults.size()));
}
return ensembleResult;
}
private double[][] convertToFeatureMatrix(List<ProcessMetrics> metricsList) {
if (metricsList.isEmpty()) {
return new double[0][];
}
int featureCount = metricsList.get(0).toFeatureVector().length;
double[][] features = new double[metricsList.size()][featureCount];
for (int i = 0; i < metricsList.size(); i++) {
features[i] = metricsList.get(i).toFeatureVector();
}
return features;
}
private AnomalyDetectionResult createResult(ProcessMetrics metrics, double score, String detectorType) {
AnomalyDetectionResult result = new AnomalyDetectionResult();
result.setMetrics(metrics);
result.setTimestamp(System.currentTimeMillis());
// Normalize score and determine anomaly threshold
boolean isAnomaly = score > 1.0; // Adjust threshold based on detector
result.setAnomaly(isAnomaly);
result.setAnomalyScore(score);
result.setConfidence(Math.min(score / 2.0, 1.0));
if (isAnomaly) {
result.setAnomalyType(detectorType + "_ANOMALY");
result.setDescription(String.format("%s anomaly detected with score: %.3f", detectorType, score));
}
return result;
}
public boolean isTrained() {
return isTrained;
}
}
package com.example.anomaly.detection.algorithm;
import com.example.anomaly.detection.AnomalyDetectionResult;
import java.util.List;
public class EnsembleDetectionResult extends AnomalyDetectionResult {
private List<AnomalyDetectionResult> individualResults;
private int voteCount;
private int totalVotes;
// Getters and setters
public List<AnomalyDetectionResult> getIndividualResults() { return individualResults; }
public void setIndividualResults(List<AnomalyDetectionResult> individualResults) { 
this.individualResults = individualResults; 
}
public int getVoteCount() { return voteCount; }
public void setVoteCount(int voteCount) { this.voteCount = voteCount; }
public int getTotalVotes() { return totalVotes; }
public void setTotalVotes(int totalVotes) { this.totalVotes = totalVotes; }
public double getAgreementRatio() {
return totalVotes > 0 ? (double) voteCount / totalVotes : 0.0;
}
}
3. Rule-Based Detection
package com.example.anomaly.detection.algorithm;
import com.example.anomaly.detection.AnomalyDetectionResult;
import com.example.anomaly.detection.ProcessMetrics;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import java.util.ArrayList;
import java.util.List;
import java.util.function.Predicate;
public class RuleBasedAnomalyDetector {
private static final Logger logger = LoggerFactory.getLogger(RuleBasedAnomalyDetector.class);
private final List<DetectionRule> rules;
public RuleBasedAnomalyDetector() {
this.rules = new ArrayList<>();
initializeDefaultRules();
}
public void addRule(DetectionRule rule) {
rules.add(rule);
}
public AnomalyDetectionResult detect(ProcessMetrics metrics) {
AnomalyDetectionResult result = new AnomalyDetectionResult();
result.setMetrics(metrics);
result.setTimestamp(System.currentTimeMillis());
List<String> triggeredRules = new ArrayList<>();
double totalScore = 0;
for (DetectionRule rule : rules) {
if (rule.getCondition().test(metrics)) {
triggeredRules.add(rule.getName());
totalScore += rule.getSeverity();
logger.debug("Rule triggered: {} - {}", rule.getName(), rule.getDescription());
}
}
boolean isAnomaly = !triggeredRules.isEmpty();
result.setAnomaly(isAnomaly);
result.setAnomalyScore(totalScore);
result.setConfidence(calculateConfidence(totalScore, triggeredRules.size()));
if (isAnomaly) {
result.setAnomalyType("RULE_BASED_ANOMALY");
result.setDescription("Rules triggered: " + String.join(", ", triggeredRules));
result.addDetail("triggeredRules", triggeredRules);
result.addDetail("ruleCount", triggeredRules.size());
}
return result;
}
private void initializeDefaultRules() {
// CPU-related rules
addRule(new DetectionRule(
"HIGH_CPU_USAGE",
"Process CPU usage exceeds 80%",
metrics -> metrics.getProcessCpuLoad() > 0.8,
0.8
));
addRule(new DetectionRule(
"CPU_SPIKE",
"Sudden CPU usage spike",
metrics -> metrics.getProcessCpuLoad() > 0.9,
0.9
));
// Memory-related rules
addRule(new DetectionRule(
"HIGH_MEMORY_USAGE",
"Heap memory usage exceeds 90%",
metrics -> {
long heapUsed = metrics.getHeapUsed();
long heapMax = metrics.getHeapMax();
return heapMax > 0 && ((double) heapUsed / heapMax) > 0.9;
},
0.7
));
addRule(new DetectionRule(
"MEMORY_LEAK_SUSPECTED",
"Continuous memory growth pattern",
metrics -> {
// This would require historical data for proper detection
long heapUsed = metrics.getHeapUsed();
return heapUsed > 1024 * 1024 * 1024; // 1GB threshold
},
0.6
));
// Thread-related rules
addRule(new DetectionRule(
"HIGH_THREAD_COUNT",
"Thread count exceeds reasonable limit",
metrics -> metrics.getThreadCount() > 500,
0.5
));
addRule(new DetectionRule(
"THREAD_DEADLOCK_SUSPECTED",
"High blocked thread count",
metrics -> {
// This would require thread-level metrics
return false; // Placeholder
},
0.8
));
// GC-related rules
addRule(new DetectionRule(
"GC_OVERHEAD",
"Excessive garbage collection time",
metrics -> {
long gcTime = metrics.getGcTime();
return gcTime > 10000; // 10 seconds
},
0.6
));
addRule(new DetectionRule(
"FREQUENT_GC",
"Too frequent garbage collection",
metrics -> {
long gcCount = metrics.getGcCount();
return gcCount > 1000; // Arbitrary threshold
},
0.5
));
// File descriptor rules
addRule(new DetectionRule(
"FILE_DESCRIPTOR_LEAK",
"High number of open file descriptors",
metrics -> {
long fdCount = metrics.getOpenFileDescriptorCount();
return fdCount > 0 && fdCount > 1000; // System-dependent threshold
},
0.7
));
}
private double calculateConfidence(double totalScore, int ruleCount) {
double baseConfidence = totalScore / rules.size();
double countBonus = Math.min(ruleCount * 0.1, 0.3);
return Math.min(baseConfidence + countBonus, 1.0);
}
public List<DetectionRule> getRules() {
return new ArrayList<>(rules);
}
}
package com.example.anomaly.detection.algorithm;
import com.example.anomaly.detection.ProcessMetrics;
import java.util.function.Predicate;
public class DetectionRule {
private final String name;
private final String description;
private final Predicate<ProcessMetrics> condition;
private final double severity; // 0.0 to 1.0
public DetectionRule(String name, String description, 
Predicate<ProcessMetrics> condition, double severity) {
this.name = name;
this.description = description;
this.condition = condition;
this.severity = severity;
}
// Getters
public String getName() { return name; }
public String getDescription() { return description; }
public Predicate<ProcessMetrics> getCondition() { return condition; }
public double getSeverity() { return severity; }
}

Real-time Monitoring System

1. Anomaly Detection Service
package com.example.anomaly.detection.service;
import com.example.anomaly.detection.*;
import com.example.anomaly.detection.algorithm.*;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.scheduling.annotation.Scheduled;
import org.springframework.stereotype.Service;
import javax.annotation.PostConstruct;
import java.util.*;
import java.util.concurrent.ConcurrentHashMap;
import java.util.concurrent.CopyOnWriteArrayList;
@Service
public class AnomalyDetectionService {
private static final Logger logger = LoggerFactory.getLogger(AnomalyDetectionService.class);
private final ProcessMetricsCollector metricsCollector;
private final StatisticalAnomalyDetector statisticalDetector;
private final MachineLearningAnomalyDetector mlDetector;
private final RuleBasedAnomalyDetector ruleBasedDetector;
private final List<ProcessMetrics> historicalMetrics;
private final List<AnomalyDetectionResult> detectedAnomalies;
private final Map<String, ProcessMetrics> latestMetrics;
private boolean isInitialized = false;
private long trainingStartTime;
public AnomalyDetectionService() {
this.metricsCollector = new ProcessMetricsCollector();
this.statisticalDetector = new StatisticalAnomalyDetector(3.0, 1.5); // Z-score 3, IQR 1.5
this.mlDetector = new MachineLearningAnomalyDetector();
this.ruleBasedDetector = new RuleBasedAnomalyDetector();
this.historicalMetrics = new CopyOnWriteArrayList<>();
this.detectedAnomalies = new CopyOnWriteArrayList<>();
this.latestMetrics = new ConcurrentHashMap<>();
}
@PostConstruct
public void initialize() {
logger.info("Initializing anomaly detection service");
trainingStartTime = System.currentTimeMillis();
isInitialized = true;
}
@Scheduled(fixedRate = 5000) // Collect every 5 seconds
public void collectMetrics() {
if (!isInitialized) {
return;
}
try {
ProcessMetrics metrics = metricsCollector.collectSystemMetrics();
String key = String.valueOf(metrics.getTimestamp());
latestMetrics.put(key, metrics);
historicalMetrics.add(metrics);
// Clean up old metrics (keep last hour)
cleanupOldMetrics(60 * 60 * 1000); // 1 hour
// Train detectors if we have enough data
if (historicalMetrics.size() >= 100 && !mlDetector.isTrained()) {
trainDetectors();
}
// Detect anomalies if trained
if (mlDetector.isTrained()) {
detectAnomalies(metrics);
}
} catch (Exception e) {
logger.error("Error collecting metrics", e);
}
}
public AnomalyDetectionResult analyzeMetrics(ProcessMetrics metrics) {
List<AnomalyDetectionResult> results = new ArrayList<>();
// Statistical detection
if (historicalMetrics.size() >= 50) {
results.add(statisticalDetector.detect(metrics));
}
// Machine learning detection
if (mlDetector.isTrained()) {
results.add(mlDetector.detectEnsemble(metrics));
}
// Rule-based detection
results.add(ruleBasedDetector.detect(metrics));
// Combine results
return combineResults(results, metrics);
}
private void trainDetectors() {
try {
logger.info("Training anomaly detectors with {} samples", historicalMetrics.size());
// Use 80% of data for training
int trainingSize = (int) (historicalMetrics.size() * 0.8);
List<ProcessMetrics> trainingData = historicalMetrics.subList(0, trainingSize);
// Train statistical detector
statisticalDetector.train(trainingData);
// Train machine learning detectors
mlDetector.trainLOF(trainingData, 20); // k=20 for LOF
mlDetector.trainOCSVM(trainingData, 0.1, 0.1); // nu=0.1, gamma=0.1 for OCSVM
logger.info("Anomaly detectors trained successfully");
} catch (Exception e) {
logger.error("Error training detectors", e);
}
}
private void detectAnomalies(ProcessMetrics metrics) {
try {
AnomalyDetectionResult result = analyzeMetrics(metrics);
if (result.isAnomaly()) {
detectedAnomalies.add(result);
logger.warn("Anomaly detected: {} - Score: {:.3f}", 
result.getDescription(), result.getAnomalyScore());
// Trigger alert
triggerAlert(result);
}
} catch (Exception e) {
logger.error("Error detecting anomalies", e);
}
}
private AnomalyDetectionResult combineResults(List<AnomalyDetectionResult> results, 
ProcessMetrics metrics) {
CombinedDetectionResult combinedResult = new CombinedDetectionResult();
combinedResult.setMetrics(metrics);
combinedResult.setTimestamp(System.currentTimeMillis());
long anomalyCount = results.stream()
.filter(AnomalyDetectionResult::isAnomaly)
.count();
double averageScore = results.stream()
.mapToDouble(AnomalyDetectionResult::getAnomalyScore)
.average()
.orElse(0.0);
double maxScore = results.stream()
.mapToDouble(AnomalyDetectionResult::getAnomalyScore)
.max()
.orElse(0.0);
boolean isAnomaly = anomalyCount >= Math.ceil(results.size() / 2.0);
combinedResult.setAnomaly(isAnomaly);
combinedResult.setAnomalyScore(maxScore);
combinedResult.setIndividualResults(results);
combinedResult.setVoteCount((int) anomalyCount);
combinedResult.setTotalVotes(results.size());
combinedResult.setAverageScore(averageScore);
if (isAnomaly) {
combinedResult.setAnomalyType("COMBINED_ANOMALY");
combinedResult.setDescription(String.format(
"Combined anomaly detected (%d/%d methods)", anomalyCount, results.size()));
}
return combinedResult;
}
private void triggerAlert(AnomalyDetectionResult anomaly) {
// Implement alerting mechanism (email, Slack, PagerDuty, etc.)
logger.warn("🚨 ANOMALY ALERT: {}", anomaly.getDescription());
logger.warn("Anomaly Score: {:.3f}, Confidence: {:.3f}", 
anomaly.getAnomalyScore(), anomaly.getConfidence());
// Additional alerting logic would go here
AlertService.triggerAlert(anomaly);
}
private void cleanupOldMetrics(long retentionPeriod) {
long cutoffTime = System.currentTimeMillis() - retentionPeriod;
// Clean historical metrics
historicalMetrics.removeIf(metrics -> metrics.getTimestamp() < cutoffTime);
// Clean latest metrics
latestMetrics.entrySet().removeIf(entry -> {
ProcessMetrics metrics = entry.getValue();
return metrics.getTimestamp() < cutoffTime;
});
// Clean detected anomalies (keep longer for analysis)
long anomalyCutoff = System.currentTimeMillis() - (retentionPeriod * 24); // Keep 24 hours
detectedAnomalies.removeIf(anomaly -> anomaly.getTimestamp() < anomalyCutoff);
}
// Public methods for external access
public ProcessMetrics getLatestMetrics() {
if (latestMetrics.isEmpty()) {
return null;
}
return latestMetrics.values().iterator().next();
}
public List<ProcessMetrics> getHistoricalMetrics() {
return new ArrayList<>(historicalMetrics);
}
public List<AnomalyDetectionResult> getDetectedAnomalies() {
return new ArrayList<>(detectedAnomalies);
}
public SystemHealth getSystemHealth() {
ProcessMetrics latest = getLatestMetrics();
if (latest == null) {
return SystemHealth.UNKNOWN;
}
// Simple health calculation based on key metrics
double healthScore = 100.0;
// CPU health (deduct points for high usage)
if (latest.getProcessCpuLoad() > 0.8) {
healthScore -= 20;
} else if (latest.getProcessCpuLoad() > 0.6) {
healthScore -= 10;
}
// Memory health
if (latest.getHeapMax() > 0) {
double heapUsage = (double) latest.getHeapUsed() / latest.getHeapMax();
if (heapUsage > 0.9) {
healthScore -= 20;
} else if (heapUsage > 0.7) {
healthScore -= 10;
}
}
// Thread health
if (latest.getThreadCount() > 1000) {
healthScore -= 15;
}
return SystemHealth.fromScore(healthScore);
}
}
package com.example.anomaly.detection.service;
import com.example.anomaly.detection.AnomalyDetectionResult;
import java.util.List;
public class CombinedDetectionResult extends AnomalyDetectionResult {
private List<AnomalyDetectionResult> individualResults;
private int voteCount;
private int totalVotes;
private double averageScore;
// Getters and setters
public List<AnomalyDetectionResult> getIndividualResults() { return individualResults; }
public void setIndividualResults(List<AnomalyDetectionResult> individualResults) { 
this.individualResults = individualResults; 
}
public int getVoteCount() { return voteCount; }
public void setVoteCount(int voteCount) { this.voteCount = voteCount; }
public int getTotalVotes() { return totalVotes; }
public void setTotalVotes(int totalVotes) { this.totalVotes = totalVotes; }
public double getAverageScore() { return averageScore; }
public void setAverageScore(double averageScore) { this.averageScore = averageScore; }
}
package com.example.anomaly.detection.service;
public enum SystemHealth {
HEALTHY(80, 100, "✅ System is healthy"),
DEGRADED(50, 80, "⚠️ System performance is degraded"),
CRITICAL(0, 50, "🚨 System health is critical"),
UNKNOWN(-1, -1, "❓ System health unknown");
private final double minScore;
private final double maxScore;
private final String description;
SystemHealth(double minScore, double maxScore, String description) {
this.minScore = minScore;
this.maxScore = maxScore;
this.description = description;
}
public static SystemHealth fromScore(double score) {
for (SystemHealth health : values()) {
if (health != UNKNOWN && score >= health.minScore && score <= health.maxScore) {
return health;
}
}
return UNKNOWN;
}
public String getDescription() {
return description;
}
}
package com.example.anomaly.detection.service;
import com.example.anomaly.detection.AnomalyDetectionResult;
public class AlertService {
public static void triggerAlert(AnomalyDetectionResult anomaly) {
// Implementation for various alerting channels
// Email, Slack, PagerDuty, etc.
System.err.println("=== ANOMALY ALERT ===");
System.err.println("Type: " + anomaly.getAnomalyType());
System.err.println("Description: " + anomaly.getDescription());
System.err.println("Score: " + anomaly.getAnomalyScore());
System.err.println("Confidence: " + anomaly.getConfidence());
System.err.println("Timestamp: " + new java.util.Date(anomaly.getTimestamp()));
System.err.println("=====================");
}
}

REST API for Monitoring

1. Monitoring Controller
package com.example.anomaly.detection.controller;
import com.example.anomaly.detection.AnomalyDetectionResult;
import com.example.anomaly.detection.ProcessMetrics;
import com.example.anomaly.detection.service.AnomalyDetectionService;
import com.example.anomaly.detection.service.SystemHealth;
import org.springframework.web.bind.annotation.*;
import java.util.List;
@RestController
@RequestMapping("/api/monitoring")
public class MonitoringController {
private final AnomalyDetectionService anomalyDetectionService;
public MonitoringController(AnomalyDetectionService anomalyDetectionService) {
this.anomalyDetectionService = anomalyDetectionService;
}
@GetMapping("/metrics/current")
public ProcessMetrics getCurrentMetrics() {
return anomalyDetectionService.getLatestMetrics();
}
@GetMapping("/metrics/history")
public List<ProcessMetrics> getHistoricalMetrics() {
return anomalyDetectionService.getHistoricalMetrics();
}
@GetMapping("/anomalies")
public List<AnomalyDetectionResult> getDetectedAnomalies() {
return anomalyDetectionService.getDetectedAnomalies();
}
@GetMapping("/health")
public HealthStatus getSystemHealth() {
SystemHealth health = anomalyDetectionService.getSystemHealth();
ProcessMetrics currentMetrics = anomalyDetectionService.getLatestMetrics();
return new HealthStatus(health, currentMetrics);
}
@PostMapping("/analyze")
public AnomalyDetectionResult analyzeCurrentState() {
ProcessMetrics metrics = anomalyDetectionService.getLatestMetrics();
if (metrics == null) {
throw new IllegalStateException("No metrics available");
}
return anomalyDetectionService.analyzeMetrics(metrics);
}
@GetMapping("/stats")
public MonitoringStats getMonitoringStats() {
List<ProcessMetrics> historical = anomalyDetectionService.getHistoricalMetrics();
List<AnomalyDetectionResult> anomalies = anomalyDetectionService.getDetectedAnomalies();
return new MonitoringStats(historical.size(), anomalies.size(), 
anomalyDetectionService.getSystemHealth());
}
}
// Response DTOs
class HealthStatus {
private final SystemHealth health;
private final String description;
private final ProcessMetrics currentMetrics;
private final String timestamp;
public HealthStatus(SystemHealth health, ProcessMetrics currentMetrics) {
this.health = health;
this.description = health.getDescription();
this.currentMetrics = currentMetrics;
this.timestamp = java.time.LocalDateTime.now().toString();
}
// Getters
public SystemHealth getHealth() { return health; }
public String getDescription() { return description; }
public ProcessMetrics getCurrentMetrics() { return currentMetrics; }
public String getTimestamp() { return timestamp; }
}
class MonitoringStats {
private final int totalMetricsCollected;
private final int totalAnomaliesDetected;
private final SystemHealth systemHealth;
private final String timestamp;
public MonitoringStats(int totalMetricsCollected, int totalAnomaliesDetected, 
SystemHealth systemHealth) {
this.totalMetricsCollected = totalMetricsCollected;
this.totalAnomaliesDetected = totalAnomaliesDetected;
this.systemHealth = systemHealth;
this.timestamp = java.time.LocalDateTime.now().toString();
}
// Getters
public int getTotalMetricsCollected() { return totalMetricsCollected; }
public int getTotalAnomaliesDetected() { return totalAnomaliesDetected; }
public SystemHealth getSystemHealth() { return systemHealth; }
public String getTimestamp() { return timestamp; }
}

Configuration

1. Application Properties
# application.yml
anomaly:
detection:
enabled: true
collection:
interval-ms: 5000
training:
min-samples: 100
training-ratio: 0.8
statistical:
z-score-threshold: 3.0
iqr-multiplier: 1.5
machine-learning:
lof-k: 20
ocsvm-nu: 0.1
ocsvm-gamma: 0.1
retention:
metrics-hours: 24
anomalies-hours: 168  # 1 week
logging:
level:
com.example.anomaly.detection: INFO
management:
endpoints:
web:
exposure:
include: health,metrics,info
endpoint:
health:
show-details: always
2. Spring Configuration
package com.example.anomaly.detection.config;
import org.springframework.context.annotation.Configuration;
import org.springframework.scheduling.annotation.EnableScheduling;
@Configuration
@EnableScheduling
public class SchedulingConfig {
// Enables scheduled tasks for metric collection
}

Testing

1. Unit Tests
package com.example.anomaly.detection;
import com.example.anomaly.detection.algorithm.StatisticalAnomalyDetector;
import org.junit.jupiter.api.BeforeEach;
import org.junit.jupiter.api.Test;
import java.util.ArrayList;
import java.util.List;
import static org.junit.jupiter.api.Assertions.*;
class StatisticalAnomalyDetectorTest {
private StatisticalAnomalyDetector detector;
private List<ProcessMetrics> trainingData;
@BeforeEach
void setUp() {
detector = new StatisticalAnomalyDetector(3.0, 1.5);
trainingData = generateTrainingData();
}
@Test
void testTraining() {
detector.train(trainingData);
double[] means = detector.getMeans();
assertNotNull(means);
assertEquals(7, means.length); // 7 features in ProcessMetrics
}
@Test
void testAnomalyDetection() {
detector.train(trainingData);
ProcessMetrics normalMetrics = createNormalMetrics();
ProcessMetrics anomalousMetrics = createAnomalousMetrics();
AnomalyDetectionResult normalResult = detector.detect(normalMetrics);
AnomalyDetectionResult anomalousResult = detector.detect(anomalousMetrics);
assertFalse(normalResult.isAnomaly());
assertTrue(anomalousResult.isAnomaly());
assertTrue(anomalousResult.getAnomalyScore() > normalResult.getAnomalyScore());
}
private List<ProcessMetrics> generateTrainingData() {
List<ProcessMetrics> data = new ArrayList<>();
for (int i = 0; i < 100; i++) {
data.add(createNormalMetrics());
}
return data;
}
private ProcessMetrics createNormalMetrics() {
ProcessMetrics metrics = new ProcessMetrics();
metrics.setProcessCpuLoad(0.3);
metrics.setSystemCpuLoad(0.4);
metrics.setHeapUsed(100 * 1024 * 1024); // 100MB
metrics.setNonHeapUsed(50 * 1024 * 1024); // 50MB
metrics.setThreadCount(50);
metrics.setGcCount(100);
metrics.setGcTime(5000);
return metrics;
}
private ProcessMetrics createAnomalousMetrics() {
ProcessMetrics metrics = new ProcessMetrics();
metrics.setProcessCpuLoad(0.95); // Very high CPU
metrics.setSystemCpuLoad(0.9);
metrics.setHeapUsed(2L * 1024 * 1024 * 1024); // 2GB - very high memory
metrics.setNonHeapUsed(500 * 1024 * 1024); // 500MB
metrics.setThreadCount(1000); // Very high thread count
metrics.setGcCount(5000); // Very high GC count
metrics.setGcTime(30000); // Very high GC time
return metrics;
}
}

Best Practices

  1. Gradual Training: Train anomaly detectors gradually as more data becomes available
  2. Feature Engineering: Carefully select and normalize features for better detection
  3. Threshold Tuning: Adjust detection thresholds based on system characteristics
  4. Alert Fatigue: Implement smart alerting to avoid notification overload
  5. Historical Analysis: Maintain historical data for trend analysis and false positive reduction
  6. Resource Monitoring: Monitor the monitoring system itself to prevent resource exhaustion
// Example of adaptive threshold adjustment
public class AdaptiveThreshold {
public double adjustThreshold(double currentThreshold, 
double falsePositiveRate, 
double detectionRate) {
if (falsePositiveRate > 0.1) {
return currentThreshold * 1.1; // Increase threshold to reduce false positives
} else if (detectionRate < 0.8) {
return currentThreshold * 0.9; // Decrease threshold to improve detection
}
return currentThreshold;
}
}

Conclusion

Process Anomaly Detection in Java provides:

  • Real-time monitoring of system processes and metrics
  • Multiple detection algorithms (statistical, machine learning, rule-based)
  • Comprehensive anomaly analysis with confidence scoring
  • REST API for integration and monitoring
  • Configurable thresholds and adaptive learning
  • Alerting system for immediate notification

By implementing this comprehensive anomaly detection system, you can proactively identify performance issues, security threats, and operational problems before they impact your system's stability and performance.

Advanced Java Supply Chain Security, Kubernetes Hardening & Runtime Threat Detection

Sigstore Rekor in Java – https://macronepal.com/blog/sigstore-rekor-in-java/
Explains integrating Sigstore Rekor into Java systems to create a transparent, tamper-proof log of software signatures and metadata for verifying supply chain integrity.

Securing Java Applications with Chainguard Wolfi – https://macronepal.com/blog/securing-java-applications-with-chainguard-wolfi-a-comprehensive-guide/
Explains using Chainguard Wolfi minimal container images to reduce vulnerabilities and secure Java applications with hardened, lightweight runtime environments.

Cosign Image Signing in Java Complete Guide – https://macronepal.com/blog/cosign-image-signing-in-java-complete-guide/
Explains how to digitally sign container images using Cosign in Java-based workflows to ensure authenticity and prevent unauthorized modifications.

Secure Supply Chain Enforcement Kyverno Image Verification for Java Containers – https://macronepal.com/blog/secure-supply-chain-enforcement-kyverno-image-verification-for-java-containers/
Explains enforcing Kubernetes policies with Kyverno to verify container image signatures and ensure only trusted Java container images are deployed.

Pod Security Admission in Java Securing Kubernetes Deployments for JVM Applications – https://macronepal.com/blog/pod-security-admission-in-java-securing-kubernetes-deployments-for-jvm-applications/
Explains Kubernetes Pod Security Admission policies that enforce security rules like restricted privileges and safe configurations for Java workloads.

Securing Java Applications at Runtime Kubernetes Security Context – https://macronepal.com/blog/securing-java-applications-at-runtime-a-guide-to-kubernetes-security-context/
Explains how Kubernetes security contexts control runtime permissions, user IDs, and access rights for Java containers to improve isolation.

Process Anomaly Detection in Java Behavioral Monitoring – https://macronepal.com/blog/process-anomaly-detection-in-java-comprehensive-behavioral-monitoring-2/
Explains detecting abnormal runtime behavior in Java applications to identify potential security threats using process monitoring techniques.

Achieving Security Excellence CIS Benchmark Compliance for Java Applications – https://macronepal.com/blog/achieving-security-excellence-implementing-cis-benchmark-compliance-for-java-applications/
Explains applying CIS security benchmarks to Java environments to standardize hardening and improve overall system security posture.

Process Anomaly Detection in Java Behavioral Monitoring – https://macronepal.com/blog/process-anomaly-detection-in-java-comprehensive-behavioral-monitoring/
Explains behavioral monitoring of Java processes to detect anomalies and improve runtime security through continuous observation and analysis.

Leave a Reply

Your email address will not be published. Required fields are marked *


Macro Nepal Helper