Tekton Supply Chain in Java: Comprehensive CI/CD Pipeline Implementation

Tekton is a powerful Kubernetes-native framework for building CI/CD systems. This guide covers Java-based Tekton pipeline creation, custom tasks, and comprehensive supply chain automation.


Dependencies and Setup

1. Maven Dependencies
<properties>
<kubernetes-client.version>6.7.2</kubernetes-client.version>
<tekton-client.version>0.6.0</tekton-client.version>
<jackson.version>2.15.2</jackson.version>
<yaml.version>2.15.2</yaml.version>
</properties>
<dependencies>
<!-- Tekton Client -->
<dependency>
<groupId>io.fabric8</groupId>
<artifactId>tekton-client</artifactId>
<version>${tekton-client.version}</version>
</dependency>
<!-- Kubernetes Client -->
<dependency>
<groupId>io.fabric8</groupId>
<artifactId>kubernetes-client</artifactId>
<version>${kubernetes-client.version}</version>
</dependency>
<!-- Tekton Pipeline Model -->
<dependency>
<groupId>io.fabric8</groupId>
<artifactId>tekton-pipelines-model-v1</artifactId>
<version>${tekton-client.version}</version>
</dependency>
<!-- YAML Processing -->
<dependency>
<groupId>com.fasterxml.jackson.dataformat</groupId>
<artifactId>jackson-dataformat-yaml</artifactId>
<version>${yaml.version}</version>
</dependency>
<!-- Spring Boot (Optional) -->
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-web</artifactId>
<version>3.1.0</version>
</dependency>
<!-- Validation -->
<dependency>
<groupId>jakarta.validation</groupId>
<artifactId>jakarta.validation-api</artifactId>
<version>3.0.2</version>
</dependency>
</dependencies>
2. Kubernetes Configuration
# tekton-config.yaml
apiVersion: v1
kind: ConfigMap
metadata:
name: tekton-java-config
data:
JAVA_HOME: /usr/lib/jvm/java-17
MAVEN_HOME: /usr/share/maven
MAVEN_OPTS: -Xmx2048m -XX:+UseG1GC
MAVEN_ARGS: -DskipTests=false -DskipITs=false
---
apiVersion: v1
kind: Secret
metadata:
name: tekton-secrets
type: Opaque
data:
github-token: BASE64_ENCODED_TOKEN
docker-config: BASE64_ENCODED_DOCKER_CONFIG
sonar-token: BASE64_ENCODED_SONAR_TOKEN

Core Tekton Pipeline Components

1. Java Tekton Client
package com.example.tekton;
import io.fabric8.kubernetes.client.Config;
import io.fabric8.kubernetes.client.ConfigBuilder;
import io.fabric8.kubernetes.client.DefaultKubernetesClient;
import io.fabric8.kubernetes.client.KubernetesClient;
import io.fabric8.tekton.client.DefaultTektonClient;
import io.fabric8.tekton.client.TektonClient;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
@Configuration
public class TektonClientConfig {
private static final Logger logger = LoggerFactory.getLogger(TektonClientConfig.class);
@Value("${kubernetes.master.url:}")
private String kubernetesMasterUrl;
@Value("${kubernetes.namespace:default}")
private String namespace;
@Bean
public KubernetesClient kubernetesClient() {
Config config = new ConfigBuilder()
.withMasterUrl(kubernetesMasterUrl)
.withNamespace(namespace)
.build();
return new DefaultKubernetesClient(config);
}
@Bean
public TektonClient tektonClient(KubernetesClient kubernetesClient) {
return new DefaultTektonClient(kubernetesClient);
}
}
2. Pipeline Model Definitions
package com.example.tekton.model;
import com.fasterxml.jackson.annotation.JsonInclude;
import lombok.Data;
import java.util.*;
@Data
@JsonInclude(JsonInclude.Include.NON_NULL)
public class JavaPipelineSpec {
private String name;
private String namespace;
private String description;
private String gitUrl;
private String gitBranch;
private String mavenVersion;
private String javaVersion;
private String dockerRegistry;
private String dockerRepository;
private List<String> stages;
private Map<String, String> environment;
private SecuritySettings security;
private QualityGates qualityGates;
private List<CustomTask> customTasks;
@Data
public static class SecuritySettings {
private boolean vulnerabilityScan;
private boolean dependencyCheck;
private boolean secretDetection;
private String trivyImage;
private String grypeImage;
}
@Data
public static class QualityGates {
private double testCoverageThreshold;
private int maxCriticalIssues;
private int maxMajorIssues;
private int maxTotalIssues;
private boolean failOnQualityGate;
}
@Data
public static class CustomTask {
private String name;
private String image;
private List<String> command;
private List<String> args;
private Map<String, String> env;
private Map<String, String> volumes;
private String script;
}
}
3. Pipeline Builder
package com.example.tekton.builder;
import io.fabric8.tekton.pipeline.v1.*;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.stereotype.Component;
import java.util.*;
@Component
public class PipelineBuilder {
private static final Logger logger = LoggerFactory.getLogger(PipelineBuilder.class);
public Pipeline buildJavaPipeline(JavaPipelineSpec spec) {
logger.info("Building Tekton pipeline for: {}", spec.getName());
Pipeline pipeline = new Pipeline();
pipeline.setMetadata(createMetadata(spec));
pipeline.setSpec(createPipelineSpec(spec));
return pipeline;
}
public PipelineRun buildPipelineRun(String pipelineName, 
Map<String, String> params,
Map<String, String> workspaces) {
PipelineRun pipelineRun = new PipelineRun();
pipelineRun.setMetadata(createRunMetadata(pipelineName));
pipelineRun.setSpec(createPipelineRunSpec(pipelineName, params, workspaces));
return pipelineRun;
}
private io.fabric8.kubernetes.api.model.ObjectMeta createMetadata(JavaPipelineSpec spec) {
io.fabric8.kubernetes.api.model.ObjectMeta metadata = 
new io.fabric8.kubernetes.api.model.ObjectMeta();
metadata.setName(spec.getName() + "-pipeline");
metadata.setNamespace(spec.getNamespace());
metadata.setLabels(createLabels(spec));
metadata.setAnnotations(createAnnotations(spec));
return metadata;
}
private PipelineSpec createPipelineSpec(JavaPipelineSpec spec) {
PipelineSpec pipelineSpec = new PipelineSpec();
pipelineSpec.setParams(createParams(spec));
pipelineSpec.setWorkspaces(createWorkspaces());
pipelineSpec.setTasks(createTasks(spec));
pipelineSpec.setFinally(createFinallyTasks(spec));
return pipelineSpec;
}
private List<ParamSpec> createParams(JavaPipelineSpec spec) {
List<ParamSpec> params = new ArrayList<>();
params.add(new ParamSpecBuilder()
.withName("GIT_URL")
.withType("string")
.withDescription("Git repository URL")
.withDefault(new io.fabric8.tekton.pipeline.v1.ParamValue(spec.getGitUrl()))
.build());
params.add(new ParamSpecBuilder()
.withName("GIT_BRANCH")
.withType("string")
.withDescription("Git branch")
.withDefault(new io.fabric8.tekton.pipeline.v1.ParamValue(spec.getGitBranch()))
.build());
params.add(new ParamSpecBuilder()
.withName("MAVEN_VERSION")
.withType("string")
.withDescription("Maven version")
.withDefault(new io.fabric8.tekton.pipeline.v1.ParamValue(spec.getMavenVersion()))
.build());
params.add(new ParamSpecBuilder()
.withName("JAVA_VERSION")
.withType("string")
.withDescription("Java version")
.withDefault(new io.fabric8.tekton.pipeline.v1.ParamValue(spec.getJavaVersion()))
.build());
params.add(new ParamSpecBuilder()
.withName("DOCKER_REGISTRY")
.withType("string")
.withDescription("Docker registry")
.withDefault(new io.fabric8.tekton.pipeline.v1.ParamValue(spec.getDockerRegistry()))
.build());
params.add(new ParamSpecBuilder()
.withName("DOCKER_REPOSITORY")
.withType("string")
.withDescription("Docker repository")
.withDefault(new io.fabric8.tekton.pipeline.v1.ParamValue(spec.getDockerRepository()))
.build());
return params;
}
private List<PipelineWorkspaceDeclaration> createWorkspaces() {
List<PipelineWorkspaceDeclaration> workspaces = new ArrayList<>();
workspaces.add(new PipelineWorkspaceDeclarationBuilder()
.withName("source")
.withDescription("Source code workspace")
.build());
workspaces.add(new PipelineWorkspaceDeclarationBuilder()
.withName("maven-repo")
.withDescription("Maven repository cache")
.withOptional(true)
.build());
workspaces.add(new PipelineWorkspaceDeclarationBuilder()
.withName("docker-config")
.withDescription("Docker configuration")
.withOptional(true)
.build());
workspaces.add(new PipelineWorkspaceDeclarationBuilder()
.withName("test-reports")
.withDescription("Test reports output")
.build());
return workspaces;
}
private List<PipelineTask> createTasks(JavaPipelineSpec spec) {
List<PipelineTask> tasks = new ArrayList<>();
// Clone repository
tasks.add(createGitCloneTask());
// Code quality checks
tasks.add(createCodeQualityTask());
// Dependency check
if (spec.getSecurity() != null && spec.getSecurity().isDependencyCheck()) {
tasks.add(createDependencyCheckTask());
}
// Build and test
tasks.add(createMavenBuildTask());
// Unit tests
tasks.add(createUnitTestTask());
// Integration tests
tasks.add(createIntegrationTestTask());
// Security scan
if (spec.getSecurity() != null && spec.getSecurity().isVulnerabilityScan()) {
tasks.add(createSecurityScanTask(spec));
}
// Build container image
tasks.add(createDockerBuildTask());
// Scan container image
tasks.add(createContainerScanTask(spec));
// Push container image
tasks.add(createDockerPushTask());
// Add custom tasks
if (spec.getCustomTasks() != null) {
spec.getCustomTasks().forEach(customTask -> 
tasks.add(createCustomTask(customTask)));
}
return tasks;
}
private List<PipelineTask> createFinallyTasks(JavaPipelineSpec spec) {
List<PipelineTask> finallyTasks = new ArrayList<>();
// Cleanup tasks
finallyTasks.add(createCleanupTask());
// Notification tasks
finallyTasks.add(createNotificationTask());
// Report generation
finallyTasks.add(createReportTask());
return finallyTasks;
}
private PipelineTask createGitCloneTask() {
return new PipelineTaskBuilder()
.withName("git-clone")
.withTaskRef(new TaskRefBuilder()
.withName("git-clone")
.withKind("ClusterTask")
.build())
.withWorkspaces(Arrays.asList(
new WorkspacePipelineTaskBindingBuilder()
.withName("output")
.withWorkspace("source")
.build()
))
.withParams(Arrays.asList(
new ParamBuilder()
.withName("url")
.withNewValue("$(params.GIT_URL)")
.build(),
new ParamBuilder()
.withName("revision")
.withNewValue("$(params.GIT_BRANCH)")
.build()
))
.build();
}
private PipelineTask createMavenBuildTask() {
return new PipelineTaskBuilder()
.withName("maven-build")
.withRunAfter(Arrays.asList("git-clone", "code-quality"))
.withTaskSpec(createMavenTaskSpec())
.withWorkspaces(Arrays.asList(
new WorkspacePipelineTaskBindingBuilder()
.withName("source")
.withWorkspace("source")
.build(),
new WorkspacePipelineTaskBindingBuilder()
.withName("maven-repo")
.withWorkspace("maven-repo")
.build()
))
.withParams(Arrays.asList(
new ParamBuilder()
.withName("MAVEN_VERSION")
.withNewValue("$(params.MAVEN_VERSION)")
.build(),
new ParamBuilder()
.withName("JAVA_VERSION")
.withNewValue("$(params.JAVA_VERSION)")
.build(),
new ParamBuilder()
.withName("GOALS")
.withNewValue("clean compile")
.build()
))
.build();
}
private EmbeddedTask createMavenTaskSpec() {
return new EmbeddedTaskBuilder()
.withNewSpec()
.withSteps(Arrays.asList(
new io.fabric8.tekton.pipeline.v1beta1.StepBuilder()
.withName("maven-build")
.withImage("maven:$(params.MAVEN_VERSION)-jdk-$(params.JAVA_VERSION)-slim")
.withWorkingDir("$(workspaces.source.path)")
.withCommand(Arrays.asList("mvn"))
.withArgs(Arrays.asList("$(params.GOALS)"))
.withEnv(Arrays.asList(
new io.fabric8.kubernetes.api.model.EnvVarBuilder()
.withName("MAVEN_OPTS")
.withValue("-Dmaven.repo.local=$(workspaces.maven-repo.path)/.m2/repository")
.build()
))
.build()
))
.endSpec()
.build();
}
private PipelineTask createUnitTestTask() {
return new PipelineTaskBuilder()
.withName("unit-tests")
.withRunAfter(Arrays.asList("maven-build"))
.withTaskSpec(new EmbeddedTaskBuilder()
.withNewSpec()
.withSteps(Arrays.asList(
new io.fabric8.tekton.pipeline.v1beta1.StepBuilder()
.withName("run-tests")
.withImage("maven:$(params.MAVEN_VERSION)-jdk-$(params.JAVA_VERSION)-slim")
.withWorkingDir("$(workspaces.source.path)")
.withCommand(Arrays.asList("mvn"))
.withArgs(Arrays.asList(
"test",
"-DskipTests=false",
"-DskipITs=true"
))
.build(),
new io.fabric8.tekton.pipeline.v1beta1.StepBuilder()
.withName("test-report")
.withImage("alpine:3.16")
.withCommand(Arrays.asList("sh"))
.withArgs(Arrays.asList("-c", 
"mkdir -p $(workspaces.test-reports.path)/unit && " +
"cp -r target/surefire-reports/* $(workspaces.test-reports.path)/unit/"))
.build()
))
.endSpec()
.build())
.withWorkspaces(Arrays.asList(
new WorkspacePipelineTaskBindingBuilder()
.withName("source")
.withWorkspace("source")
.build(),
new WorkspacePipelineTaskBindingBuilder()
.withName("maven-repo")
.withWorkspace("maven-repo")
.build(),
new WorkspacePipelineTaskBindingBuilder()
.withName("reports")
.withWorkspace("test-reports")
.build()
))
.build();
}
private PipelineTask createIntegrationTestTask() {
return new PipelineTaskBuilder()
.withName("integration-tests")
.withRunAfter(Arrays.asList("unit-tests"))
.withTaskSpec(new EmbeddedTaskBuilder()
.withNewSpec()
.withSteps(Arrays.asList(
new io.fabric8.tekton.pipeline.v1beta1.StepBuilder()
.withName("run-integration-tests")
.withImage("maven:$(params.MAVEN_VERSION)-jdk-$(params.JAVA_VERSION)-slim")
.withWorkingDir("$(workspaces.source.path)")
.withCommand(Arrays.asList("mvn"))
.withArgs(Arrays.asList(
"verify",
"-DskipTests=true",
"-DskipITs=false"
))
.build(),
new io.fabric8.tekton.pipeline.v1beta1.StepBuilder()
.withName("integration-report")
.withImage("alpine:3.16")
.withCommand(Arrays.asList("sh"))
.withArgs(Arrays.asList("-c", 
"mkdir -p $(workspaces.test-reports.path)/integration && " +
"cp -r target/failsafe-reports/* $(workspaces.test-reports.path)/integration/"))
.build()
))
.endSpec()
.build())
.withWorkspaces(Arrays.asList(
new WorkspacePipelineTaskBindingBuilder()
.withName("source")
.withWorkspace("source")
.build(),
new WorkspacePipelineTaskBindingBuilder()
.withName("maven-repo")
.withWorkspace("maven-repo")
.build(),
new WorkspacePipelineTaskBindingBuilder()
.withName("reports")
.withWorkspace("test-reports")
.build()
))
.build();
}
private PipelineTask createCodeQualityTask() {
return new PipelineTaskBuilder()
.withName("code-quality")
.withRunAfter(Arrays.asList("git-clone"))
.withTaskSpec(new EmbeddedTaskBuilder()
.withNewSpec()
.withSteps(Arrays.asList(
new io.fabric8.tekton.pipeline.v1beta1.StepBuilder()
.withName("sonar-scanner")
.withImage("sonarsource/sonar-scanner-cli:latest")
.withWorkingDir("$(workspaces.source.path)")
.withCommand(Arrays.asList("sonar-scanner"))
.withArgs(Arrays.asList(
"-Dsonar.projectKey=$(context.pipelineRun.name)",
"-Dsonar.sources=.",
"-Dsonar.host.url=$(params.SONAR_URL)",
"-Dsonar.login=$(params.SONAR_TOKEN)"
))
.build(),
new io.fabric8.tekton.pipeline.v1beta1.StepBuilder()
.withName("checkstyle")
.withImage("maven:$(params.MAVEN_VERSION)-jdk-$(params.JAVA_VERSION)-slim")
.withWorkingDir("$(workspaces.source.path)")
.withCommand(Arrays.asList("mvn"))
.withArgs(Arrays.asList(
"checkstyle:check",
"-Dcheckstyle.config.location=google_checks.xml"
))
.build()
))
.endSpec()
.build())
.withWorkspaces(Arrays.asList(
new WorkspacePipelineTaskBindingBuilder()
.withName("source")
.withWorkspace("source")
.build()
))
.build();
}
private PipelineTask createDependencyCheckTask() {
return new PipelineTaskBuilder()
.withName("dependency-check")
.withRunAfter(Arrays.asList("git-clone"))
.withTaskSpec(new EmbeddedTaskBuilder()
.withNewSpec()
.withSteps(Arrays.asList(
new io.fabric8.tekton.pipeline.v1beta1.StepBuilder()
.withName("owasp-dependency-check")
.withImage("owasp/dependency-check:latest")
.withWorkingDir("$(workspaces.source.path)")
.withCommand(Arrays.asList("dependency-check.sh"))
.withArgs(Arrays.asList(
"--scan", ".",
"--format", "ALL",
"--out", "$(workspaces.source.path)/dependency-check-report",
"--project", "$(context.pipelineRun.name)"
))
.build()
))
.endSpec()
.build())
.withWorkspaces(Arrays.asList(
new WorkspacePipelineTaskBindingBuilder()
.withName("source")
.withWorkspace("source")
.build()
))
.build();
}
private PipelineTask createSecurityScanTask(JavaPipelineSpec spec) {
String scannerImage = spec.getSecurity().getTrivyImage() != null ? 
spec.getSecurity().getTrivyImage() : "aquasec/trivy:latest";
return new PipelineTaskBuilder()
.withName("security-scan")
.withRunAfter(Arrays.asList("maven-build"))
.withTaskSpec(new EmbeddedTaskBuilder()
.withNewSpec()
.withSteps(Arrays.asList(
new io.fabric8.tekton.pipeline.v1beta1.StepBuilder()
.withName("trivy-fs-scan")
.withImage(scannerImage)
.withWorkingDir("$(workspaces.source.path)")
.withCommand(Arrays.asList("trivy"))
.withArgs(Arrays.asList(
"filesystem",
"--severity", "HIGH,CRITICAL",
"--format", "json",
"--output", "$(workspaces.source.path)/trivy-report.json",
"."
))
.build()
))
.endSpec()
.build())
.withWorkspaces(Arrays.asList(
new WorkspacePipelineTaskBindingBuilder()
.withName("source")
.withWorkspace("source")
.build()
))
.build();
}
private PipelineTask createDockerBuildTask() {
return new PipelineTaskBuilder()
.withName("docker-build")
.withRunAfter(Arrays.asList("integration-tests", "security-scan"))
.withTaskRef(new TaskRefBuilder()
.withName("buildah")
.withKind("ClusterTask")
.build())
.withWorkspaces(Arrays.asList(
new WorkspacePipelineTaskBindingBuilder()
.withName("source")
.withWorkspace("source")
.build()
))
.withParams(Arrays.asList(
new ParamBuilder()
.withName("IMAGE")
.withNewValue("$(params.DOCKER_REGISTRY)/$(params.DOCKER_REPOSITORY):$(context.pipelineRun.name)")
.build(),
new ParamBuilder()
.withName("DOCKERFILE")
.withNewValue("Dockerfile")
.build(),
new ParamBuilder()
.withName("CONTEXT")
.withNewValue("$(workspaces.source.path)")
.build()
))
.build();
}
private PipelineTask createContainerScanTask(JavaPipelineSpec spec) {
String scannerImage = spec.getSecurity().getGrypeImage() != null ? 
spec.getSecurity().getGrypeImage() : "anchore/grype:latest";
return new PipelineTaskBuilder()
.withName("container-scan")
.withRunAfter(Arrays.asList("docker-build"))
.withTaskSpec(new EmbeddedTaskBuilder()
.withNewSpec()
.withSteps(Arrays.asList(
new io.fabric8.tekton.pipeline.v1beta1.StepBuilder()
.withName("grype-scan")
.withImage(scannerImage)
.withCommand(Arrays.asList("grype"))
.withArgs(Arrays.asList(
"$(params.DOCKER_REGISTRY)/$(params.DOCKER_REPOSITORY):$(context.pipelineRun.name)",
"--output", "json",
"--file", "/reports/container-scan.json"
))
.build()
))
.endSpec()
.build())
.withWorkspaces(Arrays.asList(
new WorkspacePipelineTaskBindingBuilder()
.withName("reports")
.withWorkspace("test-reports")
.build()
))
.build();
}
private PipelineTask createDockerPushTask() {
return new PipelineTaskBuilder()
.withName("docker-push")
.withRunAfter(Arrays.asList("container-scan"))
.withTaskRef(new TaskRefBuilder()
.withName("buildah")
.withKind("ClusterTask")
.build())
.withWorkspaces(Arrays.asList(
new WorkspacePipelineTaskBindingBuilder()
.withName("source")
.withWorkspace("source")
.build(),
new WorkspacePipelineTaskBindingBuilder()
.withName("dockerconfig")
.withWorkspace("docker-config")
.build()
))
.withParams(Arrays.asList(
new ParamBuilder()
.withName("IMAGE")
.withNewValue("$(params.DOCKER_REGISTRY)/$(params.DOCKER_REPOSITORY):$(context.pipelineRun.name)")
.build(),
new ParamBuilder()
.withName("DOCKERFILE")
.withNewValue("Dockerfile")
.build(),
new ParamBuilder()
.withName("CONTEXT")
.withNewValue("$(workspaces.source.path)")
.build(),
new ParamBuilder()
.withName("TLSVERIFY")
.withNewValue("false")
.build()
))
.build();
}
private PipelineTask createCleanupTask() {
return new PipelineTaskBuilder()
.withName("cleanup")
.withTaskSpec(new EmbeddedTaskBuilder()
.withNewSpec()
.withSteps(Arrays.asList(
new io.fabric8.tekton.pipeline.v1beta1.StepBuilder()
.withName("cleanup-resources")
.withImage("alpine:3.16")
.withCommand(Arrays.asList("sh"))
.withArgs(Arrays.asList("-c", 
"echo 'Cleaning up temporary resources' && " +
"sleep 5"))
.build()
))
.endSpec()
.build())
.build();
}
private PipelineTask createNotificationTask() {
return new PipelineTaskBuilder()
.withName("notification")
.withTaskSpec(new EmbeddedTaskBuilder()
.withNewSpec()
.withSteps(Arrays.asList(
new io.fabric8.tekton.pipeline.v1beta1.StepBuilder()
.withName("send-notification")
.withImage("curlimages/curl:latest")
.withCommand(Arrays.asList("sh"))
.withArgs(Arrays.asList("-c", 
"curl -X POST -H 'Content-Type: application/json' " +
"$(params.SLACK_WEBHOOK_URL) " +
"-d '{\"text\":\"Pipeline $(context.pipelineRun.name) completed with status: $(tasks.status)\"}'"))
.build()
))
.endSpec()
.build())
.build();
}
private PipelineTask createReportTask() {
return new PipelineTaskBuilder()
.withName("generate-report")
.withTaskSpec(new EmbeddedTaskBuilder()
.withNewSpec()
.withSteps(Arrays.asList(
new io.fabric8.tekton.pipeline.v1beta1.StepBuilder()
.withName("collect-reports")
.withImage("alpine:3.16")
.withCommand(Arrays.asList("sh"))
.withArgs(Arrays.asList("-c", 
"echo 'Generating pipeline report...' && " +
"ls -la $(workspaces.test-reports.path)"))
.build()
))
.endSpec()
.build())
.withWorkspaces(Arrays.asList(
new WorkspacePipelineTaskBindingBuilder()
.withName("reports")
.withWorkspace("test-reports")
.build()
))
.build();
}
private PipelineTask createCustomTask(CustomTask customTask) {
PipelineTaskBuilder builder = new PipelineTaskBuilder()
.withName(customTask.getName());
if (customTask.getImage() != null) {
builder.withTaskSpec(new EmbeddedTaskBuilder()
.withNewSpec()
.withSteps(Arrays.asList(
new io.fabric8.tekton.pipeline.v1beta1.StepBuilder()
.withName(customTask.getName())
.withImage(customTask.getImage())
.withCommand(customTask.getCommand())
.withArgs(customTask.getArgs())
.withEnv(convertEnvVars(customTask.getEnv()))
.build()
))
.endSpec()
.build());
}
return builder.build();
}
private List<io.fabric8.kubernetes.api.model.EnvVar> convertEnvVars(Map<String, String> env) {
if (env == null) return Collections.emptyList();
List<io.fabric8.kubernetes.api.model.EnvVar> envVars = new ArrayList<>();
env.forEach((key, value) -> 
envVars.add(new io.fabric8.kubernetes.api.model.EnvVarBuilder()
.withName(key)
.withValue(value)
.build()));
return envVars;
}
private Map<String, String> createLabels(JavaPipelineSpec spec) {
Map<String, String> labels = new HashMap<>();
labels.put("app", spec.getName());
labels.put("component", "pipeline");
labels.put("language", "java");
labels.put("version", "v1");
labels.put("managed-by", "tekton-java-client");
return labels;
}
private Map<String, String> createAnnotations(JavaPipelineSpec spec) {
Map<String, String> annotations = new HashMap<>();
annotations.put("description", spec.getDescription());
annotations.put("git-url", spec.getGitUrl());
annotations.put("created-by", "java-tekton-builder");
annotations.put("timestamp", String.valueOf(System.currentTimeMillis()));
return annotations;
}
private io.fabric8.kubernetes.api.model.ObjectMeta createRunMetadata(String pipelineName) {
io.fabric8.kubernetes.api.model.ObjectMeta metadata = 
new io.fabric8.kubernetes.api.model.ObjectMeta();
metadata.setGenerateName(pipelineName + "-run-");
metadata.setNamespace("default");
return metadata;
}
private PipelineRunSpec createPipelineRunSpec(String pipelineName, 
Map<String, String> params,
Map<String, String> workspaces) {
PipelineRunSpec spec = new PipelineRunSpec();
spec.setPipelineRef(new PipelineRefBuilder()
.withName(pipelineName)
.build());
// Set parameters
List<io.fabric8.tekton.pipeline.v1.Param> paramList = new ArrayList<>();
if (params != null) {
params.forEach((key, value) -> 
paramList.add(new io.fabric8.tekton.pipeline.v1.ParamBuilder()
.withName(key)
.withValue(new io.fabric8.tekton.pipeline.v1.ParamValue(value))
.build()));
}
spec.setParams(paramList);
// Set workspaces
List<io.fabric8.tekton.pipeline.v1.WorkspaceBinding> workspaceList = new ArrayList<>();
if (workspaces != null) {
workspaces.forEach((key, value) -> 
workspaceList.add(new io.fabric8.tekton.pipeline.v1.WorkspaceBindingBuilder()
.withName(key)
.withNewPersistentVolumeClaim()
.withClaimName(value)
.endPersistentVolumeClaim()
.build()));
}
spec.setWorkspaces(workspaceList);
return spec;
}
}

Supply Chain Orchestration

1. Supply Chain Service
package com.example.tekton.service;
import com.example.tekton.builder.PipelineBuilder;
import com.example.tekton.model.JavaPipelineSpec;
import io.fabric8.tekton.client.TektonClient;
import io.fabric8.tekton.pipeline.v1.*;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.stereotype.Service;
import java.util.*;
@Service
public class SupplyChainService {
private static final Logger logger = LoggerFactory.getLogger(SupplyChainService.class);
private final TektonClient tektonClient;
private final PipelineBuilder pipelineBuilder;
public SupplyChainService(TektonClient tektonClient, PipelineBuilder pipelineBuilder) {
this.tektonClient = tektonClient;
this.pipelineBuilder = pipelineBuilder;
}
public Pipeline createPipeline(JavaPipelineSpec spec) {
try {
Pipeline pipeline = pipelineBuilder.buildJavaPipeline(spec);
Pipeline created = tektonClient.v1().pipelines().create(pipeline);
logger.info("Created pipeline: {}", created.getMetadata().getName());
return created;
} catch (Exception e) {
logger.error("Failed to create pipeline: {}", spec.getName(), e);
throw new SupplyChainException("Failed to create pipeline", e);
}
}
public PipelineRun triggerPipeline(String pipelineName, 
Map<String, String> params,
Map<String, String> workspaces) {
try {
PipelineRun pipelineRun = pipelineBuilder.buildPipelineRun(pipelineName, params, workspaces);
PipelineRun created = tektonClient.v1().pipelineRuns().create(pipelineRun);
logger.info("Triggered pipeline run: {}", created.getMetadata().getName());
return created;
} catch (Exception e) {
logger.error("Failed to trigger pipeline: {}", pipelineName, e);
throw new SupplyChainException("Failed to trigger pipeline", e);
}
}
public PipelineRunStatus getPipelineRunStatus(String runName) {
try {
PipelineRun pipelineRun = tektonClient.v1().pipelineRuns().withName(runName).get();
return pipelineRun != null ? pipelineRun.getStatus() : null;
} catch (Exception e) {
logger.error("Failed to get pipeline run status: {}", runName, e);
throw new SupplyChainException("Failed to get pipeline run status", e);
}
}
public List<PipelineRun> listPipelineRuns(String pipelineName) {
try {
return tektonClient.v1().pipelineRuns()
.withLabel("tekton.dev/pipeline", pipelineName)
.list()
.getItems();
} catch (Exception e) {
logger.error("Failed to list pipeline runs for: {}", pipelineName, e);
throw new SupplyChainException("Failed to list pipeline runs", e);
}
}
public void deletePipeline(String pipelineName) {
try {
boolean deleted = tektonClient.v1().pipelines().withName(pipelineName).delete();
logger.info("Deleted pipeline: {} - {}", pipelineName, deleted);
} catch (Exception e) {
logger.error("Failed to delete pipeline: {}", pipelineName, e);
throw new SupplyChainException("Failed to delete pipeline", e);
}
}
public void cleanupOldRuns(String pipelineName, int keepLast) {
try {
List<PipelineRun> runs = listPipelineRuns(pipelineName);
runs.sort(Comparator.comparing(run -> 
run.getMetadata().getCreationTimestamp()));
// Keep only the last N runs
if (runs.size() > keepLast) {
for (int i = 0; i < runs.size() - keepLast; i++) {
String runName = runs.get(i).getMetadata().getName();
tektonClient.v1().pipelineRuns().withName(runName).delete();
logger.debug("Deleted old pipeline run: {}", runName);
}
}
} catch (Exception e) {
logger.error("Failed to cleanup old runs for: {}", pipelineName, e);
throw new SupplyChainException("Failed to cleanup old runs", e);
}
}
public PipelineMetrics getPipelineMetrics(String pipelineName) {
try {
List<PipelineRun> runs = listPipelineRuns(pipelineName);
long totalRuns = runs.size();
long successfulRuns = runs.stream()
.filter(run -> run.getStatus() != null && 
"Succeeded".equals(run.getStatus().getConditions().get(0).getStatus()))
.count();
long failedRuns = runs.stream()
.filter(run -> run.getStatus() != null && 
"Failed".equals(run.getStatus().getConditions().get(0).getStatus()))
.count();
long runningRuns = totalRuns - successfulRuns - failedRuns;
double successRate = totalRuns > 0 ? (double) successfulRuns / totalRuns * 100 : 0;
// Calculate average duration
long totalDuration = runs.stream()
.filter(run -> run.getStatus() != null && 
run.getStatus().getCompletionTime() != null &&
run.getStatus().getStartTime() != null)
.mapToLong(run -> 
run.getStatus().getCompletionTime().getTime() - 
run.getStatus().getStartTime().getTime())
.sum();
long avgDuration = successfulRuns > 0 ? totalDuration / successfulRuns : 0;
return new PipelineMetrics(
pipelineName, totalRuns, successfulRuns, failedRuns, 
runningRuns, successRate, avgDuration
);
} catch (Exception e) {
logger.error("Failed to get metrics for pipeline: {}", pipelineName, e);
throw new SupplyChainException("Failed to get pipeline metrics", e);
}
}
}
package com.example.tekton.service;
public class PipelineMetrics {
private final String pipelineName;
private final long totalRuns;
private final long successfulRuns;
private final long failedRuns;
private final long runningRuns;
private final double successRate;
private final long averageDurationMs;
public PipelineMetrics(String pipelineName, long totalRuns, long successfulRuns,
long failedRuns, long runningRuns, double successRate,
long averageDurationMs) {
this.pipelineName = pipelineName;
this.totalRuns = totalRuns;
this.successfulRuns = successfulRuns;
this.failedRuns = failedRuns;
this.runningRuns = runningRuns;
this.successRate = successRate;
this.averageDurationMs = averageDurationMs;
}
// Getters
public String getPipelineName() { return pipelineName; }
public long getTotalRuns() { return totalRuns; }
public long getSuccessfulRuns() { return successfulRuns; }
public long getFailedRuns() { return failedRuns; }
public long getRunningRuns() { return runningRuns; }
public double getSuccessRate() { return successRate; }
public long getAverageDurationMs() { return averageDurationMs; }
}
package com.example.tekton.service;
public class SupplyChainException extends RuntimeException {
public SupplyChainException(String message) {
super(message);
}
public SupplyChainException(String message, Throwable cause) {
super(message, cause);
}
}
2. Custom Task Controller
package com.example.tekton.controller;
import com.example.tekton.service.SupplyChainService;
import io.fabric8.tekton.pipeline.v1.Task;
import io.fabric8.tekton.pipeline.v1.TaskBuilder;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.web.bind.annotation.*;
import java.util.Arrays;
@RestController
@RequestMapping("/api/tekton/tasks")
public class TaskController {
private static final Logger logger = LoggerFactory.getLogger(TaskController.class);
private final SupplyChainService supplyChainService;
public TaskController(SupplyChainService supplyChainService) {
this.supplyChainService = supplyChainService;
}
@PostMapping("/java-quality-check")
public Task createJavaQualityCheckTask() {
Task task = new TaskBuilder()
.withNewMetadata()
.withName("java-quality-check")
.withNamespace("default")
.endMetadata()
.withNewSpec()
.withParams(Arrays.asList(
new io.fabric8.tekton.pipeline.v1.ParamSpecBuilder()
.withName("SOURCE_DIR")
.withType("string")
.withDescription("Source directory")
.build(),
new io.fabric8.tekton.pipeline.v1.ParamSpecBuilder()
.withName("QUALITY_THRESHOLD")
.withType("string")
.withDescription("Quality gate threshold")
.withDefault(new io.fabric8.tekton.pipeline.v1.ParamValue("80"))
.build()
))
.withSteps(Arrays.asList(
new io.fabric8.tekton.pipeline.v1.StepBuilder()
.withName("checkstyle")
.withImage("checkstyle/checkstyle:latest")
.withCommand(Arrays.asList("java"))
.withArgs(Arrays.asList(
"-jar", "/checkstyle.jar",
"-c", "/google_checks.xml",
"$(params.SOURCE_DIR)"
))
.build(),
new io.fabric8.tekton.pipeline.v1.StepBuilder()
.withName("pmd")
.withImage("pmd/pmd:latest")
.withCommand(Arrays.asList("java"))
.withArgs(Arrays.asList(
"-jar", "/pmd.jar",
"check",
"-d", "$(params.SOURCE_DIR)",
"-R", "rulesets/java/quickstart.xml"
))
.build(),
new io.fabric8.tekton.pipeline.v1.StepBuilder()
.withName("quality-gate")
.withImage("alpine:3.16")
.withCommand(Arrays.asList("sh"))
.withArgs(Arrays.asList("-c",
"echo 'Quality check completed' && " +
"if [ $(cat /reports/quality-score.txt) -lt $(params.QUALITY_THRESHOLD) ]; then " +
"  echo 'Quality gate failed'; exit 1; " +
"else " +
"  echo 'Quality gate passed'; fi"
))
.build()
))
.withWorkspaces(Arrays.asList(
new io.fabric8.tekton.pipeline.v1.WorkspaceDeclarationBuilder()
.withName("source")
.withDescription("Source code")
.build(),
new io.fabric8.tekton.pipeline.v1.WorkspaceDeclarationBuilder()
.withName("reports")
.withDescription("Quality reports")
.build()
))
.endSpec()
.build();
logger.info("Created Java quality check task");
return task;
}
@PostMapping("/security-scan")
public Task createSecurityScanTask() {
Task task = new TaskBuilder()
.withNewMetadata()
.withName("java-security-scan")
.withNamespace("default")
.endMetadata()
.withNewSpec()
.withParams(Arrays.asList(
new io.fabric8.tekton.pipeline.v1.ParamSpecBuilder()
.withName("SCAN_TYPE")
.withType("string")
.withDescription("Type of security scan")
.withDefault(new io.fabric8.tekton.pipeline.v1.ParamValue("all"))
.build()
))
.withSteps(Arrays.asList(
new io.fabric8.tekton.pipeline.v1.StepBuilder()
.withName("dependency-check")
.withImage("owasp/dependency-check:latest")
.withCommand(Arrays.asList("dependency-check.sh"))
.withArgs(Arrays.asList(
"--scan", ".",
"--format", "HTML",
"--out", "/reports",
"--project", "$(context.taskRun.name)"
))
.build(),
new io.fabric8.tekton.pipeline.v1.StepBuilder()
.withName("secret-scan")
.withImage("zricethezav/gitleaks:latest")
.withCommand(Arrays.asList("gitleaks"))
.withArgs(Arrays.asList(
"detect",
"--source", ".",
"--report-path", "/reports/gitleaks.json"
))
.build(),
new io.fabric8.tekton.pipeline.v1.StepBuilder()
.withName("generate-report")
.withImage("alpine:3.16")
.withCommand(Arrays.asList("sh"))
.withArgs(Arrays.asList("-c",
"echo 'Security scan completed' && " +
"echo 'Vulnerabilities found: $(cat /reports/vulnerabilities.txt)'"
))
.build()
))
.withWorkspaces(Arrays.asList(
new io.fabric8.tekton.pipeline.v1.WorkspaceDeclarationBuilder()
.withName("source")
.withDescription("Source code")
.build(),
new io.fabric8.tekton.pipeline.v1.WorkspaceDeclarationBuilder()
.withName("reports")
.withDescription("Security reports")
.build()
))
.endSpec()
.build();
logger.info("Created Java security scan task");
return task;
}
}

Supply Chain Automation

1. Automated Pipeline Manager
package com.example.tekton.automation;
import com.example.tekton.model.JavaPipelineSpec;
import com.example.tekton.service.SupplyChainService;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.scheduling.annotation.Scheduled;
import org.springframework.stereotype.Component;
import java.util.*;
@Component
public class PipelineAutomationManager {
private static final Logger logger = LoggerFactory.getLogger(PipelineAutomationManager.class);
private final SupplyChainService supplyChainService;
private final Map<String, PipelineConfiguration> pipelineConfigurations;
public PipelineAutomationManager(SupplyChainService supplyChainService) {
this.supplyChainService = supplyChainService;
this.pipelineConfigurations = new HashMap<>();
}
public void registerPipeline(PipelineConfiguration config) {
pipelineConfigurations.put(config.getPipelineName(), config);
logger.info("Registered pipeline configuration: {}", config.getPipelineName());
}
@Scheduled(cron = "0 */5 * * * *") // Every 5 minutes
public void monitorAndTriggerPipelines() {
for (PipelineConfiguration config : pipelineConfigurations.values()) {
if (shouldTriggerPipeline(config)) {
triggerPipeline(config);
}
}
}
@Scheduled(cron = "0 0 */6 * * *") // Every 6 hours
public void cleanupOldPipelineRuns() {
for (PipelineConfiguration config : pipelineConfigurations.values()) {
try {
supplyChainService.cleanupOldRuns(
config.getPipelineName(), 
config.getKeepLastRuns()
);
logger.debug("Cleaned up old runs for pipeline: {}", config.getPipelineName());
} catch (Exception e) {
logger.error("Failed to cleanup old runs for: {}", config.getPipelineName(), e);
}
}
}
@Scheduled(cron = "0 */30 * * * *") // Every 30 minutes
public void monitorPipelineHealth() {
for (PipelineConfiguration config : pipelineConfigurations.values()) {
try {
var metrics = supplyChainService.getPipelineMetrics(config.getPipelineName());
if (metrics.getSuccessRate() < config.getMinimumSuccessRate()) {
logger.warn("Pipeline health check failed for {}: success rate {}% < {}%",
config.getPipelineName(), metrics.getSuccessRate(), config.getMinimumSuccessRate());
// Trigger alert
}
} catch (Exception e) {
logger.error("Failed to monitor pipeline health for: {}", config.getPipelineName(), e);
}
}
}
private boolean shouldTriggerPipeline(PipelineConfiguration config) {
if (!config.isAutoTriggerEnabled()) {
return false;
}
// Check schedule
if (config.getSchedule() != null && !config.getSchedule().isEmpty()) {
// Implement cron schedule checking
// For simplicity, we'll assume scheduled triggering is handled elsewhere
return false;
}
// For webhook-based triggering, this would check for new commits
// This is a simplified version
return checkForNewCommits(config);
}
private boolean checkForNewCommits(PipelineConfiguration config) {
// This would integrate with Git provider APIs
// For now, return false
return false;
}
private void triggerPipeline(PipelineConfiguration config) {
try {
Map<String, String> params = new HashMap<>();
params.put("GIT_URL", config.getGitUrl());
params.put("GIT_BRANCH", config.getGitBranch());
params.put("MAVEN_VERSION", config.getMavenVersion());
params.put("JAVA_VERSION", config.getJavaVersion());
params.put("DOCKER_REGISTRY", config.getDockerRegistry());
params.put("DOCKER_REPOSITORY", config.getDockerRepository());
Map<String, String> workspaces = new HashMap<>();
workspaces.put("source", config.getWorkspacePvc());
workspaces.put("maven-repo", "maven-repo-pvc");
workspaces.put("test-reports", "test-reports-pvc");
supplyChainService.triggerPipeline(config.getPipelineName(), params, workspaces);
logger.info("Auto-triggered pipeline: {}", config.getPipelineName());
} catch (Exception e) {
logger.error("Failed to auto-trigger pipeline: {}", config.getPipelineName(), e);
}
}
}
package com.example.tekton.automation;
public class PipelineConfiguration {
private String pipelineName;
private String gitUrl;
private String gitBranch;
private String mavenVersion;
private String javaVersion;
private String dockerRegistry;
private String dockerRepository;
private String workspacePvc;
private boolean autoTriggerEnabled;
private String schedule; // Cron expression
private int keepLastRuns;
private double minimumSuccessRate;
private Map<String, String> environment;
// Getters and setters
public String getPipelineName() { return pipelineName; }
public void setPipelineName(String pipelineName) { this.pipelineName = pipelineName; }
public String getGitUrl() { return gitUrl; }
public void setGitUrl(String gitUrl) { this.gitUrl = gitUrl; }
public String getGitBranch() { return gitBranch; }
public void setGitBranch(String gitBranch) { this.gitBranch = gitBranch; }
public String getMavenVersion() { return mavenVersion; }
public void setMavenVersion(String mavenVersion) { this.mavenVersion = mavenVersion; }
public String getJavaVersion() { return javaVersion; }
public void setJavaVersion(String javaVersion) { this.javaVersion = javaVersion; }
public String getDockerRegistry() { return dockerRegistry; }
public void setDockerRegistry(String dockerRegistry) { this.dockerRegistry = dockerRegistry; }
public String getDockerRepository() { return dockerRepository; }
public void setDockerRepository(String dockerRepository) { this.dockerRepository = dockerRepository; }
public String getWorkspacePvc() { return workspacePvc; }
public void setWorkspacePvc(String workspacePvc) { this.workspacePvc = workspacePvc; }
public boolean isAutoTriggerEnabled() { return autoTriggerEnabled; }
public void setAutoTriggerEnabled(boolean autoTriggerEnabled) { this.autoTriggerEnabled = autoTriggerEnabled; }
public String getSchedule() { return schedule; }
public void setSchedule(String schedule) { this.schedule = schedule; }
public int getKeepLastRuns() { return keepLastRuns; }
public void setKeepLastRuns(int keepLastRuns) { this.keepLastRuns = keepLastRuns; }
public double getMinimumSuccessRate() { return minimumSuccessRate; }
public void setMinimumSuccessRate(double minimumSuccessRate) { this.minimumSuccessRate = minimumSuccessRate; }
public Map<String, String> getEnvironment() { return environment; }
public void setEnvironment(Map<String, String> environment) { this.environment = environment; }
}
2. Git Webhook Handler
package com.example.tekton.automation;
import com.example.tekton.service.SupplyChainService;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.web.bind.annotation.*;
import java.util.HashMap;
import java.util.Map;
@RestController
@RequestMapping("/api/webhook")
public class GitWebhookHandler {
private static final Logger logger = LoggerFactory.getLogger(GitWebhookHandler.class);
private final SupplyChainService supplyChainService;
private final Map<String, PipelineConfiguration> pipelineRegistry;
public GitWebhookHandler(SupplyChainService supplyChainService) {
this.supplyChainService = supplyChainService;
this.pipelineRegistry = new HashMap<>();
}
@PostMapping("/github")
public void handleGitHubWebhook(@RequestBody GitHubWebhookPayload payload,
@RequestHeader("X-GitHub-Event") String eventType) {
logger.info("Received GitHub webhook event: {}", eventType);
if ("push".equals(eventType)) {
handlePushEvent(payload);
} else if ("pull_request".equals(eventType)) {
handlePullRequestEvent(payload);
}
}
@PostMapping("/gitlab")
public void handleGitLabWebhook(@RequestBody GitLabWebhookPayload payload,
@RequestHeader("X-Gitlab-Event") String eventType) {
logger.info("Received GitLab webhook event: {}", eventType);
if ("Push Hook".equals(eventType)) {
handlePushEvent(payload);
} else if ("Merge Request Hook".equals(eventType)) {
handleMergeRequestEvent(payload);
}
}
private void handlePushEvent(Object payload) {
String repositoryUrl = extractRepositoryUrl(payload);
String branch = extractBranch(payload);
// Find pipelines configured for this repository and branch
pipelineRegistry.values().stream()
.filter(config -> matchesRepository(config, repositoryUrl, branch))
.forEach(config -> triggerPipelineForPush(config, payload));
}
private void handlePullRequestEvent(GitHubWebhookPayload payload) {
String repositoryUrl = payload.getRepository().getHtmlUrl();
String branch = payload.getPullRequest().getHead().getRef();
String targetBranch = payload.getPullRequest().getBase().getRef();
// Trigger PR validation pipeline
Map<String, String> params = new HashMap<>();
params.put("GIT_URL", repositoryUrl);
params.put("GIT_BRANCH", branch);
params.put("TARGET_BRANCH", targetBranch);
params.put("PULL_REQUEST_ID", String.valueOf(payload.getPullRequest().getNumber()));
// Find and trigger PR validation pipeline
pipelineRegistry.values().stream()
.filter(config -> config.getPipelineName().contains("pr-validation"))
.findFirst()
.ifPresent(config -> 
supplyChainService.triggerPipeline(config.getPipelineName(), params, new HashMap<>()));
}
private boolean matchesRepository(PipelineConfiguration config, 
String repositoryUrl, String branch) {
return config.getGitUrl().equals(repositoryUrl) && 
config.getGitBranch().equals(branch);
}
private void triggerPipelineForPush(PipelineConfiguration config, Object payload) {
Map<String, String> params = new HashMap<>();
params.put("GIT_URL", config.getGitUrl());
params.put("GIT_BRANCH", config.getGitBranch());
params.put("COMMIT_SHA", extractCommitSha(payload));
params.put("COMMIT_AUTHOR", extractCommitAuthor(payload));
params.put("COMMIT_MESSAGE", extractCommitMessage(payload));
Map<String, String> workspaces = new HashMap<>();
workspaces.put("source", config.getWorkspacePvc());
supplyChainService.triggerPipeline(config.getPipelineName(), params, workspaces);
logger.info("Triggered pipeline {} for push to {}", 
config.getPipelineName(), config.getGitBranch());
}
private String extractRepositoryUrl(Object payload) {
// Extract repository URL from webhook payload
// Implementation depends on Git provider
if (payload instanceof GitHubWebhookPayload) {
return ((GitHubWebhookPayload) payload).getRepository().getHtmlUrl();
}
return "";
}
private String extractBranch(Object payload) {
// Extract branch from webhook payload
if (payload instanceof GitHubWebhookPayload) {
String ref = ((GitHubWebhookPayload) payload).getRef();
return ref.replace("refs/heads/", "");
}
return "";
}
private String extractCommitSha(Object payload) {
if (payload instanceof GitHubWebhookPayload) {
return ((GitHubWebhookPayload) payload).getAfter();
}
return "";
}
private String extractCommitAuthor(Object payload) {
if (payload instanceof GitHubWebhookPayload) {
return ((GitHubWebhookPayload) payload).getPusher().getName();
}
return "";
}
private String extractCommitMessage(Object payload) {
if (payload instanceof GitHubWebhookPayload) {
return ((GitHubWebhookPayload) payload).getHeadCommit().getMessage();
}
return "";
}
}
// Webhook payload models
class GitHubWebhookPayload {
private String ref;
private String after;
private Repository repository;
private Pusher pusher;
private HeadCommit headCommit;
private PullRequest pullRequest;
// Getters and setters
public String getRef() { return ref; }
public void setRef(String ref) { this.ref = ref; }
public String getAfter() { return after; }
public void setAfter(String after) { this.after = after; }
public Repository getRepository() { return repository; }
public void setRepository(Repository repository) { this.repository = repository; }
public Pusher getPusher() { return pusher; }
public void setPusher(Pusher pusher) { this.pusher = pusher; }
public HeadCommit getHeadCommit() { return headCommit; }
public void setHeadCommit(HeadCommit headCommit) { this.headCommit = headCommit; }
public PullRequest getPullRequest() { return pullRequest; }
public void setPullRequest(PullRequest pullRequest) { this.pullRequest = pullRequest; }
static class Repository {
private String htmlUrl;
public String getHtmlUrl() { return htmlUrl; }
public void setHtmlUrl(String htmlUrl) { this.htmlUrl = htmlUrl; }
}
static class Pusher {
private String name;
public String getName() { return name; }
public void setName(String name) { this.name = name; }
}
static class HeadCommit {
private String message;
public String getMessage() { return message; }
public void setMessage(String message) { this.message = message; }
}
static class PullRequest {
private int number;
private Head head;
private Base base;
public int getNumber() { return number; }
public void setNumber(int number) { this.number = number; }
public Head getHead() { return head; }
public void setHead(Head head) { this.head = head; }
public Base getBase() { return base; }
public void setBase(Base base) { this.base = base; }
}
static class Head {
private String ref;
public String getRef() { return ref; }
public void setRef(String ref) { this.ref = ref; }
}
static class Base {
private String ref;
public String getRef() { return ref; }
public void setRef(String ref) { this.ref = ref; }
}
}
class GitLabWebhookPayload {
// GitLab webhook payload structure
// Simplified for example
private String ref;
private String after;
private Project project;
// Getters and setters
public String getRef() { return ref; }
public void setRef(String ref) { this.ref = ref; }
public String getAfter() { return after; }
public void setAfter(String after) { this.after = after; }
public Project getProject() { return project; }
public void setProject(Project project) { this.project = project; }
static class Project {
private String webUrl;
public String getWebUrl() { return webUrl; }
public void setWebUrl(String webUrl) { this.webUrl = webUrl; }
}
}

REST API Controller

package com.example.tekton.controller;
import com.example.tekton.automation.PipelineConfiguration;
import com.example.tekton.model.JavaPipelineSpec;
import com.example.tekton.service.PipelineMetrics;
import com.example.tekton.service.SupplyChainService;
import io.fabric8.tekton.pipeline.v1.Pipeline;
import io.fabric8.tekton.pipeline.v1.PipelineRun;
import org.springframework.web.bind.annotation.*;
import java.util.List;
import java.util.Map;
@RestController
@RequestMapping("/api/pipelines")
public class PipelineController {
private final SupplyChainService supplyChainService;
public PipelineController(SupplyChainService supplyChainService) {
this.supplyChainService = supplyChainService;
}
@PostMapping
public Pipeline createPipeline(@RequestBody JavaPipelineSpec spec) {
return supplyChainService.createPipeline(spec);
}
@PostMapping("/{pipelineName}/trigger")
public PipelineRun triggerPipeline(@PathVariable String pipelineName,
@RequestBody Map<String, String> triggerRequest) {
Map<String, String> params = triggerRequest.get("params") != null ? 
(Map<String, String>) triggerRequest.get("params") : Map.of();
Map<String, String> workspaces = triggerRequest.get("workspaces") != null ? 
(Map<String, String>) triggerRequest.get("workspaces") : Map.of();
return supplyChainService.triggerPipeline(pipelineName, params, workspaces);
}
@GetMapping("/{pipelineName}/runs")
public List<PipelineRun> listPipelineRuns(@PathVariable String pipelineName) {
return supplyChainService.listPipelineRuns(pipelineName);
}
@GetMapping("/{pipelineName}/runs/{runName}/status")
public PipelineRunStatus getPipelineRunStatus(@PathVariable String pipelineName,
@PathVariable String runName) {
return supplyChainService.getPipelineRunStatus(runName);
}
@GetMapping("/{pipelineName}/metrics")
public PipelineMetrics getPipelineMetrics(@PathVariable String pipelineName) {
return supplyChainService.getPipelineMetrics(pipelineName);
}
@DeleteMapping("/{pipelineName}")
public void deletePipeline(@PathVariable String pipelineName) {
supplyChainService.deletePipeline(pipelineName);
}
@PostMapping("/{pipelineName}/cleanup")
public void cleanupPipelineRuns(@PathVariable String pipelineName,
@RequestParam(defaultValue = "10") int keepLast) {
supplyChainService.cleanupOldRuns(pipelineName, keepLast);
}
}
// Response wrapper
class ApiResponse<T> {
private boolean success;
private String message;
private T data;
public ApiResponse(boolean success, String message, T data) {
this.success = success;
this.message = message;
this.data = data;
}
public static <T> ApiResponse<T> success(T data) {
return new ApiResponse<>(true, "Success", data);
}
public static <T> ApiResponse<T> error(String message) {
return new ApiResponse<>(false, message, null);
}
// Getters
public boolean isSuccess() { return success; }
public String getMessage() { return message; }
public T getData() { return data; }
}

Configuration

1. Application Properties
# application.yml
kubernetes:
master:
url: ${KUBERNETES_MASTER:https://kubernetes.default.svc}
namespace: ${KUBERNETES_NAMESPACE:default}
tekton:
pipeline:
default-maven-version: 3.8.8
default-java-version: 17
timeout: 3600
retention:
runs: 10
days: 7
github:
webhook:
secret: ${GITHUB_WEBHOOK_SECRET:}
enabled: true
gitlab:
webhook:
secret: ${GITLAB_WEBHOOK_SECRET:}
enabled: false
security:
scanning:
enabled: true
trivy-image: aquasec/trivy:latest
grype-image: anchore/grype:latest
quality:
gates:
test-coverage: 80
max-critical-issues: 0
max-major-issues: 10
logging:
level:
com.example.tekton: INFO
2. Kubernetes Resources
# tekton-resources.yaml
apiVersion: v1
kind: PersistentVolumeClaim
metadata:
name: tekton-workspace-pvc
spec:
accessModes:
- ReadWriteOnce
resources:
requests:
storage: 10Gi
---
apiVersion: v1
kind: PersistentVolumeClaim
metadata:
name: maven-repo-pvc
spec:
accessModes:
- ReadWriteOnce
resources:
requests:
storage: 5Gi
---
apiVersion: v1
kind: PersistentVolumeClaim
metadata:
name: test-reports-pvc
spec:
accessModes:
- ReadWriteOnce
resources:
requests:
storage: 2Gi
---
apiVersion: v1
kind: ServiceAccount
metadata:
name: tekton-java-serviceaccount
secrets:
- name: tekton-secrets
---
apiVersion: rbac.authorization.k8s.io/v1
kind: ClusterRole
metadata:
name: tekton-java-clusterrole
rules:
- apiGroups: ["tekton.dev"]
resources: ["pipelines", "pipelineruns", "tasks", "taskruns"]
verbs: ["get", "list", "watch", "create", "update", "patch", "delete"]
---
apiVersion: rbac.authorization.k8s.io/v1
kind: ClusterRoleBinding
metadata:
name: tekton-java-clusterrolebinding
subjects:
- kind: ServiceAccount
name: tekton-java-serviceaccount
namespace: default
roleRef:
kind: ClusterRole
name: tekton-java-clusterrole
apiGroup: rbac.authorization.k8s.io

Testing

1. Unit Tests
package com.example.tekton;
import com.example.tekton.builder.PipelineBuilder;
import com.example.tekton.model.JavaPipelineSpec;
import io.fabric8.tekton.pipeline.v1.Pipeline;
import org.junit.jupiter.api.Test;
import static org.junit.jupiter.api.Assertions.*;
class PipelineBuilderTest {
@Test
void testBuildJavaPipeline() {
PipelineBuilder builder = new PipelineBuilder();
JavaPipelineSpec spec = new JavaPipelineSpec();
spec.setName("test-app");
spec.setNamespace("default");
spec.setGitUrl("https://github.com/test/test-app.git");
spec.setGitBranch("main");
spec.setMavenVersion("3.8.8");
spec.setJavaVersion("17");
spec.setDockerRegistry("docker.io");
spec.setDockerRepository("test/test-app");
Pipeline pipeline = builder.buildJavaPipeline(spec);
assertNotNull(pipeline);
assertEquals("test-app-pipeline", pipeline.getMetadata().getName());
assertNotNull(pipeline.getSpec());
assertFalse(pipeline.getSpec().getTasks().isEmpty());
}
}

Best Practices

  1. Idempotent Pipelines: Ensure pipelines can be rerun safely
  2. Resource Management: Set appropriate resource limits for tasks
  3. Security: Use Kubernetes secrets for sensitive data
  4. Monitoring: Implement comprehensive logging and monitoring
  5. Testing: Include pipeline validation and testing
  6. Versioning: Version pipeline definitions alongside application code
// Example of pipeline validation
public class PipelineValidator {
public boolean validate(Pipeline pipeline) {
// Check for required fields
// Validate task dependencies
// Check resource requirements
return true;
}
}

Conclusion

Tekton Supply Chain in Java provides:

  • Kubernetes-native CI/CD pipelines with Java integration
  • Comprehensive pipeline definitions for Java applications
  • Security scanning and quality gates integration
  • Automated triggering via webhooks
  • Custom task development for specific requirements
  • Monitoring and metrics collection

By implementing this Tekton-based supply chain, you can achieve a fully automated, secure, and scalable CI/CD pipeline for Java applications with comprehensive security scanning, quality gates, and deployment automation.

Advanced Java Supply Chain Security, Kubernetes Hardening & Runtime Threat Detection

Sigstore Rekor in Java – https://macronepal.com/blog/sigstore-rekor-in-java/
Explains integrating Sigstore Rekor into Java systems to create a transparent, tamper-proof log of software signatures and metadata for verifying supply chain integrity.

Securing Java Applications with Chainguard Wolfi – https://macronepal.com/blog/securing-java-applications-with-chainguard-wolfi-a-comprehensive-guide/
Explains using Chainguard Wolfi minimal container images to reduce vulnerabilities and secure Java applications with hardened, lightweight runtime environments.

Cosign Image Signing in Java Complete Guide – https://macronepal.com/blog/cosign-image-signing-in-java-complete-guide/
Explains how to digitally sign container images using Cosign in Java-based workflows to ensure authenticity and prevent unauthorized modifications.

Secure Supply Chain Enforcement Kyverno Image Verification for Java Containers – https://macronepal.com/blog/secure-supply-chain-enforcement-kyverno-image-verification-for-java-containers/
Explains enforcing Kubernetes policies with Kyverno to verify container image signatures and ensure only trusted Java container images are deployed.

Pod Security Admission in Java Securing Kubernetes Deployments for JVM Applications – https://macronepal.com/blog/pod-security-admission-in-java-securing-kubernetes-deployments-for-jvm-applications/
Explains Kubernetes Pod Security Admission policies that enforce security rules like restricted privileges and safe configurations for Java workloads.

Securing Java Applications at Runtime Kubernetes Security Context – https://macronepal.com/blog/securing-java-applications-at-runtime-a-guide-to-kubernetes-security-context/
Explains how Kubernetes security contexts control runtime permissions, user IDs, and access rights for Java containers to improve isolation.

Process Anomaly Detection in Java Behavioral Monitoring – https://macronepal.com/blog/process-anomaly-detection-in-java-comprehensive-behavioral-monitoring-2/
Explains detecting abnormal runtime behavior in Java applications to identify potential security threats using process monitoring techniques.

Achieving Security Excellence CIS Benchmark Compliance for Java Applications – https://macronepal.com/blog/achieving-security-excellence-implementing-cis-benchmark-compliance-for-java-applications/
Explains applying CIS security benchmarks to Java environments to standardize hardening and improve overall system security posture.

Process Anomaly Detection in Java Behavioral Monitoring – https://macronepal.com/blog/process-anomaly-detection-in-java-comprehensive-behavioral-monitoring/
Explains behavioral monitoring of Java processes to detect anomalies and improve runtime security through continuous observation and analysis.

Leave a Reply

Your email address will not be published. Required fields are marked *


Macro Nepal Helper