In modern applications, long-running tasks like sending emails, processing files, or generating reports shouldn't block user interactions. JobRunr is a modern, distributed background job processing library for Java that makes it incredibly easy to offload these tasks to background threads—and scale them across multiple machines.
What is JobRunr?
JobRunr is an open-source Java library that provides:
- Distributed background job processing
- Scheduled and recurring jobs
- Automatic retries with exponential backoff
- Dashboard for monitoring and management
- Cluster support with automatic load balancing
- Persistence to SQL or NoSQL databases
Key Features
- Easy Integration: Just a few lines of code to get started
- Distributed: Scale horizontally across multiple JVMs
- Persistent: Jobs survive application restarts
- Monitoring: Built-in dashboard with real-time insights
- Elastic: Automatically scales based on workload
Architecture Overview
[Your Java App] → [JobRunr Client] → [Storage] ← [JobRunr Server] ← [Dashboard] | | | | Enqueue jobs Create job Persist job Process jobs Monitor progress metadata to DB in background
Components:
- Client: Enqueues and schedules jobs
- Server: Processes jobs from the storage
- Storage: Persists job metadata (SQL, MongoDB, Redis)
- Dashboard: Web UI for monitoring
Hands-On Tutorial: Implementing Distributed Jobs with JobRunr
Let's build a complete example showing email notifications, file processing, and report generation using JobRunr.
Step 1: Project Setup
Maven Dependencies (pom.xml):
<properties>
<jobrunr.version>6.3.3</jobrunr.version>
</properties>
<dependencies>
<!-- JobRunr Core -->
<dependency>
<groupId>org.jobrunr</groupId>
<artifactId>jobrunr</artifactId>
<version>${jobrunr.version}</version>
</dependency>
<!-- Spring Boot Starter (optional) -->
<dependency>
<groupId>org.jobrunr</groupId>
<artifactId>jobrunr-spring-boot-starter</artifactId>
<version>${jobrunr.version}</version>
</dependency>
<!-- Storage: SQL Database -->
<dependency>
<groupId>com.h2database</groupId>
<artifactId>h2</artifactId>
<scope>runtime</scope>
</dependency>
<!-- For JSON processing in our example jobs -->
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-databind</artifactId>
</dependency>
</dependencies>
Step 2: Configuration
application.properties:
# JobRunr Configuration org.jobrunr.background-job-server.enabled=true org.jobrunr.dashboard.enabled=true org.jobrunr.job-scheduler.enabled=true # Database Storage (using H2 for example) org.jobrunr.database.type=sql org.jobrunr.datasource.url=jdbc:h2:file:./jobrunr-db;DB_CLOSE_DELAY=-1 org.jobrunr.datasource.username=sa org.jobrunr.datasource.password= # JobRunr Dashboard org.jobrunr.dashboard.port=8000 # Server configuration org.jobrunr.background-job-server.name=MyJobServer org.jobrunr.background-job-server.worker-count=10
Java Configuration (Alternative):
@Configuration
public class JobRunrConfig {
@Bean
public StorageProvider storageProvider(DataSource dataSource) {
return new SqlStorageProvider(dataSource);
}
@Bean
public BackgroundJobServer backgroundJobServer(StorageProvider storageProvider) {
return new BackgroundJobServer(storageProvider);
}
@Bean
public JobScheduler jobScheduler(StorageProvider storageProvider) {
return new JobScheduler(storageProvider);
}
@Bean
public DashboardWebServer dashboard(StorageProvider storageProvider) {
return new DashboardWebServer(storageProvider);
}
}
Step 3: Service Classes for Job Logic
Email Service:
@Service
public class EmailService {
private static final Logger logger = LoggerFactory.getLogger(EmailService.class);
@Job(name = "Send welcome email to %1", retries = 3)
public void sendWelcomeEmail(String recipientEmail, String userName) {
logger.info("Sending welcome email to: {} for user: {}", recipientEmail, userName);
// Simulate email sending logic
try {
// This could be integration with SendGrid, Amazon SES, etc.
Thread.sleep(2000); // Simulate API call delay
logger.info("✅ Welcome email successfully sent to: {}", recipientEmail);
} catch (InterruptedException e) {
Thread.currentThread().interrupt();
throw new RuntimeException("Email sending interrupted", e);
} catch (Exception e) {
logger.error("Failed to send email to: {}", recipientEmail, e);
throw new RuntimeException("Email sending failed", e);
}
}
@Job(name = "Send bulk newsletter", retries = 2)
public void sendBulkNewsletter(List<String> recipientEmails, String newsletterContent) {
logger.info("Sending newsletter to {} recipients", recipientEmails.size());
for (String email : recipientEmails) {
try {
sendNewsletterEmail(email, newsletterContent);
} catch (Exception e) {
logger.warn("Failed to send newsletter to: {}, continuing with others", email);
}
}
logger.info("✅ Bulk newsletter sending completed");
}
private void sendNewsletterEmail(String email, String content) {
// Simulate individual email sending
try {
Thread.sleep(100);
} catch (InterruptedException e) {
Thread.currentThread().interrupt();
}
}
}
File Processing Service:
@Service
public class FileProcessingService {
private static final Logger logger = LoggerFactory.getLogger(FileProcessingService.class);
@Job(name = "Process uploaded file: %0", retries = 2)
public void processUploadedFile(String fileName, String fileContent) {
logger.info("Starting processing of file: {}", fileName);
try {
// Simulate file parsing and processing
Thread.sleep(3000);
// Example: Parse CSV, JSON, or process images
int processedRecords = processFileContent(fileContent);
logger.info("✅ Successfully processed file: {} ({} records)", fileName, processedRecords);
} catch (Exception e) {
logger.error("Failed to process file: {}", fileName, e);
throw new RuntimeException("File processing failed", e);
}
}
@Job(name = "Batch file processing")
public void processBatchFiles(List<String> fileNames) {
logger.info("Starting batch processing of {} files", fileNames.size());
int successCount = 0;
for (String fileName : fileNames) {
try {
processSingleFile(fileName);
successCount++;
} catch (Exception e) {
logger.warn("Failed to process file: {}, skipping", fileName);
}
}
logger.info("✅ Batch processing completed: {}/{} files successful", successCount, fileNames.size());
}
private int processFileContent(String content) {
// Simulate processing logic
return content.split("\n").length;
}
private void processSingleFile(String fileName) {
// Simulate individual file processing
try {
Thread.sleep(1000);
} catch (InterruptedException e) {
Thread.currentThread().interrupt();
}
}
}
Report Generation Service:
@Service
public class ReportGenerationService {
private static final Logger logger = LoggerFactory.getLogger(ReportGenerationService.class);
@Job(name = "Generate sales report for %1", retries = 2)
public void generateSalesReport(String reportId, Date startDate, Date endDate) {
logger.info("Generating sales report {} for period: {} to {}", reportId, startDate, endDate);
try {
// Simulate complex report generation
Thread.sleep(5000);
String reportUrl = saveReportToStorage(reportId, startDate, endDate);
logger.info("✅ Sales report generated successfully: {}", reportUrl);
} catch (Exception e) {
logger.error("Failed to generate sales report: {}", reportId, e);
throw new RuntimeException("Report generation failed", e);
}
}
@Recurring(id = "daily-metrics-job", cron = "0 0 2 * * *") // 2 AM daily
@Job(name = "Generate daily metrics report")
public void generateDailyMetrics() {
logger.info("Starting daily metrics report generation");
try {
Date yesterday = Date.from(Instant.now().minus(1, ChronoUnit.DAYS));
generateDailyReport(yesterday);
logger.info("✅ Daily metrics report generated successfully");
} catch (Exception e) {
logger.error("Failed to generate daily metrics report", e);
}
}
private String saveReportToStorage(String reportId, Date startDate, Date endDate) {
// Simulate saving to cloud storage
return "https://storage.example.com/reports/" + reportId + ".pdf";
}
private void generateDailyReport(Date date) {
// Simulate daily report generation
try {
Thread.sleep(2000);
} catch (InterruptedException e) {
Thread.currentThread().interrupt();
}
}
}
Step 4: Job Management Service
@Service
public class JobManagementService {
private final BackgroundJobClient backgroundJobClient;
private final EmailService emailService;
private final FileProcessingService fileProcessingService;
private final ReportGenerationService reportGenerationService;
public JobManagementService(BackgroundJobClient backgroundJobClient,
EmailService emailService,
FileProcessingService fileProcessingService,
ReportGenerationService reportGenerationService) {
this.backgroundJobClient = backgroundJobClient;
this.emailService = emailService;
this.fileProcessingService = fileProcessingService;
this.reportGenerationService = reportGenerationService;
}
/**
* Enqueue immediate background jobs
*/
public String enqueueWelcomeEmail(String email, String userName) {
return backgroundJobClient.enqueue(() ->
emailService.sendWelcomeEmail(email, userName)
);
}
public String enqueueFileProcessing(String fileName, String content) {
return backgroundJobClient.enqueue(() ->
fileProcessingService.processUploadedFile(fileName, content)
);
}
public String enqueueSalesReport(String reportId, Date startDate, Date endDate) {
return backgroundJobClient.enqueue(() ->
reportGenerationService.generateSalesReport(reportId, startDate, endDate)
);
}
/**
* Schedule jobs for future execution
*/
public String scheduleReportGeneration(String reportId, Date startDate, Date endDate, Instant scheduleAt) {
return backgroundJobClient.schedule(scheduleAt, () ->
reportGenerationService.generateSalesReport(reportId, startDate, endDate)
);
}
/**
* Enqueue bulk processing jobs
*/
public void enqueueBulkNewsletter(List<String> emails, String content) {
backgroundJobClient.<EmailService>enqueue(x ->
x.sendBulkNewsletter(emails, content)
);
}
public void enqueueBatchFileProcessing(List<String> fileNames) {
backgroundJobClient.<FileProcessingService>enqueue(x ->
x.processBatchFiles(fileNames)
);
}
/**
* Recurring jobs management
*/
public void createRecurringReport(String jobId, String cronExpression) {
BackgroundJob.<ReportGenerationService>scheduleRecurringly(jobId, cronExpression,
service -> service.generateDailyMetrics()
);
}
/**
* Job monitoring and management
*/
public Job getJobStatus(String jobId) {
return backgroundJobClient.getJobById(jobId);
}
public void deleteJob(String jobId) {
backgroundJobClient.delete(jobId);
}
public void requeueJob(String jobId) {
backgroundJobClient.requeue(jobId);
}
}
Step 5: REST Controller
@RestController
@RequestMapping("/api/jobs")
public class JobController {
private final JobManagementService jobManagementService;
public JobController(JobManagementService jobManagementService) {
this.jobManagementService = jobManagementService;
}
@PostMapping("/email/welcome")
public ResponseEntity<JobResponse> sendWelcomeEmail(@RequestBody EmailRequest request) {
String jobId = jobManagementService.enqueueWelcomeEmail(request.getEmail(), request.getUserName());
return ResponseEntity.accepted().body(new JobResponse(jobId, "Welcome email queued successfully"));
}
@PostMapping("/files/process")
public ResponseEntity<JobResponse> processFile(@RequestBody FileProcessRequest request) {
String jobId = jobManagementService.enqueueFileProcessing(request.getFileName(), request.getContent());
return ResponseEntity.accepted().body(new JobResponse(jobId, "File processing queued successfully"));
}
@PostMapping("/reports/sales")
public ResponseEntity<JobResponse> generateSalesReport(@RequestBody ReportRequest request) {
String jobId = jobManagementService.enqueueSalesReport(
request.getReportId(), request.getStartDate(), request.getEndDate()
);
return ResponseEntity.accepted().body(new JobResponse(jobId, "Sales report generation queued"));
}
@GetMapping("/{jobId}/status")
public ResponseEntity<JobStatusResponse> getJobStatus(@PathVariable String jobId) {
Job job = jobManagementService.getJobStatus(jobId);
return ResponseEntity.ok(new JobStatusResponse(job));
}
// DTO Classes
public static class EmailRequest {
private String email;
private String userName;
// getters and setters
}
public static class FileProcessRequest {
private String fileName;
private String content;
// getters and setters
}
public static class ReportRequest {
private String reportId;
private Date startDate;
private Date endDate;
// getters and setters
}
public static class JobResponse {
private final String jobId;
private final String message;
// constructor, getters
}
public static class JobStatusResponse {
private final String jobId;
private final String state;
private final Instant createdAt;
private final Instant updatedAt;
// constructor, getters
}
}
Step 6: Application Runner for Demo
@Component
public class JobRunrDemoRunner implements CommandLineRunner {
private final JobManagementService jobManagementService;
public JobRunrDemoRunner(JobManagementService jobManagementService) {
this.jobManagementService = jobManagementService;
}
@Override
public void run(String... args) throws Exception {
// Demo: Enqueue some sample jobs on startup
jobManagementService.enqueueWelcomeEmail("[email protected]", "John Doe");
jobManagementService.enqueueFileProcessing("data.csv", "name,email\nJohn,[email protected]");
// Schedule a report for 5 minutes from now
Instant inFiveMinutes = Instant.now().plus(5, ChronoUnit.MINUTES);
jobManagementService.scheduleReportGeneration(
"report-001",
new Date(),
new Date(),
inFiveMinutes
);
// Bulk processing demo
List<String> emails = Arrays.asList("[email protected]", "[email protected]", "[email protected]");
jobManagementService.enqueueBulkNewsletter(emails, "Monthly Newsletter Content");
}
}
Running the Application
- Start the application
- Access the JobRunr Dashboard at
http://localhost:8000 - Use the REST API to enqueue jobs:
curl -X POST http://localhost:8080/api/jobs/email/welcome \
-H "Content-Type: application/json" \
-d '{"email":"[email protected]", "userName":"Test User"}'
- Monitor jobs in the dashboard with real-time updates
Production Configuration
Database Storage
# PostgreSQL example org.jobrunr.datasource.url=jdbc:postgresql://localhost:5432/jobrunr org.jobrunr.datasource.username=jobrunr org.jobrunr.datasource.password=secure-password # Or use MongoDB org.jobrunr.database.type=mongodb org.jobrunr.mongodb.connection-string=mongodb://localhost:27017/jobrunr
Cluster Configuration
# Multiple server instances org.jobrunr.background-job-server.enabled=true org.jobrunr.background-job-server.name=worker-node-1 # Worker configuration org.jobrunr.background-job-server.worker-count=20 org.jobrunr.jobs.default-number-of-retries=5 org.jobrunr.jobs.retry-back-off-time-seed=3
Monitoring and Metrics
@Configuration
public class MetricsConfig {
@Bean
public JobRunrDashboardWebSocketMetrics jobRunrMetrics() {
return new JobRunrDashboardWebSocketMetrics();
}
}
Best Practices
- Idempotent Jobs: Design jobs to be safe for retries
- Proper Error Handling: Use
@Job(retries = 3)for transient failures - Resource Management: Close resources properly in jobs
- Monitoring: Use the dashboard to track job performance
- Load Testing: Test with production-like workloads
Use Cases
- Email Campaigns: Send thousands of emails asynchronously
- Data Processing: Process large files or datasets
- Report Generation: Generate complex reports in background
- ETL Pipelines: Data synchronization and transformation
- Scheduled Maintenance: Regular cleanup and maintenance tasks
Conclusion
JobRunr provides a robust, distributed background job processing solution for Java applications that's both powerful and easy to use. With features like:
- Simple integration with just a few annotations
- Horizontal scaling across multiple servers
- Comprehensive monitoring through built-in dashboard
- Flexible scheduling for immediate and recurring jobs
- Enterprise-ready with persistence and retry mechanisms
It's an excellent choice for modern applications that need reliable, scalable background processing without the complexity of maintaining separate message brokers or job queues.