Salesforce Apex batch processing is a powerful mechanism for handling large-scale data operations asynchronously. Standard DML operations in Apex are limited to 10,000 records per transaction, making batch processing essential for bulk data handling efficiently. In this guide, we’ll dive deep into Apex Batch Classes, their structure, and real-world use cases.


What is an Apex Batch Class?

An Apex Batch Class allows you to process large volumes of data asynchronously by dividing it into manageable chunks (batches). It implements the Database.Batchable interface and consists of three key methods:

  • start(): Defines the scope of records to process.
  • execute(): Performs the actual processing logic in batches.
  • finish(): Executes post-processing logic (e.g., sending notifications, logging results).

Basic Structure of an Apex Batch Class

Here’s a simple template:

public class MyBatchClass implements Database.Batchable<SObject> {
    
    public Database.QueryLocator start(Database.BatchableContext BC) {
        return Database.getQueryLocator('SELECT Id, Name FROM Account');
    }
    
    public void execute(Database.BatchableContext BC, List<Account> scope) {
        for (Account acc : scope) {
            acc.Name = acc.Name + ' - Updated';
        }
        update scope;
    }
    
    public void finish(Database.BatchableContext BC) {
        System.debug('Batch Job Completed');
    }
}

To execute the batch, use:

MyBatchClass batchJob = new MyBatchClass();
Database.executeBatch(batchJob, 200);


Real-World Use Cases

Use Case 1: Mass Updating Contacts Based on Account Type

Imagine you need to update all Contacts under Accounts of type ‘Customer’.

public class UpdateContactsBatch implements Database.Batchable<SObject> {
    
    public Database.QueryLocator start(Database.BatchableContext BC) {
        return Database.getQueryLocator('SELECT Id FROM Contact WHERE Account.Type = 'Customer'');
    }
    
    public void execute(Database.BatchableContext BC, List<Contact> scope) {
        for (Contact con : scope) {
            con.Status__c = 'Active Customer';
        }
        update scope;
    }
    
    public void finish(Database.BatchableContext BC) {
        System.debug('Contact update batch completed.');
    }
}

Execution:

Database.executeBatch(new UpdateContactsBatch(), 200);

Use Case 2: Archiving Old Cases

If you want to close cases older than 2 years:

public class ArchiveOldCasesBatch implements Database.Batchable<SObject> {
    public Database.QueryLocator start(Database.BatchableContext BC) {
        return Database.getQueryLocator('SELECT Id, Status FROM Case WHERE CreatedDate < LAST_N_YEARS:2');
    }
    public void execute(Database.BatchableContext BC, List<Case> scope) {
        for (Case c : scope) {
            c.Status = 'Closed';
        }
        update scope;
    }
    public void finish(Database.BatchableContext BC) {
        System.debug('Old cases archived successfully.');
    }
}

Execution:

Database.executeBatch(new ArchiveOldCasesBatch(), 100);


Best Practices for Apex Batch Jobs

  • Use QueryLocator for large datasets: It helps in efficiently handling large queries.
  • Optimize batch size: A value between 100-200 is often optimal for governor limits.
  • Handle errors gracefully: Use try-catch blocks inside execute().
  • Chain batch jobs if needed: For dependent processes, call another batch in finish().
  • Monitor Batch Jobs: Use AsyncApexJob object to track job progress.

Example for chaining batch jobs:

public void finish(Database.BatchableContext BC) {
    Database.executeBatch(new AnotherBatchJob(), 200);
}


Conclusion

Apex Batch classes are indispensable for bulk processing operations in Salesforce. By properly structuring and optimizing batch jobs, you can efficiently manage large datasets while staying within governor limits. Implement these best practices, and you’ll have robust and scalable data processing solutions in no time.

We would love to hear your comments!

Trending