TECH

Ultimate Apex Interview Questions Guide (2025): Oracle APEX & Salesforce Apex

|

Aug 20, 2025

Ultimate Apex Interview Questions Guide (2025): Oracle APEX & Salesforce Apex
Ultimate Apex Interview Questions Guide (2025): Oracle APEX & Salesforce Apex

Key Takeaways

Key Takeaways

Distinguish candidates by testing practical skills like debugging, performance tuning, and complex business logic implementation beyond memorized theory.

Evaluate both Oracle APEX low-code competencies and Salesforce Apex programming expertise across beginner to expert levels.

Focus on critical platform knowledge including session management, RESTful services, asynchronous processing, and security best practices.

Use scenario-based questions on system design, error handling, integration, and deployment to find candidates ready for enterprise challenges.

Spot red flags such as superficial answers, lack of understanding of governor limits, and poor architectural insight.

Implement multi-stage interview processes combining quizzes, live coding, and architecture discussions for data-driven hiring decisions.

Apex interview questions often cover triggers, SOQL limits, and governor restrictions. Candidates discuss bulkification strategies and explain DML operations flawlessly. But when your Salesforce org hits performance walls or integration failures, academic knowledge falls short.


You need developers who can architect solutions that scale with your business growth and handle complex data relationships without breaking governor limits in real scenarios.


Engineering teams face a critical challenge when hiring developers with Apex expertise. Whether you're building rapid web applications with Oracle APEX or implementing complex business logic with Salesforce Apex, finding candidates who can deliver from day one requires precise technical evaluation.


This comprehensive guide provides 80+ carefully curated interview questions designed specifically for engineering leaders who need to identify genuine Apex expertise. We've structured questions across both Oracle APEX and Salesforce Apex platforms, categorized by skill level to help you assess candidates accurately.

Why This Guide Matters for Engineering Teams

As organizations increasingly rely on low-code platforms and Salesforce ecosystems, the demand for skilled Apex developers has skyrocketed. However, resumes often don't reflect real-world problem-solving abilities. This guide helps you:


  • Evaluate actual coding skills beyond theoretical knowledge

  • Assess architectural thinking for complex enterprise solutions

  • Identify candidates who understand performance implications

  • Test real-world scenario handling rather than textbook answers


The questions in this guide have been validated by engineering teams at companies ranging from startups to Fortune 500 organizations, ensuring they reflect actual job requirements rather than academic concepts.


Did you know?

Salesforce Apex enforces “governor limits” to maintain performance and multi-tenancy fairness.

Hire Apex Developers with confidence


Hire confidently by prioritizing real-world problem solving and architectural thinking. Our guide, vetted by Fortune 500 teams, focuses on skills that predict on-the-job success in Apex development.

Oracle APEX Interview Questions

Oracle APEX (Application Express) enables rapid development of data-driven web applications. These questions test candidates' ability to build scalable, secure applications using Oracle's low-code platform.

Beginner Level Questions (1-15)

1. What is Oracle APEX and how does it differ from traditional web development frameworks?

Oracle APEX is a low-code development platform that runs entirely within the Oracle Database. Unlike traditional frameworks that require separate application servers, databases, and extensive coding, APEX provides a declarative development environment where applications are built through configuration rather than custom code.


Key differences include:

  • Tight database integration: Applications run within the Oracle Database, eliminating the need for separate application tiers

  • Declarative development: Components are configured through wizards and forms rather than coded from scratch

  • Built-in security: Automatic protection against SQL injection, XSS, and other common vulnerabilities

  • Rapid deployment: Applications can be built and deployed in hours rather than weeks


2. Explain the architecture of Oracle APEX.

APEX architecture consists of four main components:

  • Oracle Database: Contains the APEX metadata repository, application logic, and data

  • APEX Listener (ORDS): Oracle REST Data Services handles HTTP requests and communicates with the database

  • Web Server: Hosts static files and routes requests to ORDS

  • Browser: Renders the HTML, CSS, and JavaScript generated by APEX


The request flow:

Browser → Web Server → ORDS → Oracle Database → APEX Engine → Response back through the chain.

3. What is a workspace in Oracle APEX?

A workspace is a virtual private database that groups APEX applications, users, and database schemas. It provides:

  • Isolation: Each workspace operates independently with its own users and applications

  • Security boundary: Users in one workspace cannot access another workspace's applications

  • Schema mapping: Associates the workspace with one or more database schemas

  • Administration: Manages developers, end users, and application settings


4. Describe the difference between a page and a region in APEX.

  • Page: A complete screen or view in an APEX application, accessible via a unique URL. Contains regions, items, buttons, and processes

  • Region: A container within a page that displays specific content like reports, forms, charts, or static content. Multiple regions can exist on a single page

Think of a page as a webpage and regions as sections or widgets within that page.



5. What are the main types of reports available in Oracle APEX?

  • Classic Report: Simple tabular data display with basic sorting and pagination

  • Interactive Report: Advanced user-customizable reports with filtering, searching, grouping, and personal customizations

  • Interactive Grid: Spreadsheet-like interface allowing inline editing, adding, and deleting records

  • Cards: Visual representation of data in card format

  • Chart: Graphical data representation (bar, pie, line charts, etc.)

6. How do you implement master-detail relationships in APEX?

Master-detail relationships are implemented by:

  1. Creating the master form/report based on the parent table

  2. Adding a detail region on the same page or linked page

  3. Setting the master-detail relationship in the detail region properties

  4. Configuring the link column that connects master to detail records

  5. Setting up automatic refresh so detail records update when master selection changes


7. What is a Dynamic Action in Oracle APEX?

Dynamic Actions provide client-side interactivity without page refreshes. They consist of:

  • When: Event trigger (button click, item change, page load)

  • Event: Specific action that triggers the dynamic action

  • Condition: Optional criteria that must be met

  • Action: What happens (show/hide items, execute JavaScript, refresh regions)


Example: Hide a region when a checkbox is unchecked, or refresh a report when a select list value changes.

8. Explain the concept of session state in APEX.

Session state maintains data values across pages and user interactions within an APEX session. It includes:

  • Page items: Values entered in forms or selected from lists

  • Application items: Global variables accessible across all pages

  • Session ID: Unique identifier for the user session

  • Automatic management: APEX handles session creation, maintenance, and cleanup


9. What are the different authentication schemes available in APEX?

  • APEX Accounts: Internal APEX user management

  • Database Accounts: Uses Oracle Database user authentication

  • LDAP Directory: Integrates with LDAP servers like Active Directory

  • Social Sign-On: OAuth integration with Google, Facebook, etc.

  • Custom: User-defined authentication logic using PL/SQL

  • No Authentication: For public applications

10. How do you handle file uploads in Oracle APEX?

File uploads are handled using:

  1. File Browse item: Allows users to select files from their device

  2. BLOB storage: Files stored as Binary Large Objects in database tables

  3. File processing: PL/SQL logic to handle uploaded files

  4. Validation: File type, size, and content validation

  5. Download mechanism: Process to retrieve and serve uploaded files

11. What is the difference between before and after page processes?

  • Before Header: Executes before the page is rendered, useful for authentication and data initialization

  • After Header: Runs after page rendering but before the user sees it

  • On Load: Executes when the page loads

  • On Submit - Before Computations: Runs before any computations when form is submitted

  • On Submit - After Computations: Runs after computations but before validations

12. How do you implement conditional rendering in APEX?

Conditional rendering controls when components display based on:

  • Item values: Show region only if specific item has certain value

  • User attributes: Display based on user role or authorization

  • PL/SQL expressions: Custom logic determining visibility

  • Page items: Conditions based on other page item values

  • Application items: Global conditions affecting multiple pages

13. What are shared components in Oracle APEX?

Shared components are reusable elements available across an application:

  • Lists of Values (LOVs): Dropdown options used in multiple places

  • Templates: HTML structures for consistent appearance

  • Authentication schemes: Login mechanisms

  • Authorization schemes: Access control rules

  • Themes: Overall application appearance and styling

  • Web service references: External API connections

14. Explain the difference between application items and page items.

Page Items: Scope limited to a specific page, automatically managed by APEX, used for user input and page-specific data

Application Items: Global scope across entire application, manually managed, used for session-wide data like user preferences or application state

15. How do you create cascading LOVs (Lists of Values)?

Cascading LOVs create dependent dropdowns where the second list's options depend on the first list's selection:

  1. Create parent LOV: First dropdown with independent values

  2. Create dependent LOV: Second dropdown with SQL query referencing parent item

  3. Set cascading parent: Configure the dependency relationship

  4. Add refresh action: Ensure dependent LOV updates when parent changes

Did you know?

Salesforce uses declarative sharing and granular Apex sharing for robust security models.

Intermediate & Expert Level Questions (16-35)

16. How do you optimize performance for large datasets in APEX applications?

Performance optimization strategies include:

  • Pagination: Limit records displayed per page using row limiting

  • Lazy loading: Load data only when needed

  • Efficient SQL: Use proper indexing, avoid SELECT *, optimize joins

  • Caching: Enable region and application-level caching

  • Asynchronous processing: Use background jobs for heavy operations

  • Partial page refresh: Update only necessary regions instead of full page reload

17. Describe how to implement custom authentication in Oracle APEX.

Custom authentication involves:

  1. Create authentication scheme: Go to Shared Components > Authentication Schemes

  2. Define PL/SQL function: Write authentication logic that returns TRUE/FALSE

  3. Session management: Handle user session creation and validation

  4. Login page customization: Create custom login interface

  5. Post-authentication processing: Set session attributes and redirect logic


FUNCTION custom_authenticate(p_username VARCHAR2, p_password VARCHAR2) 
RETURN BOOLEAN IS
BEGIN
  -- Custom authentication logic
  IF validate_user_credentials(p_username, p_password) THEN
    -- Set session attributes
    RETURN TRUE;
  ELSE
    RETURN FALSE;
  END IF;
END;



18. How do you handle RESTful web services in APEX?

RESTful services in APEX involve:

  • Creating REST endpoints: Define URI templates and HTTP methods

  • Data source modules: Configure external REST API connections

  • Authentication: Set up OAuth, API keys, or basic authentication

  • Request/response handling: Map JSON/XML to APEX items and collections

  • Error handling: Implement robust error handling for service failures

19. What are APEX collections and when would you use them?

APEX collections are temporary, session-specific data structures that:

  • Store temporary data: Hold data during user session without database commits

  • Manipulate datasets: Sort, filter, and modify data before database operations

  • Cross-page data: Share data between pages within a session

  • Report building: Create complex reports from multiple data sources

  • Wizard implementations: Store multi-step form data

20. How do you implement row-level security in APEX applications?

Row-level security implementation:

  • VPD policies: Virtual Private Database policies at database level

  • Authorization schemes: APEX-level access control rules

  • Shared components: Reusable security logic across applications

  • Session attributes: User-specific security context

  • SQL filtering: Dynamic WHERE clauses based on user permissions

21. Explain the APEX plugin architecture and how to develop custom plugins.

APEX plugins extend functionality through:

  1. Plugin types: Region, item, dynamic action, process, or authorization

  2. PL/SQL code: Server-side logic for data processing

  3. JavaScript/CSS: Client-side behavior and styling

  4. Configuration options: Parameters for plugin customization

  5. Installation: Packaged for deployment across applications

22. How do you handle large file uploads and downloads in APEX?

Large file handling strategies:

  • Chunked uploads: Break large files into smaller pieces

  • Background processing: Use APEX collections or temporary tables

  • Streaming: Process files without loading entirely into memory

  • Compression: Reduce file sizes before storage

  • Progress indicators: Provide user feedback during operations

  • Error recovery: Handle interrupted uploads gracefully

23. What is the role of APEX Listener (ORDS) and how do you configure it?

ORDS serves as the web server component that:

  • Handles HTTP requests: Processes incoming web requests

  • Database connectivity: Manages connection pooling to Oracle Database

  • REST services: Exposes database operations as REST APIs

  • Static file serving: Handles images, CSS, JavaScript files

  • Security: Implements SSL/TLS and authentication protocols


Configuration involves setting connection pools, security, and deployment parameters.

24. How do you implement complex business rules in APEX?

Complex business rules implementation:

  • PL/SQL packages: Centralized business logic separate from presentation

  • Database triggers: Automatic enforcement of data integrity rules

  • APEX validations: Page-level business rule validation

  • Dynamic actions: Client-side rule enforcement

  • Workflow engines: For complex approval processes

  • Custom computations: Calculated fields based on business logic

25. Describe APEX application deployment strategies.

Deployment strategies include:

  • Export/Import: Manual application export and import between environments

  • SQL*Plus scripts: Automated deployment using command-line tools

  • Version control: Integration with Git or other VCS systems

  • Environment management: Separate development, test, and production environments

  • Data migration: Handling data differences between environments

  • Rollback procedures: Ability to revert problematic deployments

26. How do you integrate APEX with external systems?

Integration approaches:

  • Web services: REST and SOAP service consumption

  • Database links: Direct database-to-database connections

  • Message queues: Asynchronous integration using Oracle AQ

  • File-based: CSV, XML, JSON file processing

  • API gateways: Centralized API management

  • ETL processes: Extract, Transform, Load operations

27. What are the security best practices for APEX applications?

Security best practices:

  • Input validation: Validate all user inputs at multiple levels

  • SQL injection prevention: Use bind variables and parameterized queries

  • XSS protection: Escape output and use Content Security Policy

  • Authentication: Implement strong authentication mechanisms

  • Authorization: Fine-grained access control

  • Session management: Secure session handling and timeout

  • HTTPS enforcement: Encrypt all communications

  • Regular updates: Keep APEX and database patches current

28. How do you design multi-tenant applications in Oracle APEX?

Multi-tenant design approaches:

  • Schema separation: Each tenant has separate database schema

  • Row-level separation: Shared schema with tenant ID filtering

  • VPD implementation: Virtual Private Database for automatic filtering

  • Workspace isolation: Separate APEX workspaces per tenant

  • Configuration management: Tenant-specific settings and customizations

  • Performance considerations: Resource allocation and monitoring per tenant

29. Describe advanced performance tuning techniques for APEX applications.

Advanced performance tuning:

  • Database optimization: Query tuning, indexing strategies, execution plan analysis

  • APEX-specific tuning: Region caching, lazy loading, efficient page design

  • Network optimization: Compression, CDN usage, static file optimization

  • Memory management: Session state optimization, collection management

  • Monitoring: Performance metrics collection and analysis

  • Scalability planning: Load balancing, connection pooling configuration

30. How do you implement custom PDF generation in APEX?

PDF generation approaches:

  • APEX native: Built-in PDF printing capabilities

  • BI Publisher: Oracle's enterprise reporting solution

  • PL/PDF: PL/SQL library for PDF creation

  • Custom solutions: Third-party tools or cloud services

  • Template design: Creating professional report layouts

  • Data integration: Merging database data with PDF templates

31. Explain the integration between APEX and Oracle Database Advanced Features.

Advanced database feature integration:

  • Partitioning: Working with partitioned tables and parallel processing

  • Analytics: Using Oracle Analytics for complex calculations

  • Spatial data: Geographic information system capabilities

  • Text search: Oracle Text integration for full-text search

  • Data warehousing: APEX as BI front-end for data warehouses

  • Advanced security: Label security, data masking, encryption

32. How do you handle real-time data updates in APEX applications?

Real-time updates implementation:

  • WebSockets: Persistent connections for live data streaming

  • APEX push notifications: Server-initiated client updates

  • Polling mechanisms: Automatic refresh of data regions

  • Database change notification: Responding to database triggers

  • Message queues: Asynchronous messaging for real-time updates

  • Event-driven architecture: Publish-subscribe patterns

33. Describe advanced authorization and access control patterns.

Advanced access control:

  • Attribute-based access: Dynamic permissions based on user attributes

  • Context-aware security: Access control based on location, time, device

  • Hierarchical permissions: Role inheritance and delegation

  • Data classification: Different access levels based on data sensitivity

  • Audit trails: Comprehensive logging of access and modifications

  • Fine-grained authorization: Column and row-level access control

34. How do you implement complex data migration strategies in APEX?

Data migration strategies:

  • ETL processes: Extract, Transform, Load operations for large datasets

  • Incremental migration: Moving data in phases to minimize downtime

  • Data validation: Ensuring data integrity during migration

  • Rollback procedures: Ability to revert failed migrations

  • Performance optimization: Parallel processing and bulk operations

  • Legacy system integration: Handling data from multiple source systems

35. What are the considerations for APEX cloud deployment and scalability?

Cloud deployment considerations:

  • Oracle Cloud Infrastructure: APEX on Autonomous Database

  • Container deployment: Docker and Kubernetes strategies

  • Auto-scaling: Dynamic resource allocation based on load

  • High availability: Multi-region deployment and failover

  • Disaster recovery: Backup and recovery strategies

  • Cost optimization: Resource utilization and pricing models

Did you know?

Apex supports asynchronous processing with future methods, batch Apex, and queueable Apex.

Beginner Level Questions (1-20)

Salesforce Apex is an object-oriented programming language that allows developers to execute flow and transaction control statements on the Salesforce platform. These questions assess candidates' ability to build robust, scalable solutions within the Salesforce ecosystem.

1. What is Salesforce Apex and how does it differ from other programming languages?

Salesforce Apex is a strongly-typed, object-oriented programming language that executes on the Salesforce platform. Key differences include:

  • Cloud-native execution: Runs entirely on Salesforce servers, not locally

  • Governor limits: Built-in limits prevent resource abuse in multi-tenant environment

  • Database integration: Native integration with Salesforce objects and data

  • Automatic platform features: Built-in security, sharing, and workflow integration

Java-like syntax: Familiar syntax for Java developers but with platform-specific features

2. Explain the different types of Apex triggers and their execution contexts.

Apex triggers execute in response to data changes:

  • Before triggers: Execute before records are saved to database, used for validation and data modification

  • After triggers: Execute after records are saved, used for operations requiring record IDs

  • Trigger events: Insert, Update, Delete, Undelete operations

  • Trigger context variables: isInsert, isUpdate, isDelete, isBefore, isAfter, Trigger.new, Trigger.old


trigger AccountTrigger on Account (before insert, before update, after insert, after update) {
    if (Trigger.isBefore) {
        // Validation and data modification logic
    }
    if (Trigger.isAfter) {
        // Operations requiring record IDs
    }
}



3. What are SOQL and SOSL, and when would you use each?

SOQL (Salesforce Object Query Language): Queries single object or related objects, returns specific records

SOSL (Salesforce Object Search Language): Searches across multiple objects, returns records containing search terms


Use SOQL for specific data retrieval, SOSL for broad searches across multiple objects.

// SOQL example
List<Account> accounts = [SELECT Id, Name FROM Account WHERE Industry = 'Technology'];
// SOSL example
List<List<SObject>> searchResults = [FIND 'John' IN ALL FIELDS RETURNING Account, Contact];



4. How do you handle exceptions in Apex?

Exception handling uses try-catch blocks:


try {
    // Code that might throw exception
    insert accountList;
} catch (DMLException e) {
    // Handle DML-specific exceptions
    System.debug('DML Error: ' + e.getMessage());
} catch (Exception e) {
    // Handle general exceptions
    System.debug('General Error: ' + e.getMessage());
} finally {
    // Cleanup code that always executes
}



5. What is the difference between with sharing and without sharing keywords?

  • with sharing: Enforces user's sharing rules and permissions

  • without sharing: Runs with full access, ignoring user permissions

  • inherited sharing: Inherits sharing context from calling class

public with sharing class AccountService {
    // Respects user sharing rules
}
public without sharing class SystemService {
    // Runs with system permissions
}



6. Explain the concept of governor limits in Salesforce.

Governor limits prevent resource abuse in the multi-tenant environment:

  • SOQL queries: 100 synchronous, 200 asynchronous per transaction

  • DML statements: 150 per transaction

  • Heap size: 6MB synchronous, 12MB asynchronous

  • CPU time: 10 seconds synchronous, 60 seconds asynchronous

  • Callouts: 100 per transaction

7. What are the different types of collections in Apex?

  • List: Ordered collection allowing duplicates

  • Set: Unordered collection of unique elements

  • Map: Key-value pairs for efficient lookups

List<String> stringList = new List<String>();
Set<Id> idSet = new Set<Id>();
Map<Id, Account> accountMap = new Map<Id, Account>();

8. How do you write test classes in Apex?

Test classes ensure code quality and are required for deployment:


@isTest
public class AccountTriggerTest {
    @testSetup
    static void setupTestData() {
        // Create test data
    }
    
    @isTest
    static void testAccountInsert() {
        Test.startTest();
        // Test logic
        Test.stopTest();
        
        // Assertions
        System.assertEquals(expected, actual);
    }
}

9. What is the difference between static and instance methods?

  • Static methods: Belong to the class, called without creating instance, cannot access instance variables

  • Instance methods: Belong to object instance, can access instance variables


public class Calculator {
    public static Integer add(Integer a, Integer b) {
        return a + b; // Static method
    }
    
    public Integer instanceVariable = 0;
    public void setVariable(Integer value) {
        this.instanceVariable = value; // Instance method
    }
}


10. Explain the order of execution in Salesforce.

The order of execution for record processing:

  1. System validation rules

  2. Before triggers

  3. Custom validation rules

  4. After triggers

  5. Assignment rules

  6. Auto-response rules

  7. Workflow rules

  8. Processes and flows

  9. Escalation rules

  10. Roll-up summary field updates

  11. Criteria-based sharing rules

11. What are future methods and when would you use them?

Future methods execute asynchronously:


public class ExternalService {
    @future(callout=true)
    public static void makeCallout(String endpoint) {
        // Asynchronous callout logic
    }
    
    @future
    public static void heavyProcessing(Set<Id> recordIds) {
        // Time-consuming operations

Use cases: External callouts, heavy processing, mixed DML operations.



12. How do you implement pagination in Visualforce or Lightning components?

Pagination handles large datasets efficiently:


public class AccountController {
    public ApexPages.StandardSetController setCon {get; set;}
    
    public AccountController() {
        setCon = new ApexPages.StandardSetController([SELECT Id, Name FROM Account]);
        setCon.setPageSize(10);
    }
    
    public List<Account> getAccounts() {
        return (List<Account>) setCon.getRecords();
    }
    
    public Boolean hasNext() {
        return setCon.getHasNext();
    }
    
    public PageReference next() {
        setCon.next();
        return null;
    }
}



13. What is the difference between insert and Database.insert?

  • insert: DML statement that throws exception on failure

  • Database.insert: Database method allowing partial success


// Traditional DML
try {
    insert accountList;
} catch (DMLException e) {
    // Handle exception
}
// Database method
Database.SaveResult[] results = Database.insert(accountList, false);
for (Database.SaveResult result : results) {
    if (!result.isSuccess()) {
        // Handle individual failures
    }
}



14. How do you handle bulk operations in Apex?

Bulk operations process multiple records efficiently:


public class BulkAccountProcessor {
    public static void updateAccounts(List<Account> accounts) {
        List<Account> accountsToUpdate = new List<Account>();
        
        for (Account acc : accounts) {
            if (acc.AnnualRevenue > 1000000) {
                acc.Type = 'Enterprise';
                accountsToUpdate.add(acc);
            }
        }
        
        if (!accountsToUpdate.isEmpty()) {
            update accountsToUpdate;
        }
    }
}



15. What are custom settings and custom metadata types?

  • Custom settings: Application data cached at organization, profile, or user level

  • Custom metadata types: Metadata that can be deployed and is accessible via SOQL

Use custom settings for configuration data, custom metadata for deployable application metadata.



16. How do you implement trigger design patterns to avoid recursion?

Recursion prevention using static variables:


public class TriggerHelper {
    private static Boolean isExecuting = false;
    private static Set<Id> processedIds = new Set<Id>();
    
    public static Boolean isFirstRun() {
        if (!isExecuting) {
            isExecuting = true;
            return true;
        }
        return false;
    }
    
    public static Boolean isProcessed(Id recordId) {
        return processedIds.contains(recordId);
    }
    
    public static void addProcessed(Id recordId) {
        processedIds.add(recordId);
    }
}

17. Explain batch Apex and provide implementation example.

Batch Apex processes large datasets asynchronously:


public class AccountBatch implements Database.Batchable<sObject>, Database.Stateful {
    private Integer recordsProcessed = 0;
    
    public Database.QueryLocator start(Database.BatchableContext bc) {
        return Database.getQueryLocator('SELECT Id, Name FROM Account WHERE Type = null');
    }
    
    public void execute(Database.BatchableContext bc, List<Account> scope) {
        for (Account acc : scope) {
            acc.Type = 'Prospect';
        }
        update scope;
        recordsProcessed += scope.size();
    }
    
    public void finish(Database.BatchableContext bc) {
        System.debug('Processed ' + recordsProcessed + ' records');
    }
}

18. How do you implement asynchronous processing with Queueable Apex?

Queueable Apex for chainable asynchronous operations:


public class AccountProcessor implements Queueable {
    private List<Account> accounts;
    private Integer batchSize;
    
    public AccountProcessor(List<Account> accounts, Integer batchSize) {
        this.accounts = accounts;
        this.batchSize = batchSize;
    }
    
    public void execute(QueueableContext context) {
        List<Account> batch = new List<Account>();
        
        for (Integer i = 0; i < Math.min(batchSize, accounts.size()); i++) {
            batch.add(accounts[i]);
        }
        
        // Process batch
        processAccounts(batch);
        
        // Chain next batch if more records
        if (accounts.size() > batchSize) {
            List<Account> remaining = accounts.subList(batchSize, accounts.size());
            System.enqueueJob(new AccountProcessor(remaining, batchSize));
        }
    }

19. How do you handle mixed DML operations?

Mixed DML occurs when setup and non-setup objects are modified in same transaction:


public class MixedDMLHandler {
    @future
    public static void createUserAsync(String firstName, String lastName, String email) {
        // Create user in async context to avoid mixed DML
        User newUser = new User(
            FirstName = firstName,
            LastName = lastName,
            Email = email,
            Username = email,
            Alias = firstName.substring(0,1) + lastName.substring(0,4),
            ProfileId = [SELECT Id FROM Profile WHERE Name = 'Standard User'].Id
        );
        insert newUser;
    }
    
    public static void handleAccountAndUser(Account acc, String userEmail) {
        insert acc; // Non-setup object
        
        // Use future method for setup object to avoid mixed DML
        createUserAsync('John', 'Doe', userEmail);
    }
}



20. Explain the implementation of sharing and security in Apex.

Sharing and security implementation:


// Manual sharing
public class AccountSharing {
    public static void shareAccountWithUser(Id accountId, Id userId, String accessLevel) {
        AccountShare sharing = new AccountShare();
        sharing.AccountId = accountId;
        sharing.UserOrGroupId = userId;
        sharing.AccountAccessLevel = accessLevel;
        sharing.OpportunityAccessLevel = 'Read';
        
        Database.SaveResult result = Database.insert(sharing, false);
        if (!result.isSuccess()) {
            System.debug('Error sharing account: ' + result.getErrors());
        }
    }
}
// Programmatic sharing rules
public inherited sharing class SecureAccountService {
    public static List<Account> getAccessibleAccounts() {
        return [SELECT Id, Name FROM Account WITH SECURITY_ENFORCED];
    }
}



Did you know?

APEX provides out-of-the-box support for OAuth and social logins like Google and Facebook.

Intermediate & Expert Level Questions (20-35)

21. How do you implement dynamic SOQL and handle injection prevention?

Dynamic SOQL with security considerations:


public class DynamicSOQLService {
    public static List<SObject> queryRecords(String objectName, List<String> fields, String whereClause) {
        // Validate object access
        if (!Schema.getGlobalDescribe().containsKey(objectName)) {
            throw new IllegalArgumentException('Invalid object name');
        }
        
        // Validate field access
        Map<String, Schema.SObjectField> fieldMap = 
            Schema.getGlobalDescribe().get(objectName).getDescribe().fields.getMap();
        
        for (String field : fields) {
            if (!fieldMap.containsKey(field)) {

22. How do you implement REST API integration in Apex?

REST API integration using HTTP callouts:

apex

public class ExternalAPIService {
    @future(callout=true)
    public static void makeRestCallout(String endpoint, String method, String body) {
        Http http = new Http();
        HttpRequest request = new HttpRequest();
        
        request.setEndpoint(endpoint);
        request.setMethod(method);
        request.setHeader('Content-Type', 'application/json');
        request.setHeader('Authorization', 'Bearer ' + getAuthToken());
        
        if (String.isNotBlank(body)) {
            request.setBody(body);
        }
        
        try {
            HttpResponse response = http.send(request);
            
            if (response.getStatusCode() == 200) {
                processResponse(response.getBody());
            } else {
                System.debug('Error: ' + response.getStatusCode() + ' ' + response.getStatus());
            }
        } catch (Exception e) {
            System.debug('Callout failed: ' + e.getMessage());
        }
    }
    
    private static String getAuthToken() {
        // Implement OAuth or API key logic
        return 'your_auth_token';
    }
    
    private static void processResponse(String responseBody) {
        // Parse and process response
        Map<String, Object> responseMap = (Map<String, Object>) JSON.deserializeUntyped(responseBody);
        // Process the response data
    }
}



23. How do you handle governor limits in complex applications?

Governor limit management strategies:

apex


public class LimitManager {
    public static void checkLimits() {
        System.debug('SOQL Queries used: ' + Limits.getQueries() + '/' + Limits.getLimitQueries());
        System.debug('DML Statements used: ' + Limits.getDMLStatements() + '/' + Limits.getLimitDMLStatements());
        System.debug('Heap Size used: ' + Limits.getHeapSize() + '/' + Limits.getLimitHeapSize());
        
        // Warn if approaching limits
        if (Limits.getQueries() > 80) {
            System.debug('WARNING: Approaching SOQL query limit');
        }
    }
    
    public static void processInBatches(List<SObject> records, Integer batchSize) {
        List<SObject> batch = new List<SObject>();
        
        for (SObject record : records) {
            batch.add(record);
            
            if (batch.size() == batchSize) {
                processBatch(batch);
                batch.clear();
            }
        }
        
        // Process remaining records
        if (!batch.isEmpty()) {
            processBatch(batch);
        }
    }
    
    private static void processBatch(List<SObject> batch) {
        // Process batch while monitoring limits
        checkLimits()



24. How do you implement custom metadata types in your solutions?

Custom metadata types for configuration:

apex


public class ConfigurationService {
    private static Map<String, Integration_Setting__mdt> settingsCache;
    
    public static Integration_Setting__mdt getSetting(String settingName) {
        if (settingsCache == null) {
            loadSettings();
        }
        
        return settingsCache.get(settingName);
    }
    
    private static void loadSettings() {
        settingsCache = new Map<String, Integration_Setting__mdt>();
        
        for (Integration_Setting__mdt setting : [
            SELECT DeveloperName, Endpoint__c, Timeout__c, Retry_Count__c 
            FROM Integration_Setting__mdt
        ]) {
            settingsCache.put(setting.DeveloperName, setting);
        }
    }
    
    public static void makeConfigurableCallout(String settingName, String payload) {
        Integration_Setting__mdt setting = getSetting(settingName);
        
        if (setting != null) {
            Http http = new Http();
            HttpRequest req = new HttpRequest();
            req.setEndpoint(setting.Endpoint__c);
            req.setTimeout(Integer.valueOf(setting.Timeout__c));
            req.setBody(payload);
            
            // Implement retry logic based on setting.Retry_Count__c
        }
    }
}



25. How do you implement platform events for event-driven architecture?

Platform events for decoupled communication:

apex


// Publisher
public class OrderEventPublisher {
    public static void publishOrderEvent(Id orderId, String status) {
        Order_Status_Event__e event = new Order_Status_Event__e();
        event.Order_Id__c = orderId;
        event.Status__c = status;
        event.Timestamp__c = System.now();
        
        Database.SaveResult result = EventBus.publish(event);
        
        if (!result.isSuccess()) {
            System.debug('Error publishing event: ' + result.getErrors());
        }
    }
}
// Subscriber (Trigger on Platform Event)
trigger OrderStatusEventTrigger on Order_Status_Event__e (after insert) {
    List<Task> tasksToCreate = new List<Task>();
    
    for (Order_Status_Event__e event : Trigger.new) {
        if (event.Status__c == 'Shipped') {
            Task followUpTask = new Task();
            followUpTask.Subject = 'Follow up on shipped order';
            followUpTask.WhatId = event.Order_Id__c;
            followUpTask.ActivityDate = Date.today().addDays(3);
            tasksToCreate.add(followUpTask);
        }
    }
    
    if (!tasksToCreate.isEmpty()) {
        insert tasksToCreate;
    }
}



26. How do you implement Lightning Web Component (LWC) integration with Apex?

LWC-Apex integration patterns:

apex


// Apex Controller for LWC
public with sharing class AccountController {
    @AuraEnabled(cacheable=true)
    public static List<Account> getAccounts(String searchTerm) {
        String searchKey = '%' + searchTerm + '%';
        return [
            SELECT Id, Name, Industry, AnnualRevenue 
            FROM Account 
            WHERE Name LIKE :searchKey 
            WITH SECURITY_ENFORCED
            LIMIT 50
        ];
    }
    
    @AuraEnabled
    public static void updateAccount(Account account) {
        try {
            update account;
        } catch (DMLException e) {
            throw new AuraHandledException(e.getMessage());
        }
    }
    
    @AuraEnabled
    public static String createAccountWithContacts(String accountData, String contactsData) {
        try {
            Account acc = (Account) JSON.deserialize(accountData, Account.class);
            insert acc;
            
            List<Contact> contacts = (List<Contact>) JSON.deserialize(contactsData, List<Contact>.class);
            for (Contact con : contacts) {
                con.AccountId = acc.Id;
            }
            insert contacts;
            
            return acc.Id;
        } catch (Exception e) {
            throw new AuraHandledException('Error creating account: ' + e.getMessage());
        }
    }
}



27. How do you implement data factory patterns for test data creation?

Test data factory for maintainable tests:

apex


@isTest
public class TestDataFactory {
    public static Account createAccount(String name, String industry) {
        return new Account(
            Name = name,
            Industry = industry,
            BillingCity = 'San Francisco',
            BillingState = 'CA'
        );
    }
    
    public static List<Account> createAccounts(Integer count) {
        List<Account> accounts = new List<Account>();
        
        for (Integer i = 0; i < count; i++) {
            accounts.add(createAccount('Test Account ' + i, 'Technology'));
        }
        
        return accounts;
    }
    
    public static Contact createContact(Id accountId, String firstName, String lastName) {
        return new Contact(
            AccountId = accountId,
            FirstName = firstName,
            LastName = lastName,
            Email = firstName.toLowerCase() + '.' + lastName.toLowerCase() + '@test.com'
        );
    }
    
    public static User createTestUser(String profileName, String username) {
        Profile profile = [SELECT Id FROM Profile WHERE Name = :profileName LIMIT 1];
        
        return new User(
            FirstName = 'Test',
            LastName = 'User',
            Email = username + '@test.com',
            Username = username + '@test.com.dev',
            Alias = 'tuser',
            ProfileId = profile.Id,
            TimeZoneSidKey = 'America/Los_Angeles',
            LocaleSidKey = 'en_US',
            EmailEncodingKey = 'UTF-8',
            LanguageLocaleKey = 'en_US'
        );
    }
}



28. How do you implement complex domain-driven design patterns in Salesforce?

Domain-driven design implementation:

apex


// Domain Layer - Business Logic
public virtual class OpportunityDomain {
    protected List<Opportunity> opportunities;
    
    public OpportunityDomain(List<Opportunity> opportunities) {
        this.opportunities = opportunities;
    }
    
    public virtual void validateBusinessRules() {
        for (Opportunity opp : opportunities) {
            validateCloseDate(opp);
            validateAmount(opp);
            validateStageProgression(opp);
        }
    }
    
    protected virtual void validateCloseDate(Opportunity opp) {
        if (opp.CloseDate < Date.today()) {
            opp.addError('Close date cannot be in the past');
        }
    }
    
    protected virtual void validateAmount(Opportunity opp) {
        if (opp.Amount <= 0) {
            opp.addError('Amount must be greater than zero');
        }
    }
    
    protected virtual void validateStageProgression(Opportunity opp) {
        // Complex stage progression logic
        if (Trigger.isUpdate) {
            Opportunity oldOpp = Trigger.oldMap.get(opp.Id);
            if (!isValidStageProgression(oldOpp.StageName, opp.StageName)) {
                opp.addError('Invalid stage progression');
            }
        }
    }
    
    private Boolean isValidStageProgression(String oldStage, String newStage) {
        // Stage progression business rules
        Map<String, Set<String>> validProgressions = new Map<String, Set<String>>{
            'Prospecting' => new Set<String>{'Qualification', 'Closed Lost'},
            'Qualification' => new Set<String>{'Needs Analysis', 'Closed Lost'},
            'Needs Analysis' => new Set<String>{'Value Proposition', 'Closed Lost'},
            'Value Proposition' => new Set<String>{'Id. Decision Makers', 'Closed Lost'},
            'Id. Decision Makers' => new Set<String>{'Perception Analysis', 'Closed Lost'},
            'Perception Analysis' => new Set<String>{'Proposal/Price Quote', 'Closed Lost'},
            'Proposal/Price Quote' => new Set<String>{'Negotiation/Review', 'Closed Lost'},
            'Negotiation/Review' => new Set<String>{'Closed Won', 'Closed Lost'}
        };
        
        return validProgressions.get(oldStage)?.contains(newStage) ?? false;
    }
}
// Service Layer - Application Logic
public class OpportunityService {
    public static void processOpportunities(List<Opportunity> opportunities) {
        OpportunityDomain domain = new OpportunityDomain(opportunities);
        domain.validateBusinessRules();
        
        // Additional service layer operations
        updateRelatedRecords(opportunities);
        sendNotifications(opportunities);
    }
    
    private static void updateRelatedRecords(List<Opportunity> opportunities) {
        // Update related accounts, contacts, etc.
    }
    
    private static void sendNotifications(List<Opportunity> opportunities) {
        // Send email notifications, platform events, etc.

29. How do you implement enterprise-grade error handling and logging frameworks?

Comprehensive error handling framework:

apex


public class Logger {
    private static List<Log_Entry__c> logEntries = new List<Log_Entry__c>();
    
    public enum LogLevel { DEBUG, INFO, WARN, ERROR, FATAL }
    
    public static void log(LogLevel level, String className, String methodName, String message, Exception ex) {
        Log_Entry__c entry = new Log_Entry__c();
        entry.Level__c = level.name();
        entry.Class_Name__c = className;
        entry.Method_Name__c = methodName;
        entry.Message__c = message;
        entry.Stack_Trace__c = ex?.getStackTraceString();
        entry.User__c = UserInfo.getUserId();
        entry.Timestamp__c = System.now();
        
        logEntries.add(entry);
        
        // Immediate insertion for errors and fatal logs
        if (level == LogLevel.ERROR || level == LogLevel.FATAL) {
            flushLogs();
        }
    }
    
    public static void flushLogs() {
        if (!logEntries.isEmpty()) {
            try {
                insert logEntries;
                logEntries.clear();
            } catch (DMLException e) {
                // Fallback to System.debug if database insert fails
                System.debug('Failed to insert log entries: ' + e.getMessage());
            }
        }
    }
    
    // Automatic log flushing on transaction completion
    public static void handleTransactionEnd() {
        flushLogs();
    }
}
// Error Handler Utility
public class ErrorHandler {
    public static void handleException(Exception ex, String context) {
        Logger.log(Logger.LogLevel.ERROR, 
                  ErrorHandler.class.getName(), 
                  'handleException', 
                  'Error in ' + context + ': ' + ex.getMessage(), 
                  ex);
        
        // Send critical error notifications
        if (ex instanceof System.LimitException) {
            sendCriticalErrorNotification(ex, context);
        }
    }
    
    private static void sendCriticalErrorNotification(Exception ex, String context) {
        // Send email to system administrators
        // Create platform event for monitoring systems
        // Log to external monitoring tools
    }
    
    public static void processWithErrorHandling(String context, ProcessingDelegate processor) {
        try {
            processor.process();
        } catch (Exception ex) {
            handleException(ex, context);
            throw ex; // Re-throw if needed
        }
    }
}
// Delegate interface for error handling
public interface ProcessingDelegate {
    void process();
}

30. How do you implement sophisticated caching strategies in Apex?

Multi-level caching implementation:

apex


public class CacheManager {
    // Org cache partition
    private static final String ORG_PARTITION = 'OrgData';
    // Session cache partition  
    private static final String SESSION_PARTITION = 'SessionData';
    
    // Static cache for transaction-level caching
    private static Map<String, Object> transactionCache = new Map<String, Object>();
    
    public static Object get(String key, CacheLevel level) {
        switch on level {
            when TRANSACTION {
                return transactionCache.get(key);
            }
            when SESSION {
                return Cache.Session.get(SESSION_PARTITION + '.' + key);
            }
            when ORG {
                return Cache.Org.get(ORG_PARTITION + '.' + key);
            }
        }
        return null;
    }
    
    public static void put(String key, Object value, CacheLevel level, Integer ttlSeconds) {
        switch on level {
            when TRANSACTION {
                transactionCache.put(key, value);
            }
            when SESSION {
                Cache.Session.put(SESSION_PARTITION + '.' + key, value, ttlSeconds);
            }
            when ORG {
                Cache.Org.put(ORG_PARTITION + '.' + key, value, ttlSeconds);
            }
        }
    }
    
    public static Boolean contains(String key, CacheLevel level) {
        switch on level {
            when TRANSACTION {
                return transactionCache.containsKey(key);
            }
            when SESSION {
                return Cache.Session.contains(SESSION_PARTITION + '.' + key);
            }
            when ORG {
                return Cache.Org.contains(ORG_PARTITION + '.' + key);
            }
        }
        return false;
    }
    
    public enum CacheLevel { TRANSACTION, SESSION, ORG }
}
// Cached data service example
public class AccountCacheService {
    private static final String ACCOUNT_CACHE_KEY = 'AccountData_';
    private static final Integer CACHE_TTL = 3600; // 1 hour
    
    public static Account getCachedAccount(Id accountId) {
        String cacheKey = ACCOUNT_CACHE_KEY + accountId;
        
        // Try transaction cache first
        Account account = (Account) CacheManager.get(cacheKey, CacheManager.CacheLevel.TRANSACTION);
        if (account != null) {
            return account;
        }
        
        // Try session cache
        account = (Account) CacheManager.get(cacheKey, CacheManager.CacheLevel.SESSION);
        if (account != null) {
            // Store in transaction cache for faster access
            CacheManager.put(cacheKey, account, CacheManager.CacheLevel.TRANSACTION, CACHE_TTL);
            return account;
        }
        
        // Query database and cache result
        account = [SELECT Id, Name, Industry, AnnualRevenue FROM Account WHERE Id = :accountId LIMIT 1];
        
        CacheManager.put(cacheKey, account, CacheManager.CacheLevel.TRANSACTION, CACHE_TTL);
        CacheManager.put(cacheKey, account, CacheManager.CacheLevel.SESSION, CACHE_TTL);
        
        return account;
    }
}

31. How do you implement complex data migration and synchronization patterns?

Enterprise data migration framework:

apex


public class DataMigrationFramework {
    public interface MigrationStep {
        void execute(MigrationContext context);
        void rollback(MigrationContext context);
        String getStepName();
    }
    
    public class MigrationContext {
        public Map<String, Object> parameters;
        public List<String> errors;
        public Integer batchSize;
        public Boolean dryRun;
        
        public MigrationContext() {
            this.parameters = new Map<String, Object>();
            this.errors = new List<String>();
            this.batchSize = 200;
            this.dryRun = false;
        }
    }
    
    public class MigrationPipeline {
        private List<MigrationStep> steps;
        private MigrationContext context;
        
        public MigrationPipeline(MigrationContext context) {
            this.steps = new List<MigrationStep>();
            this.context = context;
        }
        
        public MigrationPipeline addStep(MigrationStep step) {
            this.steps.add(step);
            return this;
        }
        
        public MigrationResult execute() {
            MigrationResult result = new MigrationResult();
            List<MigrationStep> executedSteps = new List<MigrationStep>();
            
            try {
                for (MigrationStep step : steps) {
                    Logger.log(Logger.LogLevel.INFO, 'MigrationPipeline', 'execute', 
                              'Executing step: ' + step.getStepName(), null);
                    
                    step.execute(context);
                    executedSteps.add(step);
                    
                    result.completedSteps.add(step.getStepName());
                }
                
                result.success = true;
            } catch (Exception ex) {
                Logger.log(Logger.LogLevel.ERROR, 'MigrationPipeline', 'execute', 
                          'Migration failed', ex);
                
                result.success = false;
                result.errorMessage = ex.getMessage();
                
                // Rollback executed steps in reverse order
                rollbackSteps(executedSteps);
            }
            
            return result;
        }
        
        private void rollbackSteps(List<MigrationStep> executedSteps) {
            for (Integer i = executedSteps.size() - 1; i >= 0; i--) {
                try {
                    executedSteps[i].rollback(context);
                } catch (Exception rollbackEx) {
                    Logger.log(Logger.LogLevel.ERROR, 'MigrationPipeline', 'rollbackSteps', 
                              'Rollback failed for step: ' + executedSteps[i].getStepName(), rollbackEx);
                }
            }
        }
    }
    
    public class MigrationResult {
        public Boolean success;
        public String errorMessage;
        public List<String> completedSteps;
        
        public MigrationResult() {
            this.completedSteps = new List<String>();
        }
    }
}
// Example migration step implementation
public class AccountDataMigrationStep implements DataMigrationFramework.MigrationStep {
    public void execute(DataMigrationFramework.MigrationContext context) {
        List<Legacy_Account__c> legacyAccounts = [SELECT Name, Industry__c, Revenue__c FROM Legacy_Account__c];
        List<Account> accountsToInsert = new List<Account>();
        
        for (Legacy_Account__c legacy : legacyAccounts) {
            Account newAccount = new Account();
            newAccount.Name = legacy.Name;
            newAccount.Industry = legacy.Industry__c;
            newAccount.AnnualRevenue = legacy.Revenue__c;
            accountsToInsert.add(newAccount);
        }
        
        if (!context.dryRun) {
            Database.SaveResult[] results = Database.insert(accountsToInsert, false);
            
            for (Database.SaveResult result : results) {
                if (!result.isSuccess()) {
                    context.errors.add('Failed to migrate account: ' + result.getErrors());
                }
            }
        }
    }
    
    public void rollback(DataMigrationFramework.MigrationContext context) {
        // Implement rollback logic
        delete [SELECT Id FROM Account WHERE CreatedDate = TODAY];
    }
    
    public String getStepName() {
        return 'Account Data Migration';
    }
}

32. How do you implement advanced security patterns including encryption and tokenization?

Advanced security implementation:

apex


public class SecurityManager {
    private static final String ENCRYPTION_KEY = 'MySecretEncryptionKey123!';
    
    // Field-level encryption
    public static String encryptSensitiveData(String plainText) {
        if (String.isBlank(plainText)) {
            return plainText;
        }
        
        try {
            Blob key = Crypto.generateAesKey(256);
            Blob data = Blob.valueOf(plainText);
            Blob encryptedData = Crypto.encrypt('AES256', key, data);
            
            // Store the key securely (this is a simplified example)
            storeEncryptionKey(key);
            
            return EncodingUtil.base64Encode(encryptedData);
        } catch (Exception ex) {
            Logger.log(Logger.LogLevel.ERROR, 'SecurityManager', 'encryptSensitiveData', 
                      'Encryption failed', ex);
            throw ex;
        }
    }
    
    public static String decryptSensitiveData(String encryptedText) {
        if (String.isBlank(encryptedText)) {
            return encryptedText;
        }
        
        try {
            Blob key = retrieveEncryptionKey();
            Blob encryptedData = EncodingUtil.base64Decode(encryptedText);
            Blob decryptedData = Crypto.decrypt('AES256', key, encryptedData);
            
            return decryptedData.toString();
        } catch (Exception ex) {
            Logger.log(Logger.LogLevel.ERROR, 'SecurityManager', 'decryptSensitiveData', 
                      'Decryption failed', ex);
            throw ex;
        }
    }
    
    // Data tokenization for sensitive data
    public static String tokenizeSensitiveData(String sensitiveData, String tokenType) {
        String token = generateSecureToken();
        
        // Store mapping in protected custom setting
        Token_Mapping__c mapping = new Token_Mapping__c();
        mapping.Token__c = token;
        mapping.Token_Type__c = tokenType;
        mapping.Original_Value__c = encryptSensitiveData(sensitiveData);
        
        insert mapping;
        
        return token;
    }
    
    public static String detokenizeData(String token) {
        Token_Mapping__c mapping = [
            SELECT Original_Value__c 
            FROM Token_Mapping__c 
            WHERE Token__c = :token 
            LIMIT 1
        ];
        
        return decryptSensitiveData(mapping.Original_Value__c);
    }
    
    private static String generateSecureToken() {
        // Generate cryptographically secure token
        Blob randomBytes = Crypto.generateAesKey(128);
        return EncodingUtil.base64Encode(randomBytes);
    }
    
    private static void storeEncryptionKey(Blob key) {
        // Store encryption key securely in protected custom setting or external system
        // This is a simplified implementation
    }
    
    private static Blob retrieveEncryptionKey() {
        // Retrieve encryption key from secure storage
        // This is a simplified implementation
        return Crypto.generateAesKey(256);
    }
    
    // Data masking for non-production environments
    public static String maskSensitiveData(String originalValue, String maskingPattern) {
        if (String.isBlank(originalValue)) {
            return originalValue;
        }
        
        switch on maskingPattern {
            when 'EMAIL' {
                return maskEmail(originalValue);
            }
            when 'PHONE' {
                return maskPhoneNumber(originalValue);
            }
            when 'SSN' {
                return maskSSN(originalValue);
            }
            when else {
                return '***MASKED***';
            }
        }
    }
    
    private static String maskEmail(String email) {
        if (!email.contains('@')) {
            return email;
        }
        
        String[] parts = email.split('@');
        String localPart = parts[0];
        String domain = parts[1];
        
        String maskedLocal = localPart.length() > 2 ? 
            localPart.substring(0, 2) + '***' : 
            '***';
            
        return maskedLocal + '@' + domain;
    }
    
    private static String maskPhoneNumber(String phone) {
        if (phone.length() < 4) {
            return '***';
        }
        
        return '***-***-' + phone.substring(phone.length() - 4);
    }
    
    private static String maskSSN(String ssn) {
        if (ssn.length() < 4) {
            return '***';
        }
        
        return '***-**-' + ssn.substring(ssn.length() - 4);
    }
}

33. How do you implement enterprise integration patterns with external systems?

Enterprise integration patterns:

apex


public class IntegrationOrchestrator {
    public interface MessageProcessor {
        void processMessage(IntegrationMessage message);
    }
    
    public class IntegrationMessage {
        public String messageId;
        public String messageType;
        public String source;
        public String destination;
        public Map<String, Object> payload;
        public DateTime timestamp;
        public Integer retryCount;
        
        public IntegrationMessage(String messageType, String source, String destination, Map<String, Object> payload) {
            this.messageId = generateMessageId();
            this.messageType = messageType;
            this.source = source;
            this.destination = destination;
            this.payload = payload;
            this.timestamp = System.now();
            this.retryCount = 0;
        }
        
        private String generateMessageId() {
            return 'MSG_' + System.currentTimeMillis() + '_' + Math.round(Math.random() * 1000);
        }
    }
    
    // Message Router Pattern
    public class MessageRouter {
        private Map<String, MessageProcessor> processors;
        
        public MessageRouter() {
            this.processors = new Map<String, MessageProcessor>();
        }
        
        public void registerProcessor(String messageType, MessageProcessor processor) {
            processors.put(messageType, processor);
        }
        
        public void routeMessage(IntegrationMessage message) {
            MessageProcessor processor = processors.get(message.messageType);
            
            if (processor != null) {
                try {
                    processor.processMessage(message);
                } catch (Exception ex) {
                    handleProcessingError(message, ex);
                }
            } else {
                Logger.log(Logger.LogLevel.WARN, 'MessageRouter', 'routeMessage', 
                          'No processor found for message type: ' + message.messageType, null);
            }
        }
        
        private void handleProcessingError(IntegrationMessage message, Exception ex) {
            message.retryCount++;
            
            if (message.retryCount < 3) {
                // Retry logic
                System.enqueueJob(new RetryProcessor(message));
            } else {
                // Send to dead letter queue
                sendToDeadLetterQueue(message, ex);
            }
        }
    }
    
    // Retry Processor for failed messages
    public class RetryProcessor implements Queueable {
        private IntegrationMessage message;
        
        public RetryProcessor(IntegrationMessage message) {
            this.message = message;
        }
        
        public void execute(QueueableContext context) {
            // Exponential backoff delay
            Integer delay = (Integer) Math.pow(2, message.retryCount) * 1000;
            
            // Simulate delay (in real implementation, use scheduled job)
            MessageRouter router = new MessageRouter();
            router.routeMessage(message);
        }
    }
    
    private static void sendToDeadLetterQueue(IntegrationMessage message, Exception ex) {
        Dead_Letter_Queue__c dlq = new Dead_Letter_Queue__c();
        dlq.Message_Id__c = message.messageId;
        dlq.Message_Type__c = message.messageType;
        dlq.Payload__c = JSON.serialize(message.payload);
        dlq.Error_Message__c = ex.getMessage();
        dlq.Retry_Count__c = message.retryCount;
        
        insert dlq;
    }
}
// Specific message processor implementation
public class AccountSyncProcessor implements IntegrationOrchestrator

34. How do you implement advanced lightning web component architecture with Apex integration?**

Advanced LWC-Apex integration patterns:

apex

// Advanced controller with caching and error handling
public with sharing class AdvancedAccountController {
    private static final String CACHE_PARTITION = 'AccountData';
    private static final Integer CACHE_TTL = 3600; // 1 hour
    
    // Cacheable method for wire service
    @AuraEnabled(cacheable=true)
    public static Map<String, Object> getAccountsWithMetadata(
        String searchTerm, 
        String industry, 
        Integer pageSize, 
        Integer pageNumber
    ) {
        try {
            String cacheKey = generateCacheKey('getAccounts', new List<String>{
                searchTerm, industry, String.valueOf(pageSize), String.valueOf(pageNumber)
            });
            
            // Try cache first
            Map<String, Object> cachedResult = (Map<String, Object>) 
                Cache.Org.get(CACHE_PARTITION + '.' + cacheKey);
            
            if (cachedResult != null) {
                return cachedResult;
            }
            
            // Build dynamic query
            String query = buildAccountQuery(searchTerm, industry);
            Integer offset = (pageNumber - 1) * pageSize;
            
            List<Account> accounts = Database.query(query + ' LIMIT :pageSize OFFSET :offset');
            Integer totalCount = Database.countQuery(buildCountQuery(searchTerm, industry));
            
            Map<String, Object> result = new Map<String, Object>{
                'accounts' => accounts,
                'totalCount' => totalCount,
                'pageSize' => pageSize,
                'pageNumber' => pageNumber,
                'totalPages' => Math.ceil((Decimal)totalCount / pageSize)
            };
            
            // Cache result
            Cache.Org.put(CACHE_PARTITION + '.' + cacheKey, result, CACHE_TTL);
            
            return result;
            
        } catch (Exception e) {
            throw new AuraHandledException('Error retrieving accounts: ' + e.getMessage());
        }
    }
    
    // Imperative method for data manipulation
    @AuraEnabled
    public static Map<String, Object> updateAccountWithValidation(
        Account account, 
        List<Contact> contacts
    ) {
        Savepoint sp = Database.setSavepoint();
        
        try {
            // Validate account data
            validateAccountData(account);
            
            // Update account
            update account;
            
            // Process related contacts
            Map<String, Object> contactResult = processContacts(account.Id, contacts);
            
            // Invalidate cache
            invalidateAccountCache();
            
            return new Map<String, Object>{
                'success' => true,
                'account' => account,
                'contactResult' => contactResult,
                'message' => 'Account updated successfully'
            };
            
        } catch (Exception e) {
            Database.rollback(sp);
            
            return new Map<String, Object>{
                'success' => false,
                'error' => e.getMessage(),
                'errorType' => e.getTypeName()
            };
        }
    }
    
    // Streaming data for real-time updates
    @AuraEnabled
    public static String subscribeToAccountUpdates(Id accountId) {
        try {
            // Create platform event subscription
            Account_Update_Event__e updateEvent = new Account_Update_Event__e();
            updateEvent.Account_Id__c = accountId;
            updateEvent.Subscriber_Id__c = UserInfo.getUserId();
            updateEvent.Subscription_Type__c = 'REAL_TIME_UPDATES';
            
            Database.SaveResult result = EventBus.publish(updateEvent);
            
            if (result.isSuccess()) {
                return 'Subscription created successfully';
            } else {
                throw new AuraHandledException('Failed to create subscription');
            }
            
        } catch (Exception e) {
            throw new AuraHandledException('Error creating subscription: ' + e.getMessage());
        }
    }
    
    // File upload handling
    @AuraEnabled
    public static Map<String, Object> uploadAccountDocuments(
        Id accountId,
        String fileName,
        String base64Data,
        String contentType
    ) {
        try {
            // Validate file
            validateFileUpload(fileName, base64Data, contentType);
            
            // Create content version
            ContentVersion cv = new ContentVersion();
            cv.Title = fileName;
            cv.PathOnClient = fileName;
            cv.VersionData = EncodingUtil.base64Decode(base64Data);
            cv.ContentType = contentType;
            
            insert cv;
            
            // Link to account
            ContentDocumentLink cdl = new ContentDocumentLink();
            cdl.LinkedEntityId = accountId;
            cdl.ContentDocumentId = [SELECT ContentDocumentId FROM ContentVersion WHERE Id = :cv.Id].ContentDocumentId;
            cdl.ShareType = 'V';
            cdl.Visibility = 'AllUsers';
            
            insert cdl;
            
            return new Map<String, Object>{
                'success' => true,
                'fileId' => cv.Id,
                'message' => 'File uploaded successfully'
            };
            
        } catch (Exception e) {
            throw new AuraHandledException('Error uploading file: ' + e.getMessage());
        }
    }
    
    // Batch operations for bulk updates
    @AuraEnabled
    public static Map<String, Object> bulkUpdateAccounts(List<Map<String, Object>> accountData) {
        try {
            List<Account> accountsToUpdate = new List<Account>();
            List<String> errors = new List<String>();
            
            for (Map<String, Object> data : accountData) {
                try {
                    Account acc = new Account();
                    acc.Id = (Id) data.get('Id');
                    acc.Name = (String) data.get('Name');
                    acc.Industry = (String) data.get('Industry');
                    acc.Rating = (String) data.get('Rating');
                    
                    validateAccountData(acc);
                    accountsToUpdate.add(acc);
                    
                } catch (Exception e) {
                    errors.add('Error processing account ' + data.get('Id') + ': ' + e.getMessage());
                }
            }
            
            // Perform bulk update
            Database.SaveResult[] results = Database.update(accountsToUpdate, false);
            
            Integer successCount = 0;
            for (Integer i = 0; i < results.size(); i++) {
                if (results[i].isSuccess()) {
                    successCount++;
                } else {
                    errors.add('Failed to update account ' + accountsToUpdate[i].Id + ': ' + 
                             results[i].getErrors()[0].getMessage());
                }
            }
            
            // Invalidate cache
            invalidateAccountCache();
            
            return new Map<String, Object>{
                'success' => errors.isEmpty(),
                'successCount' => successCount,
                'totalCount' => accountData.size(),
                'errors' => errors
            };
            
        } catch (Exception e) {
            throw new AuraHandledException('Error in bulk update: ' + e.getMessage());
        }
    }
    
    // Helper methods
    private static String buildAccountQuery(String searchTerm, String industry) {
        String baseQuery = 'SELECT Id, Name, Industry, Rating, AnnualRevenue, Phone, Website ' +
                          'FROM Account WHERE IsPersonAccount = false';
        
        List<String> conditions = new List<String>();
        
        if (String.isNotBlank(searchTerm)) {
            conditions.add('(Name LIKE \'%' + String.escapeSingleQuotes(searchTerm) + '%\' OR ' +
                          'Phone LIKE \'%' + String.escapeSingleQuotes(searchTerm) + '%\')');
        }
        
        if (String.isNotBlank(industry)) {
            conditions.add('Industry = \'' + String.escapeSingleQuotes(industry) + '\'');
        }
        
        if (!conditions.isEmpty()) {
            baseQuery += ' AND ' + String.join(conditions, ' AND ');
        }
        
        return baseQuery + ' ORDER BY Name ASC';
    }
    
    private static String buildCountQuery(String searchTerm, String industry) {
        String baseQuery = 'SELECT COUNT() FROM Account WHERE IsPersonAccount = false';
        
        List<String> conditions = new List<String>();
        
        if (String.isNotBlank(searchTerm)) {
            conditions.add('(Name LIKE \'%' + String.escapeSingleQuotes(searchTerm) + '%\' OR ' +
                          'Phone LIKE \'%' + String.escapeSingleQuotes(searchTerm) + '%\')');
        }
        
        if (String.isNotBlank(industry)) {
            conditions.add('Industry = \'' + String.escapeSingleQuotes(industry) + '\'');
        }
        
        if (!conditions.isEmpty()) {
            baseQuery += ' AND ' + String.join(conditions, ' AND ');
        }
        
        return baseQuery;
    }
    
    private static void validateAccountData(Account account) {
        if (String.isBlank(account.Name)) {
            throw new ValidationException('Account name is required');
        }
        
        if (account.Name.length() > 255) {
            throw new ValidationException('Account name is too long');
        }
        
        // Additional validation logic
    }
    
    private static Map<String, Object> processContacts(Id accountId, List<Contact> contacts) {
        List<Contact> contactsToUpsert = new List<Contact>();
        
        for (Contact con : contacts) {
            con.AccountId = accountId;
            contactsToUpsert.add(con);
        }
        
        Database.UpsertResult[] results = Database.upsert(contactsToUpsert, Contact.Id);
        
        Integer successCount = 0;
        for (Database.UpsertResult result : results) {
            if (result.isSuccess()) {
                successCount++;
            }
        }
        
        return new Map<String, Object>{
            'successCount' => successCount,
            'totalCount' => contacts.size()
        };
    }
    
    private static void validateFileUpload(String fileName, String base64Data, String contentType) {
        // File size validation (5MB limit)
        if (base64Data.length() > 5 * 1024 * 1024) {
            throw new ValidationException('File size exceeds 5MB limit');
        }
        
        // File type validation
        List<String> allowedTypes = new List<String>{'image/jpeg', 'image/png', 'application/pdf'};
        if (!allowedTypes.contains(contentType)) {
            throw new ValidationException('File type not allowed');
        }
    }
    
    private static String generateCacheKey(String method, List<String> parameters) {
        return method + '_' + String.join(parameters, '_').replace(' ', '');
    }
    
    private static void invalidateAccountCache() {
        // Remove cached account data
        // In a real implementation, you might use cache partitions or patterns
        Cache.Org.remove(CACHE_PARTITION + '.getAccounts');
    }
    
    public class ValidationException extends Exception {



35. How do you implement enterprise-grade monitoring and observability in Apex applications?

Comprehensive monitoring and observability framework:


public class ObservabilityFramework {
    // Metrics collection
    public class MetricsCollector {
        private static Map<String, MetricData> metrics = new Map<String, MetricData>();
        
        public class MetricData {
            public String name;
            public String type; // COUNTER, GAUGE, HISTOGRAM, TIMER
            public Decimal value;
            public Map<String, String> tags;
            public DateTime timestamp;
            
            public MetricData(String name, String type) {
                this.name = name;
                this.type = type;
                this.value = 0;
                this.tags = new Map<String, String>();
                this.timestamp = System.now();
            }
        }
        
        public static void incrementCounter(String metricName, Map<String, String> tags) {
            String key = generateMetricKey(metricName, tags);
            MetricData metric = metrics.get(key);
            
            if (metric == null) {
                metric = new MetricData(metricName, 'COUNTER');
                metric.tags = tags;
                metrics.put(key, metric);
            }
            
            metric.value++;
            metric.timestamp = System.now();
        }
        
        public static void recordGauge(String metricName, Decimal value, Map<String, String> tags) {
            String key = generateMetricKey(metricName, tags);
            MetricData metric = new MetricData(metricName, 'GAUGE');
            metric.value = value;
            metric.tags = tags;
            metrics.put(key, metric);
        }
        
        public static void recordTimer(String metricName, Long durationMs, Map<String, String> tags) {
            String key = generateMetricKey(metricName, tags);
            MetricData metric = new MetricData(metricName, 'TIMER');
            metric.value = durationMs;
            metric.tags = tags;
            metrics.put(key, metric);
        }
        
        public static List<MetricData> getAllMetrics() {
            return metrics.values();
        }
        
        public static void clearMetrics() {
            metrics.clear();
        }
        
        private static String generateMetricKey(String metricName, Map<String, String> tags) {
            String key = metricName;
            if (tags != null && !tags.isEmpty()) {
                List<String> tagPairs = new List<String>();
                for (String tagKey : tags.keySet()) {
                    tagPairs.add(tagKey + '=' + tags.get(tagKey));
                }
                key += '_' + String.join(tagPairs, '_');
            }
            return key;
        }
    }
    
    // Distributed tracing
    public class TracingService {
        private static Map<String, TraceContext> activeTraces = new Map<String, TraceContext>();
        
        public class TraceContext {
            public String traceId;
            public String spanId;
            public String parentSpanId;
            public String operationName;
            public DateTime startTime;
            public DateTime endTime;
            public Map<String, Object> tags;
            public List<LogEvent> logs;
            
            public TraceContext(String operationName) {
                this.traceId = generateTraceId();
                this.spanId = generateSpanId();
                this.operationName = operationName;
                this.startTime = System.now();
                this.tags = new Map<String, Object>();
                this.logs = new List<LogEvent>();
            }
        }
        
        public class LogEvent {
            public DateTime timestamp;
            public String level;
            public String message;
            public Map<String, Object> fields;
            
            public LogEvent(String level, String message) {
                this.timestamp = System.now();
                this.level = level;
                this.message = message;
                this.fields = new Map<String, Object>();
            }
        }
        
        public static TraceContext startTrace(String operationName) {
            TraceContext trace = new TraceContext(operationName);
            activeTraces.put(trace.traceId, trace);
            
            // Add standard tags
            trace.tags.put('user.id', UserInfo.getUserId());
            trace.tags.put('org.id', UserInfo.getOrganizationId());
            trace.tags.put('operation', operationName);
            
            return trace;
        }
        
        public static void finishTrace(String traceId) {
            TraceContext trace = activeTraces.get(traceId);
            if (trace != null) {
                trace.endTime = System.now();
                
                // Send trace to external monitoring system
                sendTraceToMonitoring(trace);
                
                activeTraces.remove(traceId);
            }
        }
        
        public static void addTraceLog(String traceId, String level, String message) {
            TraceContext trace = activeTraces.get(traceId);
            if (trace != null) {
                trace.logs.add(new LogEvent(level, message));
            }
        }
        
        public static void addTraceTag(String traceId, String key, Object value) {
            TraceContext trace = activeTraces.get(traceId);
            if (trace != null) {
                trace.tags.put(key, value);
            }
        }
        
        @future(callout=true)
        private static void sendTraceToMonitoring(TraceContext trace) {
            try {
                // Send to external tracing system (e.g., Jaeger, Zipkin)
                Map<String, Object> traceData = new Map<String, Object>{
                    'traceId' => trace.traceId,
                    'spanId' => trace.spanId,
                    'operationName' => trace.operationName,
                    'startTime' => trace.startTime.getTime(),
                    'endTime' => trace.endTime.getTime(),
                    'duration' => trace.endTime.getTime() - trace.startTime.getTime(),
                    'tags' => trace.tags,
                    'logs' => trace.logs
                };
                
                Http http = new Http();
                HttpRequest request = new HttpRequest();
                request.setEndpoint('https://monitoring-system.example.com/api/traces');
                request.setMethod('POST');
                request.setHeader('Content-Type', 'application/json');
                request.setBody(JSON.serialize(traceData));
                
                HttpResponse response = http.send(request);
                
                if (response.getStatusCode() != 200) {
                    System.debug('Failed to send trace: ' + response.getBody());
                }
                
            } catch (Exception e) {
                System.debug('Error sending trace: ' + e.getMessage());
            }
        }
        
        private static String generateTraceId() {
            return EncodingUtil.convertToHex(Crypto.generateAesKey(128)).substring(0, 16);
        }
        
        private static String generateSpanId() {
            return EncodingUtil.convertToHex(Crypto.generateAesKey(64)).substring(0, 8);
        }
    }
    
    // Health check system
    public class HealthCheckService {
        public interface HealthCheck {
            HealthStatus check();
            String getName();
        }
        
        public class HealthStatus {
            public Boolean healthy;
            public String status; // UP, DOWN, DEGRADED
            public String message;
            public Map<String, Object> details;
            
            public HealthStatus(Boolean healthy, String status, String message) {
                this.healthy = healthy;
                this.status = status;
                this.message = message;
                this.details = new Map<String, Object>();
            }
        }
        
        private static List<HealthCheck> healthChecks = new List<HealthCheck>();
        
        public static void registerHealthCheck(HealthCheck check) {
            healthChecks.add(check);
        }
        
        public static Map<String, Object> performHealthCheck() {
            Map<String, Object> overallHealth = new Map<String, Object>();
            Map<String, HealthStatus> checkResults = new Map<String, HealthStatus>();
            
            Boolean overallHealthy = true;
            
            for (HealthCheck check : healthChecks) {
                try {
                    HealthStatus status = check.check();
                    checkResults.put(check.getName(), status);
                    
                    if (!status.healthy) {
                        overallHealthy = false;
                    }

Did you know?

Oracle APEX uses SQL and PL/SQL extensively, making database skills critical for developers.

Frequently Asked Questions
Frequently Asked Questions

When should our engineering team choose Oracle APEX over Salesforce Apex for a new project?

When should our engineering team choose Oracle APEX over Salesforce Apex for a new project?

How do we assess a candidate's ability to handle our specific technical challenges?

How do we assess a candidate's ability to handle our specific technical challenges?

What's the biggest difference in skillsets between Oracle APEX and Salesforce developers?

What's the biggest difference in skillsets between Oracle APEX and Salesforce developers?

How do we evaluate a candidate's problem-solving approach during technical interviews?

How do we evaluate a candidate's problem-solving approach during technical interviews?

What are the most critical technical skills we should test for in Apex interviews?

What are the most critical technical skills we should test for in Apex interviews?

Advance your career


by mastering both declarative Oracle APEX and powerful Salesforce Apex programming. Stand out with hands-on knowledge of integration, performance optimizations, and scalable app design.

Want to hire

the best talent

with proof

of skill?

Shortlist candidates with

strong proof of skill

in just 48 hours

Founder, Utkrusht AI

Ex. Euler Motors, Oracle