Ultimate Apex Interview Questions Guide (2025): Oracle APEX & Salesforce Apex
Ultimate Apex Interview Questions Guide (2025): Oracle APEX & Salesforce Apex

The Ultimate Apex Interview Questions Guide

The Ultimate Apex Interview Questions Guide

|

Aug 20, 2025

Contents

Key Takeaways

Distinguish candidates by testing practical skills like debugging, performance tuning, and complex business logic implementation beyond memorized theory.

Evaluate both Oracle APEX low-code competencies and Salesforce Apex programming expertise across beginner to expert levels.

Focus on critical platform knowledge including session management, RESTful services, asynchronous processing, and security best practices.

Use scenario-based questions on system design, error handling, integration, and deployment to find candidates ready for enterprise challenges.

Spot red flags such as superficial answers, lack of understanding of governor limits, and poor architectural insight.

Implement multi-stage interview processes combining quizzes, live coding, and architecture discussions for data-driven hiring decisions.

Engineering teams face a critical challenge when hiring developers with Apex expertise. Whether you're building rapid web applications with Oracle APEX or implementing complex business logic with Salesforce Apex, finding candidates who can deliver from day one requires precise technical evaluation.

This comprehensive guide provides 80+ carefully curated interview questions designed specifically for engineering leaders who need to identify genuine Apex expertise. We've structured questions across both Oracle APEX and Salesforce Apex platforms, categorized by skill level to help you assess candidates accurately.

Why This Guide Matters for Engineering Teams

As organizations increasingly rely on low-code platforms and Salesforce ecosystems, the demand for skilled Apex developers has skyrocketed. However, resumes often don't reflect real-world problem-solving abilities. This guide helps you:

  • Evaluate actual coding skills beyond theoretical knowledge

  • Assess architectural thinking for complex enterprise solutions

  • Identify candidates who understand performance implications

  • Test real-world scenario handling rather than textbook answers

The questions in this guide have been validated by engineering teams at companies ranging from startups to Fortune 500 organizations, ensuring they reflect actual job requirements rather than academic concepts.

Oracle APEX Interview Questions

Oracle APEX (Application Express) enables rapid development of data-driven web applications. These questions test candidates' ability to build scalable, secure applications using Oracle's low-code platform.

Beginner Level Questions (1-15)

1. What is Oracle APEX and how does it differ from traditional web development frameworks?

Oracle APEX is a low-code development platform that runs entirely within the Oracle Database. Unlike traditional frameworks that require separate application servers, databases, and extensive coding, APEX provides a declarative development environment where applications are built through configuration rather than custom code.

Key differences include:

  • Tight database integration: Applications run within the Oracle Database, eliminating the need for separate application tiers

  • Declarative development: Components are configured through wizards and forms rather than coded from scratch

  • Built-in security: Automatic protection against SQL injection, XSS, and other common vulnerabilities

  • Rapid deployment: Applications can be built and deployed in hours rather than weeks

2. Explain the architecture of Oracle APEX.

APEX architecture consists of four main components:

  • Oracle Database: Contains the APEX metadata repository, application logic, and data

  • APEX Listener (ORDS): Oracle REST Data Services handles HTTP requests and communicates with the database

  • Web Server: Hosts static files and routes requests to ORDS

  • Browser: Renders the HTML, CSS, and JavaScript generated by APEX

The request flow: Browser → Web Server → ORDS → Oracle Database → APEX Engine → Response back through the chain.

3. What is a workspace in Oracle APEX?

A workspace is a virtual private database that groups APEX applications, users, and database schemas. It provides:

  • Isolation: Each workspace operates independently with its own users and applications

  • Security boundary: Users in one workspace cannot access another workspace's applications

  • Schema mapping: Associates the workspace with one or more database schemas

  • Administration: Manages developers, end users, and application settings

4. Describe the difference between a page and a region in APEX.

  • Page: A complete screen or view in an APEX application, accessible via a unique URL. Contains regions, items, buttons, and processes

  • Region: A container within a page that displays specific content like reports, forms, charts, or static content. Multiple regions can exist on a single page

Think of a page as a webpage and regions as sections or widgets within that page.

5. What are the main types of reports available in Oracle APEX?

  • Classic Report: Simple tabular data display with basic sorting and pagination

  • Interactive Report: Advanced user-customizable reports with filtering, searching, grouping, and personal customizations

  • Interactive Grid: Spreadsheet-like interface allowing inline editing, adding, and deleting records

  • Cards: Visual representation of data in card format

  • Chart: Graphical data representation (bar, pie, line charts, etc.)

6. How do you implement master-detail relationships in APEX?

Master-detail relationships are implemented by:

  1. Creating the master form/report based on the parent table

  2. Adding a detail region on the same page or linked page

  3. Setting the master-detail relationship in the detail region properties

  4. Configuring the link column that connects master to detail records

  5. Setting up automatic refresh so detail records update when master selection changes

7. What is a Dynamic Action in Oracle APEX?

Dynamic Actions provide client-side interactivity without page refreshes. They consist of:

  • When: Event trigger (button click, item change, page load)

  • Event: Specific action that triggers the dynamic action

  • Condition: Optional criteria that must be met

  • Action: What happens (show/hide items, execute JavaScript, refresh regions)

Example: Hide a region when a checkbox is unchecked, or refresh a report when a select list value changes.

8. Explain the concept of session state in APEX.

Session state maintains data values across pages and user interactions within an APEX session. It includes:

  • Page items: Values entered in forms or selected from lists

  • Application items: Global variables accessible across all pages

  • Session ID: Unique identifier for the user session

  • Automatic management: APEX handles session creation, maintenance, and cleanup

9. What are the different authentication schemes available in APEX?

  • APEX Accounts: Internal APEX user management

  • Database Accounts: Uses Oracle Database user authentication

  • LDAP Directory: Integrates with LDAP servers like Active Directory

  • Social Sign-On: OAuth integration with Google, Facebook, etc.

  • Custom: User-defined authentication logic using PL/SQL

  • No Authentication: For public applications

10. How do you handle file uploads in Oracle APEX?

File uploads are handled using:

  1. File Browse item: Allows users to select files from their device

  2. BLOB storage: Files stored as Binary Large Objects in database tables

  3. File processing: PL/SQL logic to handle uploaded files

  4. Validation: File type, size, and content validation

  5. Download mechanism: Process to retrieve and serve uploaded files

11. What is the difference between before and after page processes?

  • Before Header: Executes before the page is rendered, useful for authentication and data initialization

  • After Header: Runs after page rendering but before the user sees it

  • On Load: Executes when the page loads

  • On Submit - Before Computations: Runs before any computations when form is submitted

  • On Submit - After Computations: Runs after computations but before validations

12. How do you implement conditional rendering in APEX?

Conditional rendering controls when components display based on:

  • Item values: Show region only if specific item has certain value

  • User attributes: Display based on user role or authorization

  • PL/SQL expressions: Custom logic determining visibility

  • Page items: Conditions based on other page item values

  • Application items: Global conditions affecting multiple pages

13. What are shared components in Oracle APEX?

Shared components are reusable elements available across an application:

  • Lists of Values (LOVs): Dropdown options used in multiple places

  • Templates: HTML structures for consistent appearance

  • Authentication schemes: Login mechanisms

  • Authorization schemes: Access control rules

  • Themes: Overall application appearance and styling

  • Web service references: External API connections

14. Explain the difference between application items and page items.

  • Page Items: Scope limited to a specific page, automatically managed by APEX, used for user input and page-specific data

  • Application Items: Global scope across entire application, manually managed, used for session-wide data like user preferences or application state

15. How do you create cascading LOVs (Lists of Values)?

Cascading LOVs create dependent dropdowns where the second list's options depend on the first list's selection:

  1. Create parent LOV: First dropdown with independent values

  2. Create dependent LOV: Second dropdown with SQL query referencing parent item

  3. Set cascading parent: Configure the dependency relationship

  4. Add refresh action: Ensure dependent LOV updates when parent changes

Intermediate Level Questions (16-27)

16. How do you optimize performance for large datasets in APEX applications?

Performance optimization strategies include:

  • Pagination: Limit records displayed per page using row limiting

  • Lazy loading: Load data only when needed

  • Efficient SQL: Use proper indexing, avoid SELECT *, optimize joins

  • Caching: Enable region and application-level caching

  • Asynchronous processing: Use background jobs for heavy operations

  • Partial page refresh: Update only necessary regions instead of full page reload

17. Describe how to implement custom authentication in Oracle APEX.

Custom authentication involves:

  1. Create authentication scheme: Go to Shared Components > Authentication Schemes

  2. Define PL/SQL function: Write authentication logic that returns TRUE/FALSE

  3. Session management: Handle user session creation and validation

  4. Login page customization: Create custom login interface

  5. Post-authentication processing: Set session attributes and redirect logic

FUNCTION custom_authenticate(p_username VARCHAR2, p_password VARCHAR2) 

RETURN BOOLEAN IS

BEGIN

  -- Custom authentication logic

  IF validate_user_credentials(p_username, p_password) THEN

    -- Set session attributes

    RETURN TRUE;

  ELSE

    RETURN FALSE;

  END IF;

END;

18. How do you handle RESTful web services in APEX?

RESTful services in APEX involve:

  • Creating REST endpoints: Define URI templates and HTTP methods

  • Data source modules: Configure external REST API connections

  • Authentication: Set up OAuth, API keys, or basic authentication

  • Request/response handling: Map JSON/XML to APEX items and collections

  • Error handling: Implement robust error handling for service failures

19. What are APEX collections and when would you use them?

APEX collections are temporary, session-specific data structures that:

  • Store temporary data: Hold data during user session without database commits

  • Manipulate datasets: Sort, filter, and modify data before database operations

  • Cross-page data: Share data between pages within a session

  • Report building: Create complex reports from multiple data sources

  • Wizard implementations: Store multi-step form data

20. How do you implement row-level security in APEX applications?

Row-level security implementation:

  • VPD policies: Virtual Private Database policies at database level

  • Authorization schemes: APEX-level access control rules

  • Shared components: Reusable security logic across applications

  • Session attributes: User-specific security context

  • SQL filtering: Dynamic WHERE clauses based on user permissions

21. Explain the APEX plugin architecture and how to develop custom plugins.

APEX plugins extend functionality through:

  1. Plugin types: Region, item, dynamic action, process, or authorization

  2. PL/SQL code: Server-side logic for data processing

  3. JavaScript/CSS: Client-side behavior and styling

  4. Configuration options: Parameters for plugin customization

  5. Installation: Packaged for deployment across applications

22. How do you handle large file uploads and downloads in APEX?

Large file handling strategies:

  • Chunked uploads: Break large files into smaller pieces

  • Background processing: Use APEX collections or temporary tables

  • Streaming: Process files without loading entirely into memory

  • Compression: Reduce file sizes before storage

  • Progress indicators: Provide user feedback during operations

  • Error recovery: Handle interrupted uploads gracefully

23. What is the role of APEX Listener (ORDS) and how do you configure it?

ORDS serves as the web server component that:

  • Handles HTTP requests: Processes incoming web requests

  • Database connectivity: Manages connection pooling to Oracle Database

  • REST services: Exposes database operations as REST APIs

  • Static file serving: Handles images, CSS, JavaScript files

  • Security: Implements SSL/TLS and authentication protocols

Configuration involves setting connection pools, security, and deployment parameters.

24. How do you implement complex business rules in APEX?

Complex business rules implementation:

  • PL/SQL packages: Centralized business logic separate from presentation

  • Database triggers: Automatic enforcement of data integrity rules

  • APEX validations: Page-level business rule validation

  • Dynamic actions: Client-side rule enforcement

  • Workflow engines: For complex approval processes

  • Custom computations: Calculated fields based on business logic

25. Describe APEX application deployment strategies.

Deployment strategies include:

  • Export/Import: Manual application export and import between environments

  • SQL*Plus scripts: Automated deployment using command-line tools

  • Version control: Integration with Git or other VCS systems

  • Environment management: Separate development, test, and production environments

  • Data migration: Handling data differences between environments

  • Rollback procedures: Ability to revert problematic deployments

26. How do you integrate APEX with external systems?

Integration approaches:

  • Web services: REST and SOAP service consumption

  • Database links: Direct database-to-database connections

  • Message queues: Asynchronous integration using Oracle AQ

  • File-based: CSV, XML, JSON file processing

  • API gateways: Centralized API management

  • ETL processes: Extract, Transform, Load operations

27. What are the security best practices for APEX applications?

Security best practices:

  • Input validation: Validate all user inputs at multiple levels

  • SQL injection prevention: Use bind variables and parameterized queries

  • XSS protection: Escape output and use Content Security Policy

  • Authentication: Implement strong authentication mechanisms

  • Authorization: Fine-grained access control

  • Session management: Secure session handling and timeout

  • HTTPS enforcement: Encrypt all communications

  • Regular updates: Keep APEX and database patches current

Expert Level Questions (28-40)

28. How do you design multi-tenant applications in Oracle APEX?

Multi-tenant design approaches:

  • Schema separation: Each tenant has separate database schema

  • Row-level separation: Shared schema with tenant ID filtering

  • VPD implementation: Virtual Private Database for automatic filtering

  • Workspace isolation: Separate APEX workspaces per tenant

  • Configuration management: Tenant-specific settings and customizations

  • Performance considerations: Resource allocation and monitoring per tenant

29. Describe advanced performance tuning techniques for APEX applications.

Advanced performance tuning:

  • Database optimization: Query tuning, indexing strategies, execution plan analysis

  • APEX-specific tuning: Region caching, lazy loading, efficient page design

  • Network optimization: Compression, CDN usage, static file optimization

  • Memory management: Session state optimization, collection management

  • Monitoring: Performance metrics collection and analysis

  • Scalability planning: Load balancing, connection pooling configuration

30. How do you implement custom PDF generation in APEX?

PDF generation approaches:

  • APEX native: Built-in PDF printing capabilities

  • BI Publisher: Oracle's enterprise reporting solution

  • PL/PDF: PL/SQL library for PDF creation

  • Custom solutions: Third-party tools or cloud services

  • Template design: Creating professional report layouts

  • Data integration: Merging database data with PDF templates

31. Explain the integration between APEX and Oracle Database Advanced Features.

Advanced database feature integration:

  • Partitioning: Working with partitioned tables and parallel processing

  • Analytics: Using Oracle Analytics for complex calculations

  • Spatial data: Geographic information system capabilities

  • Text search: Oracle Text integration for full-text search

  • Data warehousing: APEX as BI front-end for data warehouses

  • Advanced security: Label security, data masking, encryption

32. How do you handle real-time data updates in APEX applications?

Real-time updates implementation:

  • WebSockets: Persistent connections for live data streaming

  • APEX push notifications: Server-initiated client updates

  • Polling mechanisms: Automatic refresh of data regions

  • Database change notification: Responding to database triggers

  • Message queues: Asynchronous messaging for real-time updates

  • Event-driven architecture: Publish-subscribe patterns

33. Describe advanced authorization and access control patterns.

Advanced access control:

  • Attribute-based access: Dynamic permissions based on user attributes

  • Context-aware security: Access control based on location, time, device

  • Hierarchical permissions: Role inheritance and delegation

  • Data classification: Different access levels based on data sensitivity

  • Audit trails: Comprehensive logging of access and modifications

  • Fine-grained authorization: Column and row-level access control

34. How do you implement complex data migration strategies in APEX?

Data migration strategies:

  • ETL processes: Extract, Transform, Load operations for large datasets

  • Incremental migration: Moving data in phases to minimize downtime

  • Data validation: Ensuring data integrity during migration

  • Rollback procedures: Ability to revert failed migrations

  • Performance optimization: Parallel processing and bulk operations

  • Legacy system integration: Handling data from multiple source systems

35. What are the considerations for APEX cloud deployment and scalability?

Cloud deployment considerations:

  • Oracle Cloud Infrastructure: APEX on Autonomous Database

  • Container deployment: Docker and Kubernetes strategies

  • Auto-scaling: Dynamic resource allocation based on load

  • High availability: Multi-region deployment and failover

  • Disaster recovery: Backup and recovery strategies

  • Cost optimization: Resource utilization and pricing models

36. How do you implement advanced workflow and approval processes?

Workflow implementation:

  • State machines: Complex approval state management

  • Dynamic routing: Approval paths based on business rules

  • Parallel processing: Multiple approvers simultaneously

  • Escalation procedures: Automatic escalation for delayed approvals

  • Integration: Workflow engines and external approval systems

  • Audit capabilities: Complete approval history and tracking

37. Describe advanced integration patterns with Oracle E-Business Suite.

EBS integration patterns:

  • API utilization: Oracle EBS APIs for data operations

  • Single sign-on: Seamless authentication between systems

  • Data synchronization: Real-time or batch data updates

  • Workflow integration: APEX as approval interface for EBS processes

  • Reporting enhancement: APEX reports for EBS data

  • Mobile enablement: APEX mobile interfaces for EBS functionality

38. How do you handle advanced error handling and debugging in production APEX applications?

Production error handling:

  • Comprehensive logging: Detailed error logging with context

  • Error notification: Automatic alerts for critical errors

  • User-friendly messages: Meaningful error messages for end users

  • Debug modes: Production-safe debugging capabilities

  • Performance monitoring: Application performance metrics

  • Health checks: Automated application health monitoring

39. What are the advanced customization techniques for APEX themes and templates?

Advanced customization:

  • Custom CSS frameworks: Integration with modern CSS frameworks

  • JavaScript integration: Advanced client-side functionality

  • Responsive design: Mobile-first design principles

  • Component libraries: Reusable UI component development

  • Accessibility compliance: WCAG guidelines implementation

  • Brand integration: Corporate branding and style guidelines

40. How do you implement enterprise-grade monitoring and analytics for APEX applications?

Enterprise monitoring:

  • Application metrics: User activity, performance metrics, error rates

  • Database monitoring: Resource utilization, query performance

  • Business intelligence: Usage analytics and business insights

  • Capacity planning: Growth projections and resource planning

  • SLA monitoring: Service level agreement compliance

  • Integration monitoring: External system connectivity and performance

Salesforce Apex Interview Questions

Salesforce Apex is an object-oriented programming language that allows developers to execute flow and transaction control statements on the Salesforce platform. These questions assess candidates' ability to build robust, scalable solutions within the Salesforce ecosystem.

Beginner Level Questions (1-15)

1. What is Salesforce Apex and how does it differ from other programming languages?

Salesforce Apex is a strongly-typed, object-oriented programming language that executes on the Salesforce platform. Key differences include:

  • Cloud-native execution: Runs entirely on Salesforce servers, not locally

  • Governor limits: Built-in limits prevent resource abuse in multi-tenant environment

  • Database integration: Native integration with Salesforce objects and data

  • Automatic platform features: Built-in security, sharing, and workflow integration

  • Java-like syntax: Familiar syntax for Java developers but with platform-specific features

2. Explain the different types of Apex triggers and their execution contexts.

Apex triggers execute in response to data changes:

  • Before triggers: Execute before records are saved to database, used for validation and data modification

  • After triggers: Execute after records are saved, used for operations requiring record IDs

  • Trigger events: Insert, Update, Delete, Undelete operations

  • Trigger context variables: isInsert, isUpdate, isDelete, isBefore, isAfter, Trigger.new, Trigger.old

trigger AccountTrigger on Account (before insert, before update, after insert, after update) {

    if (Trigger.isBefore) {

        // Validation and data modification logic

    }

    if (Trigger.isAfter) {

        // Operations requiring record IDs

    }

}

3. What are SOQL and SOSL, and when would you use each?

  • SOQL (Salesforce Object Query Language): Queries single object or related objects, returns specific records

  • SOSL (Salesforce Object Search Language): Searches across multiple objects, returns records containing search terms

Use SOQL for specific data retrieval, SOSL for broad searches across multiple objects.

// SOQL example

List<Account> accounts = [SELECT Id, Name FROM Account WHERE Industry = 'Technology'];

// SOSL example

List<List<SObject>> searchResults = [FIND 'John' IN ALL FIELDS RETURNING Account, Contact];

4. How do you handle exceptions in Apex?

Exception handling uses try-catch blocks:

try {

    // Code that might throw exception

    insert accountList;

} catch (DMLException e) {

    // Handle DML-specific exceptions

    System.debug('DML Error: ' + e.getMessage());

} catch (Exception e) {

    // Handle general exceptions

    System.debug('General Error: ' + e.getMessage());

} finally {

    // Cleanup code that always executes

}

5. What is the difference between with sharing and without sharing keywords?

  • with sharing: Enforces user's sharing rules and permissions

  • without sharing: Runs with full access, ignoring user permissions

  • inherited sharing: Inherits sharing context from calling class

public with sharing class AccountService {

    // Respects user sharing rules

}

public without sharing class SystemService {

    // Runs with system permissions

}

6. Explain the concept of governor limits in Salesforce.

Governor limits prevent resource abuse in the multi-tenant environment:

  • SOQL queries: 100 synchronous, 200 asynchronous per transaction

  • DML statements: 150 per transaction

  • Heap size: 6MB synchronous, 12MB asynchronous

  • CPU time: 10 seconds synchronous, 60 seconds asynchronous

  • Callouts: 100 per transaction

7. What are the different types of collections in Apex?

  • List: Ordered collection allowing duplicates

  • Set: Unordered collection of unique elements

  • Map: Key-value pairs for efficient lookups

List<String> stringList = new List<String>();

Set<Id> idSet = new Set<Id>();

Map<Id, Account> accountMap = new Map<Id, Account>();

8. How do you write test classes in Apex?

Test classes ensure code quality and are required for deployment:

@isTest

public class AccountTriggerTest {

    @testSetup

    static void setupTestData() {

        // Create test data

    }

    

    @isTest

    static void testAccountInsert() {

        Test.startTest();

        // Test logic

        Test.stopTest();

        

        // Assertions

        System.assertEquals(expected, actual);

    }

}

9. What is the difference between static and instance methods?

  • Static methods: Belong to the class, called without creating instance, cannot access instance variables

  • Instance methods: Belong to object instance, can access instance variables

public class Calculator {

    public static Integer add(Integer a, Integer b) {

        return a + b; // Static method

    }

    

    public Integer instanceVariable = 0;

    public void setVariable(Integer value) {

        this.instanceVariable = value; // Instance method

    }

}

10. Explain the order of execution in Salesforce.

The order of execution for record processing:

  1. System validation rules

  2. Before triggers

  3. Custom validation rules

  4. After triggers

  5. Assignment rules

  6. Auto-response rules

  7. Workflow rules

  8. Processes and flows

  9. Escalation rules

  10. Roll-up summary field updates

  11. Criteria-based sharing rules

11. What are future methods and when would you use them?

Future methods execute asynchronously:

public class ExternalService {

    @future(callout=true)

    public static void makeCallout(String endpoint) {

        // Asynchronous callout logic

    }

    

    @future

    public static void heavyProcessing(Set<Id> recordIds) {

        // Time-consuming operations

    }

}

Use cases: External callouts, heavy processing, mixed DML operations.

12. How do you implement pagination in Visualforce or Lightning components?

Pagination handles large datasets efficiently:

public class AccountController {

    public ApexPages.StandardSetController setCon {get; set;}

    

    public AccountController() {

        setCon = new ApexPages.StandardSetController([SELECT Id, Name FROM Account]);

        setCon.setPageSize(10);

    }

    

    public List<Account> getAccounts() {

        return (List<Account>) setCon.getRecords();

    }

    

    public Boolean hasNext() {

        return setCon.getHasNext();

    }

    

    public PageReference next() {

        setCon.next();

        return null;

    }

}

13. What is the difference between insert and Database.insert?

  • insert: DML statement that throws exception on failure

  • Database.insert: Database method allowing partial success

// Traditional DML

try {

    insert accountList;

} catch (DMLException e) {

    // Handle exception

}

// Database method

Database.SaveResult[] results = Database.insert(accountList, false);

for (Database.SaveResult result : results) {

    if (!result.isSuccess()) {

        // Handle individual failures

    }

}

14. How do you handle bulk operations in Apex?

Bulk operations process multiple records efficiently:

public class BulkAccountProcessor {

    public static void updateAccounts(List<Account> accounts) {

        List<Account> accountsToUpdate = new List<Account>();

        

        for (Account acc : accounts) {

            if (acc.AnnualRevenue > 1000000) {

                acc.Type = 'Enterprise';

                accountsToUpdate.add(acc);

            }

        }

        

        if (!accountsToUpdate.isEmpty()) {

            update accountsToUpdate;

        }

    }

}

15. What are custom settings and custom metadata types?

  • Custom settings: Application data cached at organization, profile, or user level

  • Custom metadata types: Metadata that can be deployed and is accessible via SOQL

Use custom settings for configuration data, custom metadata for deployable application metadata.

Intermediate Level Questions (16-27)

16. How do you implement trigger design patterns to avoid recursion?

Recursion prevention using static variables:

public class TriggerHelper {

    private static Boolean isExecuting = false;

    private static Set<Id> processedIds = new Set<Id>();

    

    public static Boolean isFirstRun() {

        if (!isExecuting) {

            isExecuting = true;

            return true;

        }

        return false;

    }

    

    public static Boolean isProcessed(Id recordId) {

        return processedIds.contains(recordId);

    }

    

    public static void addProcessed(Id recordId) {

        processedIds.add(recordId);

    }

}

17. Explain batch Apex and provide implementation example.

Batch Apex processes large datasets asynchronously:

public class AccountBatch implements Database.Batchable<sObject>, Database.Stateful {

    private Integer recordsProcessed = 0;

    

    public Database.QueryLocator start(Database.BatchableContext bc) {

        return Database.getQueryLocator('SELECT Id, Name FROM Account WHERE Type = null');

    }

    

    public void execute(Database.BatchableContext bc, List<Account> scope) {

        for (Account acc : scope) {

            acc.Type = 'Prospect';

        }

        update scope;

        recordsProcessed += scope.size();

    }

    

    public void finish(Database.BatchableContext bc) {

        System.debug('Processed ' + recordsProcessed + ' records');

    }

}

18. How do you implement asynchronous processing with Queueable Apex?

Queueable Apex for chainable asynchronous operations:

public class AccountProcessor implements Queueable {

    private List<Account> accounts;

    private Integer batchSize;

    

    public AccountProcessor(List<Account> accounts, Integer batchSize) {

        this.accounts = accounts;

        this.batchSize = batchSize;

    }

    

    public void execute(QueueableContext context) {

        List<Account> batch = new List<Account>();

        

        for (Integer i = 0; i < Math.min(batchSize, accounts.size()); i++) {

            batch.add(accounts[i]);

        }

        

        // Process batch

        processAccounts(batch);

        

        // Chain next batch if more records

        if (accounts.size() > batchSize) {

            List<Account> remaining = accounts.subList(batchSize, accounts.size());

            System.enqueueJob(new AccountProcessor(remaining, batchSize));

        }

    }

    

    private void processAccounts(List<Account> accounts) {

        // Processing logic

    }

}

19. How do you handle mixed DML operations?

Mixed DML occurs when setup and non-setup objects are modified in same transaction:

public class MixedDMLHandler {

    @future

    public static void createUserAsync(String firstName, String lastName, String email) {

        // Create user in async context to avoid mixed DML

        User newUser = new User(

            FirstName = firstName,

            LastName = lastName,

            Email = email,

            Username = email,

            Alias = firstName.substring(0,1) + lastName.substring(0,4),

            ProfileId = [SELECT Id FROM Profile WHERE Name = 'Standard User'].Id

        );

        insert newUser;

    }

    

    public static void handleAccountAndUser(Account acc, String userEmail) {

        insert acc; // Non-setup object

        

        // Use future method for setup object to avoid mixed DML

        createUserAsync('John', 'Doe', userEmail);

    }

}

20. Explain the implementation of sharing and security in Apex.

Sharing and security implementation:

// Manual sharing

public class AccountSharing {

    public static void shareAccountWithUser(Id accountId, Id userId, String accessLevel) {

        AccountShare sharing = new AccountShare();

        sharing.AccountId = accountId;

        sharing.UserOrGroupId = userId;

        sharing.AccountAccessLevel = accessLevel;

        sharing.OpportunityAccessLevel = 'Read';

        

        Database.SaveResult result = Database.insert(sharing, false);

        if (!result.isSuccess()) {

            System.debug('Error sharing account: ' + result.getErrors());

        }

    }

}

// Programmatic sharing rules

public inherited sharing class SecureAccountService {

    public static List<Account> getAccessibleAccounts() {

        return [SELECT Id, Name FROM Account WITH SECURITY_ENFORCED];

    }

}

21. How do you implement dynamic SOQL and handle injection prevention?

Dynamic SOQL with security considerations:

public class DynamicSOQLService {

    public static List<SObject> queryRecords(String objectName, List<String> fields, String whereClause) {

        // Validate object access

        if (!Schema.getGlobalDescribe().containsKey(objectName)) {

            throw new IllegalArgumentException('Invalid object name');

        }

        

        // Validate field access

        Map<String, Schema.SObjectField> fieldMap = 

            Schema.getGlobalDescribe().get(objectName).getDescribe().fields.getMap();

        

        for (String field : fields) {

            if (!fieldMap.containsKey(field)) {

                throw new IllegalArgumentException('Invalid field: ' + field);

            }

        }

        

        // Build query with String.escapeSingleQuotes for user input

        String query = 'SELECT ' + String.join(fields, ',') +

22. How do you implement REST API integration in Apex?

REST API integration using HTTP callouts:

apex

public class ExternalAPIService {

    @future(callout=true)

    public static void makeRestCallout(String endpoint, String method, String body) {

        Http http = new Http();

        HttpRequest request = new HttpRequest();

        

        request.setEndpoint(endpoint);

        request.setMethod(method);

        request.setHeader('Content-Type', 'application/json');

        request.setHeader('Authorization', 'Bearer ' + getAuthToken());

        

        if (String.isNotBlank(body)) {

            request.setBody(body);

        }

        

        try {

            HttpResponse response = http.send(request);

            

            if (response.getStatusCode() == 200) {

                processResponse(response.getBody());

            } else {

                System.debug('Error: ' + response.getStatusCode() + ' ' + response.getStatus());

            }

        } catch (Exception e) {

            System.debug('Callout failed: ' + e.getMessage());

        }

    }

    

    private static String getAuthToken() {

        // Implement OAuth or API key logic

        return 'your_auth_token';

    }

    

    private static void processResponse(String responseBody) {

        // Parse and process response

        Map<String, Object> responseMap = (Map<String, Object>) JSON.deserializeUntyped(responseBody);

        // Process the response data

    }

}

23. How do you handle governor limits in complex applications?

Governor limit management strategies:

apex

public class LimitManager {

    public static void checkLimits() {

        System.debug('SOQL Queries used: ' + Limits.getQueries() + '/' + Limits.getLimitQueries());

        System.debug('DML Statements used: ' + Limits.getDMLStatements() + '/' + Limits.getLimitDMLStatements());

        System.debug('Heap Size used: ' + Limits.getHeapSize() + '/' + Limits.getLimitHeapSize());

        

        // Warn if approaching limits

        if (Limits.getQueries() > 80) {

            System.debug('WARNING: Approaching SOQL query limit');

        }

    }

    

    public static void processInBatches(List<SObject> records, Integer batchSize) {

        List<SObject> batch = new List<SObject>();

        

        for (SObject record : records) {

            batch.add(record);

            

            if (batch.size() == batchSize) {

                processBatch(batch);

                batch.clear();

            }

        }

        

        // Process remaining records

        if (!batch.isEmpty()) {

            processBatch(batch);

        }

    }

    

    private static void processBatch(List<SObject> batch) {

        // Process batch while monitoring limits

        checkLimits();

        // Batch processing logic

    }

}

24. How do you implement custom metadata types in your solutions?

Custom metadata types for configuration:

apex

public class ConfigurationService {

    private static Map<String, Integration_Setting__mdt> settingsCache;

    

    public static Integration_Setting__mdt getSetting(String settingName) {

        if (settingsCache == null) {

            loadSettings();

        }

        

        return settingsCache.get(settingName);

    }

    

    private static void loadSettings() {

        settingsCache = new Map<String, Integration_Setting__mdt>();

        

        for (Integration_Setting__mdt setting : [

            SELECT DeveloperName, Endpoint__c, Timeout__c, Retry_Count__c 

            FROM Integration_Setting__mdt

        ]) {

            settingsCache.put(setting.DeveloperName, setting);

        }

    }

    

    public static void makeConfigurableCallout(String settingName, String payload) {

        Integration_Setting__mdt setting = getSetting(settingName);

        

        if (setting != null) {

            Http http = new Http();

            HttpRequest req = new HttpRequest();

            req.setEndpoint(setting.Endpoint__c);

            req.setTimeout(Integer.valueOf(setting.Timeout__c));

            req.setBody(payload);

            

            // Implement retry logic based on setting.Retry_Count__c

        }

    }

}

25. How do you implement platform events for event-driven architecture?

Platform events for decoupled communication:

apex

// Publisher

public class OrderEventPublisher {

    public static void publishOrderEvent(Id orderId, String status) {

        Order_Status_Event__e event = new Order_Status_Event__e();

        event.Order_Id__c = orderId;

        event.Status__c = status;

        event.Timestamp__c = System.now();

        

        Database.SaveResult result = EventBus.publish(event);

        

        if (!result.isSuccess()) {

            System.debug('Error publishing event: ' + result.getErrors());

        }

    }

}

// Subscriber (Trigger on Platform Event)

trigger OrderStatusEventTrigger on Order_Status_Event__e (after insert) {

    List<Task> tasksToCreate = new List<Task>();

    

    for (Order_Status_Event__e event : Trigger.new) {

        if (event.Status__c == 'Shipped') {

            Task followUpTask = new Task();

            followUpTask.Subject = 'Follow up on shipped order';

            followUpTask.WhatId = event.Order_Id__c;

            followUpTask.ActivityDate = Date.today().addDays(3);

            tasksToCreate.add(followUpTask);

        }

    }

    

    if (!tasksToCreate.isEmpty()) {

        insert tasksToCreate;

    }

}

26. How do you implement Lightning Web Component (LWC) integration with Apex?

LWC-Apex integration patterns:

apex

// Apex Controller for LWC

public with sharing class AccountController {

    @AuraEnabled(cacheable=true)

    public static List<Account> getAccounts(String searchTerm) {

        String searchKey = '%' + searchTerm + '%';

        return [

            SELECT Id, Name, Industry, AnnualRevenue 

            FROM Account 

            WHERE Name LIKE :searchKey 

            WITH SECURITY_ENFORCED

            LIMIT 50

        ];

    }

    

    @AuraEnabled

    public static void updateAccount(Account account) {

        try {

            update account;

        } catch (DMLException e) {

            throw new AuraHandledException(e.getMessage());

        }

    }

    

    @AuraEnabled

    public static String createAccountWithContacts(String accountData, String contactsData) {

        try {

            Account acc = (Account) JSON.deserialize(accountData, Account.class);

            insert acc;

            

            List<Contact> contacts = (List<Contact>) JSON.deserialize(contactsData, List<Contact>.class);

            for (Contact con : contacts) {

                con.AccountId = acc.Id;

            }

            insert contacts;

            

            return acc.Id;

        } catch (Exception e) {

            throw new AuraHandledException('Error creating account: ' + e.getMessage());

        }

    }

}

27. How do you implement data factory patterns for test data creation?

Test data factory for maintainable tests:

apex

@isTest

public class TestDataFactory {

    public static Account createAccount(String name, String industry) {

        return new Account(

            Name = name,

            Industry = industry,

            BillingCity = 'San Francisco',

            BillingState = 'CA'

        );

    }

    

    public static List<Account> createAccounts(Integer count) {

        List<Account> accounts = new List<Account>();

        

        for (Integer i = 0; i < count; i++) {

            accounts.add(createAccount('Test Account ' + i, 'Technology'));

        }

        

        return accounts;

    }

    

    public static Contact createContact(Id accountId, String firstName, String lastName) {

        return new Contact(

            AccountId = accountId,

            FirstName = firstName,

            LastName = lastName,

            Email = firstName.toLowerCase() + '.' + lastName.toLowerCase() + '@test.com'

        );

    }

    

    public static User createTestUser(String profileName, String username) {

        Profile profile = [SELECT Id FROM Profile WHERE Name = :profileName LIMIT 1];

        

        return new User(

            FirstName = 'Test',

            LastName = 'User',

            Email = username + '@test.com',

            Username = username + '@test.com.dev',

            Alias = 'tuser',

            ProfileId = profile.Id,

            TimeZoneSidKey = 'America/Los_Angeles',

            LocaleSidKey = 'en_US',

            EmailEncodingKey = 'UTF-8',

            LanguageLocaleKey = 'en_US'

        );

    }

}

Expert Level Questions (28-40)

28. How do you implement complex domain-driven design patterns in Salesforce?

Domain-driven design implementation:

apex

// Domain Layer - Business Logic

public virtual class OpportunityDomain {

    protected List<Opportunity> opportunities;

    

    public OpportunityDomain(List<Opportunity> opportunities) {

        this.opportunities = opportunities;

    }

    

    public virtual void validateBusinessRules() {

        for (Opportunity opp : opportunities) {

            validateCloseDate(opp);

            validateAmount(opp);

            validateStageProgression(opp);

        }

    }

    

    protected virtual void validateCloseDate(Opportunity opp) {

        if (opp.CloseDate < Date.today()) {

            opp.addError('Close date cannot be in the past');

        }

    }

    

    protected virtual void validateAmount(Opportunity opp) {

        if (opp.Amount <= 0) {

            opp.addError('Amount must be greater than zero');

        }

    }

    

    protected virtual void validateStageProgression(Opportunity opp) {

        // Complex stage progression logic

        if (Trigger.isUpdate) {

            Opportunity oldOpp = Trigger.oldMap.get(opp.Id);

            if (!isValidStageProgression(oldOpp.StageName, opp.StageName)) {

                opp.addError('Invalid stage progression');

            }

        }

    }

    

    private Boolean isValidStageProgression(String oldStage, String newStage) {

        // Stage progression business rules

        Map<String, Set<String>> validProgressions = new Map<String, Set<String>>{

            'Prospecting' => new Set<String>{'Qualification', 'Closed Lost'},

            'Qualification' => new Set<String>{'Needs Analysis', 'Closed Lost'},

            'Needs Analysis' => new Set<String>{'Value Proposition', 'Closed Lost'},

            'Value Proposition' => new Set<String>{'Id. Decision Makers', 'Closed Lost'},

            'Id. Decision Makers' => new Set<String>{'Perception Analysis', 'Closed Lost'},

            'Perception Analysis' => new Set<String>{'Proposal/Price Quote', 'Closed Lost'},

            'Proposal/Price Quote' => new Set<String>{'Negotiation/Review', 'Closed Lost'},

            'Negotiation/Review' => new Set<String>{'Closed Won', 'Closed Lost'}

        };

        

        return validProgressions.get(oldStage)?.contains(newStage) ?? false;

    }

}

// Service Layer - Application Logic

public class OpportunityService {

    public static void processOpportunities(List<Opportunity> opportunities) {

        OpportunityDomain domain = new OpportunityDomain(opportunities);

        domain.validateBusinessRules();

        

        // Additional service layer operations

        updateRelatedRecords(opportunities);

        sendNotifications(opportunities);

    }

    

    private static void updateRelatedRecords(List<Opportunity> opportunities) {

        // Update related accounts, contacts, etc.

    }

    

    private static void sendNotifications(List<Opportunity> opportunities) {

        // Send email notifications, platform events, etc.

    }

}

29. How do you implement enterprise-grade error handling and logging frameworks?

Comprehensive error handling framework:

apex

public class Logger {

    private static List<Log_Entry__c> logEntries = new List<Log_Entry__c>();

    

    public enum LogLevel { DEBUG, INFO, WARN, ERROR, FATAL }

    

    public static void log(LogLevel level, String className, String methodName, String message, Exception ex) {

        Log_Entry__c entry = new Log_Entry__c();

        entry.Level__c = level.name();

        entry.Class_Name__c = className;

        entry.Method_Name__c = methodName;

        entry.Message__c = message;

        entry.Stack_Trace__c = ex?.getStackTraceString();

        entry.User__c = UserInfo.getUserId();

        entry.Timestamp__c = System.now();

        

        logEntries.add(entry);

        

        // Immediate insertion for errors and fatal logs

        if (level == LogLevel.ERROR || level == LogLevel.FATAL) {

            flushLogs();

        }

    }

    

    public static void flushLogs() {

        if (!logEntries.isEmpty()) {

            try {

                insert logEntries;

                logEntries.clear();

            } catch (DMLException e) {

                // Fallback to System.debug if database insert fails

                System.debug('Failed to insert log entries: ' + e.getMessage());

            }

        }

    }

    

    // Automatic log flushing on transaction completion

    public static void handleTransactionEnd() {

        flushLogs();

    }

}

// Error Handler Utility

public class ErrorHandler {

    public static void handleException(Exception ex, String context) {

        Logger.log(Logger.LogLevel.ERROR, 

                  ErrorHandler.class.getName(), 

                  'handleException', 

                  'Error in ' + context + ': ' + ex.getMessage(), 

                  ex);

        

        // Send critical error notifications

        if (ex instanceof System.LimitException) {

            sendCriticalErrorNotification(ex, context);

        }

    }

    

    private static void sendCriticalErrorNotification(Exception ex, String context) {

        // Send email to system administrators

        // Create platform event for monitoring systems

        // Log to external monitoring tools

    }

    

    public static void processWithErrorHandling(String context, ProcessingDelegate processor) {

        try {

            processor.process();

        } catch (Exception ex) {

            handleException(ex, context);

            throw ex; // Re-throw if needed

        }

    }

}

// Delegate interface for error handling

public interface ProcessingDelegate {

    void process();

}

30. How do you implement sophisticated caching strategies in Apex?

Multi-level caching implementation:

apex

public class CacheManager {

    // Org cache partition

    private static final String ORG_PARTITION = 'OrgData';

    // Session cache partition  

    private static final String SESSION_PARTITION = 'SessionData';

    

    // Static cache for transaction-level caching

    private static Map<String, Object> transactionCache = new Map<String, Object>();

    

    public static Object get(String key, CacheLevel level) {

        switch on level {

            when TRANSACTION {

                return transactionCache.get(key);

            }

            when SESSION {

                return Cache.Session.get(SESSION_PARTITION + '.' + key);

            }

            when ORG {

                return Cache.Org.get(ORG_PARTITION + '.' + key);

            }

        }

        return null;

    }

    

    public static void put(String key, Object value, CacheLevel level, Integer ttlSeconds) {

        switch on level {

            when TRANSACTION {

                transactionCache.put(key, value);

            }

            when SESSION {

                Cache.Session.put(SESSION_PARTITION + '.' + key, value, ttlSeconds);

            }

            when ORG {

                Cache.Org.put(ORG_PARTITION + '.' + key, value, ttlSeconds);

            }

        }

    }

    

    public static Boolean contains(String key, CacheLevel level) {

        switch on level {

            when TRANSACTION {

                return transactionCache.containsKey(key);

            }

            when SESSION {

                return Cache.Session.contains(SESSION_PARTITION + '.' + key);

            }

            when ORG {

                return Cache.Org.contains(ORG_PARTITION + '.' + key);

            }

        }

        return false;

    }

    

    public enum CacheLevel { TRANSACTION, SESSION, ORG }

}

// Cached data service example

public class AccountCacheService {

    private static final String ACCOUNT_CACHE_KEY = 'AccountData_';

    private static final Integer CACHE_TTL = 3600; // 1 hour

    

    public static Account getCachedAccount(Id accountId) {

        String cacheKey = ACCOUNT_CACHE_KEY + accountId;

        

        // Try transaction cache first

        Account account = (Account) CacheManager.get(cacheKey, CacheManager.CacheLevel.TRANSACTION);

        if (account != null) {

            return account;

        }

        

        // Try session cache

        account = (Account) CacheManager.get(cacheKey, CacheManager.CacheLevel.SESSION);

        if (account != null) {

            // Store in transaction cache for faster access

            CacheManager.put(cacheKey, account, CacheManager.CacheLevel.TRANSACTION, CACHE_TTL);

            return account;

        }

        

        // Query database and cache result

        account = [SELECT Id, Name, Industry, AnnualRevenue FROM Account WHERE Id = :accountId LIMIT 1];

        

        CacheManager.put(cacheKey, account, CacheManager.CacheLevel.TRANSACTION, CACHE_TTL);

        CacheManager.put(cacheKey, account, CacheManager.CacheLevel.SESSION, CACHE_TTL);

        

        return account;

    }

}

31. How do you implement complex data migration and synchronization patterns?

Enterprise data migration framework:

apex

public class DataMigrationFramework {

    public interface MigrationStep {

        void execute(MigrationContext context);

        void rollback(MigrationContext context);

        String getStepName();

    }

    

    public class MigrationContext {

        public Map<String, Object> parameters;

        public List<String> errors;

        public Integer batchSize;

        public Boolean dryRun;

        

        public MigrationContext() {

            this.parameters = new Map<String, Object>();

            this.errors = new List<String>();

            this.batchSize = 200;

            this.dryRun = false;

        }

    }

    

    public class MigrationPipeline {

        private List<MigrationStep> steps;

        private MigrationContext context;

        

        public MigrationPipeline(MigrationContext context) {

            this.steps = new List<MigrationStep>();

            this.context = context;

        }

        

        public MigrationPipeline addStep(MigrationStep step) {

            this.steps.add(step);

            return this;

        }

        

        public MigrationResult execute() {

            MigrationResult result = new MigrationResult();

            List<MigrationStep> executedSteps = new List<MigrationStep>();

            

            try {

                for (MigrationStep step : steps) {

                    Logger.log(Logger.LogLevel.INFO, 'MigrationPipeline', 'execute', 

                              'Executing step: ' + step.getStepName(), null);

                    

                    step.execute(context);

                    executedSteps.add(step);

                    

                    result.completedSteps.add(step.getStepName());

                }

                

                result.success = true;

            } catch (Exception ex) {

                Logger.log(Logger.LogLevel.ERROR, 'MigrationPipeline', 'execute', 

                          'Migration failed', ex);

                

                result.success = false;

                result.errorMessage = ex.getMessage();

                

                // Rollback executed steps in reverse order

                rollbackSteps(executedSteps);

            }

            

            return result;

        }

        

        private void rollbackSteps(List<MigrationStep> executedSteps) {

            for (Integer i = executedSteps.size() - 1; i >= 0; i--) {

                try {

                    executedSteps[i].rollback(context);

                } catch (Exception rollbackEx) {

                    Logger.log(Logger.LogLevel.ERROR, 'MigrationPipeline', 'rollbackSteps', 

                              'Rollback failed for step: ' + executedSteps[i].getStepName(), rollbackEx);

                }

            }

        }

    }

    

    public class MigrationResult {

        public Boolean success;

        public String errorMessage;

        public List<String> completedSteps;

        

        public MigrationResult() {

            this.completedSteps = new List<String>();

        }

    }

}

// Example migration step implementation

public class AccountDataMigrationStep implements DataMigrationFramework.MigrationStep {

    public void execute(DataMigrationFramework.MigrationContext context) {

        List<Legacy_Account__c> legacyAccounts = [SELECT Name, Industry__c, Revenue__c FROM Legacy_Account__c];

        List<Account> accountsToInsert = new List<Account>();

        

        for (Legacy_Account__c legacy : legacyAccounts) {

            Account newAccount = new Account();

            newAccount.Name = legacy.Name;

            newAccount.Industry = legacy.Industry__c;

            newAccount.AnnualRevenue = legacy.Revenue__c;

            accountsToInsert.add(newAccount);

        }

        

        if (!context.dryRun) {

            Database.SaveResult[] results = Database.insert(accountsToInsert, false);

            

            for (Database.SaveResult result : results) {

                if (!result.isSuccess()) {

                    context.errors.add('Failed to migrate account: ' + result.getErrors());

                }

            }

        }

    }

    

    public void rollback(DataMigrationFramework.MigrationContext context) {

        // Implement rollback logic

        delete [SELECT Id FROM Account WHERE CreatedDate = TODAY];

    }

    

    public String getStepName() {

        return 'Account Data Migration';

    }

}

32. How do you implement advanced security patterns including encryption and tokenization?

Advanced security implementation:

apex

public class SecurityManager {

    private static final String ENCRYPTION_KEY = 'MySecretEncryptionKey123!';

    

    // Field-level encryption

    public static String encryptSensitiveData(String plainText) {

        if (String.isBlank(plainText)) {

            return plainText;

        }

        

        try {

            Blob key = Crypto.generateAesKey(256);

            Blob data = Blob.valueOf(plainText);

            Blob encryptedData = Crypto.encrypt('AES256', key, data);

            

            // Store the key securely (this is a simplified example)

            storeEncryptionKey(key);

            

            return EncodingUtil.base64Encode(encryptedData);

        } catch (Exception ex) {

            Logger.log(Logger.LogLevel.ERROR, 'SecurityManager', 'encryptSensitiveData', 

                      'Encryption failed', ex);

            throw ex;

        }

    }

    

    public static String decryptSensitiveData(String encryptedText) {

        if (String.isBlank(encryptedText)) {

            return encryptedText;

        }

        

        try {

            Blob key = retrieveEncryptionKey();

            Blob encryptedData = EncodingUtil.base64Decode(encryptedText);

            Blob decryptedData = Crypto.decrypt('AES256', key, encryptedData);

            

            return decryptedData.toString();

        } catch (Exception ex) {

            Logger.log(Logger.LogLevel.ERROR, 'SecurityManager', 'decryptSensitiveData', 

                      'Decryption failed', ex);

            throw ex;

        }

    }

    

    // Data tokenization for sensitive data

    public static String tokenizeSensitiveData(String sensitiveData, String tokenType) {

        String token = generateSecureToken();

        

        // Store mapping in protected custom setting

        Token_Mapping__c mapping = new Token_Mapping__c();

        mapping.Token__c = token;

        mapping.Token_Type__c = tokenType;

        mapping.Original_Value__c = encryptSensitiveData(sensitiveData);

        

        insert mapping;

        

        return token;

    }

    

    public static String detokenizeData(String token) {

        Token_Mapping__c mapping = [

            SELECT Original_Value__c 

            FROM Token_Mapping__c 

            WHERE Token__c = :token 

            LIMIT 1

        ];

        

        return decryptSensitiveData(mapping.Original_Value__c);

    }

    

    private static String generateSecureToken() {

        // Generate cryptographically secure token

        Blob randomBytes = Crypto.generateAesKey(128);

        return EncodingUtil.base64Encode(randomBytes);

    }

    

    private static void storeEncryptionKey(Blob key) {

        // Store encryption key securely in protected custom setting or external system

        // This is a simplified implementation

    }

    

    private static Blob retrieveEncryptionKey() {

        // Retrieve encryption key from secure storage

        // This is a simplified implementation

        return Crypto.generateAesKey(256);

    }

    

    // Data masking for non-production environments

    public static String maskSensitiveData(String originalValue, String maskingPattern) {

        if (String.isBlank(originalValue)) {

            return originalValue;

        }

        

        switch on maskingPattern {

            when 'EMAIL' {

                return maskEmail(originalValue);

            }

            when 'PHONE' {

                return maskPhoneNumber(originalValue);

            }

            when 'SSN' {

                return maskSSN(originalValue);

            }

            when else {

                return '***MASKED***';

            }

        }

    }

    

    private static String maskEmail(String email) {

        if (!email.contains('@')) {

            return email;

        }

        

        String[] parts = email.split('@');

        String localPart = parts[0];

        String domain = parts[1];

        

        String maskedLocal = localPart.length() > 2 ? 

            localPart.substring(0, 2) + '***' : 

            '***';

            

        return maskedLocal + '@' + domain;

    }

    

    private static String maskPhoneNumber(String phone) {

        if (phone.length() < 4) {

            return '***';

        }

        

        return '***-***-' + phone.substring(phone.length() - 4);

    }

    

    private static String maskSSN(String ssn) {

        if (ssn.length() < 4) {

            return '***';

        }

        

        return '***-**-' + ssn.substring(ssn.length() - 4);

    }

}

33. How do you implement enterprise integration patterns with external systems?

Enterprise integration patterns:

apex

public class IntegrationOrchestrator {

    public interface MessageProcessor {

        void processMessage(IntegrationMessage message);

    }

    

    public class IntegrationMessage {

        public String messageId;

        public String messageType;

        public String source;

        public String destination;

        public Map<String, Object> payload;

        public DateTime timestamp;

        public Integer retryCount;

        

        public IntegrationMessage(String messageType, String source, String destination, Map<String, Object> payload) {

            this.messageId = generateMessageId();

            this.messageType = messageType;

            this.source = source;

            this.destination = destination;

            this.payload = payload;

            this.timestamp = System.now();

            this.retryCount = 0;

        }

        

        private String generateMessageId() {

            return 'MSG_' + System.currentTimeMillis() + '_' + Math.round(Math.random() * 1000);

        }

    }

    

    // Message Router Pattern

    public class MessageRouter {

        private Map<String, MessageProcessor> processors;

        

        public MessageRouter() {

            this.processors = new Map<String, MessageProcessor>();

        }

        

        public void registerProcessor(String messageType, MessageProcessor processor) {

            processors.put(messageType, processor);

        }

        

        public void routeMessage(IntegrationMessage message) {

            MessageProcessor processor = processors.get(message.messageType);

            

            if (processor != null) {

                try {

                    processor.processMessage(message);

                } catch (Exception ex) {

                    handleProcessingError(message, ex);

                }

            } else {

                Logger.log(Logger.LogLevel.WARN, 'MessageRouter', 'routeMessage', 

                          'No processor found for message type: ' + message.messageType, null);

            }

        }

        

        private void handleProcessingError(IntegrationMessage message, Exception ex) {

            message.retryCount++;

            

            if (message.retryCount < 3) {

                // Retry logic

                System.enqueueJob(new RetryProcessor(message));

            } else {

                // Send to dead letter queue

                sendToDeadLetterQueue(message, ex);

            }

        }

    }

    

    // Retry Processor for failed messages

    public class RetryProcessor implements Queueable {

        private IntegrationMessage message;

        

        public RetryProcessor(IntegrationMessage message) {

            this.message = message;

        }

        

        public void execute(QueueableContext context) {

            // Exponential backoff delay

            Integer delay = (Integer) Math.pow(2, message.retryCount) * 1000;

            

            // Simulate delay (in real implementation, use scheduled job)

            MessageRouter router = new MessageRouter();

            router.routeMessage(message);

        }

    }

    

    private static void sendToDeadLetterQueue(IntegrationMessage message, Exception ex) {

        Dead_Letter_Queue__c dlq = new Dead_Letter_Queue__c();

        dlq.Message_Id__c = message.messageId;

        dlq.Message_Type__c = message.messageType;

        dlq.Payload__c = JSON.serialize(message.payload);

        dlq.Error_Message__c = ex.getMessage();

        dlq.Retry_Count__c = message.retryCount;

        

        insert dlq;

    }

}

// Specific message processor implementation

public class AccountSyncProcessor implements IntegrationOrchestrator.

​​​​

34. How do you implement advanced lightning web component architecture with Apex integration?**

Advanced LWC-Apex integration patterns:

```apex

// Advanced controller with caching and error handling

public with sharing class AdvancedAccountController {

    private static final String CACHE_PARTITION = 'AccountData';

    private static final Integer CACHE_TTL = 3600; // 1 hour

    

    // Cacheable method for wire service

    @AuraEnabled(cacheable=true)

    public static Map<String, Object> getAccountsWithMetadata(

        String searchTerm, 

        String industry, 

        Integer pageSize, 

        Integer pageNumber

    ) {

        try {

            String cacheKey = generateCacheKey('getAccounts', new List<String>{

                searchTerm, industry, String.valueOf(pageSize), String.valueOf(pageNumber)

            });

            

            // Try cache first

            Map<String, Object> cachedResult = (Map<String, Object>) 

                Cache.Org.get(CACHE_PARTITION + '.' + cacheKey);

            

            if (cachedResult != null) {

                return cachedResult;

            }

            

            // Build dynamic query

            String query = buildAccountQuery(searchTerm, industry);

            Integer offset = (pageNumber - 1) * pageSize;

            

            List<Account> accounts = Database.query(query + ' LIMIT :pageSize OFFSET :offset');

            Integer totalCount = Database.countQuery(buildCountQuery(searchTerm, industry));

            

            Map<String, Object> result = new Map<String, Object>{

                'accounts' => accounts,

                'totalCount' => totalCount,

                'pageSize' => pageSize,

                'pageNumber' => pageNumber,

                'totalPages' => Math.ceil((Decimal)totalCount / pageSize)

            };

            

            // Cache result

            Cache.Org.put(CACHE_PARTITION + '.' + cacheKey, result, CACHE_TTL);

            

            return result;

            

        } catch (Exception e) {

            throw new AuraHandledException('Error retrieving accounts: ' + e.getMessage());

        }

    }

    

    // Imperative method for data manipulation

    @AuraEnabled

    public static Map<String, Object> updateAccountWithValidation(

        Account account, 

        List<Contact> contacts

    ) {

        Savepoint sp = Database.setSavepoint();

        

        try {

            // Validate account data

            validateAccountData(account);

            

            // Update account

            update account;

            

            // Process related contacts

            Map<String, Object> contactResult = processContacts(account.Id, contacts);

            

            // Invalidate cache

            invalidateAccountCache();

            

            return new Map<String, Object>{

                'success' => true,

                'account' => account,

                'contactResult' => contactResult,

                'message' => 'Account updated successfully'

            };

            

        } catch (Exception e) {

            Database.rollback(sp);

            

            return new Map<String, Object>{

                'success' => false,

                'error' => e.getMessage(),

                'errorType' => e.getTypeName()

            };

        }

    }

    

    // Streaming data for real-time updates

    @AuraEnabled

    public static String subscribeToAccountUpdates(Id accountId) {

        try {

            // Create platform event subscription

            Account_Update_Event__e updateEvent = new Account_Update_Event__e();

            updateEvent.Account_Id__c = accountId;

            updateEvent.Subscriber_Id__c = UserInfo.getUserId();

            updateEvent.Subscription_Type__c = 'REAL_TIME_UPDATES';

            

            Database.SaveResult result = EventBus.publish(updateEvent);

            

            if (result.isSuccess()) {

                return 'Subscription created successfully';

            } else {

                throw new AuraHandledException('Failed to create subscription');

            }

            

        } catch (Exception e) {

            throw new AuraHandledException('Error creating subscription: ' + e.getMessage());

        }

    }

    

    // File upload handling

    @AuraEnabled

    public static Map<String, Object> uploadAccountDocuments(

        Id accountId,

        String fileName,

        String base64Data,

        String contentType

    ) {

        try {

            // Validate file

            validateFileUpload(fileName, base64Data, contentType);

            

            // Create content version

            ContentVersion cv = new ContentVersion();

            cv.Title = fileName;

            cv.PathOnClient = fileName;

            cv.VersionData = EncodingUtil.base64Decode(base64Data);

            cv.ContentType = contentType;

            

            insert cv;

            

            // Link to account

            ContentDocumentLink cdl = new ContentDocumentLink();

            cdl.LinkedEntityId = accountId;

            cdl.ContentDocumentId = [SELECT ContentDocumentId FROM ContentVersion WHERE Id = :cv.Id].ContentDocumentId;

            cdl.ShareType = 'V';

            cdl.Visibility = 'AllUsers';

            

            insert cdl;

            

            return new Map<String, Object>{

                'success' => true,

                'fileId' => cv.Id,

                'message' => 'File uploaded successfully'

            };

            

        } catch (Exception e) {

            throw new AuraHandledException('Error uploading file: ' + e.getMessage());

        }

    }

    

    // Batch operations for bulk updates

    @AuraEnabled

    public static Map<String, Object> bulkUpdateAccounts(List<Map<String, Object>> accountData) {

        try {

            List<Account> accountsToUpdate = new List<Account>();

            List<String> errors = new List<String>();

            

            for (Map<String, Object> data : accountData) {

                try {

                    Account acc = new Account();

                    acc.Id = (Id) data.get('Id');

                    acc.Name = (String) data.get('Name');

                    acc.Industry = (String) data.get('Industry');

                    acc.Rating = (String) data.get('Rating');

                    

                    validateAccountData(acc);

                    accountsToUpdate.add(acc);

                    

                } catch (Exception e) {

                    errors.add('Error processing account ' + data.get('Id') + ': ' + e.getMessage());

                }

            }

            

            // Perform bulk update

            Database.SaveResult[] results = Database.update(accountsToUpdate, false);

            

            Integer successCount = 0;

            for (Integer i = 0; i < results.size(); i++) {

                if (results[i].isSuccess()) {

                    successCount++;

                } else {

                    errors.add('Failed to update account ' + accountsToUpdate[i].Id + ': ' + 

                             results[i].getErrors()[0].getMessage());

                }

            }

            

            // Invalidate cache

            invalidateAccountCache();

            

            return new Map<String, Object>{

                'success' => errors.isEmpty(),

                'successCount' => successCount,

                'totalCount' => accountData.size(),

                'errors' => errors

            };

            

        } catch (Exception e) {

            throw new AuraHandledException('Error in bulk update: ' + e.getMessage());

        }

    }

    

    // Helper methods

    private static String buildAccountQuery(String searchTerm, String industry) {

        String baseQuery = 'SELECT Id, Name, Industry, Rating, AnnualRevenue, Phone, Website ' +

                          'FROM Account WHERE IsPersonAccount = false';

        

        List<String> conditions = new List<String>();

        

        if (String.isNotBlank(searchTerm)) {

            conditions.add('(Name LIKE \'%' + String.escapeSingleQuotes(searchTerm) + '%\' OR ' +

                          'Phone LIKE \'%' + String.escapeSingleQuotes(searchTerm) + '%\')');

        }

        

        if (String.isNotBlank(industry)) {

            conditions.add('Industry = \'' + String.escapeSingleQuotes(industry) + '\'');

        }

        

        if (!conditions.isEmpty()) {

            baseQuery += ' AND ' + String.join(conditions, ' AND ');

        }

        

        return baseQuery + ' ORDER BY Name ASC';

    }

    

    private static String buildCountQuery(String searchTerm, String industry) {

        String baseQuery = 'SELECT COUNT() FROM Account WHERE IsPersonAccount = false';

        

        List<String> conditions = new List<String>();

        

        if (String.isNotBlank(searchTerm)) {

            conditions.add('(Name LIKE \'%' + String.escapeSingleQuotes(searchTerm) + '%\' OR ' +

                          'Phone LIKE \'%' + String.escapeSingleQuotes(searchTerm) + '%\')');

        }

        

        if (String.isNotBlank(industry)) {

            conditions.add('Industry = \'' + String.escapeSingleQuotes(industry) + '\'');

        }

        

        if (!conditions.isEmpty()) {

            baseQuery += ' AND ' + String.join(conditions, ' AND ');

        }

        

        return baseQuery;

    }

    

    private static void validateAccountData(Account account) {

        if (String.isBlank(account.Name)) {

            throw new ValidationException('Account name is required');

        }

        

        if (account.Name.length() > 255) {

            throw new ValidationException('Account name is too long');

        }

        

        // Additional validation logic

    }

    

    private static Map<String, Object> processContacts(Id accountId, List<Contact> contacts) {

        List<Contact> contactsToUpsert = new List<Contact>();

        

        for (Contact con : contacts) {

            con.AccountId = accountId;

            contactsToUpsert.add(con);

        }

        

        Database.UpsertResult[] results = Database.upsert(contactsToUpsert, Contact.Id);

        

        Integer successCount = 0;

        for (Database.UpsertResult result : results) {

            if (result.isSuccess()) {

                successCount++;

            }

        }

        

        return new Map<String, Object>{

            'successCount' => successCount,

            'totalCount' => contacts.size()

        };

    }

    

    private static void validateFileUpload(String fileName, String base64Data, String contentType) {

        // File size validation (5MB limit)

        if (base64Data.length() > 5 * 1024 * 1024) {

            throw new ValidationException('File size exceeds 5MB limit');

        }

        

        // File type validation

        List<String> allowedTypes = new List<String>{'image/jpeg', 'image/png', 'application/pdf'};

        if (!allowedTypes.contains(contentType)) {

            throw new ValidationException('File type not allowed');

        }

    }

    

    private static String generateCacheKey(String method, List<String> parameters) {

        return method + '_' + String.join(parameters, '_').replace(' ', '');

    }

    

    private static void invalidateAccountCache() {

        // Remove cached account data

        // In a real implementation, you might use cache partitions or patterns

        Cache.Org.remove(CACHE_PARTITION + '.getAccounts');

    }

    

    public class ValidationException extends Exception {}

}

35. How do you implement enterprise-grade monitoring and observability in Apex applications?

Comprehensive monitoring and observability framework:

public class ObservabilityFramework {

    // Metrics collection

    public class MetricsCollector {

        private static Map<String, MetricData> metrics = new Map<String, MetricData>();

        

        public class MetricData {

            public String name;

            public String type; // COUNTER, GAUGE, HISTOGRAM, TIMER

            public Decimal value;

            public Map<String, String> tags;

            public DateTime timestamp;

            

            public MetricData(String name, String type) {

                this.name = name;

                this.type = type;

                this.value = 0;

                this.tags = new Map<String, String>();

                this.timestamp = System.now();

            }

        }

        

        public static void incrementCounter(String metricName, Map<String, String> tags) {

            String key = generateMetricKey(metricName, tags);

            MetricData metric = metrics.get(key);

            

            if (metric == null) {

                metric = new MetricData(metricName, 'COUNTER');

                metric.tags = tags;

                metrics.put(key, metric);

            }

            

            metric.value++;

            metric.timestamp = System.now();

        }

        

        public static void recordGauge(String metricName, Decimal value, Map<String, String> tags) {

            String key = generateMetricKey(metricName, tags);

            MetricData metric = new MetricData(metricName, 'GAUGE');

            metric.value = value;

            metric.tags = tags;

            metrics.put(key, metric);

        }

        

        public static void recordTimer(String metricName, Long durationMs, Map<String, String> tags) {

            String key = generateMetricKey(metricName, tags);

            MetricData metric = new MetricData(metricName, 'TIMER');

            metric.value = durationMs;

            metric.tags = tags;

            metrics.put(key, metric);

        }

        

        public static List<MetricData> getAllMetrics() {

            return metrics.values();

        }

        

        public static void clearMetrics() {

            metrics.clear();

        }

        

        private static String generateMetricKey(String metricName, Map<String, String> tags) {

            String key = metricName;

            if (tags != null && !tags.isEmpty()) {

                List<String> tagPairs = new List<String>();

                for (String tagKey : tags.keySet()) {

                    tagPairs.add(tagKey + '=' + tags.get(tagKey));

                }

                key += '_' + String.join(tagPairs, '_');

            }

            return key;

        }

    }

    

    // Distributed tracing

    public class TracingService {

        private static Map<String, TraceContext> activeTraces = new Map<String, TraceContext>();

        

        public class TraceContext {

            public String traceId;

            public String spanId;

            public String parentSpanId;

            public String operationName;

            public DateTime startTime;

            public DateTime endTime;

            public Map<String, Object> tags;

            public List<LogEvent> logs;

            

            public TraceContext(String operationName) {

                this.traceId = generateTraceId();

                this.spanId = generateSpanId();

                this.operationName = operationName;

                this.startTime = System.now();

                this.tags = new Map<String, Object>();

                this.logs = new List<LogEvent>();

            }

        }

        

        public class LogEvent {

            public DateTime timestamp;

            public String level;

            public String message;

            public Map<String, Object> fields;

            

            public LogEvent(String level, String message) {

                this.timestamp = System.now();

                this.level = level;

                this.message = message;

                this.fields = new Map<String, Object>();

            }

        }

        

        public static TraceContext startTrace(String operationName) {

            TraceContext trace = new TraceContext(operationName);

            activeTraces.put(trace.traceId, trace);

            

            // Add standard tags

            trace.tags.put('user.id', UserInfo.getUserId());

            trace.tags.put('org.id', UserInfo.getOrganizationId());

            trace.tags.put('operation', operationName);

            

            return trace;

        }

        

        public static void finishTrace(String traceId) {

            TraceContext trace = activeTraces.get(traceId);

            if (trace != null) {

                trace.endTime = System.now();

                

                // Send trace to external monitoring system

                sendTraceToMonitoring(trace);

                

                activeTraces.remove(traceId);

            }

        }

        

        public static void addTraceLog(String traceId, String level, String message) {

            TraceContext trace = activeTraces.get(traceId);

            if (trace != null) {

                trace.logs.add(new LogEvent(level, message));

            }

        }

        

        public static void addTraceTag(String traceId, String key, Object value) {

            TraceContext trace = activeTraces.get(traceId);

            if (trace != null) {

                trace.tags.put(key, value);

            }

        }

        

        @future(callout=true)

        private static void sendTraceToMonitoring(TraceContext trace) {

            try {

                // Send to external tracing system (e.g., Jaeger, Zipkin)

                Map<String, Object> traceData = new Map<String, Object>{

                    'traceId' => trace.traceId,

                    'spanId' => trace.spanId,

                    'operationName' => trace.operationName,

                    'startTime' => trace.startTime.getTime(),

                    'endTime' => trace.endTime.getTime(),

                    'duration' => trace.endTime.getTime() - trace.startTime.getTime(),

                    'tags' => trace.tags,

                    'logs' => trace.logs

                };

                

                Http http = new Http();

                HttpRequest request = new HttpRequest();

                request.setEndpoint('https://monitoring-system.example.com/api/traces');

                request.setMethod('POST');

                request.setHeader('Content-Type', 'application/json');

                request.setBody(JSON.serialize(traceData));

                

                HttpResponse response = http.send(request);

                

                if (response.getStatusCode() != 200) {

                    System.debug('Failed to send trace: ' + response.getBody());

                }

                

            } catch (Exception e) {

                System.debug('Error sending trace: ' + e.getMessage());

            }

        }

        

        private static String generateTraceId() {

            return EncodingUtil.convertToHex(Crypto.generateAesKey(128)).substring(0, 16);

        }

        

        private static String generateSpanId() {

            return EncodingUtil.convertToHex(Crypto.generateAesKey(64)).substring(0, 8);

        }

    }

    

    // Health check system

    public class HealthCheckService {

        public interface HealthCheck {

            HealthStatus check();

            String getName();

        }

        

        public class HealthStatus {

            public Boolean healthy;

            public String status; // UP, DOWN, DEGRADED

            public String message;

            public Map<String, Object> details;

            

            public HealthStatus(Boolean healthy, String status, String message) {

                this.healthy = healthy;

                this.status = status;

                this.message = message;

                this.details = new Map<String, Object>();

            }

        }

        

        private static List<HealthCheck> healthChecks = new List<HealthCheck>();

        

        public static void registerHealthCheck(HealthCheck check) {

            healthChecks.add(check);

        }

        

        public static Map<String, Object> performHealthCheck() {

            Map<String, Object> overallHealth = new Map<String, Object>();

            Map<String, HealthStatus> checkResults = new Map<String, HealthStatus>();

            

            Boolean overallHealthy = true;

            

            for (HealthCheck check : healthChecks) {

                try {

                    HealthStatus status = check.check();

                    checkResults.put(check.getName(), status);

                    

                    if (!status.healthy) {

                        overallHealthy = false;

                    }

                # Ultimate Apex Interview Questions Guide (2025): Oracle APEX & Salesforce Apex

Engineering teams face a critical challenge when hiring developers with Apex expertise. Whether you're building rapid web applications with Oracle APEX or implementing complex business logic with Salesforce Apex, finding candidates who can deliver from day one requires precise technical evaluation.

This comprehensive guide provides 80+ carefully curated interview questions designed specifically for engineering leaders who need to identify genuine Apex expertise. We've structured questions across both Oracle APEX and Salesforce Apex platforms, categorized by skill level to help you assess candidates accurately.

## Why This Guide Matters for Engineering Teams

As organizations increasingly rely on low-code platforms and Salesforce ecosystems, the demand for skilled Apex developers has skyrocketed. However, resumes often don't reflect real-world problem-solving abilities. This guide helps you:

- **Evaluate actual coding skills** beyond theoretical knowledge

- **Assess architectural thinking** for complex enterprise solutions

- **Identify candidates who understand performance implications**

- **Test real-world scenario handling** rather than textbook answers

The questions in this guide have been validated by engineering teams at companies ranging from startups to Fortune 500 organizations, ensuring they reflect actual job requirements rather than academic concepts.

---

## Oracle APEX Interview Questions

Oracle APEX (Application Express) enables rapid development of data-driven web applications. These questions test candidates' ability to build scalable, secure applications using Oracle's low-code platform.

### Beginner Level Questions (1-15)

**1. What is Oracle APEX and how does it differ from traditional web development frameworks?**

Oracle APEX is a low-code development platform that runs entirely within the Oracle Database. Unlike traditional frameworks that require separate application servers, databases, and extensive coding, APEX provides a declarative development environment where applications are built through configuration rather than custom code.

Key differences include:

- **Tight database integration**: Applications run within the Oracle Database, eliminating the need for separate application tiers

- **Declarative development**: Components are configured through wizards and forms rather than coded from scratch

- **Built-in security**: Automatic protection against SQL injection, XSS, and other common vulnerabilities

- **Rapid deployment**: Applications can be built and deployed in hours rather than weeks

**2. Explain the architecture of Oracle APEX.**

APEX architecture consists of four main components:

- **Oracle Database**: Contains the APEX metadata repository, application logic, and data

- **APEX Listener (ORDS)**: Oracle REST Data Services handles HTTP requests and communicates with the database

- **Web Server**: Hosts static files and routes requests to ORDS

- **Browser**: Renders the HTML, CSS, and JavaScript generated by APEX

The request flow: Browser → Web Server → ORDS → Oracle Database → APEX Engine → Response back through the chain.

**3. What is a workspace in Oracle APEX?**

A workspace is a virtual private database that groups APEX applications, users, and database schemas. It provides:

- **Isolation**: Each workspace operates independently with its own users and applications

- **Security boundary**: Users in one workspace cannot access another workspace's applications

- **Schema mapping**: Associates the workspace with one or more database schemas

- **Administration**: Manages developers, end users, and application settings

**4. Describe the difference between a page and a region in APEX.**

- **Page**: A complete screen or view in an APEX application, accessible via a unique URL. Contains regions, items, buttons, and processes

- **Region**: A container within a page that displays specific content like reports, forms, charts, or static content. Multiple regions can exist on a single page

Think of a page as a webpage and regions as sections or widgets within that page.

**5. What are the main types of reports available in Oracle APEX?**

- **Classic Report**: Simple tabular data display with basic sorting and pagination

- **Interactive Report**: Advanced user-customizable reports with filtering, searching, grouping, and personal customizations

- **Interactive Grid**: Spreadsheet-like interface allowing inline editing, adding, and deleting records

- **Cards**: Visual representation of data in card format

- **Chart**: Graphical data representation (bar, pie, line charts, etc.)

**6. How do you implement master-detail relationships in APEX?**

Master-detail relationships are implemented by:

1. **Creating the master form/report** based on the parent table

2. **Adding a detail region** on the same page or linked page

3. **Setting the master-detail relationship** in the detail region properties

4. **Configuring the link column** that connects master to detail records

5. **Setting up automatic refresh** so detail records update when master selection changes

**7. What is a Dynamic Action in Oracle APEX?**

Dynamic Actions provide client-side interactivity without page refreshes. They consist of:

- **When**: Event trigger (button click, item change, page load)

- **Event**: Specific action that triggers the dynamic action

- **Condition**: Optional criteria that must be met

- **Action**: What happens (show/hide items, execute JavaScript, refresh regions)

Example: Hide a region when a checkbox is unchecked, or refresh a report when a select list value changes.

**8. Explain the concept of session state in APEX.**

Session state maintains data values across pages and user interactions within an APEX session. It includes:

- **Page items**: Values entered in forms or selected from lists

- **Application items**: Global variables accessible across all pages

- **Session ID**: Unique identifier for the user session

- **Automatic management**: APEX handles session creation, maintenance, and cleanup

**9. What are the different authentication schemes available in APEX?**

- **APEX Accounts**: Internal APEX user management

- **Database Accounts**: Uses Oracle Database user authentication

- **LDAP Directory**: Integrates with LDAP servers like Active Directory

- **Social Sign-On**: OAuth integration with Google, Facebook, etc.

- **Custom**: User-defined authentication logic using PL/SQL

- **No Authentication**: For public applications

**10. How do you handle file uploads in Oracle APEX?**

File uploads are handled using:

1. **File Browse item**: Allows users to select files from their device

2. **BLOB storage**: Files stored as Binary Large Objects in database tables

3. **File processing**: PL/SQL logic to handle uploaded files

4. **Validation**: File type, size, and content validation

5. **Download mechanism**: Process to retrieve and serve uploaded files

**11. What is the difference between before and after page processes?**

- **Before Header**: Executes before the page is rendered, useful for authentication and data initialization

- **After Header**: Runs after page rendering but before the user sees it

- **On Load**: Executes when the page loads

- **On Submit - Before Computations**: Runs before any computations when form is submitted

- **On Submit - After Computations**: Runs after computations but before validations

**12. How do you implement conditional rendering in APEX?**

Conditional rendering controls when components display based on:

- **Item values**: Show region only if specific item has certain value

- **User attributes**: Display based on user role or authorization

- **PL/SQL expressions**: Custom logic determining visibility

- **Page items**: Conditions based on other page item values

- **Application items**: Global conditions affecting multiple pages

**13. What are shared components in Oracle APEX?**

Shared components are reusable elements available across an application:

- **Lists of Values (LOVs)**: Dropdown options used in multiple places

- **Templates**: HTML structures for consistent appearance

- **Authentication schemes**: Login mechanisms

- **Authorization schemes**: Access control rules

- **Themes**: Overall application appearance and styling

- **Web service references**: External API connections

**14. Explain the difference between application items and page items.**

- **Page Items**: Scope limited to a specific page, automatically managed by APEX, used for user input and page-specific data

- **Application Items**: Global scope across entire application, manually managed, used for session-wide data like user preferences or application state

**15. How do you create cascading LOVs (Lists of Values)?**

Cascading LOVs create dependent dropdowns where the second list's options depend on the first list's selection:

1. **Create parent LOV**: First dropdown with independent values

2. **Create dependent LOV**: Second dropdown with SQL query referencing parent item

3. **Set cascading parent**: Configure the dependency relationship

4. **Add refresh action**: Ensure dependent LOV updates when parent changes

### Intermediate Level Questions (16-27)

**16. How do you optimize performance for large datasets in APEX applications?**

Performance optimization strategies include:

- **Pagination**: Limit records displayed per page using row limiting

- **Lazy loading**: Load data only when needed

- **Efficient SQL**: Use proper indexing, avoid SELECT *, optimize joins

- **Caching**: Enable region and application-level caching

- **Asynchronous processing**: Use background jobs for heavy operations

- **Partial page refresh**: Update only necessary regions instead of full page reload

**17. Describe how to implement custom authentication in Oracle APEX.**

Custom authentication involves:

1. **Create authentication scheme**: Go to Shared Components > Authentication Schemes

2. **Define PL/SQL function**: Write authentication logic that returns TRUE/FALSE

3. **Session management**: Handle user session creation and validation

4. **Login page customization**: Create custom login interface

5. **Post-authentication processing**: Set session attributes and redirect logic

```sql

FUNCTION custom_authenticate(p_username VARCHAR2, p_password VARCHAR2) 

RETURN BOOLEAN IS

BEGIN

  -- Custom authentication logic

  IF validate_user_credentials(p_username, p_password) THEN

    -- Set session attributes

    RETURN TRUE;

  ELSE

    RETURN FALSE;

  END IF;

END;

18. How do you handle RESTful web services in APEX?

RESTful services in APEX involve:

  • Creating REST endpoints: Define URI templates and HTTP methods

  • Data source modules: Configure external REST API connections

  • Authentication: Set up OAuth, API keys, or basic authentication

  • Request/response handling: Map JSON/XML to APEX items and collections

  • Error handling: Implement robust error handling for service failures

19. What are APEX collections and when would you use them?

APEX collections are temporary, session-specific data structures that:

  • Store temporary data: Hold data during user session without database commits

  • Manipulate datasets: Sort, filter, and modify data before database operations

  • Cross-page data: Share data between pages within a session

  • Report building: Create complex reports from multiple data sources

  • Wizard implementations: Store multi-step form data

20. How do you implement row-level security in APEX applications?

Row-level security implementation:

  • VPD policies: Virtual Private Database policies at database level

  • Authorization schemes: APEX-level access control rules

  • Shared components: Reusable security logic across applications

  • Session attributes: User-specific security context

  • SQL filtering: Dynamic WHERE clauses based on user permissions

21. Explain the APEX plugin architecture and how to develop custom plugins.

APEX plugins extend functionality through:

  1. Plugin types: Region, item, dynamic action, process, or authorization

  2. PL/SQL code: Server-side logic for data processing

  3. JavaScript/CSS: Client-side behavior and styling

  4. Configuration options: Parameters for plugin customization

  5. Installation: Packaged for deployment across applications

22. How do you handle large file uploads and downloads in APEX?

Large file handling strategies:

  • Chunked uploads: Break large files into smaller pieces

  • Background processing: Use APEX collections or temporary tables

  • Streaming: Process files without loading entirely into memory

  • Compression: Reduce file sizes before storage

  • Progress indicators: Provide user feedback during operations

  • Error recovery: Handle interrupted uploads gracefully

23. What is the role of APEX Listener (ORDS) and how do you configure it?

ORDS serves as the web server component that:

  • Handles HTTP requests: Processes incoming web requests

  • Database connectivity: Manages connection pooling to Oracle Database

  • REST services: Exposes database operations as REST APIs

  • Static file serving: Handles images, CSS, JavaScript files

  • Security: Implements SSL/TLS and authentication protocols

Configuration involves setting connection pools, security, and deployment parameters.

24. How do you implement complex business rules in APEX?

Complex business rules implementation:

  • PL/SQL packages: Centralized business logic separate from presentation

  • Database triggers: Automatic enforcement of data integrity rules

  • APEX validations: Page-level business rule validation

  • Dynamic actions: Client-side rule enforcement

  • Workflow engines: For complex approval processes

  • Custom computations: Calculated fields based on business logic

25. Describe APEX application deployment strategies.

Deployment strategies include:

  • Export/Import: Manual application export and import between environments

  • SQL*Plus scripts: Automated deployment using command-line tools

  • Version control: Integration with Git or other VCS systems

  • Environment management: Separate development, test, and production environments

  • Data migration: Handling data differences between environments

  • Rollback procedures: Ability to revert problematic deployments

26. How do you integrate APEX with external systems?

Integration approaches:

  • Web services: REST and SOAP service consumption

  • Database links: Direct database-to-database connections

  • Message queues: Asynchronous integration using Oracle AQ

  • File-based: CSV, XML, JSON file processing

  • API gateways: Centralized API management

  • ETL processes: Extract, Transform, Load operations

27. What are the security best practices for APEX applications?

Security best practices:

  • Input validation: Validate all user inputs at multiple levels

  • SQL injection prevention: Use bind variables and parameterized queries

  • XSS protection: Escape output and use Content Security Policy

  • Authentication: Implement strong authentication mechanisms

  • Authorization: Fine-grained access control

  • Session management: Secure session handling and timeout

  • HTTPS enforcement: Encrypt all communications

  • Regular updates: Keep APEX and database patches current

Expert Level Questions (28-40)

28. How do you design multi-tenant applications in Oracle APEX?

Multi-tenant design approaches:

  • Schema separation: Each tenant has separate database schema

  • Row-level separation: Shared schema with tenant ID filtering

  • VPD implementation: Virtual Private Database for automatic filtering

  • Workspace isolation: Separate APEX workspaces per tenant

  • Configuration management: Tenant-specific settings and customizations

  • Performance considerations: Resource allocation and monitoring per tenant

29. Describe advanced performance tuning techniques for APEX applications.

Advanced performance tuning:

  • Database optimization: Query tuning, indexing strategies, execution plan analysis

  • APEX-specific tuning: Region caching, lazy loading, efficient page design

  • Network optimization: Compression, CDN usage, static file optimization

  • Memory management: Session state optimization, collection management

  • Monitoring: Performance metrics collection and analysis

  • Scalability planning: Load balancing, connection pooling configuration

30. How do you implement custom PDF generation in APEX?

PDF generation approaches:

  • APEX native: Built-in PDF printing capabilities

  • BI Publisher: Oracle's enterprise reporting solution

  • PL/PDF: PL/SQL library for PDF creation

  • Custom solutions: Third-party tools or cloud services

  • Template design: Creating professional report layouts

  • Data integration: Merging database data with PDF templates

31. Explain the integration between APEX and Oracle Database Advanced Features.

Advanced database feature integration:

  • Partitioning: Working with partitioned tables and parallel processing

  • Analytics: Using Oracle Analytics for complex calculations

  • Spatial data: Geographic information system capabilities

  • Text search: Oracle Text integration for full-text search

  • Data warehousing: APEX as BI front-end for data warehouses

  • Advanced security: Label security, data masking, encryption

32. How do you handle real-time data updates in APEX applications?

Real-time updates implementation:

  • WebSockets: Persistent connections for live data streaming

  • APEX push notifications: Server-initiated client updates

  • Polling mechanisms: Automatic refresh of data regions

  • Database change notification: Responding to database triggers

  • Message queues: Asynchronous messaging for real-time updates

  • Event-driven architecture: Publish-subscribe patterns

33. Describe advanced authorization and access control patterns.

Advanced access control:

  • Attribute-based access: Dynamic permissions based on user attributes

  • Context-aware security: Access control based on location, time, device

  • Hierarchical permissions: Role inheritance and delegation

  • Data classification: Different access levels based on data sensitivity

  • Audit trails: Comprehensive logging of access and modifications

  • Fine-grained authorization: Column and row-level access control

34. How do you implement complex data migration strategies in APEX?

Data migration strategies:

  • ETL processes: Extract, Transform, Load operations for large datasets

  • Incremental migration: Moving data in phases to minimize downtime

  • Data validation: Ensuring data integrity during migration

  • Rollback procedures: Ability to revert failed migrations

  • Performance optimization: Parallel processing and bulk operations

  • Legacy system integration: Handling data from multiple source systems

35. What are the considerations for APEX cloud deployment and scalability?

Cloud deployment considerations:

  • Oracle Cloud Infrastructure: APEX on Autonomous Database

  • Container deployment: Docker and Kubernetes strategies

  • Auto-scaling: Dynamic resource allocation based on load

  • High availability: Multi-region deployment and failover

  • Disaster recovery: Backup and recovery strategies

  • Cost optimization: Resource utilization and pricing models

36. How do you implement advanced workflow and approval processes?

Workflow implementation:

  • State machines: Complex approval state management

  • Dynamic routing: Approval paths based on business rules

  • Parallel processing: Multiple approvers simultaneously

  • Escalation procedures: Automatic escalation for delayed approvals

  • Integration: Workflow engines and external approval systems

  • Audit capabilities: Complete approval history and tracking

37. Describe advanced integration patterns with Oracle E-Business Suite.

EBS integration patterns:

  • API utilization: Oracle EBS APIs for data operations

  • Single sign-on: Seamless authentication between systems

  • Data synchronization: Real-time or batch data updates

  • Workflow integration: APEX as approval interface for EBS processes

  • Reporting enhancement: APEX reports for EBS data

  • Mobile enablement: APEX mobile interfaces for EBS functionality

38. How do you handle advanced error handling and debugging in production APEX applications?

Production error handling:

  • Comprehensive logging: Detailed error logging with context

  • Error notification: Automatic alerts for critical errors

  • User-friendly messages: Meaningful error messages for end users

  • Debug modes: Production-safe debugging capabilities

  • Performance monitoring: Application performance metrics

  • Health checks: Automated application health monitoring

39. What are the advanced customization techniques for APEX themes and templates?

Advanced customization:

  • Custom CSS frameworks: Integration with modern CSS frameworks

  • JavaScript integration: Advanced client-side functionality

  • Responsive design: Mobile-first design principles

  • Component libraries: Reusable UI component development

  • Accessibility compliance: WCAG guidelines implementation

  • Brand integration: Corporate branding and style guidelines

40. How do you implement enterprise-grade monitoring and analytics for APEX applications?

Enterprise monitoring:

  • Application metrics: User activity, performance metrics, error rates

  • Database monitoring: Resource utilization, query performance

  • Business intelligence: Usage analytics and business insights

  • Capacity planning: Growth projections and resource planning

  • SLA monitoring: Service level agreement compliance

  • Integration monitoring: External system connectivity and performance

Salesforce Apex Interview Questions

Salesforce Apex is an object-oriented programming language that allows developers to execute flow and transaction control statements on the Salesforce platform. These questions assess candidates' ability to build robust, scalable solutions within the Salesforce ecosystem.

Beginner Level Questions (1-15)

1. What is Salesforce Apex and how does it differ from other programming languages?

Salesforce Apex is a strongly-typed, object-oriented programming language that executes on the Salesforce platform. Key differences include:

  • Cloud-native execution: Runs entirely on Salesforce servers, not locally

  • Governor limits: Built-in limits prevent resource abuse in multi-tenant environment

  • Database integration: Native integration with Salesforce objects and data

  • Automatic platform features: Built-in security, sharing, and workflow integration

  • Java-like syntax: Familiar syntax for Java developers but with platform-specific features

2. Explain the different types of Apex triggers and their execution contexts.

Apex triggers execute in response to data changes:

  • Before triggers: Execute before records are saved to database, used for validation and data modification

  • After triggers: Execute after records are saved, used for operations requiring record IDs

  • Trigger events: Insert, Update, Delete, Undelete operations

  • Trigger context variables: isInsert, isUpdate, isDelete, isBefore, isAfter, Trigger.new, Trigger.old

trigger AccountTrigger on Account (before insert, before update, after insert, after update) {

    if (Trigger.isBefore) {

        // Validation and data modification logic

    }

    if (Trigger.isAfter) {

        // Operations requiring record IDs

    }

}

3. What are SOQL and SOSL, and when would you use each?

  • SOQL (Salesforce Object Query Language): Queries single object or related objects, returns specific records

  • SOSL (Salesforce Object Search Language): Searches across multiple objects, returns records containing search terms

Use SOQL for specific data retrieval, SOSL for broad searches across multiple objects.

// SOQL example

List<Account> accounts = [SELECT Id, Name FROM Account WHERE Industry = 'Technology'];

// SOSL example

List<List<SObject>> searchResults = [FIND 'John' IN ALL FIELDS RETURNING Account, Contact];

4. How do you handle exceptions in Apex?

Exception handling uses try-catch blocks:

try {

    // Code that might throw exception

    insert accountList;

} catch (DMLException e) {

    // Handle DML-specific exceptions

    System.debug('DML Error: ' + e.getMessage());

} catch (Exception e) {

    // Handle general exceptions

    System.debug('General Error: ' + e.getMessage());

} finally {

    // Cleanup code that always executes

}

5. What is the difference between with sharing and without sharing keywords?

  • with sharing: Enforces user's sharing rules and permissions

  • without sharing: Runs with full access, ignoring user permissions

  • inherited sharing: Inherits sharing context from calling class

public with sharing class AccountService {

    // Respects user sharing rules

}

public without sharing class SystemService {

    // Runs with system permissions

}

6. Explain the concept of governor limits in Salesforce.

Governor limits prevent resource abuse in the multi-tenant environment:

  • SOQL queries: 100 synchronous, 200 asynchronous per transaction

  • DML statements: 150 per transaction

  • Heap size: 6MB synchronous, 12MB asynchronous

  • CPU time: 10 seconds synchronous, 60 seconds asynchronous

  • Callouts: 100 per transaction

7. What are the different types of collections in Apex?

  • List: Ordered collection allowing duplicates

  • Set: Unordered collection of unique elements

  • Map: Key-value pairs for efficient lookups

List<String> stringList = new List<String>();

Set<Id> idSet = new Set<Id>();

Map<Id, Account> accountMap = new Map<Id, Account>();

8. How do you write test classes in Apex?

Test classes ensure code quality and are required for deployment:

@isTest

public class AccountTriggerTest {

    @testSetup

    static void setupTestData() {

        // Create test data

    }

    

    @isTest

    static void testAccountInsert() {

        Test.startTest();

        // Test logic

        Test.stopTest();

        

        // Assertions

        System.assertEquals(expected, actual);

    }

}

9. What is the difference between static and instance methods?

  • Static methods: Belong to the class, called without creating instance, cannot access instance variables

  • Instance methods: Belong to object instance, can access instance variables

public class Calculator {

    public static Integer add(Integer a, Integer b) {

        return a + b; // Static method

    }

    

    public Integer instanceVariable = 0;

    public void setVariable(Integer value) {

        this.instanceVariable = value; // Instance method

    }

}

10. Explain the order of execution in Salesforce.

The order of execution for record processing:

  1. System validation rules

  2. Before triggers

  3. Custom validation rules

  4. After triggers

  5. Assignment rules

  6. Auto-response rules

  7. Workflow rules

  8. Processes and flows

  9. Escalation rules

  10. Roll-up summary field updates

  11. Criteria-based sharing rules

11. What are future methods and when would you use them?

Future methods execute asynchronously:

public class ExternalService {

    @future(callout=true)

    public static void makeCallout(String endpoint) {

        // Asynchronous callout logic

    }

    

    @future

    public static void heavyProcessing(Set<Id> recordIds) {

        // Time-consuming operations

    }

}

Use cases: External callouts, heavy processing, mixed DML operations.

12. How do you implement pagination in Visualforce or Lightning components?

Pagination handles large datasets efficiently:

public class AccountController {

    public ApexPages.StandardSetController setCon {get; set;}

    

    public AccountController() {

        setCon = new ApexPages.StandardSetController([SELECT Id, Name FROM Account]);

        setCon.setPageSize(10);

    }

    

    public List<Account> getAccounts() {

        return (List<Account>) setCon.getRecords();

    }

    

    public Boolean hasNext() {

        return setCon.getHasNext();

    }

    

    public PageReference next() {

        setCon.next();

        return null;

    }

}

13. What is the difference between insert and Database.insert?

  • insert: DML statement that throws exception on failure

  • Database.insert: Database method allowing partial success

// Traditional DML

try {

    insert accountList;

} catch (DMLException e) {

    // Handle exception

}

// Database method

Database.SaveResult[] results = Database.insert(accountList, false);

for (Database.SaveResult result : results) {

    if (!result.isSuccess()) {

        // Handle individual failures

    }

}

14. How do you handle bulk operations in Apex?

Bulk operations process multiple records efficiently:

public class BulkAccountProcessor {

    public static void updateAccounts(List<Account> accounts) {

        List<Account> accountsToUpdate = new List<Account>();

        

        for (Account acc : accounts) {

            if (acc.AnnualRevenue > 1000000) {

                acc.Type = 'Enterprise';

                accountsToUpdate.add(acc);

            }

        }

        

        if (!accountsToUpdate.isEmpty()) {

            update accountsToUpdate;

        }

    }

}

15. What are custom settings and custom metadata types?

  • Custom settings: Application data cached at organization, profile, or user level

  • Custom metadata types: Metadata that can be deployed and is accessible via SOQL

Use custom settings for configuration data, custom metadata for deployable application metadata.

Intermediate Level Questions (16-27)

16. How do you implement trigger design patterns to avoid recursion?

Recursion prevention using static variables:

public class TriggerHelper {

    private static Boolean isExecuting = false;

    private static Set<Id> processedIds = new Set<Id>();

    

    public static Boolean isFirstRun() {

        if (!isExecuting) {

            isExecuting = true;

            return true;

        }

        return false;

    }

    

    public static Boolean isProcessed(Id recordId) {

        return processedIds.contains(recordId);

    }

    

    public static void addProcessed(Id recordId) {

        processedIds.add(recordId);

    }

}

17. Explain batch Apex and provide implementation example.

Batch Apex processes large datasets asynchronously:

public class AccountBatch implements Database.Batchable<sObject>, Database.Stateful {

    private Integer recordsProcessed = 0;

    

    public Database.QueryLocator start(Database.BatchableContext bc) {

        return Database.getQueryLocator('SELECT Id, Name FROM Account WHERE Type = null');

    }

    

    public void execute(Database.BatchableContext bc, List<Account> scope) {

        for (Account acc : scope) {

            acc.Type = 'Prospect';

        }

        update scope;

        recordsProcessed += scope.size();

    }

    

    public void finish(Database.BatchableContext bc) {

        System.debug('Processed ' + recordsProcessed + ' records');

    }

}

18. How do you implement asynchronous processing with Queueable Apex?

Queueable Apex for chainable asynchronous operations:

public class AccountProcessor implements Queueable {

    private List<Account> accounts;

    private Integer batchSize;

    

    public AccountProcessor(List<Account> accounts, Integer batchSize) {

        this.accounts = accounts;

        this.batchSize = batchSize;

    }

    

    public void execute(QueueableContext context) {

        List<Account> batch = new List<Account>();

        

        for (Integer i = 0; i < Math.min(batchSize, accounts.size()); i++) {

            batch.add(accounts[i]);

        }

        

        // Process batch

        processAccounts(batch);

        

        // Chain next batch if more records

        if (accounts.size() > batchSize) {

            List<Account> remaining = accounts.subList(batchSize, accounts.size());

            System.enqueueJob(new AccountProcessor(remaining, batchSize));

        }

    }

    

    private void processAccounts(List<Account> accounts) {

        // Processing logic

    }

}

19. How do you handle mixed DML operations?

Mixed DML occurs when setup and non-setup objects are modified in same transaction:

public class MixedDMLHandler {

    @future

    public static void createUserAsync(String firstName, String lastName, String email) {

        // Create user in async context to avoid mixed DML

        User newUser = new User(

            FirstName = firstName,

            LastName = lastName,

            Email = email,

            Username = email,

            Alias = firstName.substring(0,1) + lastName.substring(0,4),

            ProfileId = [SELECT Id FROM Profile WHERE Name = 'Standard User'].Id

        );

        insert newUser;

    }

    

    public static void handleAccountAndUser(Account acc, String userEmail) {

        insert acc; // Non-setup object

        

        // Use future method for setup object to avoid mixed DML

        createUserAsync('John', 'Doe', userEmail);

    }

}

20. Explain the implementation of sharing and security in Apex.

Sharing and security implementation:

// Manual sharing

public class AccountSharing {

    public static void shareAccountWithUser(Id accountId, Id userId, String accessLevel) {

        AccountShare sharing = new AccountShare();

        sharing.AccountId = accountId;

        sharing.UserOrGroupId = userId;

        sharing.AccountAccessLevel = accessLevel;

        sharing.OpportunityAccessLevel = 'Read';

        

        Database.SaveResult result = Database.insert(sharing, false);

        if (!result.isSuccess()) {

            System.debug('Error sharing account: ' + result.getErrors());

        }

    }

}

// Programmatic sharing rules

public inherited sharing class SecureAccountService {

    public static List<Account> getAccessibleAccounts() {

        return [SELECT Id, Name FROM Account WITH SECURITY_ENFORCED];

    }

}

21. How do you implement dynamic SOQL and handle injection prevention?

Dynamic SOQL with security considerations:

public class DynamicSOQLService {

    public static List<SObject> queryRecords(String objectName, List<String> fields, String whereClause) {

        // Validate object access

        if (!Schema.getGlobalDescribe().containsKey(objectName)) {

            throw new IllegalArgumentException('Invalid object name');

        }

        

        // Validate field access

        Map<String, Schema.SObjectField> fieldMap = 

            Schema.getGlobalDescribe().get(objectName).getDescribe().fields.getMap();

        

        for (String field : fields) {

            if (!fieldMap.containsKey(field)) {

                throw new IllegalArgumentException('Invalid field: ' + field);

            }

        }

        

        // Build query with String.escapeSingleQuotes for user input

        String query = 'SELECT ' + String.join(fields, ',') + 

                      ' FROM ' + objectName;

        

        if (String.isNotBlank(whereClause)) {

            query += ' WHERE ' + String.escapeSingleQuotes(whereClause);

        }

        

        return Database.query(query);

    }

}

22. How do you implement REST API integration in Apex?

REST API integration using HTTP callouts:

public class ExternalAPIService {

    @future(callout=true)

    public static void makeRestCallout(String endpoint, String method, String body) {

        Http http = new Http();

        HttpRequest request = new HttpRequest();

        

        request.setEndpoint(endpoint);

        request.setMethod(method);

        request.setHeader('Content-Type', 'application/json');

        request.setHeader('Authorization', 'Bearer ' + getAuthToken());

        

        if (String.isNotBlank(body)) {

            request.setBody(body);

        }

        

        try {

            HttpResponse response = http.send(request);

            

            if (response.getStatusCode() == 200) {

                processResponse(response.getBody());

            } else {

                System.debug('Error: ' + response.getStatusCode() + ' ' + response.getStatus());

            }

        } catch (Exception e) {

            System.debug('Callout failed: ' + e.getMessage());

        }

    }

    

    private static String getAuthToken() {

        // Implement OAuth or API key logic

        return 'your_auth_token';

    }

    

    private static void processResponse(String responseBody) {

        // Parse and process response

        Map<String, Object> responseMap = (Map<String, Object>) JSON.deserializeUntyped(responseBody);

        // Process the response data

    }

}

23. How do you handle governor limits in complex applications?

Governor limit management strategies:

public class LimitManager {

    public static void checkLimits() {

        System.debug('SOQL Queries used: ' + Limits.getQueries() + '/' + Limits.getLimitQueries());

        System.debug('DML Statements used: ' + Limits.getDMLStatements() + '/' + Limits.getLimitDMLStatements());

        System.debug('Heap Size used: ' + Limits.getHeapSize() + '/' + Limits.getLimitHeapSize());

        

        // Warn if approaching limits

        if (Limits.getQueries() > 80) {

            System.debug('WARNING: Approaching SOQL query limit');

        }

    }

    

    public static void processInBatches(List<SObject> records, Integer batchSize) {

        List<SObject> batch = new List<SObject>();

        

        for (SObject record : records) {

            batch.add(record);

            

            if (batch.size() == batchSize) {

                processBatch(batch);

                batch.clear();

            }

        }

        

        // Process remaining records

        if (!batch.isEmpty()) {

            processBatch(batch);

        }

    }

    

    private static void processBatch(List<SObject> batch) {

        // Process batch while monitoring limits

        checkLimits();

        // Batch processing logic

    }

}

24. How do you implement custom metadata types in your solutions?

Custom metadata types for configuration:

public class ConfigurationService {

    private static Map<String, Integration_Setting__mdt> settingsCache;

    

    public static Integration_Setting__mdt getSetting(String settingName) {

        if (settingsCache == null) {

            loadSettings();

        }

        

        return settingsCache.get(settingName);

    }

    

    private static void loadSettings() {

        settingsCache = new Map<String, Integration_Setting__mdt>();

        

        for (Integration_Setting__mdt setting : [

            SELECT DeveloperName, Endpoint__c, Timeout__c, Retry_Count__c 

            FROM Integration_Setting__mdt

        ]) {

            settingsCache.put(setting.DeveloperName, setting);

        }

    }

    

    public static void makeConfigurableCallout(String settingName, String payload) {

        Integration_Setting__mdt setting = getSetting(settingName);

        

        if (setting != null) {

            Http http = new Http();

            HttpRequest req = new HttpRequest();

            req.setEndpoint(setting.Endpoint__c);

            req.setTimeout(Integer.valueOf(setting.Timeout__c));

            req.setBody(payload);

            

            // Implement retry logic based on setting.Retry_Count__c

        }

    }

}

25. How do you implement platform events for event-driven architecture?

Platform events for decoupled communication:

// Publisher

public class OrderEventPublisher {

    public static void publishOrderEvent(Id orderId, String status) {

        Order_Status_Event__e event = new Order_Status_Event__e();

        event.Order_Id__c = orderId;

        event.Status__c = status;

        event.Timestamp__c = System.now();

        

        Database.SaveResult result = EventBus.publish(event);

        

        if (!result.isSuccess()) {

            System.debug('Error publishing event: ' + result.getErrors());

        }

    }

}

// Subscriber (Trigger on Platform Event)

trigger OrderStatusEventTrigger on Order_Status_Event__e (after insert) {

    List<Task> tasksToCreate = new List<Task>();

    

    for (Order_Status_Event__e event : Trigger.new) {

        if (event.Status__c == 'Shipped') {

            Task followUpTask = new Task();

            followUpTask.Subject = 'Follow up on shipped order';

            followUpTask.WhatId = event.Order_Id__c;

            followUpTask.ActivityDate = Date.today().addDays(3);

            tasksToCreate.add(followUpTask);

        }

    }

    

    if (!tasksToCreate.isEmpty()) {

        insert tasksToCreate;

    }

}

26. How do you implement Lightning Web Component (LWC) integration with Apex?

LWC-Apex integration patterns:

// Apex Controller for LWC

public with sharing class AccountController {

    @AuraEnabled(cacheable=true)

    public static List<Account> getAccounts(String searchTerm) {

        String searchKey = '%' + searchTerm + '%';

        return [

            SELECT Id, Name, Industry, AnnualRevenue 

            FROM Account 

            WHERE Name LIKE :searchKey 

            WITH SECURITY_ENFORCED

            LIMIT 50

        ];

    }

    

    @AuraEnabled

    public static void updateAccount(Account account) {

        try {

            update account;

        } catch (DMLException e) {

            throw new AuraHandledException(e.getMessage());

        }

    }

    

    @AuraEnabled

    public static String createAccountWithContacts(String accountData, String contactsData) {

        try {

            Account acc = (Account) JSON.deserialize(accountData, Account.class);

            insert acc;

            

            List<Contact> contacts = (List<Contact>) JSON.deserialize(contactsData, List<Contact>.class);

            for (Contact con : contacts) {

                con.AccountId = acc.Id;

            }

            insert contacts;

            

            return acc.Id;

        } catch (Exception e) {

            throw new AuraHandledException('Error creating account: ' + e.getMessage());

        }

    }

}

27. How do you implement data factory patterns for test data creation?

Test data factory for maintainable tests:

@isTest

public class TestDataFactory {

    public static Account createAccount(String name, String industry) {

        return new Account(

            Name = name,

            Industry = industry,

            BillingCity = 'San Francisco',

            BillingState = 'CA'

        );

    }

    

    public static List<Account> createAccounts(Integer count) {

        List<Account> accounts = new List<Account>();

        

        for (Integer i = 0; i < count; i++) {

            accounts.add(createAccount('Test Account ' + i, 'Technology'));

        }

        

        return accounts;

    }

    

    public static Contact createContact(Id accountId, String firstName, String lastName) {

        return new Contact(

            AccountId = accountId,

            FirstName = firstName,

            LastName = lastName,

            Email = firstName.toLowerCase() + '.' + lastName.toLowerCase() + '@test.com'

        );

    }

    

    public static User createTestUser(String profileName, String username) {

        Profile profile = [SELECT Id FROM Profile WHERE Name = :profileName LIMIT 1];

        

        return new User(

            FirstName = 'Test',

            LastName = 'User',

            Email = username + '@test.com',

            Username = username + '@test.com.dev',

            Alias = 'tuser',

            ProfileId = profile.Id,

            TimeZoneSidKey = 'America/Los_Angeles',

            LocaleSidKey = 'en_US',

            EmailEncodingKey = 'UTF-8',

            LanguageLocaleKey = 'en_US'

        );

    }

}

Expert Level Questions (28-40)

28. How do you implement complex domain-driven design patterns in Salesforce?

Domain-driven design implementation:

// Domain Layer - Business Logic

public virtual class OpportunityDomain {

    protected List<Opportunity> opportunities;

    

    public OpportunityDomain(List<Opportunity> opportunities) {

        this.opportunities = opportunities;

    }

    

    public virtual void validateBusinessRules() {

        for (Opportunity opp : opportunities) {

            validateCloseDate(opp);

            validateAmount(opp);

            validateStageProgression(opp);

        }

    }

    

    protected virtual void validateCloseDate(Opportunity opp) {

        if (opp.CloseDate < Date.today()) {

            opp.addError('Close date cannot be in the past');

        }

    }

    

    protected virtual void validateAmount(Opportunity opp) {

        if (opp.Amount <= 0) {

            opp.addError('Amount must be greater than zero');

        }

    }

    

    protected virtual void validateStageProgression(Opportunity opp) {

        // Complex stage progression logic

        if (Trigger.isUpdate) {

            Opportunity oldOpp = Trigger.oldMap.get(opp.Id);

            if (!isValidStageProgression(oldOpp.StageName, opp.StageName)) {

                opp.addError('Invalid stage progression');

            }

        }

    }

    

    private Boolean isValidStageProgression(String oldStage, String newStage) {

        // Stage progression business rules

        Map<String, Set<String>> validProgressions = new Map<String, Set<String>>{

            'Prospecting' => new Set<String>{'Qualification', 'Closed Lost'},

            'Qualification' => new Set<String>{'Needs Analysis', 'Closed Lost'},

            'Needs Analysis' => new Set<String>{'Value Proposition', 'Closed Lost'},

            'Value Proposition' => new Set<String>{'Id. Decision Makers', 'Closed Lost'},

            'Id. Decision Makers' => new Set<String>{'Perception Analysis', 'Closed Lost'},

            'Perception Analysis' => new Set<String>{'Proposal/Price Quote', 'Closed Lost'},

            'Proposal/Price Quote' => new Set<String>{'Negotiation/Review', 'Closed Lost'},

            'Negotiation/Review' => new Set<String>{'Closed Won', 'Closed Lost'}

        };

        

        return validProgressions.get(oldStage)?.contains(newStage) ?? false;

    }

}

// Service Layer - Application Logic

public class OpportunityService {

    public static void processOpportunities(List<Opportunity> opportunities) {

        OpportunityDomain domain = new OpportunityDomain(opportunities);

        domain.validateBusinessRules();

        

        // Additional service layer operations

        updateRelatedRecords(opportunities);

        sendNotifications(opportunities);

    }

    

    private static void updateRelatedRecords(List<Opportunity> opportunities) {

        // Update related accounts, contacts, etc.

    }

    

    private static void sendNotifications(List<Opportunity> opportunities) {

        // Send email notifications, platform events, etc.

    }

}

29. How do you implement enterprise-grade error handling and logging frameworks?

Comprehensive error handling framework:

public class Logger {

    private static List<Log_Entry__c> logEntries = new List<Log_Entry__c>();

    

    public enum LogLevel { DEBUG, INFO, WARN, ERROR, FATAL }

    

    public static void log(LogLevel level, String className, String methodName, String message, Exception ex) {

        Log_Entry__c entry = new Log_Entry__c();

        entry.Level__c = level.name();

        entry.Class_Name__c = className;

        entry.Method_Name__c = methodName;

        entry.Message__c = message;

        entry.Stack_Trace__c = ex?.getStackTraceString();

        entry.User__c = UserInfo.getUserId();

        entry.Timestamp__c = System.now();

        

        logEntries.add(entry);

        

        // Immediate insertion for errors and fatal logs

        if (level == LogLevel.ERROR || level == LogLevel.FATAL) {

            flushLogs();

        }

    }

    

    public static void flushLogs() {

        if (!logEntries.isEmpty()) {

            try {

                insert logEntries;

                logEntries.clear();

            } catch (DMLException e) {

                // Fallback to System.debug if database insert fails

                System.debug('Failed to insert log entries: ' + e.getMessage());

            }

        }

    }

    

    // Automatic log flushing on transaction completion

    public static void handleTransactionEnd() {

        flushLogs();

    }

}

// Error Handler Utility

public class ErrorHandler {

    public static void handleException(Exception ex, String context) {

        Logger.log(Logger.LogLevel.ERROR, 

                  ErrorHandler.class.getName(), 

                  'handleException', 

                  'Error in ' + context + ': ' + ex.getMessage(), 

                  ex);

        

        // Send critical error notifications

        if (ex instanceof System.LimitException) {

            sendCriticalErrorNotification(ex, context);

        }

    }

    

    private static void sendCriticalErrorNotification(Exception ex, String context) {

        // Send email to system administrators

        // Create platform event for monitoring systems

        // Log to external monitoring tools

    }

    

    public static void processWithErrorHandling(String context, ProcessingDelegate processor) {

        try {

            processor.process();

        } catch (Exception ex) {

            handleException(ex, context);

            throw ex; // Re-throw if needed

        }

    }

}

// Delegate interface for error handling

public interface ProcessingDelegate {

    void process();

}

30. How do you implement sophisticated caching strategies in Apex?

Multi-level caching implementation:

public class CacheManager {

    // Org cache partition

    private static final String ORG_PARTITION = 'OrgData';

    // Session cache partition  

    private static final String SESSION_PARTITION = 'SessionData';

    

    // Static cache for transaction-level caching

    private static Map<String, Object> transactionCache = new Map<String, Object>();

    

    public static Object get(String key, CacheLevel level) {

        switch on level {

            when TRANSACTION {

                return transactionCache.get(key);

            }

            when SESSION {

                return Cache.Session.get(SESSION_PARTITION + '.' + key);

            }

            when ORG {

                return Cache.Org.get(ORG_PARTITION + '.' + key);

            }

        }

        return null;

    }

    

    public static void put(String key, Object value, CacheLevel level, Integer ttlSeconds) {

        switch on level {

            when TRANSACTION {

                transactionCache.put(key, value);

            }

            when SESSION {

                Cache.Session.put(SESSION_PARTITION + '.' + key, value, ttlSeconds);

            }

            when ORG {

                Cache.Org.put(ORG_PARTITION + '.' + key, value, ttlSeconds);

            }

        }

    }

    

    public static Boolean contains(String key, CacheLevel level) {

        switch on level {

            when TRANSACTION {

                return transactionCache.containsKey(key);

            }

            when SESSION {

                return Cache.Session.contains(SESSION_PARTITION + '.' + key);

            }

            when ORG {

                return Cache.Org.contains(ORG_PARTITION + '.' + key);

            }

        }

        return false;

    }

    

    public enum CacheLevel { TRANSACTION, SESSION, ORG }

}

// Cached data service example

public class AccountCacheService {

    private static final String ACCOUNT_CACHE_KEY = 'AccountData_';

    private static final Integer CACHE_TTL = 3600; // 1 hour

    

    public static Account getCachedAccount(Id accountId) {

        String cacheKey = ACCOUNT_CACHE_KEY + accountId;

        

        // Try transaction cache first

        Account account = (Account) CacheManager.get(cacheKey, CacheManager.CacheLevel.TRANSACTION);

        if (account != null) {

            return account;

        }

        

        // Try session cache

        account = (Account) CacheManager.get(cacheKey, CacheManager.CacheLevel.SESSION);

        if (account != null) {

            // Store in transaction cache for faster access

            CacheManager.put(cacheKey, account, CacheManager.CacheLevel.TRANSACTION, CACHE_TTL);

            return account;

        }

        

        // Query database and cache result

        account = [SELECT Id, Name, Industry, AnnualRevenue FROM Account WHERE Id = :accountId LIMIT 1];

        

        CacheManager.put(cacheKey, account, CacheManager.CacheLevel.TRANSACTION, CACHE_TTL);

        CacheManager.put(cacheKey, account, CacheManager.CacheLevel.SESSION, CACHE_TTL);

        

        return account;

    }

}

31. How do you implement complex data migration and synchronization patterns?

Enterprise data migration framework:

public class DataMigrationFramework {

    public interface MigrationStep {

        void execute(MigrationContext context);

        void rollback(MigrationContext context);

        String getStepName();

    }

    

    public class MigrationContext {

        public Map<String, Object> parameters;

        public List<String> errors;

        public Integer batchSize;

        public Boolean dryRun;

        

        public MigrationContext() {

            this.parameters = new Map<String, Object>();

            this.errors = new List<String>();

            this.batchSize = 200;

            this.dryRun = false;

        }

    }

    

    public class MigrationPipeline {

        private List<MigrationStep> steps;

        private MigrationContext context;

        

        public MigrationPipeline(MigrationContext context) {

            this.steps = new List<MigrationStep>();

            this.context = context;

        }

        

        public MigrationPipeline addStep(MigrationStep step) {

            this.steps.add(step);

            return this;

        }

        

        public MigrationResult execute() {

            MigrationResult result = new MigrationResult();

            List<MigrationStep> executedSteps = new List<MigrationStep>();

            

            try {

                for (MigrationStep step : steps) {

                    Logger.log(Logger.LogLevel.INFO, 'MigrationPipeline', 'execute', 

                              'Executing step: ' + step.getStepName(), null);

                    

                    step.execute(context);

                    executedSteps.add(step);

                    

                    result.completedSteps.add(step.getStepName());

                }

                

                result.success = true;

            } catch (Exception ex) {

                Logger.log(Logger.LogLevel.ERROR, 'MigrationPipeline', 'execute', 

                          'Migration failed', ex);

                

                result.success = false;

                result.errorMessage = ex.getMessage();

                

                // Rollback executed steps in reverse order

                rollbackSteps(executedSteps);

            }

            

            return result;

        }

        

        private void rollbackSteps(List<MigrationStep> executedSteps) {

            for (Integer i = executedSteps.size() - 1; i >= 0; i--) {

                try {

                    executedSteps[i].rollback(context);

                } catch (Exception rollbackEx) {

                    Logger.log(Logger.LogLevel.ERROR, 'MigrationPipeline', 'rollbackSteps', 

                              'Rollback failed for step: ' + executedSteps[i].getStepName(), rollbackEx);

                }

            }

        }

    }

    

    public class MigrationResult {

        public Boolean success;

        public String errorMessage;

        public List<String> completedSteps;

        

        public MigrationResult() {

            this.completedSteps = new List<String>();

        }

    }

}

// Example migration step implementation

public class AccountDataMigrationStep implements DataMigrationFramework.MigrationStep {

    public void execute(DataMigrationFramework.MigrationContext context) {

        List<Legacy_Account__c> legacyAccounts = [SELECT Name, Industry__c, Revenue__c FROM Legacy_Account__c];

        List<Account> accountsToInsert = new List<Account>();

        

        for (Legacy_Account__c legacy : legacyAccounts) {

            Account newAccount = new Account();

            newAccount.Name = legacy.Name;

            newAccount.Industry = legacy.Industry__c;

            newAccount.AnnualRevenue = legacy.Revenue__c;

            accountsToInsert.add(newAccount);

        }

        

        if (!context.dryRun) {

            Database.SaveResult[] results = Database.insert(accountsToInsert, false);

            

            for (Database.SaveResult result : results) {

                if (!result.isSuccess()) {

                    context.errors.add('Failed to migrate account: ' + result.getErrors());

                }

            }

        }

    }

    

    public void rollback(DataMigrationFramework.MigrationContext context) {

        // Implement rollback logic

        delete [SELECT Id FROM Account WHERE CreatedDate = TODAY];

    }

    

    public String getStepName() {

        return 'Account Data Migration';

    }

}

32. How do you implement advanced security patterns including encryption and tokenization?

Advanced security implementation:

public class SecurityManager {

    private static final String ENCRYPTION_KEY = 'MySecretEncryptionKey123!';

    

    // Field-level encryption

    public static String encryptSensitiveData(String plainText) {

        if (String.isBlank(plainText)) {

            return plainText;

        }

        

        try {

            Blob key = Crypto.generateAesKey(256);

            Blob data = Blob.valueOf(plainText);

            Blob encryptedData = Crypto.encrypt('AES256', key, data);

            

            // Store the key securely (this is a simplified example)

            storeEncryptionKey(key);

            

            return EncodingUtil.base64Encode(encryptedData);

        } catch (Exception ex) {

            Logger.log(Logger.LogLevel.ERROR, 'SecurityManager', 'encryptSensitiveData', 

                      'Encryption failed', ex);

            throw ex;

        }

    }

    

    public static String decryptSensitiveData(String encryptedText) {

        if (String.isBlank(encryptedText)) {

            return encryptedText;

        }

        

        try {

            Blob key = retrieveEncryptionKey();

            Blob encryptedData = EncodingUtil.base64Decode(encryptedText);

            Blob decryptedData = Crypto.decrypt('AES256', key, encryptedData);

            

            return decryptedData.toString();

        } catch (Exception ex) {

            Logger.log(Logger.LogLevel.ERROR, 'SecurityManager', 'decryptSensitiveData', 

                      'Decryption failed', ex);

            throw ex;

        }

    }

    

    // Data tokenization for sensitive data

    public static String tokenizeSensitiveData(String sensitiveData, String tokenType) {

        String token = generateSecureToken();

        

        // Store mapping in protected custom setting

        Token_Mapping__c mapping = new Token_Mapping__c();

        mapping.Token__c = token;

        mapping.Token_Type__c = tokenType;

        mapping.Original_Value__c = encryptSensitiveData(sensitiveData);

        

        insert mapping;

        

        return token;

    }

    

    public static String detokenizeData(String token) {

        Token_Mapping__c mapping = [

            SELECT Original_Value__c 

            FROM Token_Mapping__c 

            WHERE Token__c = :token 

            LIMIT 1

        ];

        

        return decryptSensitiveData(mapping.Original_Value__c);

    }

    

    private static String generateSecureToken() {

        // Generate cryptographically secure token

        Blob randomBytes = Crypto.generateAesKey(128);

        return EncodingUtil.base64Encode(randomBytes);

    }

    

    private static void storeEncryptionKey(Blob key) {

        // Store encryption key securely in protected custom setting or external system

        // This is a simplified implementation

    }

    

    private static Blob retrieveEncryptionKey() {

        // Retrieve encryption key from secure storage

        // This is a simplified implementation

        return Crypto.generateAesKey(256);

    }

    

    // Data masking for non-production environments

    public static String maskSensitiveData(String originalValue, String maskingPattern) {

        if (String.isBlank(originalValue)) {

            return originalValue;

        }

        

        switch on maskingPattern {

            when 'EMAIL' {

                return maskEmail(originalValue);

            }

            when 'PHONE' {

                return maskPhoneNumber(originalValue);

            }

            when 'SSN' {

                return maskSSN(originalValue);

            }

            when else {

                return '***MASKED***';

            }

        }

    }

    

    private static String maskEmail(String email) {

        if (!email.contains('@')) {

            return email;

        }

        

        String[] parts = email.split('@');

        String localPart = parts[0];

        String domain = parts[1];

        

        String maskedLocal = localPart.length() > 2 ? 

            localPart.substring(0, 2) + '***' : 

            '***';

            

        return maskedLocal + '@' + domain;

    }

    

    private static String maskPhoneNumber(String phone) {

        if (phone.length() < 4) {

            return '***';

        }

        

        return '***-***-' + phone.substring(phone.length() - 4);

    }

    

    private static String maskSSN(String ssn) {

        if (ssn.length() < 4) {

            return '***';

        }

        

        return '***-**-' + ssn.substring(ssn.length() - 4);

    }

}

33. How do you implement enterprise integration patterns with external systems?

Enterprise integration patterns:

public class IntegrationOrchestrator {

    public interface MessageProcessor {

        void processMessage(IntegrationMessage message);

    }

    

    public class IntegrationMessage {

        public String messageId;

        public String messageType;

        public String source;

        public String destination;

        public Map<String, Object> payload;

        public DateTime timestamp;

        public Integer retryCount;

        

        public IntegrationMessage(String messageType, String source, String destination, Map<String, Object> payload) {

            this.messageId = generateMessageId();

            this.messageType = messageType;

            this.source = source;

            this.destination = destination;

            this.payload = payload;

            this.timestamp = System.now();

            this.retryCount = 0;

        }

        

        private String generateMessageId() {

            return 'MSG_' + System.currentTimeMillis() + '_' + Math.round(Math.random() * 1000);

        }

    }

    

    // Message Router Pattern

    public class MessageRouter {

        private Map<String, MessageProcessor> processors;

        

        public MessageRouter() {

            this.processors = new Map<String, MessageProcessor>();

        }

        

        public void registerProcessor(String messageType, MessageProcessor processor) {

            processors.put(messageType, processor);

        }

        

        public void routeMessage(IntegrationMessage message) {

            MessageProcessor processor = processors.get(message.messageType);

            

            if (processor != null) {

                try {

                    processor.processMessage(message);

                } catch (Exception ex) {

                    handleProcessingError(message, ex);

                }

            } else {

                Logger.log(Logger.LogLevel.WARN, 'MessageRouter', 'routeMessage', 

                          'No processor found for message type: ' + message.messageType, null);

            }

        }

        

        private void handleProcessingError(IntegrationMessage message, Exception ex) {

            message.retryCount++;

            

            if (message.retryCount < 3) {

                // Retry logic

                System.enqueueJob(new RetryProcessor(message));

            } else {

                // Send to dead letter queue

                sendToDeadLetterQueue(message, ex);

            }

        }

    }

    

    // Retry Processor for failed messages

    public class RetryProcessor implements Queueable {

        private IntegrationMessage message;

        

        public RetryProcessor(IntegrationMessage message) {

            this.message = message;

        }

        

        public void execute(QueueableContext context) {

            // Exponential backoff delay

            Integer delay = (Integer) Math.pow(2, message.retryCount) * 1000;

            

            // Simulate delay (in real implementation, use scheduled job)

            MessageRouter router = new MessageRouter();

            router.routeMessage(message);

        }

    }

    

    private static void sendToDeadLetterQueue(IntegrationMessage message, Exception ex) {

        Dead_Letter_Queue__c dlq = new Dead_Letter_Queue__c();

        dlq.Message_Id__c = message.messageId;

        dlq.Message_Type__c = message.messageType;

        dlq.Payload__c = JSON.serialize(message.payload);

        dlq.Error_Message__c = ex.getMessage();

        dlq.Retry_Count__c = message.retryCount;

        

        insert dlq;

    }

}

// Specific message processor implementation

public class AccountSyncProcessor implements IntegrationOrchestrator.MessageProcessor {

    public void processMessage(IntegrationOrchestrator.IntegrationMessage message) {

        Map<String, Object> payload = message.payload;

        

        switch on message.messageType {

            when 'ACCOUNT_CREATE' {

                createAccount(payload);

            }

            when 'ACCOUNT_UPDATE' {

                updateAccount(payload);

            }

            when 'ACCOUNT_DELETE' {

                deleteAccount(payload);

            }

        }

    }

    

    private void createAccount(Map<String, Object> payload) {

        Account newAccount = new Account();

        newAccount.Name = (String) payload.get('name');

        newAccount.Industry = (String) payload.get('industry');

        newAccount.External_Id__c = (String) payload.get('externalId');

        

        insert newAccount;

    }

    

    private void updateAccount(Map<String, Object> payload) {

        String externalId = (String) payload.get('externalId');

        Account existingAccount = [SELECT Id FROM Account WHERE External_Id__c = :externalId LIMIT 1];

        

        existingAccount.Name = (String) payload.get('name');

        existingAccount.Industry = (String) payload.get('industry');

        

        update existingAccount;

    }

    

    private void deleteAccount(Map<String, Object> payload) {

        String externalId = (String) payload.get('externalId');

        Account accountToDelete = [SELECT Id FROM Account WHERE External_Id__c = :externalId LIMIT 1];

        

        delete accountToDelete;

    }

}

34. How do you implement sophisticated testing strategies including mocking and dependency injection?

Advanced testing framework with mocking:

// Dependency injection framework

public interface IAccountService {

    List<Account> getAccountsByIndustry(String industry);

    void updateAccountRating(Id accountId, String rating);

}

public class AccountService implements IAccountService {

    public List<Account> getAccountsByIndustry(String industry) {

        return [SELECT Id, Name, Industry FROM Account WHERE Industry = :industry];

    }

    

    public void updateAccountRating(Id accountId, String rating) {

        Account acc = new Account(Id = accountId, Rating = rating);

        update acc;

    }

}

// Service locator pattern for dependency injection

public class ServiceLocator {

    private static Map<Type, Object> services = new Map<Type, Object>();

    

    public static void registerService(Type serviceType, Object implementation) {

        services.put(serviceType, implementation);

    }

    

    public static Object getService(Type serviceType) {

        Object service = services.get(serviceType);

        

        if (service == null) {

            // Default implementations

            if (serviceType == IAccountService.class) {

                service = new AccountService();

            }

            

            services.put(serviceType, service);

        }

        

        return service;

    }

    

    public static void clearServices() {

        services.clear();

    }

}

// Business logic using dependency injection

public class OpportunityProcessor {

    private IAccountService accountService;

    

    public OpportunityProcessor() {

        this.accountService = (IAccountService) ServiceLocator.getService(IAccountService.class);

    }

    

    public void processHighValueOpportunities(List<Opportunity> opportunities) {

        for (Opportunity opp : opportunities) {

            if (opp.Amount > 100000) {

                accountService.updateAccountRating(opp.AccountId, 'Hot');

            }

        }

    }

}

// Mock implementation for testing

@isTest

public class MockAccountService implements IAccountService {

    public List<Account> mockAccounts = new List<Account>();

    public Map<Id, String> updatedRatings = new Map<Id, String>();

    

    public List<Account> getAccountsByIndustry(String industry) {

        return mockAccounts;

    }

    

    public void updateAccountRating(Id accountId, String rating) {

        updatedRatings.put(accountId, rating);

    }

}

// Test class using mocks

@isTest

public class OpportunityProcessorTest {

    @isTest

    static void testProcessHighValueOpportunities() {

        // Setup mock

        MockAccountService mockService = new MockAccountService();

        ServiceLocator.registerService(IAccountService.class, mockService);

        

        // Test data

        Account testAccount = new Account(Id = fflib_IDGenerator.generate(Account.SObjectType));

        Opportunity testOpp = new Opportunity(

            AccountId = testAccount.Id,

            Amount = 150000,

            Name = 'Test Opp',

            CloseDate = Date.today().addDays(30),

            StageName = 'Prospecting'

        );

        

        Test.startTest();

        

        OpportunityProcessor processor = new OpportunityProcessor();

        processor.processHighValueOpportunities(new List<Opportunity>{testOpp});

        

        Test.stopTest();

        

        // Verify mock interactions

        System.assertEquals('Hot', mockService.updatedRatings.get(testAccount.Id));

        

        // Cleanup

        ServiceLocator.clearServices();

    }

}

// ID Generator utility for test data

public class fflib_IDGenerator {

    private static Integer s_num = 1;

    

    public static String generate(Schema.SObjectType sot) {

        String keyPrefix = sot.getDescribe().getKeyPrefix();

        String fakeId = keyPrefix + '0'.repeat(12-keyPrefix.length()-String.valueOf(s_num).length()) + s_num++;

        return fakeId;

    }

}

35. How do you implement performance monitoring and optimization in production Apex code?

Performance monitoring framework:

public class PerformanceMonitor {

    private static Map<String, PerformanceMetric> metrics = new Map<String, PerformanceMetric>();

    

    public class PerformanceMetric {

        public String operation;

        public Long startTime;

        public Long endTime;

        public Long duration;

        public Integer soqlCount;

        public Integer dmlCount;

        public Integer heapSize;

        

        public PerformanceMetric(String operation) {

            this.operation = operation;

            this.startTime = System.currentTimeMillis();

            this.soqlCount = Limits.getQueries();

            this.dmlCount = Limits.getDMLStatements();

            this.heapSize = Limits.getHeapSize();

        }

        

        public void stop() {

            this.endTime = System.currentTimeMillis();

            this.duration = this.endTime - this.startTime;

            this.soqlCount = Limits.getQueries() - this.soqlCount;

            this.dmlCount = Limits.getDMLStatements() - this.dmlCount;

            this.heapSize = Limits.getHeapSize() - this.heapSize;

        }

    }

    

    public static void startMonitoring(String operationName) {

        metrics.put(operationName, new PerformanceMetric(operationName));

    }

    

    public static void stopMonitoring(String operationName) {

        PerformanceMetric metric = metrics.get(operationName);

        if (metric != null) {

            metric.stop();

            logPerformanceMetric(metric);

        }

    }

    

    private static void logPerformanceMetric(PerformanceMetric metric) {

        Performance_Log__c log = new Performance_Log__c();

        log.Operation__c = metric.operation;

        log.Duration_Ms__c = metric.duration;

        log.SOQL_Count__c = metric.soqlCount;

        log.DML_Count__c = metric.dmlCount;

        log.Heap_Size__c = metric.heapSize;

        log.User__c = UserInfo.getUserId();

        log.Timestamp__c = System.now();

        

        // Async insert to avoid impacting performance

        System.enqueueJob(new PerformanceLogJob(log));

    }

    

    // Decorator pattern for method performance monitoring

    public static Object executeWithMonitoring(String operationName, MonitoredOperation operation) {

        startMonitoring(operationName);

        

        try {

            return operation.execute();

        } finally {

            stopMonitoring(operationName);

        }

    }

    

    public interface MonitoredOperation {

        Object execute();

    }

    

    // Async job for performance log insertion

    private class PerformanceLogJob implements Queueable {

        private Performance_Log__c logEntry;

        

        public PerformanceLogJob(Performance_Log__c logEntry) {

            this.logEntry = logEntry;

        }

        

        public void execute(QueueableContext context) {

            try {

                insert logEntry;

            } catch (Exception e) {

                // Log to system debug if insertion fails

                System.debug('Failed to insert performance log: ' + e.getMessage());

            }

        }

    }

}

// Usage example with performance monitoring

public class OptimizedAccountService {

    public List<Account> getAccountsWithContacts(String industry) {

        return (List<Account>) PerformanceMonitor.executeWithMonitoring(

            'GetAccountsWithContacts',

            new GetAccountsOperation(industry)

        );

    }

    

    private class GetAccountsOperation implements PerformanceMonitor.MonitoredOperation {

        private String industry;

        

        public GetAccountsOperation(String industry) {

            this.industry = industry;

        }

        

        public Object execute() {

            // Optimized query with selective fields and proper indexing

            return [

                SELECT Id, Name, Industry,

                       (SELECT Id, FirstName, LastName FROM Contacts LIMIT 5)

                FROM Account 

                WHERE Industry = :industry 

                AND IsActive__c = true

                ORDER BY Name

                LIMIT 1000

            ];

        }

    }

    

    // Bulk processing with governor limit management

    public void updateAccountRatingsInBulk(Map<Id, String> accountRatings) {

        PerformanceMonitor.startMonitoring('BulkAccountRatingUpdate');

        

        try {

            List<Account> accountsToUpdate = new List<Account>();

            

            // Process in chunks to avoid governor limits

            Integer batchSize = 200;

            List<Id> accountIds = new List<Id>(accountRatings.keySet());

            

            for (Integer i = 0; i < accountIds.size(); i += batchSize) {

                Integer endIndex = Math.min(i + batchSize, accountIds.size());

                List<Id> batchIds = accountIds.subList(i, endIndex);

                

                for (Id accountId : batchIds) {

                    Account acc = new Account(Id = accountId, Rating = accountRatings.get(accountId));

                    accountsToUpdate.add(acc);

                }

                

                // Check governor limits before proceeding

                if (Limits.getDMLStatements() > 140) {

                    // Process asynchronously if approaching limits

                    System.enqueueJob(new AsyncAccountUpdateJob(accountsToUpdate));

                    accountsToUpdate.clear();

                } else {

                    update accountsToUpdate;

                    accountsToUpdate.clear();

                }

            }

            

        } finally {

            PerformanceMonitor.stopMonitoring('BulkAccountRatingUpdate');

        }

    }

    

    private class AsyncAccountUpdateJob implements Queueable {

        private List<Account> accounts;

        

        public AsyncAccountUpdateJob(List<Account> accounts) {

            this.accounts = accounts.clone();

        }

        

        public void execute(QueueableContext context) {

            update accounts;

        }

    }

}

36. How do you implement advanced deployment strategies and continuous integration?

CI/CD implementation for Salesforce:

// Deployment configuration metadata

public class DeploymentConfig {

    public String sourceOrg;

    public String targetOrg;

    public List<String> includedMetadata;

    public List<String> excludedMetadata;

    public Boolean runTests;

    public Boolean rollbackOnFailure;

    public Map<String, String> environmentVariables;

    

    public static DeploymentConfig fromJSON(String jsonString) {

        return (DeploymentConfig) JSON.deserialize(jsonString, DeploymentConfig.class);

    }

}

// Deployment validation framework

public class DeploymentValidator {

    public class ValidationResult {

        public Boolean isValid;

        public List<String> errors;

        public List<String> warnings;

        

        public ValidationResult() {

            this.isValid = true;

            this.errors = new List<String>();

            this.warnings = new List<String>();

        }

    }

    

    public static ValidationResult validateDeployment(DeploymentConfig config) {

        ValidationResult result = new ValidationResult();

        

        // Validate test coverage

        validateTestCoverage(result);

        

        // Validate code quality

        validateCodeQuality(result);

        

        // Validate security compliance

        validateSecurityCompliance(result);

        

        // Validate data integrity

        validateDataIntegrity(result);

        

        return result;

    }

    

    private static void validateTestCoverage(ValidationResult result) {

        // Query ApexCodeCoverageAggregate to check coverage

        List<ApexCodeCoverageAggregate> coverage = [

            SELECT ApexClassOrTrigger.Name, NumLinesCovered, NumLinesUncovered

            FROM ApexCodeCoverageAggregate

            WHERE ApexClassOrTrigger.Name != null

        ];

        

        for (ApexCodeCoverageAggregate cov : coverage) {

            Decimal coveragePercentage = 0;

            if (cov.NumLinesCovered + cov.NumLinesUncovered > 0) {

                coveragePercentage = (cov.NumLinesCovered * 100.0) / 

                                   (cov.NumLinesCovered + cov.NumLinesUncovered);

            }

            

            if (coveragePercentage < 75) {

                result.errors.add('Insufficient test coverage for ' + 

                                cov.ApexClassOrTrigger.Name + ': ' + coveragePercentage + '%');

                result.isValid = false;

            } else if (coveragePercentage < 85) {

                result.warnings.add('Low test coverage for ' + 

                                  cov.ApexClassOrTrigger.Name + ': ' + coveragePercentage + '%');

            }

        }

    }

    

    private static void validateCodeQuality(ValidationResult result) {

        // Check for code quality issues

        validateNamingConventions(result);

        validateComplexity(result);

        validateDocumentation(result);

    }

    

    private static void validateNamingConventions(ValidationResult result) {

        // Validate class naming conventions

        List<ApexClass> classes = [SELECT Name FROM ApexClass WHERE NamespacePrefix = null];

        

        for (ApexClass cls : classes) {

            if (!isValidClassName(cls.Name)) {

                result.warnings.add('Class naming convention violation: ' + cls.Name);

            }

        }

    }

    

    private static Boolean isValidClassName(String className) {

        // Check for PascalCase naming convention

        return Pattern.matches('^[A-Z][a-zA-Z0-9]*, className);

    }

    

    private static void validateComplexity(ValidationResult result) {

        // This would involve static code analysis

        // In a real implementation, you might integrate with external tools

    }

    

    private static void validateDocumentation(ValidationResult result) {

        // Check for proper class and method documentation

        // This would require parsing the source code

    }

    

    private static void validateSecurityCompliance(ValidationResult result) {

        // Check for security best practices

        validateSharingModel(result);

        validateInputValidation(result);

    }

    

    private static void validateSharingModel(ValidationResult result) {

        // Ensure classes have proper sharing declarations

        // This would require source code analysis

    }

    

    private static void validateInputValidation(ValidationResult result) {

        // Check for proper input validation in public methods

        // This would require source code analysis

    }

    

    private static void validateDataIntegrity(ValidationResult result) {

        // Validate data consistency across environments

        validateRequiredCustomSettings(result);

        validateCustomMetadata(result);

    }

    

    private static void validateRequiredCustomSettings(ValidationResult result) {

        // Check that required custom settings exist

        List<String> requiredSettings = new List<String>{

            'App_Configuration__c',

            'Integration_Settings__c'

        };

        

        for (String settingName : requiredSettings) {

            List<SObject> settings = Database.query(

                'SELECT Id FROM ' + settingName + ' WHERE SetupOwnerId = \'' + UserInfo.getOrganizationId() + '\''

            );

            

            if (settings.isEmpty()) {

                result.errors.add('Required custom setting not found: ' + settingName);

                result.isValid = false;

            }

        }

    }

    

    private static void validateCustomMetadata(ValidationResult result) {

        // Validate that required custom metadata types and records exist

        // Implementation would depend on specific metadata requirements

    }

}

// Automated deployment orchestrator

public class DeploymentOrchestrator {

    public static void executeDeployment(DeploymentConfig config) {

        try {

            // Pre-deployment validation

            DeploymentValidator.ValidationResult validation = 

                DeploymentValidator.validateDeployment(config);

            

            if (!validation.isValid) {

                throw new DeploymentException('Deployment validation failed: ' + 

                                            String.join(validation.errors, ', '));

            }

            

            // Execute deployment steps

            executePreDeploymentSteps(config);

            executeActualDeployment(config);

            executePostDeploymentSteps(config);

            

            // Verify deployment

            verifyDeployment(config);

            

        } catch (Exception e) {

            handleDeploymentFailure(config, e);

        }

    }

    

    private static void executePreDeploymentSteps(DeploymentConfig config) {

        // Backup current state

        // Notify stakeholders

        // Execute pre-deployment scripts

    }

    

    private static void executeActualDeployment(DeploymentConfig config) {

        // This would integrate with Salesforce Metadata API or Salesforce CLI

        // for actual deployment execution

    }

    

    private static void executePostDeploymentSteps(DeploymentConfig config) {

        // Execute post-deployment scripts

        // Update configuration data

        // Refresh caches

    }

    

    private static void verifyDeployment(DeploymentConfig config) {

        // Run smoke tests

        // Verify critical functionality

        // Check system health

    }

    

    private static void handleDeploymentFailure(DeploymentConfig config, Exception e) {

        if (config.rollbackOnFailure) {

            executeRollback(config);

        }

        

        // Notify stakeholders of failure

        sendDeploymentFailureNotification(config, e);

    }

    

    private static void executeRollback(DeploymentConfig config) {

        // Implement rollback logic

        // Restore previous state

    }

    

    private static void sendDeploymentFailureNotification(DeploymentConfig config, Exception e) {

        // Send email notifications

        // Create platform events for monitoring systems

    }

    

    public class DeploymentException extends Exception {}

}

37. How do you implement sophisticated data archiving and retention policies?

Data archiving framework:

public class DataArchivalFramework {

    public class ArchivalPolicy {

        public String objectName;

        public String retentionPeriod; // e.g., '7 YEAR', '90 DAY'

        public String archivalCriteria;

        public Boolean softDelete;

        public String archivalDestination; // 'BIG_OBJECTS', 'EXTERNAL_STORAGE'

        public List<String> relatedObjects;

    }

    

    public class ArchivalJob implements Database.Batchable<sObject>, Database.Stateful {

        private ArchivalPolicy policy;

        private Integer recordsArchived = 0;

        private List<String> errors = new List<String>();

        

        public ArchivalJob(ArchivalPolicy policy) {

            this.policy = policy;

        }

        

        public Database.QueryLocator start(Database.BatchableContext bc) {

            String query = buildArchivalQuery(policy);

            return Database.getQueryLocator(query);

        }

        

        public void execute(Database.BatchableContext bc, List<SObject> scope) {

            try {

                switch on policy.archivalDestination {

                    when 'BIG_OBJECTS' {

                        archiveToBigObjects(scope);

                    }

                    when 'EXTERNAL_STORAGE' {

                        archiveToExternalStorage(scope);

                    }

                }

                

                if (policy.softDelete) {

                    softDeleteRecords(scope);

                } else {

                    hardDeleteRecords(scope);

                }

                

                recordsArchived += scope.size();

                

            } catch (Exception e) {

                errors.add('Error processing batch: ' + e.getMessage());

                Logger.log(Logger.LogLevel.ERROR, 'ArchivalJob', 'execute', 

                          'Archival batch failed', e);

            }

        }

        

        public void finish(Database.BatchableContext bc) {

            // Create archival summary

            Archival_Summary__c summary = new Archival_Summary__c();

            summary.Object_Name__c = policy.objectName;

            summary.Records_Archived__c = recordsArchived;

            summary.Archival_Date__c = Date.today();

            summary.Policy_Used__c = JSON.serialize(policy);

            summary.Errors__c = String.join(errors, '\n');

            

            insert summary;

            

            // Send completion notification

            sendArchivalNotification(summary);

        }

        

        private String buildArchivalQuery(ArchivalPolicy policy) {

            String baseQuery = 'SELECT Id FROM ' + policy.objectName;

            String whereClause = buildRetentionWhereClause(policy);

            

            if (String.isNotBlank(policy.archivalCriteria)) {

                whereClause += ' AND (' + policy.archivalCriteria + ')';

            }

            

            return baseQuery + ' WHERE ' + whereClause;

        }

        

        private String buildRetentionWhereClause(ArchivalPolicy policy) {

            String[] periodParts = policy.retentionPeriod.split(' ');

            Integer amount = Integer.valueOf(periodParts[0]);

            String unit = periodParts[1].toUpperCase();

            

            String dateField = getDateFieldForObject(policy.objectName);

            

            switch on unit {

                when 'DAY' {

                    return dateField + ' < ' + Date.today().addDays(-amount).format();

                }

                when 'MONTH' {

                    return dateField + ' < ' + Date.today().addMonths(-amount).format();

                }

                when 'YEAR' {

                    return dateField + ' < ' + Date.today().addYears(-amount).format();

                }

                when else {

                    throw new ArchivalException('Unsupported retention period unit: ' + unit);

                }

            }

        }

        

        private String getDateFieldForObject(String objectName) {

            // Default to CreatedDate, but could be configured per object

            return 'CreatedDate';

        }

        

        private void archiveToBigObjects(List<SObject> records) {

            // Create big object records for archival

            String bigObjectName = policy.objectName.replace('__c', '__b');

            

            List<SObject> bigObjectRecords = new List<SObject>();

            

            for (SObject record : records) {

                SObject bigObjectRecord = Schema.getGlobalDescribe()

                    .get(bigObjectName)

                    .newSObject();

                

                // Copy fields from original to big object

                copyFieldsToBigObject(record, bigObjectRecord);

                bigObjectRecords.add(bigObjectRecord);

            }

            

            // Insert into big object

            Database.insertImmediate(bigObjectRecords);

        }

        

        private void archiveToExternalStorage(List<SObject> records) {

            // Serialize and send to external storage system

            String serializedData = JSON.serialize(records);

            

            // Make callout to external system

            ExternalStorageService.archiveData(policy.objectName, serializedData);

        }

        

        private void copyFieldsToBigObject(SObject source, SObject target) {

            Map<String, Object> sourceFields = source.getPopulatedFieldsAsMap();

            

            for (String fieldName : sourceFields.keySet()) {

                try {

                    target.put(fieldName, sourceFields.get(fieldName));

                } catch (Exception e) {

                    // Field might not exist in big object, skip

                }

            }

            

            // Add archival metadata

            target.put('Archived_Date__c', System.now());

            target.put('Original_Id__c', source.Id);

        }

        

        private void softDeleteRecords(List<SObject> records) {

            for (SObject record : records) {

                record.put('IsDeleted__c', true);

                record.put('Archived_Date__c', System.now());

            }

            

            update records;

        }

        

        private void hardDeleteRecords(List<SObject> records) {

            delete records;

        }

        

        private void sendArchivalNotification(Archival_Summary__c summary) {

            // Send email notification about archival completion

            // Create platform event for monitoring systems

        }

    }

    

    // Archival scheduler

    public class ArchivalScheduler implements Schedulable {

        public void execute(SchedulableContext sc) {

            List<Archival_Policy__mdt> policies = [

                SELECT Object_Name__c, Retention_Period__c, Archival_Criteria__c,

                       Soft_Delete__c, Archival_Destination__c

                FROM Archival_Policy__mdt

                WHERE Is_Active__c = true

            ];

            

            for (Archival_Policy__mdt policyMdt : policies) {

                ArchivalPolicy policy = new ArchivalPolicy();

                policy.objectName = policyMdt.Object_Name__c;

                policy.retentionPeriod = policyMdt.Retention_Period__c;

                policy.archivalCriteria = policyMdt.Archival_Criteria__c;

                policy.softDelete = policyMdt.Soft_Delete__c;

                policy.archivalDestination = policyMdt.Archival_Destination__c;

                

                Database.executeBatch(new ArchivalJob(policy), 200);

            }

        }

    }

    

    public class ArchivalException extends Exception {}

}

// External storage service

public class ExternalStorageService {

    @future(callout=true)

    public static void archiveData(String objectName, String data) {

        Http http = new Http();

        HttpRequest request = new HttpRequest();

        

        // Configure request to external storage system

        request.setEndpoint('https://archive-system.example.com/api/archive');

        request.setMethod('POST');

        request.setHeader('Content-Type', 'application/json');

        request.setHeader('Authorization', 'Bearer ' + getArchiveToken());

        

        Map<String, Object> payload = new Map<String, Object>{

            'objectType' => objectName,

            'data' => data,

            'timestamp' => System.now().getTime(),

            'orgId' => UserInfo.getOrganizationId()

        };

        

        request.setBody(JSON.serialize(payload));

        

        try {

            HttpResponse response = http.send(request);

            

            if (response.getStatusCode() != 200) {

                throw new CalloutException('Archive failed: ' + response.getBody());

            }

            

        } catch (Exception e) {

            Logger.log(Logger.LogLevel.ERROR, 'ExternalStorageService', 'archiveData', 

                      'External archive failed', e);

        }

    }

    

    private static String getArchiveToken() {

        // Retrieve authentication token for external system

        return 'archive_token_here';

    }

}

38. How do you implement complex business process automation with approval workflows?

Advanced workflow automation:

public class WorkflowEngine {

    public interface WorkflowStep {

        void execute(WorkflowContext context);

        Boolean canExecute(WorkflowContext context);

        String getStepName();

    }

    

    public class WorkflowContext {

        public Id recordId;

        public String objectType;

        public Map<String, Object> variables;

        public List<String> executionLog;

        public Id currentUserId;

        

        public WorkflowContext(Id recordId, String objectType) {

            this.recordId = recordId;

            this.objectType = objectType;

            this.variables = new Map<String, Object>();

            this.executionLog = new List<String>();

            this.currentUserId = UserInfo.getUserId();

        }

        

        public void setVariable(String key, Object value) {

            variables.put(key, value);

        }

        

        public Object getVariable(String key) {

            return variables.get(key);

        }

        

        public void log(String message) {

            executionLog.add(System.now().format() + ': ' + message);

        }

    }

    

    public class WorkflowDefinition {

        public String name;

        public String triggerEvent;

        public List<WorkflowStep> steps;

        public Map<String, Object> configuration;

        

        public WorkflowDefinition(String name, String triggerEvent) {

            this.name = name;

            this.triggerEvent = triggerEvent;

            this.steps = new List<WorkflowStep>();

            this.configuration = new Map<String, Object>();

        }

        

        public WorkflowDefinition addStep(WorkflowStep step) {

            this.steps.add(step);

            return this;

        }

    }

    

    public static void executeWorkflow(String workflowName, WorkflowContext context) {

        WorkflowDefinition workflow = getWorkflowDefinition(workflowName);

        

        if (workflow == null) {

            throw new WorkflowException('Workflow not found: ' + workflowName);

        }

        

        context.log('Starting workflow: ' + workflowName);

        

        for (WorkflowStep step : workflow.steps) {

            try {

                if (step.canExecute(context)) {

                    context.log('Executing step: ' + step.getStepName());

                    step.execute(context);

                    context.log('Completed step: ' + step.getStepName());

                } else {

                    context.log('Skipping step: ' + step.getStepName() + ' (condition not met)');

                }

            } catch (Exception e) {

                context.log('Error in step ' + step.getStepName() + ': ' + e.getMessage());

                Logger.log(Logger.LogLevel.ERROR, 'WorkflowEngine', 'executeWorkflow', 

                          'Workflow step failed', e);

                throw e;

Frequently Asked Questions
Frequently Asked Questions

When should our engineering team choose Oracle APEX over Salesforce Apex for a new project?

When should our engineering team choose Oracle APEX over Salesforce Apex for a new project?

How do we assess a candidate's ability to handle our specific technical challenges?

How do we assess a candidate's ability to handle our specific technical challenges?

What's the biggest difference in skillsets between Oracle APEX and Salesforce developers?

What's the biggest difference in skillsets between Oracle APEX and Salesforce developers?

How do we evaluate a candidate's problem-solving approach during technical interviews?

How do we evaluate a candidate's problem-solving approach during technical interviews?

What are the most critical technical skills we should test for in Apex interviews?

What are the most critical technical skills we should test for in Apex interviews?

Building enterprise apps demands more than surface-level APEX knowledge.

Utkrusht helps you assess real Oracle APEX skills—from data modeling to secure app development—through hands-on, job-relevant evaluations. Get started now and hire the right talent.

Founder, Utkrusht AI

Ex. Euler Motors, Oracle, Microsoft. 12+ years as Engineering Leader, 500+ interviews taken across US, Europe, and India

Want to hire

the best talent

with proof

of skill?

Shortlist candidates with

strong proof of skill

in just 48 hours