APEX, Development

[ART-011] APEX Core Framework Part 2


It’s time to share my new framework with you :). I’ve been asked by all my colleagues (even if they are not part of the company anymore) where they can get my old framework, when they could have a new version or ongoing updates. I have simplify it, rework it from scratch to come with a releasable solution that can be shared with everyone.


SSDL Release Version 1.3

SSDL stands for Salesforce Secure Development Library, it will be available as a managed package that you can install on your org. This release will contain the core framework and a trigger pattern ready to be used.

Installation

Get the link:

https://login.salesforce.com/packaging/installPackage.apexp?p0=04t2p000001FzDq

Remove base url and replace it by your org base url. Don’t worry about the version number, I have done some modifications since the screen capture:

Choose Install for Admins Only and click on Install button:

Wait a few minutes:

Click on Done button to complete the installation and verify it:

Run test execution for this package only to confirm everything is good:

Core Setup

First, we will go to the Custom Metadata Types section, here you will see 3 new setup:

Click on Manage records of Core Setting and then select Default Core Settings:

FieldValueUpgradable?
LabelDefault Core SettingsNo
NameDefaultCoreSettingsNo
Core Trigger Manager ClassAPT001_TriggerEventManagerYes
With Security EnforcedCheckedYes
Bypass All Trigger Permissionssdl.BP_ALL_TRIGGERSYes
Logging EnabledCheckedYes
User Info Session Cachessdl.UserInfoCacheYes
SObject Info Session Cachessdl.UserInfoCacheYes
Recursion LevelYes
is Default ?CheckedNo
Org IdDefaultYes
is Active ?CheckedYes
Default and ActiveIn order to make the framework work, we need at least one setting with “is Default?” and “is Active ?” checked. If there no setting corresponding to org found then we will fallback to the default setting. You can have as many settings as you want depending on your strategy but remember that there can only be one default/active.
Org IdThe setup will only be available to corresponding ORG.
Core Trigger Manager Classis the default class instantiate by the trigger framework, you can override this one with your custom implementation or you can specify here another class to play as core class. You must specify a prefix to your local class, ex: “local.MyCoreImpl”.
With Security Enforcedwill append the security enforcing keyword to each SOQL query.
Bypass All Trigger Permissiondefines the custom permission used for bypassing all triggers.
Logging Enabledwill activate or deactivate the logging capabilities of the framework.
User Info Session Cachedefines the platform session cache to use to store user information.
SObject Info Session Cachedefines the platform session cache to use to store user permission information.
Recursion Levelallows to set a limit on trigger recursion (same event on same object type) from 0 to 9. Leave the field blank if you do no want restriction. Be aware that bulk loading is not compatible with this option, so leave it blank except to debug some triggers behavior.

Platform Session Cache

The framework uses cache builder capabilities to store user and schema information on session level. A default cache is provided but the size of the session cache is set to 0. You can play with this parameter or define your own cache to be used with the framework.

Custom Permission

4 Custom Permissions come with the framework and will be used to implement Feature Management capabilities:

Custom PermissionBehavior
BP_ALL_TRIGGERSwill bypass all triggers
BP_ALL_VRwill bypass all validation rules
BP_ALL_WFwill bypass all workflow rules
BP_ALL_PBwill bypass all process builders

If you want to use your own one you can, make sure to be consistent in the way you implement it. When installing the package, the Custom Permissions are automatically assigned to the profile which is not convenient, so don’t forget to remove them.

First Validation Rule

Let’s implement our first validation rule compliant with the core framework principles. I will use Account SObject as a playground and will implement a validation rule that will fire for french billing country if billing street is not filled. There are 2 prerequisites before going through the validation rule, one is to declare it core framework and another is to create a custom permission to bypass it.

Custom PermissionBehavior
BP_ACC_VR001will bypass Account_VR001

Once the custom permission has been created, go to to the Custom Metadata Types section and click on Manage records of Validation Rule and select new:

LabelAccount validation rule 001
NameAccount_VR001
CountriesFR

You can put “*” to define the rule for all countries or you can specify each country code separated by “;”.

Now go to Object Manager, select Account and then select Validation Rules

NameAccount_VR001_BillingStreet
DescriptionBilling street must be filled for allowed countries
ActiveChecked
Error LocationField checked and select BillingStreet
Error MessagePlease fill Billing Street.
FormulaBelow
IF(
    NOT($Permission.ssdl__BP_ALL_VR) &&
    NOT($Permission.BP_ACC_VR001) &&
    ( CONTAINS($CustomMetadata.ssdl__ValidationRule__mdt.Account_VR001.ssdl__Countries__c,'*') ||
CONTAINS($CustomMetadata.ssdl__ValidationRule__mdt.Account_VR001.ssdl__Countries__c,BillingCountry)
    ),
    ISBLANK(BillingStreet),
    false
)

Now you are ready to play with some Account records and you can also play with Custom Permission and Validation Rule Custom Metadata Type to see the different behaviors.

First Workflow Rule

Let’s continue with workflow rule and again we will stick around Account SObject. There is one prerequisite before implementing the rule which to create a Custom Permission to bypass it:

Custom PermissionBehavior
BP_ACC_WF001will bypass Account_WF001

Now go to Worflow rules, select new and then select Account:

NameAccount_WF001_SendBillingStreetAlert
DescriptionSend an email every time some one changes billing street
Evaluation Criteriacreated, and every time it’s edited
Rule CriteriaFormula evaluates to true
FormulaBelow
IF(
    NOT($Permission.ssdl__BP_ALL_WF) &&
    NOT($Permission.BP_ACC_WF001),
    ISCHANGED(BillingStreet) ,
    false
)

Just create an Email Action and activate the rule. You can now play with Custom Permission to watch this workflow rule fire or not depending on the setting you have applied.

First Process Builder

Let’s say we want a chatter notification every time the BillingStreet is changed for french country only. There are 2 prerequisites before going through the process builder, one is to declare it in core framework and another is to create a custom permission to bypass it.

Custom PermissionBehavior
BP_ACC_PB001_MAINwill bypass Account_PB001_Main

Once the custom permission has been created, go to to the Custom Metadata Types section and click on Manage records of Process Builder and select new:

LabelAccount_PB001_Main
NameAccount_PB001_Main
CountriesFR

You can put “*” to define the rule for all countries or you can specify each country code separated by “;”.

Now go to Process Builder, click new

NameAccount_PB001_Main
DescriptionAccount Handler
Process TypeRecord Change
ObjectAccount

Click on Add Object and then select Account and then start a process when a record is created or edited and then Save.

Click Add criteria, name it isBypassable, select formula evaluates to true, and paste the formula:

$Permission.ssdl__BP_ALL_PB 
|| $Permission.BP_ACC_PB001_MAIN
|| NOT(
CONTAINS($CustomMetadata.ssdl__ProcessBuilder__mdt.Account_PB001_Main.ssdl__Countries__c,'*')  
||  CONTAINS($CustomMetadata.ssdl__ProcessBuilder__mdt.Account_PB001_Main.ssdl__Countries__c,[Account].BillingCountry ))

Now add an action, name it Void, select Apex as Action Type an select Void in Apex Class.

Click again on Add criteria, name it isBillingStreetChanged, select conditions are met

FieldAccount.BillingStreet
OperatorisChanged
TypeBoolean
ValueTrue
ConditionsAll of the conditions are met

Now add an action, name it BillingStreetChanged, select Post to chatter and Post to this record , fill in some message and save it and activate it.

Now you are ready to play with some Account records and you can also play with Custom Permission and Process Builder Custom Metadata Type to see the different behaviors.

Data Manager

There are 2 ways to implement the pattern defined in the Apex Core Framework:

PatternImplementation
3 Layers patternSM+EM+physical DM
2 Layers patternSM+EM+virtual DM

We will start with the 3 layers one and move to 2 layers afterwards.

public inherited sharing class DM100_Account extends ssdl.DM000_SObject implements ssdl.ITF001_DataManager{
    public DM100_Account(){
        super(SObjectType.Account.Name, DM100_Account.class.getName());
    }
}

The Core Framework is principally based on the master class DM000_SObject which is abstract and cannot be instantiate directly. The class has been fully tested at 100% to avoid any regression. To access all the functionalities of this class, we need to instantiate a data manager that will inherit from DM00_SObject and will implement the ITF001_DataManager interface which will be helpful in mocking situation or to replace a complete class by another one. There are 2 parameters needed, the first one will point on the SObject to manage and the other one will tell which class has initiated the call, essentially for logging purpose. Here are the list of available methods :

global interface ITF001_DataManager {
    List<ssdl.WRP000_DMLResult.DmlResultMatcher> buildDmlResultsMatcher(List<SObject> param0, List<Database.SaveResult> param1, List<Database.UpsertResult> param2, List<Database.DeleteResult> param3);
    String buildSelectClause(Map<String,Schema.SObjectField> param0);
    List<Database.DeleteResult> deleteList(List<SObject> param0, Boolean param1);
    String getAllFields();
    Map<String,Schema.SObjectField> getFieldsMap();
    List<Database.SaveResult> insertList(List<SObject> param0, Boolean param1);
    ssdl.WRP002_QueryBuilder query(ssdl.WRP002_QueryBuilder param0);
    List<SObject> queryBy(String param0, String param1, List<Object> param2);
    List<Database.SaveResult> updateList(List<SObject> param0, Boolean param1);
    List<Database.UpsertResult> upsertList(List<SObject> param0, Schema.SObjectField param1);
}
MethodDescription
queryThis is the main method of the framework, it will get a request from a Wrapper class WRP002_QueryBuilder and return the result in the same wrapper. It can handle up to 10 simple binding and up to 10 List binding.
queryByThis is the simplified version of the query method that will take in parameter the list of fields to retrieve, the field to filter on and the list of values to bind to this filter. It will return a List<SObject> a result.
getFieldsMapThis method will fetch a Map containing fields information from the current SObject.
buildSelectClauseThis method will return a String containing fields separated by comma. The fieldMap can be manipulated to keep only some fields.
getAllFieldsThis method will get all field for the current SObject. It will return a String containing fields separated by comma.
insertListThis method will fire a DML insert statement that can be partially successful depending on the chosen option.
upsertListThis method will fire a DML upsert statement that can be partially successful depending on the chosen option.It does not work with allOrNone mode, you have to rewrite it with the right casting of SObject.
updateListThis method will fire a DML update statement that can be partially successful depending on the chosen option.
deleteListThis method will fire a DML delete statement that can be partially successful depending on the chosen option.
buildDmlResultsMatcherThis method will match the collection in input with the results from a DML operation.
VariableDescription
describeResultThis variable contains schema information on current SObject for current user.

Security has been enforced on each method in order to avoid to handle it in every class. It is also now up to the developer to fine tune the query and retrieve only the fields really needed. There is still an option to go easy by selecting all fields but it’s not recommended, use it wisely.

As you can see, the upsert method behaves a little bit differently from the others. As explained earlier, it’s not possible to use allorNone mode without casting explicitly the collection. To overcome this situation, we need to re implement the method in each data manager class.

public inherited sharing class DM100_Account extends ssdl.DM000_SObject implements ssdl.ITF001_DataManager{
    public DM100_Account(){
        super(SObjectType.Account.Name, DM100_Account.class.getName());
    }

    public List<Database.UpsertResult> upsertList(List<Account> accounts, Boolean allOrNoneMode, Schema.SObjectField externalIdField){
        if (accounts == null ||  accounts.isEmpty() || !describeResult.isAccessible() || !describeResult.isCreateable() || !describeResult.isUpdateable()){
            return new List<Database.UpsertResult>();
        }

        return Database.upsert(accounts, externalIdField, allOrNoneMode);
    }
}

I have added a new method to make upsert work, it will take in parameter the SObject collection, the allOrNone mode and the field to play as ExternalId. Each SObject should have an externalid field or a technical externalid to help data integration or testing purpose. Try now this piece of code:

Account acc = new Account(ExternalId__c='1234', Description = 'Updated');
List<Account> accList = new List<Account>{acc};
    
ssdl.ITF001_DataManager datamanager = new DM100_Account();
((DM100_Account)datamanager).upsertList(accList, true, Account.ExternalId__c);

As you can see, I’m not dealing with any Salesforce Ids nor doing any extra SOQL request to get the record to manipulate. I take the hypothesis that a record of Account exists in database, if not it will be created. As the newly created method is inside DM100_Account, I need to cast my data manager to it in order to get the visibility on the method which is not available on interface level.

Let’s go further and make thinks even more smooth, I will create a dedicated interface for my account data manager:

public interface ITF_DM100_Account extends ssdl.ITF001_DataManager{
    List<Database.UpsertResult> upsertList(List<Account> accounts, Boolean allOrNoneMode, Schema.SObjectField externalIdField);
}

This implementation will prevent to have a strong coupling with the framework, so you can easily switch the implementation if needed and also you can start to add signature for newly created methods.

public inherited sharing class DM100_Account extends ssdl.DM000_SObject implements ITF_DM100_Account{
    public DM100_Account(){
        super(SObjectType.Account.Name, DM100_Account.class.getName());
    }
    
    public List<Database.UpsertResult> upsertList(List<Account> accounts, Boolean allOrNoneMode, Schema.SObjectField externalIdField){
        if (accounts == null ||  accounts.isEmpty() || !describeResult.isAccessible() || !describeResult.isCreateable() || !describeResult.isUpdateable()){
            return new List<Database.UpsertResult>();
        }

        return Database.upsert(accounts, externalIdField, allOrNoneMode);
    }

}

Try now this piece of code which will have the same behavior as before:

Account acc = new Account(ExternalId__c='1234', Description = 'Updated ITF');
List<Account> accList = new List<Account>{acc};
    
ITF_DM100_Account datamanager = new DM100_Account();
datamanager.upsertList(accList, true, Account.ExternalId__c);

Entity Manager

The Entity Manager is the only class that is allowed to access Data Manager layer. It will avoid having cross or cyclic references between classes and it will define a unique pattern to access data. Remember that Entity and Data Manager are part of the technical base and will be used massively in many features, so it has to be robust and independent.

public inherited sharing class EM100_Account{
    public static ITF_DM100_Account datamanager = new DM100_Account();
}
RuleDescription
1The class is annotated “inherited sharing” because visibility should be handled at upper level.
2The class instantiates one instance of the corresponding data manager as a static variable, meaning that it will be available across the entire transaction but re executed for each new transaction.
3Entity Manager is stateless and don’t need to be instantiate, all methods will be static.

Mock

A mock can be a class or a method that will replace on the fly the standard behavior of a feature. It’s mainly used for testing purpose when integration are not available. In this section, we will explain Mocking for unit testing, but there are lot of use cases where this kind of implementation can be interesting (functional mocking, webservice mocking, routing integration …)

Before implementing Mock, there a some prerequisites to consider:

  • The class you want to mock must implement an interface
  • The mock class must implement the same interface and all methods
  • The caller class must orchestrate which implementation to instantiate
@isTest
public inherited sharing class MCK_DM100_Account implements ITF_DM100_Account{

    ### Write here implementation for every methods ###

    public List<Database.UpsertResult> upsertList(List<Account> accounts, Boolean allOrNoneMode, Schema.SObjectField externalIdField){
        return new List<Database.UpsertResult>();
    }
}

With this implementation, calling upsertList will do nothing on database but it’s up to you to come with some implementation. You can also notice that I have set the class with the anootation @isTest. Salesforce will consider the class as a test class and will not count it size against character limits. Let’s see now how we can orchestrate the switch.

public inherited sharing class EM100_Account{
   
    public static ITF_DM100_Account datamanager;
    
    static {
        datamanager = getDataManager();
    }

    private static ITF_DM100_Account getDataManager(){

        ITF_DM100_Account accountAPI = null;

        if (Test.isRunningTest()){
            accountAPI = new MCK_DM100_Account();
        } else {
            accountAPI = new DM100_Account();
        }

        return accountAPI;
    }
}
RuleDescription
1A static bloc will be used to init the class behavior.
2The implementation of this behavior is produced by a dedicated static method.
3This is an example of orchestration based on test context

To demo the behavior, you can add a static boolean variable instead of Test.isRunningTest() and run the code by switching between true and false:

Account acc = new Account(ExternalId__c='1234', Description = 'Updated ITF');
List<Account> accList = new List<Account>{acc};

//Set a variable to false (Mock)    
EM100_Account.datamanager.upsertList(accList, true, Account.ExternalId__c);

//Set a variable to true (Update)
EM100_Account.datamanager.upsertList(accList, true, Account.ExternalId__c);

As you see, we didn’t mess with any logic to switch between context, it’s handled by the Entity Manager which will be responsible for routing to the right implementation. More advanced mocking system can be put in place with Test.setMock but we will not deal with this subject here.

2 Layers Pattern

Now let’s say you don’t want to bother with Data Manager Layer as it implies to write 2 class per SObject. There are benefits and some drawbacks and we will see that through a new set of examples.

public inherited sharing class EM100_Account {
	public static ssdl.ITF001_DataManager datamanager = new ssdl.DM001_SObjectInstance(SObjectType.Account.name, EM100_Account.class.getName());
}
RuleDescription
1The class is annotated “inherited sharing” because visibility should be handled at upper level.
2The class instantiates one instance of the corresponding data manager as a static variable, meaning that it will be available across the entire transaction but re executed for each new transaction.
3Entity Manager is stateless and don’t need to be instantiate, all methods will be static.
4DM001_SObjectInstance is generic and will provide an access to all methods

Now we need to add a dedicated method to handle Account SObject:

public inherited sharing class EM100_Account {
	public static ssdl.ITF001_DataManager datamanager = new ssdl.DM001_SObjectInstance(SObjectType.Account.name, EM100_Account.class.getName());
    
    public static List<Database.UpsertResult> upsertList(List<Account> accounts, Boolean allOrNoneMode, Schema.SObjectField externalIdField){
        
        ssdl.DM001_SObjectInstance localDm = (ssdl.DM001_SObjectInstance)datamanager;
        
        if (accounts == null ||  accounts.isEmpty() || !localDm.describeResult.isAccessible() || !localDm.describeResult.isCreateable() || !localDm.describeResult.isUpdateable()){
            return new List<Database.UpsertResult>();
        }

        return Database.upsert(accounts, externalIdField, allOrNoneMode);
    }
}

With less effort, I have rewritten the previous upserList but you can notice some adjustment:

RuleDescription
1The upserList method cannot be an instance method so we have moved to static method.
2The describeResult variable is no more accessible, we need to cast our data manager to get the visibility.

Up to this point, we only see benefits, less code writing, less class, so less test to handle… But what about Mocking ?

public inherited sharing class EM100_Account {
    public static ssdl.ITF001_DataManager datamanager;
    
    static {
        datamanager = getDataManager();
    }

    private static ssdl.ITF001_DataManager getDataManager(){

        ssdl.ITF001_DataManager accountAPI = null;

        if (Test.isRunningTest()){
            accountAPI = new MCK_DM001_SObjectInstance(SObjectType.Account.name, EM100_Account.class.getName());
        } else {
            accountAPI = new ssdl.DM001_SObjectInstance(SObjectType.Account.name, EM100_Account.class.getName());
        }

        return accountAPI;
    }
    
    public static List<Database.UpsertResult> upsertList(List<Account> accounts, Boolean allOrNoneMode, Schema.SObjectField externalIdField){
        
        ssdl.DM001_SObjectInstance localDm = (ssdl.DM001_SObjectInstance)datamanager;
        
        if (accounts == null ||  accounts.isEmpty() || !localDm.describeResult.isAccessible() || !localDm.describeResult.isCreateable() || !localDm.describeResult.isUpdateable()){
            return new List<Database.UpsertResult>();
        }

        return Database.upsert(accounts, externalIdField, allOrNoneMode);
    }
}

What happens here is that we are only able to create a mocking system for the core framework methods, it is not possible to mock upsertList method for example. In most cases, we don’t honestly need a mocking system especially for standard and custom object as Salesforce database will correctly handle rollback etc … But, if you are dealing with External Objects or API integration with REST or SOAP, you need a dedicated Data Manager class.

So, my advice is to stick with 2 Layers Pattern for the majority of your development inside Salesforce and switch to 3 Layers Pattern when you have to deal with some integration that have to be mock.

Service Manager

The aim of the Service Manager is to handle business and technical requirements by implementing dedicated logic which can need access to many SObjects or API. Another important thing is that Service Manager will be responsible to handle errors. Let’s start with a simple implementation:

public with sharing class SM100_AccountServices{

    public static List<Account> processAccounts(List<String> accountExtIds){
        List<Account> queryResults = EM100_Account.datamanager.queryBy('Id, Name, Description, ExternalId__c', String.valueOf(Account.ExternalId__c), accountExtIds);
        
        return queryResults;
    }
}
RuleDescription
1This class is set with sharing enforced, but you can choose here between with sharing, without sharing, inherited sharing depending on the use case and the context of execution.
2The main method is static and the class is stateless.
3The method is bulkified so that it can be called from any context of execution.
4The method calls the Entity Manager to get the results.
5You decide at the service manager which fields are needed in your process.

With this kind of approach, we are avoiding duplication of code, errors in writing SOQL queries and we are using a technical base that have been already tested and enforced. Let’s execute the code, you will get one record in the collection:

List<Account> result = SM100_AccountServices.processAccounts(new List<String>{'1234'});

Now, we will put some advance logic in the service to show the framework capabilities:

public with sharing class SM100_AccountServices{
    
    private static final String BP_SM100 = 'BP_SM100';
    private static final String className = SM100_AccountServices.class.getName();

    public static List<Account> processAccounts(List<String> accountExtIds){
        List<Account> queryResults = null;
        List<Database.SaveResult> saveResults = null;
        String methodName = 'processAccounts';
       	
        if (FeatureManagement.checkPermission(BP_SM100)){
            return null;
        }
            
        try{
                
            ssdl.WRP001_UserInfo userInfo = ssdl.APC001_UserInfoCache.getFromCache(UserInfo.getUserId());
            ssdl.APU000_Logger.log(LoggingLevel.INFO, className, methodName, 'Connected User: '+userInfo.user.userName);
                
            queryResults = EM100_Account.datamanager.queryBy('Id, Name, Description, ExternalId__c', String.valueOf(Account.ExternalId__c), accountExtIds);
                
            for (Account acc : queryResults){
                acc.Description = 'Processing...';
            }
                
            saveResults = EM100_Account.datamanager.updateList(queryResults, false);
                
            return queryResults;
        } catch(Exception exp){
            ssdl.APU000_Logger.log(LoggingLevel.ERROR, className, methodName, 'Update failed: '+exp.getMessage());
            handleErrors(queryResults, saveResults, exp);
        }
        
        return null;
    }
    
    private static void handleErrors(List<Account> queryResults, List<Database.SaveResult> saveResults, Exception exp){
        List<ssdl.WRP000_DMLResult.DmlResultMatcher> resultMatcherList = EM100_Account.datamanager.buildDmlResultsMatcher(queryResults, saveResults, null, null);
        String methodName = 'handleErrors';
        
        for(ssdl.WRP000_DMLResult.DmlResultMatcher resultMatcher : resultMatcherList){
            if (resultMatcher.isSuccess){
                ssdl.APU000_Logger.log(LoggingLevel.INFO, className, methodName, 'Sucess: '+resultMatcher.theObject.Id);
            } else {
                List<ssdl.WRP000_DMLResult.DmlError> errors = resultMatcher.convertErrors();
                ssdl.APU000_Logger.log(LoggingLevel.INFO, className, methodName, 'Error: '+resultMatcher.theObject.Id);
                
                for (ssdl.WRP000_DMLResult.DmlError error : errors){
                     ssdl.APU000_Logger.log(LoggingLevel.INFO, className, methodName, 'Message: '+error.message);
                     ssdl.APU000_Logger.log(LoggingLevel.INFO, className, methodName, 'Status Code: '+error.statusCode);
                     ssdl.APU000_Logger.log(LoggingLevel.INFO, className, methodName, 'Fields list: '+error.fields);
                }
            }
        }
    }
}
RuleDescription
1BP_SM100 is a custom Permission that allows the feature to be enabled or disabled. There are some nice ways to handle it without hard-coding it but for the moment we keep that this way. I’m using Bypass because you don’t have to create the Custom Permission until it’s really needed, the method will return false if the permission don’t exist.
2We can get some info on connected user without having to make a query.
3We can use logging capabilities of the framework which will construct the debug in a consistent way. It can be disabled in the Core Setting.
4We can use CRUD operations capabilities
5Catching and handling exception is crucial, try catch block should be placed carefully and must always handle exception or throw new Exception. Only logging errors is useless.
6You can implement any error handling system, the framework will help you organize records in category with some information on errors. It’s up to you to handle the transaction properly.

The main problem with static blocks is that it does not allow us to push common functionalities to be included without lot of rework each time. In order to overcome this, Service Manager should become an instance following a pattern but I will not handle this topic now, you have to wait :).

Trigger

Let’s write our triggers with the framework trigger handler. We will use Account SObject to demonstrate the capabilities:

trigger AccountTrigger on Account (before insert, before update, before delete, after insert, after update, after delete, after undelete) {
    ssdl.ITF002_TriggerManager triggerManager = new ssdl.APT002_SObjectInstance(SObjectType.Account.Name, 'AccountTrigger');
    triggerManager.execute();
}
RuleDescription
1My trigger will be named with SObjectName followed by Trigger, don’t forget to remove ‘__c’ for custom object.
2The trigger will declare all available events as they will be managed on framework level.
3Create a new Instance of trigger manager for the current SObject, provide its name for logging purpose.
4Call execute method to run the trigger
5Trigger is logic less so you only instantiate the framework for the current SObject and exceute it.

Update an Account record and you will see the framework running. In this case, nothing is really processed in the trigger as the default implementation is logic less. Let’s say now we want to override trigger event to handle custom logic:

global inherited sharing class APT100_Account{
    
    global inherited sharing class BeforeUpdate extends ssdl.APT001_TriggerEventManager {
        
        public override void prepare(){
            return;
        }
        
        public override void process(){
            return;
        }
        
        public override void finish(){
            return;
        }
    }
}
RuleDescription
1APT100_Account class will be used to write every implementation for every event for Account triggers. Notice that you are free to implement the way you want, one class to hold everything or separated class per event. Make sure the class is global because it needs to be made visible to the framework.
2The inner class BeforeUpdate will implement the logic for before update event. Make sure the class is global because it needs to be made visible to the framework.
3The inner class must extend ssdl.APT001_TriggerEventManager, remember that you can replace this class with your own class by implementing the interface ssdl.ITF003_TriggerEventManager and changing the core setting.
4You can override 3 methods prepare, process and finish. The goal is to help you write the code properly but you can stick with only one method if you want. For example, you can put all SOQL in prepare(), process records in process() and then commit all DML in finish().

Now, you need to declare the implementation in you trigger:

trigger AccountTrigger on Account (before insert, before update, before delete, after insert, after update, after delete, after undelete) {
    ssdl.ITF002_TriggerManager triggerManager = new ssdl.APT002_SObjectInstance(SObjectType.Account.Name, 'AccountTrigger');
    
    triggerManager.overrideTriggerEvent(new Map<String, String>{'BEFORE_UPDATE' => 'APT100_Account.BeforeUpdate'});
    
    triggerManager.execute();
}
RuleDescription
1overrideTriggerEvent is used to declare for each event its implementation.
2keywords are :
BEFORE_INSERT
BEFORE_UPDATE
BEFORE_DELETE
AFTER_INSERT
AFTER_UPDATE
AFTER_DELETE
AFTER_UNDELETE

Now put some some logic and make an update on Account record. Every time the Account record is updated, the field description will be filled with new datetime value.

global inherited sharing class APT100_Account{
    
    global inherited sharing class BeforeUpdate extends ssdl.APT001_TriggerEventManager {
        
        List<Account> newAccountList = Trigger.new;
        Datetime myDate;

        public override void prepare(){
            myDate = Datetime.now();
            //SOQL to retrieve extra data from other SObjects  
        }
        
        public override void process(){
            for (Account acc : newAccountList){
                acc.Description = 'Updated:' +myDate;
            }
        }
        
        public override void finish(){
            //do some extra logic after the processing
            system.debug('clossing event');
        }
    }
}

From my point of view, this class and inner class can be considered like Service Manager class, so you don’t need to have extra class and all logic can be implemented here. You will have a global overview of what is really handled by trigger. You will need to implement a service manager class in case the feature is shared with other feature. In this case, I will recommend you to externalize as much as possible all SOQL and DML so that we will precisely call each method from the APT class.

Another point of attention is that this class have to be tested with unit test, so don’t forget to reset all variables provided by Trigger Context by variables provided by Test Context. These variables should be class member in order to access and modify them just after the instantiation.

Bypassing triggers

There are 2 ways to bypass triggers, by using Custom Permission or by using programmatic bypass, the first one will be configurable while the second one is tight to the the code you will write.

RuleDescription
1Every trigger can be bypassed by assigning the Custom Permission BP_ALL_TRIGGERS, this custom permission can be replaced by yours in Core Framework Settings.
2Each trigger can be bypassed independently by assigning the Custom Permission BP_TriggerName, the trigger name is provided when instantiating the trigger framework in each trigger. You don’t need to create in advance the Custom Permission. Ex: for AccountTrigger, it will be BP_AccountTrigger.
3Each trigger event can be bypassed independently by assigning the Custom Permission BP_TriggerName_EventName. You don’t need to create in advance the Custom Permission. Ex: for AccountTrigger before insert event, it will be BP_AccountTrigger_BEFORE_INSERT.
4Every trigger can be bypassed by setting the variable ssdl.APU002_Context.contextualBypassAllTriggers to true before calling any DML statement, and can be reactivated by setting it back to false after DML operations.
5Each trigger and trigger even can be bypassed independently on programmatic level.
//All triggers will be deactivated
ssdl.APU002_Context.contextualBypassAllTriggers = true;
update accountList;
update opportunityList;
ssdl.APU002_Context.contextualBypassAllTriggers = false;

//Only Account triggers will be deactivated
ssdl.APU002_Context.addTriggerBypass('AccountTrigger');
update accountList;
update opportunityList;
ssdl.APU002_Context.removeTriggerBypass('AccountTrigger');

//Only Account Before update triggers will be deactivated
ssdl.APU002_Context.addTriggerEventBypass('AccountTrigger', 'BEFORE_UPDATE');
update accountList;
update opportunityList;
ssdl.APU002_Context.removeTriggerEventBypass('AccountTrigger','BEFORE_UPDATE');

//Will deactivate all previously set bypass except contextualBypassAllTriggers
ssdl.APU002_Context.removeAllTriggersBypass();

I will recommend you to place the bypass near corresponding DML operation and deactivate them immediately after the DML call. The static variables are shared across the entire transaction and also for every trigger call in bulk request, not deactivating them after DML call will lead to unpredictable behavior. It’s not recommended to use programmatic bypass when dealing with Bulk API request.

SOQL

Let’s see now how we could write some SOQL queries in different manners:

ssdl.ITF001_DataManager datamanager = new ssdl.DM001_SObjectInstance(SObjectType.Account.Name, 'EM100_Account');

//Get all fields if needed, remove some etc or prepare your own Map
Map<String, Schema.SObjectField> fieldsMap = datamanager.getFieldsMap();
//And then build a string to prepare the query
String selectQuery = datamanager.buildSelectClause(fieldsMap);

//Alternatively you can call one method to get the list of fields
String allfields = datamanager.getAllFields();

//criteria to filter on
Set<String> externalIdList = new Set<String>{'1234'};
    
//Retrieve information on the current SObject for current user
Schema.DescribeSObjectResult describeResult = ((ssdl.DM001_SObjectInstance)datamanager).getDescribeResult();

List<Account> liste01;
    
if (describeResult.isAccessible() && describeResult.isQueryAble()){
    //You could write the query in standard way
    liste01 = [select Id, Name, ExternalId__c from Account where ExternalId__c in :externalIdList WITH SECURITY_ENFORCED limit 1];
    
    //You could write the query with Database.query
    liste01 = Database.query(String.escapeSingleQuotes('select Id, Name, ExternalId__c from Account where ExternalId__c in :externalIdList WITH SECURITY_ENFORCED limit 1'));
}

//You could use the QueryBuilder with simple binded variable 
ssdl.WRP002_QueryBuilder queryBuilder01 = new ssdl.WRP002_QueryBuilder('Id, Name, ExternalId__c', 'ExternalId__c = :extId', 'LIMIT 1', new Map<String, Object>{'extId' => '1234'}, null);
queryBuilder01 = datamanager.query(queryBuilder01);
liste01 = queryBuilder01.results;

//You could use the QueryBuilder with list binded variable 
queryBuilder01 = new ssdl.WRP002_QueryBuilder('Id, Name, ExternalId__c', 'ExternalId__c in :extId', 'LIMIT 1', null, new Map<String, List<Object>>{'extId' => new List<String>{'1234', 'DM000_TEST-02'}});
queryBuilder01 = datamanager.query(queryBuilder01);
liste01 = queryBuilder01.results;

//You could use the QueryBuilder with simple binded variable and with list binded variable
queryBuilder01 = new ssdl.WRP002_QueryBuilder('Id, Name, ExternalId__c', 'ExternalId__c in :extId2 and ExternalId__c = :extId1', 'LIMIT 1', new Map<String, Object>{'extId1' => '1234'}, new Map<String, List<Object>>{'extId2' => new List<String>{'1234', 'DM000_TEST-02'}});
queryBuilder01 = datamanager.query(queryBuilder01);
liste01 = queryBuilder01.results;

//You could use the query builder for aggregate query
queryBuilder01 = new ssdl.WRP002_QueryBuilder('count(Id) mycount', null, null);
queryBuilder01.isAggregateResults = true;
queryBuilder01 = datamanager.query(queryBuilder01);
List<AggregateResult> aggregateResults = queryBuilder01.aggregateResults;
Integer countResults = (Integer)aggregateResults.get(0).get('mycount');

//You could use queryBy for simple binding
liste01 = datamanager.queryBy('Id, Name, ExternalId__c', String.valueOf(Account.ExternalId__c), new List<String>{'1234', 'DM000_TEST-01', 'DM000_TEST-02'});
RuleDescription
1The way you will write the query depends on the complexity, on the maintenance level, on security level, on reducing duplication.
2The framework will help remove the painful part and come with homogeneous security control, easy querying options.
3Binded variable naming must be unique and must not contain sub word. For example, for 2 binded variables ‘:ext’ and ‘:extendvar’, the query builder will fail because ‘:extendvar’ contains ‘:ext’. We use String.replace method to make binded system work. Better will be to come with a homogeneous unique naming like ‘argExternalId’, ‘argAccountId’ but not with ‘argExternalId01’ which will fail in combination with previous arguments.

Conclusion

This is my first releasable version, it has been challenging to package it in managed package due to some design considerations, visibility of components etc… I’ve been struggling with some issues that you only discover after installing the package in another org, upgrading process is a quite tricky and have a lot constraints that sometime limits your design. It’s your responsibility to install it, to integrate it with your code, to test it in your context and decide if you’ll go live or not with it. I will be happy to have your feedback on benefits and concerns you have seen while using it, it will help me to improve it.

Hope you enjoy reading this article, see you soon for the next one ...

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.