dev-user
dev-user

Reputation: 1

Is It possible to implement an trigger to prevent duplicate records inserted while within record set duplicate exists?

Can a Trigger be implemented on any object to prevent duplicate records from being inserted during a bulk insert operation? This solution should address two scenarios: first, when duplicates already exist in the database, and second, when there are duplicate records within the set being inserted. The Trigger should also display an error message during the insertion process while allowing the first record from the set containing duplicates to be successfully inserted.

I have implemented a solution, but my understanding is that even when using triggers, if a single record is duplicated, it will cause the entire insertion process to abort. This means that no records will be inserted unless I use database.insert with the flag set to false. However, even in that case, an error will still be displayed still remain the question? Here is my Solution:

trigger FindDuplicate on Contact (before insert) {
    Set<String>emailIdToBeInserted = new Set<String>();
    List<Contact>existingContact = new List<Contact>(); 
    Set<String>existingEmail = new Set<String>();
    Set<String>processed = new Set<String>();
    Set<Id>contactId = new Set<Id>();
    if(Trigger.isBefore && Trigger.isInsert){
        for(Contact cc:Trigger.new){
            if(!emailIdToBeInserted.contains(cc.Email)){
                emailIdToBeInserted.add(cc.Email);
            }else{
                cc.IsDuplicate__c = true;
            }
        }
        
        existingContact = [Select Id,Email from Contact Where Email IN:emailIdToBeInserted];
        for(Contact ecc:existingContact){
            existingEmail.add(ecc.Email);
        }
        
        for(Contact ctc:Trigger.new){
            if(existingEmail.contains(ctc.Email) || processed.contains(ctc.Email)){
                ctc.Email.addError('Duplicate Email Address found');
            }else{
                processed.add(ctc.Email);
            }
        }
    }
    
}

If this approach is not feasible, what are the underlying reasons? Additionally, while some aspects may seem straightforward, what other considerations should I take into account?

Upvotes: 0

Views: 64

Answers (2)

user1
user1

Reputation: 23

In line with @fredbe's answer, I would totally agree with using the Salesforce’s Duplicate Detection feature as it automatically detects duplicate records, so you don’t have to manually query the database to check for duplicates yourself.

You can follow these steps for your reference:

  1. Create Matching Rules: Go to Setup → Duplicate Management → Matching Rules, and define fields to match duplicates

  2. Create Duplicate Rules: Setup → Duplicate Management → Duplicate Rules, and set how to handle duplicates (Allow or Block) and enable alerts if needed. -->

  3. Insert a batch of records to test; Salesforce will process valid records and flag or block duplicates based on your settings.

Once you have this set up, you don’t have to worry about duplicate records messing things up during bulk data insertion. It takes the load off, eliminates the need for complex triggers, and just makes life a little easier by handling duplicates in the background. The best part is that the process doesn’t stop for valid records, so you’re always moving forward.

Upvotes: 2

fredbe
fredbe

Reputation: 21

I assume that you use the dataloader. Why don't you use the Duplicate detection feature ? It detects duplicates and continues to process the remaining records being inserted when detected.

Upvotes: 1

Related Questions