Handling All Rows in Bulk Operations with Triggers in SQL Server

Handling All Rows in Bulk Operations with Triggers in SQL Server

To process multiple rows correctly during bulk operations in SQL Server, leverage the INSERTED and DELETED pseudo-tables in triggers. These tables store the new and old values of rows affected by INSERT, UPDATE, or DELETE operations. Set-based operations on these tables allow handling all affected rows efficiently, avoiding row-by-row processing.


Example:

CREATE TABLE Attendance (

    ID INT IDENTITY PRIMARY KEY,         -- Unique identifier for each record

    EmployeeID INT,                      -- ID of the employee (foreign key reference)

    PunchTime DATETIME,                  -- The time the employee punched in or out

    Status NVARCHAR(50),                 -- Status (e.g., 'Present', 'Absent', 'Late', etc.)

    CreatedDate DATETIME DEFAULT GETDATE()  -- Timestamp for when the attendance was recorded

);

 

CREATE TABLE Transactions (

    TransactionID INT IDENTITY PRIMARY KEY,

    TransactionDate DATETIME,

    OperationType NVARCHAR(10), -- 'INSERT', 'UPDATE', or 'DELETE'

    LogID INT,                  -- ID of the affected row in the Attendance table

    EmployeeID INT,

    PunchTime DATETIME,

    Status NVARCHAR(50)

);

  

CREATE TRIGGER trg_AfterBulkInsertUpdateDelete

ON Attendance

AFTER INSERT, UPDATE, DELETE

AS

BEGIN

    -- Log INSERTED rows for INSERT operations

    IF EXISTS (SELECT * FROM INSERTED)

    BEGIN

        INSERT INTO Transactions (TransactionDate, OperationType, LogID, EmployeeID, PunchTime, Status)

        SELECT GETDATE(), 'INSERT', ID, EmployeeID, PunchTime, Status

        FROM INSERTED;

    END

 

    -- Log DELETED rows for DELETE operations

    IF EXISTS (SELECT * FROM DELETED)

    BEGIN

        INSERT INTO Transactions (TransactionDate, OperationType, LogID, EmployeeID, PunchTime, Status)

        SELECT GETDATE(), 'DELETE', ID, EmployeeID, PunchTime, Status

        FROM DELETED;

    END

 

    -- Log UPDATED rows for UPDATE operations

    IF EXISTS (SELECT * FROM INSERTED) AND EXISTS (SELECT * FROM DELETED)

    BEGIN

        INSERT INTO Transactions (TransactionDate, OperationType, LogID, EmployeeID, PunchTime, Status)

        SELECT GETDATE(), 'UPDATE', i.ID, i.EmployeeID, i.PunchTime, i.Status

        FROM INSERTED i

        JOIN DELETED d ON i.ID = d.ID

        WHERE i.EmployeeID <> d.EmployeeID

           OR i.PunchTime <> d.PunchTime

           OR i.Status <> d.Status;

    END

END;

 

Consider using BATCHSIZE for bulk inserts to control the number of rows inserted in each batch. This helps manage large datasets by processing in smaller chunks, reducing locking contention and enhancing performance.


Example

BULK INSERT Attendance

FROM 'D:\AttendanceData.csv'

WITH (BATCHSIZE = 1000); 


Using INSERTED and DELETED tables for set-based processing, along with an appropriate BATCHSIZE for bulk operations, ensures that triggers correctly handle all affected rows during bulk INSERT, UPDATE, or DELETE operations in SQL Server.



Comments

Popular posts from this blog

Multiline to singleline IN C# - CODING

EF Core interview questions for beginners

EF Core interview questions for experienced