Oracle PL SQL Tutorial Day 15
Introduction
------------
Performance is very important In large scale data and applications. When we perform these operation using traditional way eg. Cursors, loops it mat slower.
Oracle PL SQL provides another technique called bulk processing.
Bulk processing is a powerful technique that allows developers to perform batch operations efficiently by minimizing the context switches between SQL and PL/SQL engines. In data-intensive environments, understanding and using bulk processing can dramatically enhance application responsiveness and throughput.
Before start this tutorial you must know pl SQL collection, cursors.
This blog will cover everything you need to know about bulk processing in PL/SQL, including:
- What is bulk processing?
- Why use bulk processing?
- The three main techniques: BULK COLLECT, FORALL, and SAVE EXCEPTIONS
- Performance comparisons
- Real-world examples using Oracle’s HR schema
- Best practices and caveats
The bulk processing use memory to store data and processing it may impact on memory storage and processer.
What is Bulk Processing?
------------------------
Bulk processing refers to the PL/SQL feature that allows you to perform operations on collections of data in a single context switch, rather than row-by-row.
There are three main components:
- BULK COLLECT – for retrieving multiple rows in one go
- FORALL – for performing DML operations (INSERT, UPDATE, DELETE) in bulk
- SAVE EXCEPTIONS – for capturing errors during bulk operations.
Why Use Bulk Processing?
------------------------
- - Reduce context switching between the PL/SQL and SQL engines.
- - Improve performance for large volumes of data.
- - Lower memory consumption through efficient batch sizes.
- - Cleaner code by working with collections.
Oracle Collections: The Foundation
----------------------------------
Bulk operations require collections. The most commonly used collection type is:
- PL/SQL Table (INDEX BY table)
- VARRAYs
- Nested Tables
Example:
TYPE emp_id_table IS TABLE OF employees.employee_id%TYPE INDEX BY PLS_INTEGER;
BULK COLLECT
------------
Used to fetch multiple rows from a query into a collection with a single fetch.
Syntax:
SELECT column1, column2
BULK COLLECT INTO collection1, collection2
FROM table_name;
Example 1: Fetching employee data into collections
DECLARE
TYPE emp_id_tab IS TABLE OF employees.employee_id%TYPE INDEX BY PLS_INTEGER;
TYPE name_tab IS TABLE OF employees.last_name%TYPE INDEX BY PLS_INTEGER;
v_emp_ids emp_id_tab;
v_names name_tab;
BEGIN
SELECT employee_id, last_name
BULK COLLECT INTO v_emp_ids, v_names
FROM employees
WHERE department_id = 60;
FOR i IN 1 .. v_emp_ids.COUNT LOOP
DBMS_OUTPUT.PUT_LINE('ID: ' || v_emp_ids(i) || ', Name: ' || v_names(i));
END LOOP;
END;
FORALL
------
FORALL executes a DML statement multiple times using values from a collection.
Syntax:
FORALL index IN collection.FIRST .. collection.LAST
DML_statement;
Example 2: Bulk insert into audit table
DECLARE
TYPE emp_tab IS TABLE OF employees%ROWTYPE;
v_emps emp_tab;
BEGIN
SELECT * BULK COLLECT INTO v_emps
FROM employees
WHERE department_id = 60;
FORALL i IN v_emps.FIRST .. v_emps.LAST
INSERT INTO emp_audit(emp_id, name, action_date, action_type)
VALUES (v_emps(i).employee_id,
v_emps(i).first_name || ' ' || v_emps(i).last_name,
SYSDATE,
'COPY');
END;
SAVE EXCEPTIONS
---------------
When used with FORALL, SAVE EXCEPTIONS allows partial success in DML operations by handling errors for individual records.
Syntax:
FORALL index IN indices SAVE EXCEPTIONS
DML_statement;
EXCEPTION
WHEN OTHERS THEN
FOR i IN 1 .. SQL%BULK_EXCEPTIONS.COUNT LOOP
DBMS_OUTPUT.PUT_LINE('Error at index ' || SQL%BULK_EXCEPTIONS(i).ERROR_INDEX ||
': ' || SQLERRM(-SQL%BULK_EXCEPTIONS(i).ERROR_CODE));
END LOOP;
Example 3: Bulk insert with error handling
DECLARE
TYPE id_tab IS TABLE OF employees.employee_id%TYPE;
v_ids id_tab := id_tab(100, 101, 999); -- 999 will cause error
BEGIN
FORALL i IN v_ids.FIRST .. v_ids.LAST SAVE EXCEPTIONS
INSERT INTO emp_bonus(emp_id, bonus)
VALUES (v_ids(i), 1000);
EXCEPTION
WHEN OTHERS THEN
FOR j IN 1 .. SQL%BULK_EXCEPTIONS.COUNT LOOP
DBMS_OUTPUT.PUT_LINE('Failed record: ' || SQL%BULK_EXCEPTIONS(j).ERROR_INDEX ||
' - ' || SQLERRM(-SQL%BULK_EXCEPTIONS(j).ERROR_CODE));
END LOOP;
END;
Using LIMIT with BULK COLLECT
-----------------------------
LIMIT clause fetches rows in manageable chunks.
Example 4: BULK COLLECT with LIMIT
DECLARE
CURSOR emp_cur IS SELECT employee_id FROM employees;
TYPE emp_tab IS TABLE OF employees.employee_id%TYPE;
v_emp_ids emp_tab;
BEGIN
OPEN emp_cur;
LOOP
FETCH emp_cur BULK COLLECT INTO v_emp_ids LIMIT 10;
EXIT WHEN v_emp_ids.COUNT = 0;
FOR i IN 1 .. v_emp_ids.COUNT LOOP
DBMS_OUTPUT.PUT_LINE('Processing ID: ' || v_emp_ids(i));
END LOOP;
END LOOP;
CLOSE emp_cur;
END;
Real-World Example: Salary Hike for a Department
------------------------------------------------
Scenario: Give 10% hike to all employees in department 80.
DECLARE
TYPE emp_id_tab IS TABLE OF employees.employee_id%TYPE;
v_emp_ids emp_id_tab;
BEGIN
SELECT employee_id BULK COLLECT INTO v_emp_ids
FROM employees WHERE department_id = 80;
FORALL i IN v_emp_ids.FIRST .. v_emp_ids.LAST
UPDATE employees
SET salary = salary * 1.10
WHERE employee_id = v_emp_ids(i);
END;
Performance Comparison: Loop vs FORALL
--------------------------------------
Let’s say you want to insert 100,000 records.
Traditional Loop:
FOR i IN 1 .. 100000 LOOP
INSERT INTO my_table VALUES (...);
END LOOP;
Bulk FORALL:
FORALL i IN 1 .. 100000
INSERT INTO my_table VALUES (...);
Result:
- FORALL executes in seconds.
- Loop takes several minutes.
- CPU and memory use is lower with bulk.
Monitoring Bulk Performance
---------------------------
Use SQL%ROWCOUNT to check how many rows were affected:
DBMS_OUTPUT.PUT_LINE('Rows processed: ' || SQL%ROWCOUNT);
Check exceptions:
SQL%BULK_EXCEPTIONS(i).ERROR_INDEX
SQL%BULK_EXCEPTIONS(i).ERROR_CODE
Best Practices for Bulk Processing
----------------------------------
- Use BULK COLLECT with a LIMIT clause for large data sets.
- Always initialize and extend collections when required.
- Avoid unnecessary memory use—don’t fetch more than you need.
- Use SAVE EXCEPTIONS to handle partial errors.
- Index your DML tables for faster operations.
- Use FORALL for DMLs and BULK COLLECT for SELECTs.
- Profile performance using DBMS_PROFILER or SQL Trace.
When NOT to Use Bulk Processing
-------------------------------
- For small datasets (bulk adds complexity with minimal gain)
- When using triggers on the table being modified (may fire multiple times)
- When needing row-by-row precision/error control
Common Errors
-------------
- ORA-22160: Index outside bounds (accessing uninitialized collection)
- ORA-06502: PL/SQL numeric/character error (type mismatch)
- ORA-06550: Compilation issues (wrong syntax or missing declaration)
Troubleshooting Tips
--------------------
- Use DBMS_OUTPUT to debug collections.
- Validate collection counts before indexing.
- Use SQL%BULK_EXCEPTIONS for post-mortem debugging.
- Break large loops into chunks using LIMIT.
Conclusion
----------
Bulk processing is essential for high-performance PL/SQL programming. It drastically reduces context switches between SQL and PL/SQL engines, making batch operations significantly faster and more efficient.
With BULK COLLECT, you can retrieve multiple rows in a single query. Using FORALL, you can process inserts, updates, or deletes in batches. SAVE EXCEPTIONS provides robust error-handling to identify and recover from individual failures during bulk operations.
By understanding how and when to use these techniques, you can build scalable, maintainable, and performant PL/SQL applications. When working with millions of records, the difference between row-by-row and bulk processing isn't just performance—it's the difference between success and failure of your application.
Invest time in mastering bulk operations—they are a cornerstone of professional PL/SQL development in Oracle.
0 Comments