How BULK Collect and BULK Bind
helps in improving Performance?
There is a limitation of fetching values by declaring
variable which can hold one value at a time. Bulk is useful in such situation
which can hold multiple values (Entire column) using array variables which will
be much faster.
BULK COLLECT:
Bulk binds can improve
the performance when loading collections from a query. The BULK COLLECT
INTO construct binds the output of the query to the collection. SELECT
statements that retrieve multiple rows with a single fetch, improving the speed
of data retrieval
FORALL:
The FORALL syntax allows us to bind the contents of
a collection to a single DML statement, allowing the DML to be run for each row
in the collection without requiring a context switch each time. INSERTs, UPDATEs, and
DELETEs that use collections to change multiple rows of data very quickly.
Difference between for all and
for loop?
• For LOOP statement executes a
sequence of statements multiple times. You would use a FOR LOOP when you want
to execute the loop body a fixed number of times.
• The FORALL statement issues a
series of static or dynamic DML statements, usually much faster than an equivalent FOR loop. It requires some setup code, because each iteration of the
loop must use values from one or more collections in its VALUES or WHERE
clauses
• The FORALL allows us to
bind the contents of a collection to a single DML statement, allowing the DML
to be run for each row in the collection without requiring a context switch
each time
Context Switching
When the PL/SQL runtime engine processes a block of code, it
executes the procedural statements within its own engine, but passes the SQL
statements on to the SQL engine. The SQL layer executes the SQL statements and
then returns information to the PL/SQL engine.
This transfer of control between the PL/SQL and SQL engines
is called a context switch. Each time a switch
occurs, there is additional overhead. There are a number of scenarios in which
many switches occur and performance degrades.
With FORALL and BULK COLLECT, however, you can fine-tune the
way these two engines communicate, effectively telling the PL/SQL engine to
compress multiple context switches into a single switch, thereby improving the
performance of your applications.
Using BULK COLLECT, we replace "INTO" with "BULK
COLLECT INTO" (whether it is SELECT INTO, FETCH INTO or EXECUTE IMMEDIATE
INTO) and then after the INTO keyword, you will supply one or more collections,
rather than a record or list of variables.
Bulk Inserts with Oracle
CREATE OR REPLACE PROCEDURE TestProc
AS
TimeStart Varchar2(20);
TimeEnd Varchar2(20);
BEGIN
Select TO_Char(Sysdate, 'HH:MI:SS') Into TimeStart From Dual;
DBMS_OUTPUT.PUT_LINE(TimeStart);
FOR x IN (SELECT * FROM All_Objects)
LOOP
INSERT INTO Test_Object
(
OWNER,
OBJECT_NAME,
SUBOBJECT_NAME,
OBJECT_ID,
DATA_OBJECT_ID,
OBJECT_TYPE,
CREATED,
LAST_DDL_TIME,
TIMESTAMP,
STATUS,
TEMPORARY,
GENERATED,
SECONDARY,
NAMESPACE,
EDITION_NAME
)
VALUES
(
x.OWNER,
x.OBJECT_NAME,
x.SUBOBJECT_NAME,
x.OBJECT_ID,
x.DATA_OBJECT_ID,
x.OBJECT_TYPE,
x.CREATED,
x.LAST_DDL_TIME,
x.TIMESTAMP,
x.STATUS,
x.TEMPORARY,
x.GENERATED,
x.SECONDARY,
x.NAMESPACE,
x.EDITION_NAME
);
END LOOP;
COMMIT;
Select TO_Char(Sysdate, 'HH:MI:SS') Into TimeEnd From Dual;
DBMS_OUTPUT.PUT_LINE(TimeEnd);
END TestProc;
USING Bulk Collect
•
Converting to collections and bulk
processing can increase the volume and complexity of your code.
•
Collections, an evolution of PL/SQL
tables that allows us to manipulate many variables at once, as a unit.
•
Collections, coupled with two new
features introduced with Oracle, BULK_COLLECT and FORALL, can dramatically
increase the performance of data manipulation code within PL/SQL.
CREATE OR REPLACE PROCEDURE
TestProcBulk
IS
TimeStart Varchar2(20);
TimeEnd Varchar2(20);
TYPE ARRAY IS TABLE OF
All_Objects%ROWTYPE;
l_Data ARRAY;
CURSOR c IS SELECT * FROM All_Objects;
BEGIN
Select TO_Char(Sysdate, 'HH:MI:SS') Into TimeStart From Dual;
DBMS_OUTPUT.PUT_LINE(TimeStart);
OPEN c;
LOOP
FETCH c BULK COLLECT INTO l_Data LIMIT 100;
FORALL i IN 1..l_Data.COUNT
INSERT INTO Test_Object VALUES l_Data(i);
EXIT WHEN c%NOTFOUND;
END LOOP;
CLOSE c;
Select TO_Char(Sysdate, 'HH:MI:SS') Into TimeEnd From Dual;
DBMS_OUTPUT.PUT_LINE(TimeEnd);
END TestProcBulk;
Eliminate CURSOR LOOP
You may eliminate the CURSOR Loop at all, the
resulting Procedure is compacter and the performance is more or less the same.
CREATE OR REPLACE PROCEDURE
TestProcBulk
IS
TimeStart Varchar2(20);
TimeEnd Varchar2(20);
TYPE TObjectTable IS TABLE OF
ALL_OBJECTS%ROWTYPE;
ObjectTable TObjectTable;
BEGIN
Select TO_Char(Sysdate, 'HH:MI:SS') Into TimeStart From Dual;
DBMS_OUTPUT.PUT_LINE(TimeStart);
SELECT * BULK COLLECT INTO ObjectTable
FROM ALL_OBJECTS;
FORALL x in ObjectTable.First..ObjectTable.Last
INSERT INTO Test_Object VALUES ObjectTable(x) ;
Select TO_Char(Sysdate, 'HH:MI:SS') Into TimeEnd From Dual;
DBMS_OUTPUT.PUT_LINE(TimeEnd);
END;
The LIMIT Clause gives you the benefits of bulk binds, without hogging all the server memory. The following code shows how to chunk through the data in a large table.
SET SERVEROUTPUT
ON
DECLARE
TYPE VBulk_Collect IS TABLE OF
Test_Object%ROWTYPE;
l_tab VBulk_Collect;
CURSOR c IS SELECT *
FROM Test_Object;
BEGIN
OPEN c;
LOOP
FETCH c
BULK COLLECT INTO l_tab LIMIT 10000;
EXIT WHEN l_tab.count = 0;
-- Process contents of collection here.
DBMS_OUTPUT.put_line(l_tab.count || ' Rows
Processed');
END LOOP;
CLOSE c;
END;
SQL> SET SERVEROUTPUT ON
SQL> DECLARE
2 TYPE VBulk_Collect IS TABLE OF Test_Object%ROWTYPE;
3 l_tab VBulk_Collect;
4 CURSOR c IS SELECT * FROM Test_Object;
5 BEGIN
6 OPEN c;
7 LOOP
8 FETCH c
9 BULK COLLECT INTO l_tab LIMIT 10000;
10 EXIT WHEN l_tab.count = 0;
11
12 -- Process contents of collection here.
13 DBMS_OUTPUT.put_line(l_tab.count || ' Rows Processed');
14 END LOOP;
15 CLOSE c;
16 END;
17 /
10000 Rows Processed
10000 Rows Processed
10000 Rows Processed
10000 Rows Processed
10000 Rows Processed
5724 Rows Processed
PL/SQL procedure successfully completed.
Keep the following factoids in mind when using BULK COLLECT:
- You could use BULK COLLECT only with static SQL. Now you can use BULK COLLECT with both dynamic and static SQL.
- You can use BULK COLLECT keywords in any of the following clauses: SELECT INTO, FETCH INTO, and RETURNING INTO.
- A collection populated by BULK COLLECT is always filled sequentially from index value 1 (or is empty).
- If you are BULK COLLECTing into a varray or a nested table, you do not need to initialize or extend prior to executing the BULK COLLECT query. Oracle will take care of all of that for you.
- SELECT...BULK COLLECT will not raise NO_DATA_FOUND if no rows are found. Instead, you must check the contents of the collection to see if there is any data inside it.
- If the query returns no rows, the collection’s COUNT method will return 0.
- As with any operation involving collections, use of BULK COLLECT will likely increase the amount of PGA (Process Global Area) memory used by an application session.
- Use the LIMIT clause with BULK COLLECT to put a cap on the amount of PGA memory used by your query.
- You can specify the limit as a literal, expression or parameter. I suggest you avoid hard-coding the value.
- 100 is a good default or starting value for LIMIT. Setting the value higher, to say 500 or 1000, will probably not improve performance significantly, but will increase PGA consumption. For very large volumes of data, such millions of rows of data, you should probably experiment with higher limit values to see what kind of impact you see.
- You can fetch into a single collection of records or a series of collections (one for each expression returned in the SELECT list).
RELATED TOPICS:
It's RUDE to
Read and Run!
Get involved
and leave your Comments in the Box Below. The more people get involved, the
more we all benefit. So, leave your thoughts before you leave
the page.
It is amazing and wonderful to visit your site. Thanks for sharing this information, this is very useful.. We are also providing the best services click on below links to visit our website.
ReplyDeleteOracle Fusion HCM Training
Workday Training
Okta Training
Palo Alto Training
Adobe Analytics Training