Translate

ORACLE SQL* Loader

What is SQL Loader and how it is useful in loading External file Into DB?

SQL*Loader loads data from external files Into Tables of an Oracle database. It has a powerful data parsing engine that puts little limitation on the format of the data in the Datafile. 
Following is procedure to load the data from Third Party Database into Oracle using SQL Loader.

  1. Convert the Data into Flat file using third party database command.
  2. Create the Table Structure in Oracle Database using appropriate datatypes
  3. Write a Control File, describing how to interpret the flat file and options to load the data.
  4. Execute SQL Loader utility specifying the control file in the command line argument

Input data file for SQL*Loader
This is the input text file that contains the data that needs to be loaded Into an oracle Table. Each and every record needs to be in a separate line, and the column values should be delimited By some common delimiter character. For some of the examples mentioned below, we’ll use the following Employee.txt file to upload the data to the Employee Table.
$ cat Employee.txt
100,Thomas,Sales,5000
200,Jason,Technology,5500
300,Mayla,Technology,7000
400,Nisha,Marketing,9500
500,Randy,Technology,6000
501,Ritu,Accounting,5400

SQL*Loader Control File
This contains the instructions to the SQLLDR utility. This tells SQLLDR the location of the input file, the format of the input file, and other optional meta data information required By the SQLLDR to upload the data Into oracle Tables.
$ cat Example1.ctl
load data
 infile '/Home/BaluniGroups/Employee.txt'
 Into Table Employee
 Fields Terminated By ","
 ( ID, Name, Dept, Salary )
The above control file indicates the following:
§  Infile – Indicates the location of the input data file
§  Into Table – Indicates the Table Name where this data should be inserted
§  Fields Terminated By – Indicates the delimiter that is used in the input file to separate the fields

§  ( ID, Name, Dept, Salary ) – Lists the Name of the column in the Table Into which the data should be uploaded

1. Upload file Using SQL*Loader

First, create the Employee Table as shown below.
SQL> Create Table Employee
(ID Integer, Name Varchar2(10), Dept Varchar2(15), Salary Integer, Hiredon Date)
Next create the control file that explains what needs to be upload and where.
$ cat Sqlldr-add-new.ctl
load data
 Infile '/Home/BaluniGroups/Employee.txt'
 Into Table Employee
 Fields Terminated By ","
 ( ID,Name,Dept,Salary )

Note: If you have the values inside the data file enclosed with double quote, use this in your control file: fields terminated By “,” optionally enclosed By ‘”‘
Note: If you don’t have the Table created, you’ll get the following error message:
SQL*Loader-941: Error during describe of Table EMPLOYEE
ORA-04043: object EMPLOYEE does not exist
You can pass the USERID and Password to the SQLLDR command using any one of the following format. As you see below, both of these will prompt you for control file location, as it was not given in the command line.
$ sqlldr scott/tiger
(or)
$ sqlldr UserID=scott/tiger
control =
SQL*Loader-287: No control file Name specified.
Execute the SQLLDR command to upload these new records to the empty Table By specifying both UID/Pwd and the control file location as shown below.
$ sqlldr scott/tiger control=/Home/BaluniGroups/sqlldr-add-new.ctl
Commit point reached - logical record count 5

Verify the records are created in the database.
SQL> Select * From Employee;
        ID NAME       DEPT                SALARY HIREDON
---------- ---------- --------------- ---------- -------
       100 Thomas     Sales                 5000
       200 Jason      Technology            5500
       300 Mayla      Technology            7000
       400 Nisha      Marketing             9500
       500 Randy      Technology            6000

This will create the output log file in the same Name as the data file, but with the .log extension (instead of. ctl). Partial output shown below.
$ cat sqlldr-add-new.log
Control File:   /Home/BaluniGroups/sqlldr-add-new.ctl
Data File:      /Home/BaluniGroups/Employee.txt
Table EMPLOYEE:
  5 Rows successfully loaded.
  0 Rows not loaded due to data errors.
  0 Rows not loaded because all WHEN clauses were failed.
  0 Rows not loaded because all fields were null.
Elapsed time was:     00:00:00.04
CPU time was:         00:00:00.00

2. Inserting Additional Records(APPEND)

Let us say you want to add two new Employees to the Employee Table from the following newEmployee.txt file.
$ vi NewEmployee.txt
600,Ritu,Accounting,5400
700,Jessica,Marketing,7800
If you create a similar control file like the previous example, you might get the following error message.
$ sqlldr scott/tiger control=/Home/BaluniGroups/sqlldr-add-more.ctl
SQL*Loader-601: For INSERT option, Table must be empty.  Error on Table EMPLOYEE
The above indicates that the Table should be empty before you can upload data using SQL*Loader.
If you like to insert more data to the Tables without having to delete the existing rows, use the “append’ command as shown in the following control file.
$ vi sqlldr-append-more.ctl
load data
 Infile '/Home/BaluniGroups/NewEmployee.txt'
 Append
 Into Table Employee
 Fields Terminated By ","
 ( ID, Name, Dept, Salary )
Now, if you do SQLLDR this will append the data.
$ sqlldr scott/tiger control=/Home/BaluniGroups/sqlldr-append-more.ctl
Commit point reached - logical record count 2
Verify that the records are appended successfully
SQL> Select * from Employee;
        ID NAME       DEPT                SALARY HIREDON
---------- ---------- --------------- ---------- -------
       100 Thomas     Sales                 5000
       200 Jason      Technology            5500
       300 Mayla      Technology            7000
       400 Nisha      Marketing             9500
       500 Randy      Technology            6000
       600 Ritu       Accounting            5400
       700 Jessica    Marketing             7800

3. Data inside the Control File using BEGINDATA

You can also specify the data directly insIDe the control file itself using BEGINDATA keyword. i.e. Anything that comes after BEGINDATA will be treated as data to be uploaded to the Table as shown below.
$ cat sqlldr-add-new-with-data.ctl
load data
 Infile *
 Into Table Employee
 Fields Terminated By ","
 ( ID, Name, Dept, Salary )
Begindata
100,Thomas,Sales,5000
200,Jason,Technology,5500
300,Mayla,Technology,7000
400,Nisha,Marketing,9500
500,Randy,Technology,6000
Note: The Infile will say ‘*’ in this case, as there is no input data file Name for this example.
Execute SQLLDR to upload the data from the control file.
$ sqlldr scott/tiger control=/Home/BaluniGroups/sqlldr-add-new-with-data.ctl

4. Date format and Different Delimiter

This example shows how to specify a date format in the control file and how to handle different delimiters in a data file
The following example has different delimiters ($ after Name, ^ after department).
$ cat Employee-date.txt
100,Thomas$Sales^5000,31-JAN-2008
200,Jason$Technology^5500,01-Feb-2005
300,Mayla$Technology^7000,10-Aug-2000
400,Nisha$Marketing^9500,12-Dec-2011
500,Randy$Technology^6000,01-JAN-2007
Create the following control file and indicate the field delimiters for each and every field using “terminated By” as shown below.
$ cat sqlldr-date.ctl
load data
 Infile '/Home/BaluniGroups/Employee-date.txt'
 Into Table Employee
 Fields Terminated By ","
 ( ID, Name Terminated By "$", Dept terminated By "^", Salary, hiredon DATE "dd-mon-yyyy" )
Load the data using SQLLDR as shown below.
$ sqlldr scott/tiger control=/Home/BaluniGroups/sqlldr-date.ctl
Verify that the data got loaded properly as shown below.
SQL> Select * From Employee;
        ID NAME       DEPT                SALARY HIREDON
---------- ---------- --------------- ---------- ---------
       100 Thomas     Sales                 5000 31-JAN-08
       200 Jason      Technology            5500 01-FEB-05
       300 Mayla      Technology            7000 10-AUG-00
       400 Nisha      Marketing             9500 12-DEC-11
       500 Randy      Technology            6000 01-JAN-07

5. Fixed Length Data Upload

If you have a data file without data that are fixed length (i.e without any delimiter), you can use this example to upload this data.
For this example, let us use the following file which has data that are of fixed length. For example, 1st three characters are always Employee number, Next 5 characters are always Employee Name, etc.
$ cat Employee-fixed.txt
200JasonTechnology5500
300MaylaTechnology7000
400NishaTechnology9500
500RandyTechnology6000
Create the following control file, where you specific the position of each and every field as shown below using the “Position(start:end)” syntax.
$ cat sqlldr-fixed.ctl
load data
 infile '/Home/BaluniGroups/Employee-fixed.txt'
 Into Table Employee
 Fields Terminated By ","
 ( ID Position(1:3), Name Position(4:8), Dept Position(9:18), Salary Position(19:22) )
Load this fixed length data using the SQLLDR as shown below.
$ sqlldr scott/tiger control=/Home/BaluniGroups/sqlldr-fixed.ctl
Verify that the data got loaded.
SQL> Select * From Employee;
        ID NAME       DEPT                SALARY HIREDON
---------- ---------- --------------- ---------- ---------
       200 Jason      Technology            5500
       300 Mayla      Technology            7000
       400 Nisha      Technology            9500
       500 Randy      Technology            6000
6. Change the data during upload
You can also massage the data and change it during upload based on certain rules.
In the following control file:
§  ID is incremented By 999 before uploading. i.e if the emp ID is 100 in the data file, it will be loaded as 1099
§  Convert the Name to upper case and load it. This uses the upper function.
§  If the department contains the value “Technology” change it to “Techies”. This uses decode function

$ cat sqlldr-change-data.ctl
Load Data
 Infile '/Home/BaluniGroups/Employee.txt'
 Into Table Employee
 Fields Terminated By ","
 (ID ":ID+999",
   Name "Upper(:Name)",
   Dept  "Decode(:Dept,'Technology','Techies', :Dept)",
   Salary
  )
Load the data using this control file which will massage the data before uploading it.
$ sqlldr scott/tiger control=/Home/BaluniGroups/sqlldr-change-data.ctl
Verify that the data got changed while loading as per our rules.
SQL> Select * From Employee;
        ID NAME       DEPT                SALARY HIREDON
---------- ---------- --------------- ---------- ---------
      1099 THOMAS     Sales                 5000
      1199 JASON      Techies               5500
      1299 MAYLA      Techies               7000
      1399 NISHA      Marketing             9500
      1499 RANDY      Techies               6000
7. Load data from multiple files
To load data from multiple files, you just have to specify multiple Infile in the control file.
The following control file loads data from two different data files (Employee.txt and NewEmployee.txt) to the Employee Table.
$ sqlldr-add-multiple.ctl
Load Data
 Infile '/Home/BaluniGroups/Employee.txt'
 Infile '/Home/BaluniGroups/newEmployee.txt'
 Into Table Employee
 Fields Terminated By ","
 (ID, Name, Dept, Salary )
Load the data using this control file which will upload data from multiple data files as shown below.
$ sqlldr scott/tiger control=/Home/BaluniGroups/sqlldr-add-multiple.ctl
Commit point reached - logical record count 5
Commit point reached - logical record count 7
8. Load data to Multiple Tables
Create another Table called Bonus which will have Employee ID and bonus columns.
Create Table Bonus (ID integer, Bonus Integer);
Create the Employee-bonus.txt data file that contains the Fields: ID, Name, department, Salary, bonus
$ cat Employee-Bonus.txt
100 Thomas Sales      5000 1000
200 Jason  Technology 5500 2000
300 Mayla  Technology 7000 2000
400 Nisha  Marketing  9500 1000
500 Randy  Technology 6000 3000
Create the control file as shown below, which will upload the data from the above file to two different Tables. As shown below, you should have two “Into Table” commands, and specify the position of the data which needs to be used to upload the data to that column.
$ cat sqlldr-multiple-Tables.ctl
load data
 infile '/Home/BaluniGroups/Employee-bonus.txt'
 Into Table Employee
 ( ID position(1:3),
   Name position(5:10),
   Dept position(12:21),
   Salary position(23:26))
 Into Table Bonus
 ( ID position(1:3),
   Bonus Position(28:31))
Load the data to multiple Tables using this control file as shown below.
$ sqlldr scott/tiger control=/Home/BaluniGroups/sqlldr-multiple-Tables.ctl
Verify that the data got loaded to multiple Tables successfully.
SQL> Select * From Employee;
        ID NAME       DEPT                SALARY HIREDON
---------- ---------- --------------- ---------- ---------
       100 Thomas     Sales                 5000
       200 Jason      Technology            5500
       300 Mayla      Technology            7000
       400 Nisha      Marketing             9500
       500 Randy      Technology            6000

SQL> Select * From Bonus;
        ID      BONUS
---------- ----------
       100       1000
       200       2000
       300       2000
       400       1000
       500       3000
9. Handling Bad (Rejected) Records
In the following example, we have two bad records. Employee ID 300 and 500 has Salary column which is not numeric.
$ cat Employee-bad.txt
100,Thomas,Sales,5000
200,Jason,Technology,5500
300,Mayla,Technology,7K
400,Nisha,Marketing,9500
500,Randy,Technology,6K
Use the following control file for this example.
$ cat sqlldr-bad.ctl
load data
 infile '/Home/BaluniGroups/Employee-bad.txt'
 Into Table Employee
 Fields Terminated By ","
 ( ID, Name, Dept, Salary )
Load the data (including the invalid records) using this control file as shown below.
$ sqlldr scott/tiger control=/Home/BaluniGroups/sqlldr-bad.ctl
Commit point reached - logical record count 5
As you see from the above output, it still says “logical record count 5”, but you should check the log files to see if it has rejected any records.
The log file indicates that 2 records are rejected as shown below:
Control File:   /Home/BaluniGroups/sqlldr-bad.ctl
Data File:      /Home/BaluniGroups/Employee-bad.txt
 Bad File:     /Home/BaluniGroups/Employee-bad.bad
 Discard File:  none specified
Table EMPLOYEE:
 3 Rows successfully loaded.
 2 Rows not loaded due to data errors.
By default the rejected records are stored in a file that has the same Name as the data file (but with .bad extension)
$ cat Employee-bad.bad
300,Mayla,Technology,7K
500,Randy,Technology,6K
As you see below, the Employee Table has only 3 records (as 2 of them were rejected).
SQL> Select * From Employee;
        ID NAME       DEPT                SALARY HIREDON
---------- ---------- --------------- ---------- ---------
       100 Thomas     Sales                 5000
       200 Jason      Technology            5500
       400 Nisha      Marketing             9500
10. Load Specific Rows from a datafile
If you want to load only a specific record from a data file use the WHEN in the control file.
Add the line “when” next to “Into Table” line. In the following control file, the when clause indicates that it will load only the records that have Dept as “Technology”.
$ cat sqlldr-when.ctl
load data
 infile '/Home/BaluniGroups/Employee.txt'
 Into Table Employee
 when Dept = 'Technology'
 Fields Terminated By ","
 ( ID, Name, Dept, Salary )
Load the selective data (only the “Technology” records) using this control file as shown below.
$ sqlldr scott/tiger control=/Home/BaluniGroups/sqlldr-when.ctl
Commit point reached - logical record count 5
As you see from the above output, it still says “logical record count 5”, but you should check the log files to see how many records were loaded, and how many records were discarded because it didn’t match the when condition.
The following from the log file shows that 5 records were read, and 2 of them were discarded as it didn’t match the when condition.
Discard File:  none specified
Total logical records read:             5
Total logical records discarded:        2
Verify that only the selective records were loaded Into the Table.
SQL> Select * From Employee;
        ID NAME       DEPT                SALARY HIREDON
---------- ---------- --------------- ---------- ---------
       200 Jason      Technology            5500
       300 Mayla      Technology            7000
       500 Randy      Technology            6000
 Problem- Skip Column when using SQL Loader
Consider below scenario:
Table T1 (f1, f2, f3);
Data files:
a|b|c|d
w|x|y|z
I want to load this data skipping the second field as follow:
f1    f2    f3
---   ---   ---
a     d     c
w     z     y

Solution-
Define the column you want to skip as FILLER. Keep in mind the order of the columns in the control file is typically the order they are in the datafile. If the Name matches a column in the Table, that's where it will go.
 (
  f1 CHAR,  -- 1st field in the file, goes to column Named f1 in the Table
  X FILLER, -- 2nd field in the file, ignored
  f3 CHAR,  -- 3rd field in the file, goes to column Named f3 in the Table
  f2 CHAR   -- 4th field in the file, goes to column Named f2 in the Table
)
In other words, the order of the columns in the control file matches the order they are in the data file, not their order in the Table. That is matched By Name, not order.
EDIT - I added some comments for explanation, but I believe they can't be in that position in the actual file. See below for a full example:
Create Table:
CREATE TABLE T1
(
  F1  VARCHAR2(50 BYTE),
  F2  VARCHAR2(50 BYTE),
  F3  VARCHAR2(50 BYTE)
);
The control file, Example.ctl:
load data
infile *
truncate
Into Table t1
Fields Terminated By '|' trailing nullcols
(
f1 CHAR,
x FILLER,
f3 CHAR,
f2 CHAR
)
BEGINDATA
a|b|c|d
w|x|y|z
Run it:
C:\temp>sqlldr userID=login/password@database control=Example.ctl
SQL*Loader: Release 11.2.0.1.0 - Production on Wed Apr 22 11:25:49 2015
Copyright (c) 1982, 2009, Oracle and/or its affiliates.  All rights reserved.
Commit point reached - logical record count 2

Select from the Table:















Can one skip header records while loading?
One can skip unwanted header records or continue an interrupted load (for example if you run out of space) By specifying the "SKIP=n" keyword. "n" specifies the number of logical rows to skip. Look at these examples:
OPTIONS (SKIP=5)
LOAD DATA
INFILE *
INTO TABLE load_positional_data
(  data1 POSITION(1:5),
   data2 POSITION(6:15)
)
BEGINDATA
11111AAAAAAAAAA
22222BBBBBBBBBB

sqlldr userID=ora_ID/ora_passwd control=control_file_Name.ctl skip=4
If you are continuing a multiple Table direct path load, you may need to use the CONTINUE_LOAD clause instead of the SKIP parameter. CONTINUE_LOAD allows you to specify a different number of rows to skip for each of the Tables you are loading.

How does one use SQL*Loader to load images, sound clips and documents?
SQL*Loader can load data from a "primary data file", SDF (Secondary Data file - for loading nested Tables and VARRAYs) or LOBFILE. The LOBFILE method proves an easy way to load documents, photos, images and audio clips Into BLOB and CLOB columns. Look at this example:
Given the following Table:
CREATE TABLE image_Table (
       image_ID   NUMBER(5),
       file_Name  VARCHAR2(30),
       image_data BLOB);
Control File:
LOAD DATA
INFILE *
INTO TABLE image_Table
REPLACE
FIELDS TERMINATED BY ','
(
 image_ID   INTEGER(5),
 file_Name  CHAR(30),
 image_data LOBFILE (file_Name) TERMINATED BY EOF
)
BEGINDATA
001,image1.gif
002,image2.jpg
003,image3.jpg






Get involved and leave your Comments in the Box Below. The more people get involved, the more we all benefit. So, leave your thoughts before you leave the page. 

6 comments:

  1. what are the permission required to use SQLLDR ?

    ReplyDelete
  2. $ sqlldr scott/tiger control=/Home/BaluniGroups/sqlldr‐add‐multiple.ctl
    Commit point reached ‐ logical record count 5
    Commit point reached ‐ logical record count 7

    what is commit point here
    Is this is part of syntax of SQLLDR

    ReplyDelete
  3. Thank you for providing and sharing good information to the students it will be very helpful to them for Choosing best course in future. Oracle Financials Training

    ReplyDelete
  4. how to insert nested table data in ctl file

    ReplyDelete
  5. I think there is a need to look for some more information about REST API and SQL.There is actually a wider scope to finding and contemplating some sources about database operations.


    SQL Server Load Rest Api

    ReplyDelete
  6. Hello, I have gone through your article and I found it really very informative. You know, you write the same thing which I was searching for. Thanks a lot for your efforts. Keep Posting dear. If you want to know Related Quality Post.

    ReplyDelete