The import utility inserts data from an input file into a table or a view. The utility performs inserts as if it was executing
INSERT statements. Just like normal insert operations, DB2 validates the data and checks against the table definitions, constraints (such as referential integrity and check constraints), and index definitions. Triggers with satisfying conditions are also activated.
The utility supports options and import modes that let you customize its behavior. The syntax diagram of the
import command is very long; Figure 12.10 shows only a portion of it. Please refer to the
DB2 Command Reference for the complete syntax diagram.
>>-IMPORT FROM--filename--OF--filetype--------------------------> >--+-------------------------+----------------------------------> | .-,--------. | | V | | '-LOBS FROM----lob-path-+-' >--+-------------------------------+----------------------------> | .--------------. | | V | | '-MODIFIED BY----filetype-mod-+-' .-ALLOW NO ACCESS----. >--+--------------------+--+----------------------------+-------> '-ALLOW WRITE ACCESS-' '-COMMITCOUNT--+-n---------+-' '-AUTOMATIC-' >--+---------------------+--+-------------+---------------------> '-+-RESTARTCOUNT-+--n-' '-ROWCOUNT--n-' '-SKIPCOUNT----' >--+-----------------+--+-----------+---------------------------> '-WARNINGCOUNT--n-' '-NOTIMEOUT-' >--+------------------------+-----------------------------------> '-MESSAGES--message-file-' >--+-+-INSERT---------+--INTO--+-table-name--+-------------------+-+------+--> | +-INSERT_UPDATE--+ | |.-,-------------. | | | | +-REPLACE--------+ | | V | | | | | '-REPLACE_CREATE-' | '-(-insert-column-+-)-' | | '-CREATE--INTO--+-table-name+-------------| tblspace-specs |----' tblspace-specs: |--+-----------------------------------------------------------------------+--| '-IN--tablespace-name--+----------------------+--+----------------------+-' '-INDEX INtspace-name-' '-LONG IN--tspace-name-'
Although the syntax diagram may seem complex, it is quite easy to understand and follow. Let's start with a simple
import command and discuss the mandatory options. To a certain degree, the
import command is structured much like the
export command: you have to specify the input file name, format of the file, and the target table name. For example:
import from employee.ixf of ixf
messages employee.out
insert into employee
This command takes the file
employee.ixf, which is in IXF format, as the input and inserts data into the
employee table. The import utility supports input files in ASC, DEL, IXF, and WSF formats. We also recommend you to specify the optional clause
messages to save the errors and warning messages and the import status. In section 12.4.4, Restarting a Failed Import, you will see that the message file can be used to identify where to restart an import operation.
The previous example uses
insert to indicate that new data is to be appended to the existing
employee table. Table 12.1 lists the modes supported by the import utility.
Mode | Description |
---|---|
INSERT | Adds the imported data to the table without changing the existing table data. The target table must already exist. |
INSERT_UPDATE | Adds the imported data to the target table or updates existing rows with matching primary keys. The target table must already exist with primary keys. |
CREATE | Creates the table, index definitions, and row contents. The input file must use the IXF format because this is the only format that stores table and index definitions. |
REPLACE | Deletes all existing data from the table and inserts the imported data. The table definition and index definitions are not changed. |
REPLACE_CREATE | If the table exists, this option behaves like the replace option. If the table does not exist, this option behaves like the create option, which creates the table and index definitions and then inserts the row contents. This option requires the input file to be in IXF format. |
Figures 12.11, 12.12, and 12.13 demonstrate some of the import modes and other options.
In Figure 12.11, the input data of specific columns are selected from the DEL input file and imported into the
empsalary table. The
warningcount option indicates that the utility will stop after 10 warnings are received.
import from employee.del of del
messages empsalary.out
warningcount 10
replace into empsalary (salary, bonus, comm)
In Figure 12.12, the
import command deletes all the rows in the table (if table
newemployee exists) and inserts the row contents. If the
newemployee table does not exist, the command creates the table with definitions stored in the IXF input file and inserts the row contents. In addition to specifying the columns you want to import as demonstrated in Figure 12.11, you can also limit the number of rows to be imported using the
rowcount option. In Figure 12.12, the number of rows to import is limited to the first 1000 rows.
import from employee.ixf of ixf
messages employee.out
rowcount 1000
replace_create into newemployee
If the
create option is used as in Figure 12.13, you can also specify which table space the new table is going to be created in. The
in clause tells DB2 to store the table data in a particular table space, and the
index in clauses indicates where the index is to be stored.
import from employee.ixf of ixf
messages newemployee.out
create into newemployee in datats index in indexts
While the import utility is adding new rows to the table, the table by default is locked exclusively to block any read/write activities from other applications. This is the behavior of the
allow no access option. Alternatively, you can specify
allow write access in the command to allow concurrent read/write access to the target table. A less restrictive lock is acquired at the beginning of the import operation.
Both the
allow write access and
allow no access options require some type of table lock. It is possible that the utility will be placed in lock-wait state and eventually will be terminated due to a lock timeout. You can specify the
notimeout option so that the utility will not time out while waiting for locks. This option supersedes the LOCKTIMEOUT database configuration parameter.
The import utility inserts data into a table through normal insert operations. Therefore, changes made during the import are logged, and they are committed to the database upon successful completion of the import operation. By default, an import, behaves like a non-atomic compound statement for which more than one insert is grouped into a transaction. If any insert fails, the rest of the inserts will still be committed to the database. Atomic and non-atomic compound statements are discussed in detail in Chapter 9, Leveraging the Power of SQL.
If you were to import a few million rows into a table, you would need to make sure there was enough log space to hold the insertions because they are treated as one transaction. However, sometimes it is not feasible to allocate large log space just for the import. You can specify the commitcount n option to force a commit after every n records are imported. With commitcount automatic , the utility will commit automatically at an appropriate time to avoid running out of active log space and avoid lock escalation. |
Figure 12.14 shows the messages captured during the following
import command. Note that a COMMIT is issued every 1,000 rows. The message file also serves as a very good progress indicator, because you can access this file while the utility is running.
import from employee.ixf of ixf
commitcount 1000
messages newemployee.out
create into newemployee in datats index in indexts
If you have import failures due to invalid input, for example, you can use the message file generated from an
import command that uses the
commitcount and
messages options to identify which record failed. Then you could issue the same
import command with
restartcount
n or
skipcount
n to start the import from record
n+ 1. This is a very handy method to restart a failed import. Here is an example:
import from employee.ixf of ixf
commitcount 1000
skipcount 550
messages newemployee.out
create into newemployee in datats index in indexts
The import utility also has the
modified by clause to allow customization. Some modifiers supported in the export utility also apply to the import utility. Refer to the
DB2 Data Movement Utilities Guide and Reference for a complete listing specific to the import utility. The following sections describe some of the more useful modifiers.
Tables with generated columns or identity columns are defined in a way that column values will be automatically generated when records are inserted into the tables. Since import operations perform inserts in the background, new values will be generated at the target server. Therefore, you need to decide whether values stored in the source input file should be used or if new values should be generated. The import utility supports a few file type modifiers to take care of that.
The file modifier
generatedignore forces the import utility to ignore data for all generated columns presented in the data file. The utility generates the values of those columns. The file modifier
identityignore behaves the same way as
generatedignore .
You can use the
generatemissing modifier to inform the import utility that the input data file contains no data for the generated columns (not even NULLs), and the import utility will therefore generate a value for each row. This behavior also applies to
identitymissing modifier.
If you are exporting LOB data in separate files (as described in Section 12.3.2, Exporting Large Objects), you need to tell the import utility the location and name of the files. Consider the following
import command.
import from mgrresume.ixf of ixf
lobs from c:\lobs1, c:\lobs2, c:\lobs3
modified by lobsinfile
commitcount 1000
messages mgrresume.out
create into newemployee in datats index in indexts long in lobts
This command takes
mgrresume.del as the input file. With the
lobsinfile modifier, the utility searches the paths specified in the
lobs from clause for the LOB location specifier (LLS). Recall that each LOB data has a LLS that represents the location of a LOB in a file stored in the LOB file path.
Notice that an additional clause,
long in lobts , is added to the
create into option. It indicates that all LOB data will be created and stored in
lobts table space. If this clause is omitted, LOB data will be stored in the same table space with the other data. Typically, we recommend that you use DMS table space and keep regular data, LOB data, and indexes in different table spaces.
There are three ways to select particular columns you want to import.
method l uses the starting and ending position (in bytes) for all columns to be imported. This method only supports ASC files. For example:
import from employee.asc of asc
messages employee.out
method l (1 5, 6 14, 24 30)
insert into employee
This command imports three selected columns of data into the
employee table: bytes 1 to 5 from the first column, bytes 6 to 14 from the second column, and bytes 24 to 30 from the third column.
The other two methods specify the names of the columns (
method n ) or the field numbers of the input data (
method p ).
method n is only valid for IXF files and
method p can be used with IXF or DEL files. The following shows an example of an
import command with
method n and
method p clauses.
import from employee.ixf of ixf
messages employee.out
method n (empno, firstnme, lastname)
insert into employee (empno, firstnme, lastname)
import from employee.ixf of ixf
messages employee.out
method p (1, 2, 4)
insert into employee (empno, firstnme, lastname)
Depending on the options you have chosen for the import, specific authorization and privileges are required. Since SYSADM and DBADM hold the highest authority for an instance and a database respectively, both of them can issue
import commands with all of the options discussed above. For users who do not have SYSADM and DBADM privileges, refer to Chapter 10, Implementing Security.
You can invoke the import utility from the Control Center by right-clicking on the target table and selecting Import. This displays the Import table dialog (shown in Figure 12.15).