It's All About ORACLE

Oracle - The number one Database Management System. Hope this Blog will teach a lot about oracle.

Datapump expdp/impdp utility

ORACLE provides two external utilities to transfer database objects from one database to another database. Traditional exports (exp /imp) are introduced before 10g. Then from 10g, ORACLE introduced datapump (expdp / impdp) as an enhancement to traditional export utility.

Traditional Export (exp/ imp)

This is an ORACLE database external utility, which is used to transfer database objects from one database server to another database server. It allows transferring the database objects over different platforms, different hardware and software configurations. When an export command is executed on a database, database objects are extracted with their dependency objects. That means if it extracts a table, the dependences like indexes, comments, and grants are extracted and written into an export file (binary format dump file). Following is the command to export a full database,

Cmd> exp userid=username/password@exportdb_tns file=export.dmp log=export.log full=y statistics=none

The above command will be exported the database to a binary dump file named export.dmp. Then imp utility can be used to import this data to another database. Following is the command to import,

cmd> imp userid=username/password@importdb_tns file=export.dmp log=import.log full=y statistics=none

Datapump Export (expdp/ impdp)

This is also an ORACLE database external utility, which is used to transfer objects between databases. This utility is coming from ORACLE 10g database. It has more enhancements than the traditional exp/ imp utilities. This utility also makes dump files, which are in binary formats with database objects, object metadata and their control information. The expdp and impdp commands can be executed in three ways,

Command line interface (specify expdp/impdp parameters in command line)
Parameter file interface (specify expdp/impdp parameters in a separate file)
Interactive-command interface (entering various commands in export prompt)

There are five different modes of data unloading using expdp. They are,

  • Full Export Mode (entire database is unloaded)
  • Schema Mode (this is the default mode, specific schemas are unloaded)
  • Table Mode (specified set of tables and their dependent objects are unloaded)
  • Tablespace Mode (the tables in the specified tablespace are unloaded)
  • Transportable Tablespace Mode (only the metadata for the tables and their dependent objects within a specified set of tablespaces are unloaded)
Following is the way to export a full database using expdp,
cmd> expdp userid=username/password dumpfile=expdp_export.dmp logfile=expdp_export.log full=y directory=export

Then impdp utility should be used to import this file to another database.

Performing various type of Export/Import using Data Pump

In order to work with oracle data pump we will have to first unlock the scott user and create a directory object. The directory object is only a pointer to a physical directory, creating it does not actually create the physical directory on the file system.

First startup the database and its associated services. Then from the root prompt, do the following.

su - oracle
export ORACLE_SID=orcl
sqlplus / as sysdba

ALTER USER scott IDENTIFIED BY tiger ACCOUNT UNLOCK;

CREATE OR REPLACE DIRECTORY test_dir AS '/u01/app/oracle/oradata/orcl';
GRANT READ, WRITE ON DIRECTORY test_dir TO scott;

Note: Directory must be created physically before executing the above command.

TABLE BASED IMPORT AND EXPORT
$ export ORACLE_SID=orcl
$ expdp scott/tiger tables=EMP,DEPT directory=TEST_DIR dumpfile=EMP_DEPT.dmp logfile=expEMP_DEPT.log

$ export ORACLE_SID=orcl
$ impdp scott/tiger tables=EMP,DEPT directory=TEST_DIR dumpfile=EMP_DEPT.dmp logfile=impEMP_DEPT.log
CONTENT={ALL | DATA_ONLY | METADATA_ONLY}
  • All loads all the metadata as well as data from the source dump file.
  • DATA_ONLY only loads row data into the tables no database objects are created.
  • METADATA_ONLY only creates database objects, no data is inserted.

SCHEMA BASED IMPORT AND EXPORT
$ export ORACLE_SID=orcl
$ expdp scott/tiger schemas=SCOTT directory=TEST_DIR dumpfile=SCOTT.dmp logfile=expSCOTT.log

$ export ORACLE_SID=orcl
$ impdp scott/tiger schemas=SCOTT directory=TEST_DIR dumpfile=SCOTT.dmp logfile=impSCOTT.log
CONTENT={ALL | DATA_ONLY | METADATA_ONLY}

  •  All loads all the metadata as well as data from the source dump file.
  •  DATA_ONLY only loads row data into the tables no database objects are created.
  •  METADATA_ONLY only creates database objects, no data is inserted.

FULL DATABASE IMPORTS AND EXPORTS

For this to work you must login as the system user not as the sys user.
$ export ORACLE_SID=orcl
$ expdp system/password full=Y directory=TEST_DIR dumpfile=DB10G.dmp logfile=expDB10G.log

$ export ORACLE_SID=orcl
$ impdp system/password full=Y directory=TEST_DIR dumpfile=DB10G.dmp logfile=impDB10G.log
CONTENT={ALL | DATA_ONLY | METADATA_ONLY}

In the above commands you can also add a additional clause that is job_name.

Example syntax :
$ expdp system/password full=y directory=TEST_DIR dumpfile=DB10G.dmp job_name=qq

You can view the datapump jobs from the following view:
Sql> select * from dba_datapump_jobs;
OWNER_NAME                     JOB_NAME                       OPERATION
------------------------------ ------------------------------ ------------------------------
JOB_MODE                       STATE                              DEGREE ATTACHED_SESSIONS
------------------------------ ------------------------------ ---------- -----------------
SYSTEM                         SYS_EXPORT_FULL_01             EXPORT
FULL                           EXECUTING                               1                 1

Even if you exit the prompt or press ctrl+c at the command prompt or exit from the client side the datapump jobs will continue to run at the server.

To reattach with the running job enter the following command.
$ expdp system/password attach=qq

If the import or export job is to be stopped temporarily then type the following command.
press CTRL+C
Export> STOP_JOB=IMMEDIATE
Are you sure you wish to stop this job ([y]/n): y

In order to resume the job do the following.
export ORACLE_SID=orcl
$expdp hr/hr ATTACH=qq
Export> CONTINUE_CLIENT

DATAPUMP EXCLUDE/INCLUDE parameters

The exclude and include parameters availbale with expdp,impdp can be used as metadata filters so that one can specify any objects like tables,indexes,triggers, procedure to be excluded or included during export or import operation

syntax:

EXCLUDE=[object_type]:[name_clause],[object_type]:[name_clause]
INCLUDE=[object_type]:[name_clause],[object_type]:[name_clause]

examples:


expdp SCHEMAS=scott EXCLUDE=SEQUENCE,TABLE:”IN (’EMP’,'DEPT’)”;
impdp SCHEMAS=scott INCLUDE=PACKAGE,FUNCTION, PROCEDURE,TABLE:”=’EMP’”

The name_clause is a SQL expression that is used as a filter on the object names of the object. It consists of a SQL operator and the values against which the object names of the specified type are to be compared. If no name_clause is provided, all objects of the specified type are excluded/included. The name clause must be separated from the object type with a colon.

Examples of operator-usage:


EXCLUDE=SEQUENCE
or:EXCLUDE=TABLE:”IN (’EMP’,'DEPT’)”
or:EXCLUDE=INDEX:”= ‘MY_INDX’”
or:INCLUDE=PROCEDURE:”LIKE ‘MY_PROC_%’”
or:INCLUDE=TABLE:”> ‘E’”

The parameter can also be stored in a par (parameter file) as shown
Parameter file:exp.par
DIRECTORY = my_dir
DUMPFILE = exp_tab.dmp
LOGFILE = exp_tab.log
SCHEMAS = scott
INCLUDE = TABLE:”IN (’EMP’, ‘DEPT’)”

expdp system/manager parfile=exp.par


If parameter file is not used then in unix special care needs to be taken in syntax of expdp and impdp, in particular all the single quotes and double quotes needs to be preceded with the special character ‘\’ .The syntax for windows and unix

Windows:
D:\> expdp system/manager DIRECTORY=my_dir DUMPFILE=exp_tab.dmp LOGFILE=exp_tab.log SCHEMAS=scott INCLUDE=TABLE:\”IN (’EMP’, ‘DEP’)\”

Unix:
% expdp system/manager DIRECTORY=my_dir DUMPFILE=exp_tab.dmp LOGFILE=exp_tab.log SCHEMAS=scott INCLUDE=TABLE:\”IN \(\’EMP\’, \’DEP\’\)\”


Any improper use of exclude or include can give you errors and hence to avoid this error please be careful and read the entire post carefully .

Data pump performance can be improved by using the PARALLEL parameter. This should be used in conjunction with the "%U" wildcard in the DUMPFILE parameter to allow multiple dumpfiles to be created or read.

expdp scott/tiger@db10g schemas=SCOTT directory=TEST_DIR parallel=4 dumpfile=SCOTT_%U.dmp logfile=expdpSCOTT.log


0 comments:

You Might Also Like

Related Posts with Thumbnails

Pages