Monday, October 28, 2013

Examples of expdp and impdp parameter files, and some useful directives


Export:

USERID='/ as sysdba'
DIRECTORY=DPUMP
DUMPFILE=file_name_%U.dmp
LOGFILE=expdp_log_file_name.log
JOB_NAME=EXP_DATA
PARALLEL=8
SCHEMAS=schema1,[schema2],[schema n]
EXCLUDE=STATISTICS

To get a consistent dump file, add the flashback_time or flashback_scn directive, for example:
FLASHBACK_TIME="to_timestamp(to_char(sysdate,'ddmmyyyy hh24:mi:ss'),'ddmmyyyy hh24:mi:ss')" 

Import:

USERID='/ as sysdba'
DIRECTORY=DPUMP
DUMPFILE=file_name_%U.dmp
LOGFILE=impdp_log_file_name.log
JOB_NAME=IMP_DATA
SCHEMAS=schema1,[schema2],[schema n]
REMAP_SCHEMA=SCOTT:JAMES
REMAP_TABLE=SCOTT.MYTABLE:MYTABLE_INTERIM
REMAP_TABLESPACE=USERS:TOOLS
PARALLEL=8
TRANSFORM=oid:n
TABLE_EXISTS_ACTION=REPLACE

Note: the directive "TRANSFORM=oid:n" makes sure all imported objects will be assigned a new OID instead of inhibiting the old OID from exported objects.

You can remap serveral tablespaces during the same run, just specify the directive multiple times in the same parameter file:

remap_tablespace=USER_DATA:HISTORICAL_DATA
remap_tablespace=APP_DATA:HISTORICAL_DATA
remap_tablespace=LOB_DATA:HISTORICAL_DATA


To limit export or import to specific tables only, use:
tables=SCOTT.EMP,
SCOTT.DEPT,
SCOTT.SALARIES

To limit export or import to specific partitions only, use:
TABLES=SCOTT.SALES:P_201701

To limit export or import to partitions matching a specific pattern, use:
TABLES=SCOTT.SALES:CONTRACT%2021%

Note that you cannot use both SCEMAS and TABLES as directives in the same export/import session.


Other useful directives:

CONTENT=METADATA_ONLY (only export object definition, not data)
EXCLUDE=TABLE:"LIKE 'ABC%_TMP'" (exclude tables based on name pattern)
EXCLUDE=TABLE:"= 'BONUS'" (exclude an entire table and its partitions, if any)
EXCLUDE=INDEX:"LIKE 'EMP%'" (exclude indexes that match a certain string)
EXCLUDE=VIEW,PACKAGE, FUNCTION (exclude certain types of objects)
EXCLUDE = TRIGGER:"IN ('TRIG1', 'TRIG2')", INDEX:"='INDX1'", REF_CONSTRAINT (exclude specific triggers, a specific index and referential constraints)
EXCLUDE=SEQUENCE, TABLE:"IN ('EMP', 'DEPT')" (exclude specific tables)
EXCLUDE=INDEX:"='MY_INDX'" (exclude a specific index)
EXCLUDE=MATERIALIZED_VIEW (exclude materialized views)
EXCLUDE=MATERIALIZED_VIEW,MATERIALIZED_VIEW_LOG (exlude materialized views + mv logs)


INCLUDE=FUNCTION, PACKAGE, TABLE:"='EMP'" (only include specific objects types, and tables called "EMP")
INCLUDE=PROCEDURE:"LIKE 'MY_PROC_%'" (only include procedures named according to a specific string)

QUERY=SCOTT.TABLE1:"WHERE 1=0" (no data is being exported, but the table is created)
QUERY=SCOTT.TABLE2:"WHERE SALES_DATE >= to_date('01.12.2011','dd.MM.yyyy')" (exports only rows older than 01.12.2011)
EXCLUDE=SCHEMA:"='SCOTT'" (exclude a specific user and all objects of that user)
Note:
Specifying EXCLUDE=USER excludes only the definitions of users, not the objects contained within users' schemas.
If you try to exclude a user by using a statement such as EXCLUDE=USER:"= 'SCOTT'", only the CREATE USER scott DDL statement will be excluded, and you may not get the results you expect.

To import only the user definition over a database link

See this post.

To compress tables on import

See this post

A note on integrity checking in data pump import TABLE_EXISTS_ACTION

TRUNCATE is subject to stricter integrity checking, than in case of REPLACE.

• TRUNCATE deletes existing rows and then loads rows from the source.
• REPLACE drops the existing table and then creates and loads it from the source.

• When you use TRUNCATE or REPLACE, make sure that rows in the affected tables are not targets of any referential constraints.
• When you use TRUNCATE, existing table-dependent objects in the source, such as indexes, grants, triggers, and constraints, are ignored.
• For REPLACE, the dependent objects are dropped and re-created from the source, if they were not explicitly or implicitly excluded.
• When you use TRUNCATE, checks are made to ensure that rows from the source are compatible with the existing table prior to performing any action.

If the existing table has active constraints and triggers, it is loaded using the external tables access method. If any row violates an active constraint, the load fails and no data is loaded. You can override this behavior by specifying DATA_OPTIONS=SKIP_CONSTRAINT_ERRORS on the Import command line.
If you have data that must be loaded, but may cause constraint violations, consider disabling the constraints, loading the data, and then deleting the problem rows before re-enabling the constraints.

A note about excluding Constraints
The following constraints cannot be excluded:
• NOT NULL constraints.
• Constraints needed for the table to be created and loaded successfully (for example, primary key constraints for index-organized tables or REF SCOPE and WITH ROWID constraints for tables with REF columns).

Examples:

Exclude all nonreferential constraints, except for NOT NULL constraints and any constraints needed for successful table creation and loading:

EXCLUDE=CONSTRAINT

Exclude referential integrity (foreign key) constraints:
EXCLUDE=REF_CONSTRAINT

Turn on compression during export:
COMPRESSION=[ALL | DATA_ONLY | METADATA_ONLY | NONE]
COMPRESSION_ALGORITHM = {BASIC | LOW | MEDIUM | HIGH}

For most cases, I would stick to the MEDIUM compression algorithm. Oracle states in the 12.2 documentation:
Recommended for most environments. This option, like the BASIC option, provides a good combination of compression ratios and speed, but it uses a different algorithm than BASIC.

I find compression of exports very useful when space is an issue, or transfer between servers over the network is needed.


Sources: Oracle 11g Documentation
Oracle 12cR2 documentation

Source: My Oracle Support

1 comment: