Data Pump Import Best Practices
Best Practices and recommended parameter settings for Oracle Data Pump import operations.
General Recommendations​
- Use a parameter file (
PARFILE=xxx.par
)
Don't use sys as sysdba for the import
Data Pump Patching​
Apply the latest Data Pump Bundle Patch.
- Data Pump Recommended Proactive Patches For 19.10 and Above Doc ID 2819284.1
- The patch can be applied online but when executing datapatch no active Data Pump jobs should be running
Diagnostics​
Always specify the LOGTIME=ALL
and METRICS=YES
parameters to log more timing/worker information and show the access_method per table in the logfile.
Here are some example logfiles with different diagnostics settings:
- Logfile Example (Default Settings):
Processing object type SCHEMA_EXPORT/TABLE/TABLE
Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
. . imported "SCHEMA1"."TABLE1" 5.117 KB 3 rows
- Logfile Example (
LOGTIME=ALL
):
22-MAY-23 11:14:05.310: Processing object type SCHEMA_EXPORT/TABLE/TABLE
22-MAY-23 11:14:06.128: Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
22-MAY-23 11:14:06.347: . . imported "SCHEMA1"."TABLE1" 5.117 KB 3 rows
- Logfile Example (
METRICS=YES
):
W-1 Startup took 0 seconds
W-1 Master table "SYSTEM"."SYS_IMPORT_SCHEMA_01" successfully loaded/unloaded
W-1 Processing object type SCHEMA_EXPORT/TABLE/TABLE
W-1 Completed 1 TABLE objects in 1 seconds
W-1 Completed by worker 1 1 TABLE objects in 1 seconds
W-1 Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
W-1 . . imported "SCHEMA1"."TABLE1" 5.117 KB 3 rows in 0 seconds using direct_path
- Logfile Example (
LOGTIME=ALL
+METRICS=YES
):
22-MAY-23 11:17:59.424: W-1 Startup took 0 seconds
22-MAY-23 11:17:59.687: W-1 Master table "SYSTEM"."SYS_IMPORT_SCHEMA_01" successfully loaded/unloaded
22-MAY-23 11:18:03.001: W-1 Processing object type SCHEMA_EXPORT/TABLE/TABLE
22-MAY-23 11:18:03.949: W-1 Completed 1 TABLE objects in 0 seconds
22-MAY-23 11:18:03.949: W-1 Completed by worker 1 1 TABLE objects in 0 seconds
22-MAY-23 11:18:04.251: W-1 Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
22-MAY-23 11:18:04.676: W-1 . . imported "SCHEMA1"."TABLE1" 5.117 KB 3 rows in 0 seconds using direct_path
Target Database​
STREAMS_POOL_SIZE
​
Data Pump uses Advanced Queuing so STREAMS_POOL_SIZE
needs to be set to a minimum of 256 MB.
Check the current size:
select current_size/1024/1024 as current_size_mb
from v$sga_dynamic_components
where component='streams pool';Set
STREAMS_POOL_SIZE
to minimum of 256 MB:alter system set streams_pool_size=256m scope=both;
Statistics​
After the import gather statistics on the target database or transport the status using DBMS_STATS
.
Additionally make sure the dictionary stats (SYS
, SYSTEM
) are current after the import:
exec dbms_stats.gather_schema_stats ('SYS');
exec dbms_stats.gather_schema_stats ('SYSTEM');
Recommended Parameters​
Overview of the most important and relevant Data Pump import parameters. Click here for examples.
LOGTIME=ALL
​
Always specify the logtime diagnostics parameter to log more timing information.
METRICS=YES
​
Always specify the metrics diagnostics parameter to show more worker information and the access_method per table in the logfile.
PARALLEL=n
​
This parameter is only valid for the Enterprise Edition
Use parallel processes to speed up the import operation. The number of processes depends on the actual number of cpus on the target system.
Degree of parallelism:
Platform | Degree |
---|---|
Cloud OCPU | 1 x OCPU count |
Cloud ECPU | .25 x ECPU count |
On-prem x86 | 2 x CPU cores count |
On-prem (Other) | Depends |
The number of dump files does not affect the parallel import settings.
You can use more parallel processes than dump files available.
EXCLUDE=STATISTICS
​
Exclude optimizer statistics from the import operation. After the import the statistics should be gathered on the target db or transported from the source database with DBMS_STATS
.
- Regular
- Transportable Tablespaces
exclude=statistics
exclude=table_statistics,index_statistics
TRANSFORM=LOB_STORAGE:SECUREFILE
​
SecureFiles is the default LOB storage option now and all binary data types should be migrated from BasicFile LOBs to SecureFile LOBs.
BasicFile LOBs do not allow parallel DML.
Example:
impdp ... transform=lob_storage:securefile
Optional Parameters​
Checksum​
This is a new feature available starting from Oracle 21c and can be used to verify the file integrity on import when a checksum was calculated during export. It's possible to only run a verify of the checksum with the VERIFY_ONLY
parameter.
Examples:
impdp ... verify_only=yes
impdp ... verify_checksum=yes
Multitenant​
To avoid any noisy neighbours if the resources are insufficient set the following parameters:
Restrict the number of concurrent Data Pump jobs for a PDB
alter system set max_datapump_jobs_per_pdb=n;
Restrict the parallel degree setting for a Data Pump Job
alter system set max_datapump_parallel_per_job=n container=all;
Examples​
- Command
- Parameterfile
impdp userid=system@pdb1 \
schemas=SCHEMA1 \
directory=DATA_PUMP_DIR \
dumpfile=dump%L.dmp \
logfile=impdp.log \
parallel=2 \
transform=lob_storage:securefile \
exclude=statistics \
logtime=all \
metrics=yes
userid=system@pdb1
schemas=SCHEMA1
directory=DATA_PUMP_DIR
dumpfile=dump%L.dmp
logfile=impdp.log
parallel=2
transform=lob_storage:securefile
exclude=statistics
logtime=all
metrics=yes