Oracle: Different options in Export and Imports

Share via:

Different options in Export and Imports

 

Schema export and import using remap_schemas.

 

1) create directory dpump as ‘/u02/dpump’;

2) grant read,write on directory dpump to system;

For importing schema :

 

1) Cretae directory dpump as ‘/u02/DATAPUMP’;

2) grant read,write on directory DATAPUMP to system;

APPEND – Loads rows from the source and leaves existing rows unchanged.

Export and Import using Query option:

Query parameter can be used in two ways :-

 

a. Using parameter file :-

 

b. In Command Line :-

 

 

2. IMPDP :-

 

a. Using par file :-

 

b. In Command Line :-

 

Export using Include Parameter:-

 

Import using Include Parameter:-

 

Using Exclude Parameter:-

 

 

Export Using Flashback_time parameter:-

Using Parallel parameter:

DBA’s work on database with gigabytes or terabytes of data. Exporting data can be a very time consuming job when we needed to export lots of data in a short period of time. Datapump EXPDP has the ability to export data as compressed format, which achieves faster write times, but this will use more processor time.

Best way to achive this is using the parallel parameter. Parallel allows you to have several dump processes, thus exporting data much faster. You can specify the numbers of processes you want to launch by specifying parallel=n. You can specify the filenames of each processes individually or the filename can contain a substitution variable (%U)which indicates that multiple dump files are generated.

here is an example to export data using 10 channels.

expdp user=scott parallel=10 DUMPFILE=expdata%U.dmp

This command will create 10 files in the default datapump export directory and this export will then be imported by using the %U substitution variable again.

 

To import the datapump in parallel we can use the following example:

Incase of RAC environment the export job run on a subset of RAC instances, rather than on any instance in the cluster. .

But, If you want to run the export utility node 1 alone, specifying CLUSTER=N will force Data Pump processes (the worker processes, in particular) to run only on the instance where the job is started.

 

Here is an example:-

To import

 

In below example we are compressing 800 GB dump to 100 GB. If NFS mount point is pointing to only one node in RAC then we need to use CLUSTER=N

 

Expdp :

 

 

Impdp:

 

 

 

Share via:
Note: Please test scripts in Non Prod before trying in Production.
1 Star2 Stars3 Stars4 Stars5 Stars (29 votes, average: 4.97 out of 5)
Loading...

10 thoughts on “Oracle: Different options in Export and Imports

Add Comment