Thursday, July 23, 2020

DATA PUMP QUESTIONS & ANSWERS

DATA PUMP QUESTIONS & ANSWERS
------------------------------------------------------

1) What is use of CONSISTENT option in exp?
A)Cross-table consistency. Implements SET TRANSACTION READ ONLY. Default value N.

2) What is use of DIRECT=Y option in exp?
A)Setting direct=yes, to extract data by reading the data directly, bypasses the SGA, bypassing the SQL command-processing layer (evaluating buffer), so it should be faster. Default value N.

3) What is use of COMPRESS option in exp?
A)Imports into one extent. Specifies how export will manage the initial extent for the table data. This parameter is helpful during database re-organization. Export the objects (especially tables and indexes) with COMPRESS=Y. If table was spawning 20 Extents of 1M each (which is not desirable, taking into account performance), if you export the table with COMPRESS=Y, the DDL generated will have initial of 20M. Later on when importing the extents will be coalesced. Sometime it is found desirable to export with COMPRESS=N, in situations where you do not have contiguous space on disk (tablespace), and do not want imports to fail.

4) How to improve exp performance?
a) Set the BUFFER parameter to a high value. Default is 256KB.
b) Stop unnecessary applications to free the resources.
c) If you are running multiple sessions, make sure they write to different disks.
d) Do not export to NFS (Network File Share). Exporting to disk is faster.
e) Set the RECORDLENGTH parameter to a high value.
f) Use DIRECT=yes (direct mode export).

5) How to improve imp performance?
a) Place the file to be imported in separate disk from datafiles.
b) Increase the DB_CACHE_SIZE.
c) Set LOG_BUFFER to big size.
d) Stop redolog archiving, if possible.
e) Use COMMIT=n, if possible.
f) Set the BUFFER parameter to a high value. Default is 256KB.
g) It's advisable to drop indexes before importing to speed up the import process or set INDEXES=N and building indexes later on after the import. Indexes can easily be recreated after the data was successfully imported.
h) Use STATISTICS=NONE
i) Disable the INSERT triggers, as they fire during import.
j) Set Parameter COMMIT_WRITE=NOWAIT(in Oracle 10g) or COMMIT_WAIT=NOWAIT (in Oracle 11g) during import.

6) What is use of INDEXFILE option in imp?
A)Will write DDLs of the objects in the dumpfile into the specified file.

7) What is use of IGNORE option in imp?
A)Will ignore the errors during import and will continue the import.

8)What are the differences between expdp and exp (Data Pump or normal exp/imp)?
A)Data Pump is server centric (files will be at server).
Data Pump has APIs, from procedures we can run Data Pump jobs.
In Data Pump, we can stop and restart the jobs.
Data Pump will do parallel execution.
Tapes & pipes are not supported in Data Pump.
Data Pump consumes more undo tablespace.
Data Pump import will create the user, if user doesn’t exist.

9) Why expdp is faster than exp (or) why Data Pump is faster than conventional export/import?
A)Data Pump is block mode, exp is byte mode. 
Data Pump will do parallel execution.
Data Pump uses direct path API.

10) How to improve expdp performance?
A)Using parallel option which increases worker threads. This should be set based on the number of cpus.

11) How to improve impdp performance?
A)Using parallel option which increases worker threads. This should be set based on the number of cpus.

12) In Data Pump, where the jobs info will be stored (or) if you restart a job in Data Pump, how it will know from where to resume?
A)Whenever Data Pump export or import is running, Oracle will create a table with the JOB_NAME and will be deleted once the job is done. From this table, Oracle will find out how much job has completed and from where to continue etc.
Default export job name will be SYS_EXPORT_XXXX_01, where XXXX can be FULL or SCHEMA or TABLE.
Default import job name will be SYS_IMPORT_XXXX_01, where XXXX can be FULL or SCHEMA or TABLE.

13) What is the order of importing objects in impdp?
A) Tablespaces
 Users
 Roles
 Database links
 Sequences
 Directories
 Synonyms
 Types
 Tables/Partitions
 Views
 Comments
 Packages/Procedures/Functions
 Materialized views

14) How to import only metadata?
A)CONTENT= METADATA_ONLY

15)How to import into different user/tablespace/datafile/table?
A)REMAP_SCHEMA
REMAP_TABLESPACE
REMAP_DATAFILE
REMAP_TABLE 
REMAP_DATA

16) How to export/import without using external directory?
a) Run the older CATEXP.SQL script on the database to be exported
b)use the older export utility to create the dump file
c)use the older import utility to import to the target db

17) Using Data Pump, how to export in higher version (11g) and import into lower version (10g), can we import to 9i?
A) No guarantee that an later release expdp dmp file will import into a earlier relese in 10g or 91 this is called forward compatibility ,impossible we can not import to earlier releses


18) How to do transport tablespaces (and across platforms) using exp/imp or expdp/impdp?
A) $impdp directory= datapump dumpfile=emp_bkp.dmp logfile =imp_emp.log tables='EMP' remap_schema='SCOTT:SCOTT' remap_tablespace='MYDATA:MYTBS'




THANK YOU  FOR VIEWING MY BLOG MORE UPDATES VISIT MY BLOG  

ITIL Process

ITIL Process Introduction In this Blog i am going to explain  ITIL Process, ITIL stands for Information Technology Infrastructure Library ...