Chapter 13
This presentation is the property of its rightful owner.
Sponsored Links
1 / 38

Chapter 13 PowerPoint PPT Presentation


  • 93 Views
  • Uploaded on
  • Presentation posted in: General

Chapter 13. Data Pump. Data Pump. Tool that replaces legacy exp /imp utilities Data Pump is a scalable, feature-rich utility that allows you to extract objects and data from a database You can use the extract file as: Point in time backup of your database

Download Presentation

Chapter 13

An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -

Presentation Transcript


Chapter 13

Chapter 13

Data Pump


Data pump

Data Pump

  • Tool that replaces legacy exp/imp utilities

  • Data Pump is a scalable, feature-rich utility that allows you to extract objects and data from a database

  • You can use the extract file as:

    • Point in time backup of your database

    • Efficiently copy large amounts of objects/data from one database environment to another

    • Extract SQL from the export file

    • Partially extract specific objects or data from the export file

    • Easily stop/suspend/restart operations


Data pump architectural components

Data Pump Architectural Components

  • expdp (Data Pump export utility)

  • impdp (Data Pump import utility)

  • DBMS_DATAPUMP PL/SQL package (Data Pump API)

  • DBMS_METADATA PL/SQL package (Data Pump Metadata API)


Data pump architecture

Data Pump Architecture


Steps required to export data

Steps Required to Export Data

  • Create a database-directory object that points to an operating-system directory that you want to write/read Data Pump files to/from.

  • Grant read, write on the directory to the database user running the export.

  • From the operating-system prompt, run the expdp utility.


Creating a database directory

Creating a Database Directory

  • Use CREATE DIRECTORY statement to create a directory:

    SQL> create directory dp_dir as '/oradump';

  • View directory details:

    SQL> select owner, directory_name, directory_path from dba_directories;


Granting access to the directory

Granting Access to the Directory

  • As a privileged user, grant access to the directory to the user who will be invoking expdp:

    SQL> grant read, write on directory dp_dir to <user>;


Taking an export

Taking an Export

  • From the operating system invoke expdp:

    $ expdpdarl/foo directory=dp_dir tables=invdumpfile=exp.dmp logfile=exp.log


Importing using data pump

Importing using Data Pump

  • Similar to export, but using the import utility.

  • Need to setup directory and access

  • Create a database-directory object that points to an operating-system directory that you want to write/read Data Pump files to/from.

  • Grant read, write on the directory to the database user running the export or import.

  • From the operating system prompt, run the impdp command.


Using data pump interactive command mode

Using Data Pump Interactive Command Mode

  • Allows you to interact with Data Pump from a command line mode utility interface.

  • There are two ways to get to the interactive command-mode prompt:

    • Press Ctrl+C in a Data Pump job that you started via expdp or impdp.

    • Use the ATTACH parameter to attach to a currently running job

  • Most useful commands: status, start_job, kill_job, parallel, stop_job.


Attaching to a running job

Attaching to a Running Job

  • You can attach to a currently running Data Pump job to view its status, stop/start it, and so on.

  • Determine the job name:

    SQL> select owner_name, operation, job_name, state from dba_datapump_jobs;

  • Use the ATTACH command line switch:

    $ impdpdarl/engdev attach=sys_import_schema_02


Stopping and restarting a job

Stopping and Restarting a Job

  • You can stop and restart a Data Pump job

  • First attach to the job, then issue interactive commands:

    $ impdpdarl/foo attach=sys_import_table_01

    Import> stop_job

    Import> start_job


Terminating a data pump job

Terminating a Data Pump Job

  • You might find yourself in a scenario where you have a long running Data Pump job that appears to be hung.

  • You want to kill the job as gracefully as possible.

  • First attach to the job, then kill it:

    Import> kill_job


Using a parameter file

Using a Parameter File

  • Parameter files are useful for repeatability and consistency.

  • Parameter files also make it easier to handle parameters that use single and double quotes.

  • Sometimes single and double quotes are mis-interpreted by the OS when entered on the command line.

    userid=darl/foo

    directory=dp

    dumpfile=invp.dmp

    logfile=invp.log

    tables=f_sales

    $ impdpparfile=pfile.ctl


Estimating the size of export jobs

Estimating the Size of Export Jobs

  • Sometimes it’s useful to estimate the size of a job before starting it.

  • Helps you get an idea how much disk space will be required.

    $ expdpdbauser/foo estimate_only=y full=y logfile=n


Listing the contents of dump files

Listing the Contents of Dump Files

  • Allows you to translate the contents of a dump file created by expdpinto SQL statements.

  • Very handy for generating a copy of the SQL required to re-create an environment.

  • Use the SQLFILE parameter:

    $ impdphr/hr DIRECTORY=dpump_dir1 DUMPFILE=expfull.dmp \

    SQLFILE=dpump_dir2:expfull.sql

  • In this mode, impdp doesn’t import anything, it only creates a file that contains the SQL required to re-create objects within the dump file.


Exporting and importing directly across the network

Exporting and Importing Directly Across the Network

  • Powerful technique for copying environments.

  • Efficient way to copy users and data from one database to another.

  • Create users in target environment.

  • Create a database link

  • Create a directory object.

  • Run the impdp command using the NETWORK_LINK

  • parameter that points at the database link you created in the prior step.


Transportable tablespaces

Transportable Tablespaces

  • Powerful feature for copying large amounts of data from one environment to another.

  • Ensure tablespace is self-contained.

  • Make the tablespaces being transported read-only.

  • Use Data Pump to export the metadata for the tablespaces being transported:

  • Copy the Data Pump export dump file to the destination server.

  • Copy the datafile(s) to the destination database.

  • Import the metadata into the destination database.


Exporting tablespace metadata

Exporting Tablespace Metadata

  • Sometimes it’s useful to capture tablespace information and transfer it to a new database environment.

  • Use a combination of FULL and INCLUDE=TABLESPACE.

    $ expdpdarl/foo directory=dpdumpfile=phredstg.dmp content=metadata_only full=y \

    include=tablespace


Specifying different datafile paths and names

Specifying Different Datafile Paths and Names

  • When you’re copying database environments, oftentimes the source and target servers have different mount point names.

  • You can rename the datafiles on the import using REMAP_DATAFILE.

    userid=darl/foo

    directory=dp

    dumpfile=phredstg.dmp

    full=y

    include=tablespace:"like 'TBSP%'"

    remap_datafile="'/ora01/dbfile/O11R2/tbsp101.dbf':'/ora02/O11R2/tb1.dbf'"

    remap_datafile="'/ora01/dbfile/O11R2/tbsp201.dbf':'/ora02/O11R2/tb2.dbf'"

    remap_datafile="'/ora01/dbfile/O11R2/tbsp301.dbf':'/ora02/O11R2/tb3.dbf'"


Changing segment and storage attributes

Changing Segment and Storage Attributes

  • You may have very different segment and storage requirements when copying database environments.

  • Use the TRANSFORM parameter to remove the segment and storage attributes from being specified when importing.

    $ impdpdarl/foo directory=dpdumpfile=inv.dmp transform=segment_attributes:n

    $ impdpdarl/foo directory=dpdumpfile=inv.dmp transform=storage:n


Importing into a different tablespace from the original

Importing into a Different Tablespace from the Original

  • You may want to export a user and when importing map the old user to a new username and map the old tablespaces associated with the user’s objects to new tablespace names.

  • Use REMAP_SCHEMA and REMAP_TABLESPACE

    $ impdpdarl/foo directory=dpdumpfile=rm.dmpremap_schema=HEERA:CHAYA \

    remap_tablespace=TBSP1:V_DATA tables=heera.ticket


Sophisticated filtering mechanisms

Sophisticated Filtering Mechanisms

  • Use the QUERY parameter to export or import subsets of data.

  • Use the SAMPLE parameter to export a percentage of the rows in a table.

  • Use the CONTENT parameter to exclude or include data and metadata.

  • Use the EXCLUDE parameter to specifically name items to be excluded.

  • Use the INCLUDE parameter to name the items to be included (thereby excluding other non-dependent items not included in the list).

  • Use parameters like SCHEMA to specify that you only want a subset of the database’s objects (those that belong to the specified user or users).


Filtering via a query

Filtering via a Query

  • Use this syntax to filter via a query:

    QUERY = [schema.][table_name:] query_clause

  • Example using a parameter file:

    userid=darl/foo

    directory=dp

    dumpfile=inv.dmp

    tables=inv,reg

    query=inv:"WHERE inv_desc='Book'"

    query=reg:"WHERE reg_id <=20"


Exporting a percentage of the data

Exporting a Percentage of the Data

  • Doesn’t work well when exporting combinations of parent/child tables, because the percentage doesn’t enforce that parent rows match corresponding child rows.

  • General syntax for exporting a percentage of data in a table:

    SAMPLE=[[schema_name.]table_name:]sample_percent

    $ expdpdarl/foo directory=dp tables=inv sample=10 dumpfile=inv.dmp


Excluding objects from the export file

Excluding Objects from the Export File

  • Use this general syntax to exclude objects from being exported:

    EXCLUDE=object_type[:name_clause] [, ...]

    userid=darl/foo

    directory=dp

    dumpfile=inv.dmp

    tables=inv

    exclude=index:"LIKE 'INV%'"


Including only specific objects in an export file

Including Only Specific Objects in an Export File

  • Use INCLUDE to specify objects to be included.

    directory=datapump

    dumpfile=pl.dmp

    include=function:"='ISDATE'",procedure:"='DEPTREE_FILL'"


Excluding objects from import

Excluding Objects from Import

  • Use EXCLUDE to exclude objects on import.

    impdpdarl/foo dumpfile=d.dmp directory=dp exclude=TRIGGER,PROCEDURE


Including objects in import

Including Objects in Import

  • To only include objects on import, use INCLUDE:

    userid=darl/foo

    directory=dp

    dumpfile=h.dmp

    schemas=HEERA

    include=table:"like 'A%'"


Creating a consistent export

Creating a Consistent Export

  • No consistent=y switch (like the old exp utility)

  • Use flashback_scn or flashback_time

    SQL> select current_scn from v$database;

    $ expdpdarl/foo directory=dpflashback_scn=8400741902387


Importing when objects already exist

Importing When Objects Already Exist

  • Use TABLE_EXISTS_ACTION to specify behavior if the object already exists

  • The TABLE_EXISTS_ACTION parameter takes the following options:

    • SKIP (default if not combined with CONTENT=DATA_ONLY)

    • APPEND (default if combined with CONTENT=DATA_ONLY)

    • REPLACE

    • TRUNCATE


Renaming a table

Renaming a Table

  • Rename a table on import

  • General syntax:

    REMAP_TABLE=[schema.]old_tablename[.partition]:new_tablename

    or

    REMAP_TABLE=[schema.]old_tablename[:partition]:new_tablename


Remapping data

Remapping Data

  • Allows you to obfuscate data on import.

  • Some production data you may not want imported with easily readable values.

  • Create a PL/SQL function to obfuscate the data

    $ impdpdarl/foo directory=dpdumpfile=cust.dmptables=customers remap_data=customers.last_name:obfus.obf


Cloning a user

Cloning a User

  • Use REMAP_SCHEMA to rename a user during import

    $ impdpdarl/foo directory=dpremap_schema=inv:inv_dwdumpfile=inv.dmp


Miscellaneous features

Miscellaneous Features

  • Suppressing a Log File

  • Using Parallelism

  • Specifying Additional Dump Files

  • Reusing Output File Names

  • Creating a Daily DDL File

  • Compressing Output

  • Encrypting Data


Monitoring data pump jobs

Monitoring Data Pump Jobs

  • Screen output

  • Data Pump log file

  • Database alert log

  • Querying the status table

  • Querying data-dictionary views

  • Interactive command-mode status

  • Using operating-system utilities’ process status ps


Data pump legacy mode

Data Pump Legacy Mode

  • Enter old exp or imp parameter at the command line when invoking Data Pump

  • Data Pump automatically translates parameter into parameters it understands

  • Available from 11g R2

  • Very handy feature if you’re familiar with the old exp/imp utilities


Summary

Summary

  • Data Pump is a flexible and powerful utility for taking point in time backups and copying database environments.

  • As a DBA, you should be proficient with this utility.

  • Sophisticated object and data transfer filtering parameters.

  • Use Data Pump in legacy mode to view how the old exp/imp parameters are translated into Data Pump parameters.


  • Login