1 / 36

Accessing and Loading Data

Accessing and Loading Data. Objectives. After completing this lesson, you should be able to do the following: Use Data Pump to import and export data Create external tables for data population Transport tablespaces across different platforms. Data Pump: Overview.

lamya
Download Presentation

Accessing and Loading Data

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Accessing and Loading Data

  2. Objectives • After completing this lesson, you should be able to do the following: • Use Data Pump to import and export data • Create external tables for data population • Transport tablespaces across different platforms

  3. Data Pump: Overview • Server-based facility for high-speed data and metadata movement • Infrastructure callable via DBMS_DATAPUMP • Provides new-generation export and import tools: • expdp • impdp • Web-based interface through Enterprise Manager • Data Pump Export and Import are supersets of the original Export and Import utilities.

  4. Data Pump: General Architecture SQLLoader Other clients expdp impdp Data Pump DBMS_DATAPUMPData/Metadata Movement Engine 4 ORACLELOADER ORACLEDATAPUMP Direct Path API Metadata API External Table API

  5. Data Pump Export and Import: Overview Databaselink expdpclient Source Target Serverprocess Data Pumpjob Database Database Mastertable Mastertable Dumpfile set Dumpfile set Serverprocess Data Pumpjob impdpclient

  6. Data Pump Export and Import: Benefits • Data access methods: • Direct path • External tables • Detach from and reattach to long-running jobs • Restart Data Pump jobs • Fine-grained object selection • Explicit database version specification • Parallel execution (Enterprise Edition only) • Estimate export job space consumption • Network mode in a distributed environment • Remapping capabilities during import

  7. Data Pump Access Methods • Data Pump supports two access methods: • Direct-path load using the direct-path API • External tables Database Directpath Externaltables Database

  8. Data Pump Direct Path Considerations • External tables are used under the following conditions: • Tables with fine-grained access control enabled in insert and select modes • Domain index exists for a LOB column. • Clustered tables are present. • Tables with active triggers defined • Global index on partitioned tables with a single-partition load • BFILE or opaque type columns • Referential integrity constraint • VARRAY columns with an embedded opaque type

  9. Data Pump File Locations • Three types of Data Pump files: • Dump files • Log files • SQL files • Absolute paths are not supported. • Oracle directory objects must be used. • Order of precedence of file locations: • Per-file directory • DIRECTORY parameter • DATA_PUMP_DIR environment variable

  10. Data Pump File Naming and Size • A dump file set can contain more than one file. • DUMPFILE determines the list of dump files: • Comma-separated list of files • %U template • The initial number of dump files depends on: • PARALLEL parameter • DUMPFILE parameter • The FILESIZE parameter determines the size of each dump file. • Preexisting files with matching names are not overwritten.

  11. Data Pump Utility: Interfaces and Modes • Data Pump Export and Import interfaces: • Command-line • Parameter file • Interactive command-line • Database Control • Data Pump Export and Import modes: • Full • Schema • Table • Tablespace • Transportable tablespace

  12. Data Pump and Filtering • Fine-grained object selection: • INCLUDE= object_type[:"name_expr"] • EXCLUDE= object_type[:"name_expr"] • Data selection: • CONTENT=ALL|METADATA_ONLY|DATA_ONLY • QUERY=[schema.][table_name:]"query_clause" EXCLUDE=VIEW EXCLUDE=PACKAGE EXCLUDE=INDEX:"LIKE 'EMP%'" QUERY=hr.employees:"WHERE department_id in (10,20) and salary < 1600 ORDER BY department_id"

  13. Data Pump Import Transformations • You can remap: • Data files using REMAP_DATAFILE • Tablespaces using REMAP_TABLESPACE • Schemas using REMAP_SCHEMA • Using TRANSFORM, you can exclude from tables and indexes: • STORAGE and TABLESPACE clauses • STORAGE clause only REMAP_DATAFILE = 'C:\oradata\tbs6.f':'/u1/tbs6.f' TRANSFORM = SEGMENT_ATTRIBUTES|STORAGE:{y|n}[:TABLE|INDEX]

  14. Data Pump Job Monitoring Views Master Process V$SESSION_LONGOPS USERNAME OPNAME SIDSERIAL# DBA_DATAPUMP_JOBS OWNER_NAME JOB_NAME DBA_DATAPUMP_SESSIONS SADDR Client processes V$SESSION

  15. Database Control and Data Pump

  16. Parallel Full Export and Import: Example $ expdp system/manager full = y parallel = 4 dumpfile = DATADIR1:full1%U.dat, DATADIR2:full2%U.dat, DATADIR3:full3%U.dat, DATADIR4:full4%U.dat filesize = 2G $ impdp system/manager directory = NET_STORAGE_1 parallel = 4 dumpfile = full1%U.dat,full2%U.dat, full3%U.dat,full4%U.dat

  17. Limited Schema Export: Example $ expdp system/manager schemas = hr,oe directory = USR_DATA dumpfile = schema_hr_oe.dat parfile = exp_par.txt include = function include = procedure include = package include = type include = view:"like 'PRODUCT%'" $ impdp system/manager directory = USR_DATA dumpfile = schema_hr_oe.dat sqlfile = schema_hr_oe.sql

  18. Network Mode Import: Example $ impdp system/manager schemas = hr,sh,payroll parfile = imp_par.txt network_link = finance.hq.com flashback_time = 2003-09-08 09:00 remap_schema = payroll:finance

  19. Attaching to Existing Job: Example $ expdp system/manager ATTACH = EXP_TS1... Job: EXP_TS1 Owner: SYSTEM ... Mode: TABLESPACE MaxDegree: 2 EXPORT Job Parameters: Parameter Name Parameter Value: CLIENT_COMMAND system/**** estimate=blocks ... DATA_ACCESS_METHOD AUTOMATIC ... State: EXECUTING Job error count: 0... Export> STOP_JOB Are you sure you wish to stop this job ([y]/n): Y

  20. Restarting Stopped Job: Example $ expdp system/manager attach=exp_ts1 … Export> parallel = 4 Export> start_job … Export> status = 600 Export> continue_client

  21. Data-Only Unload: Example $ expdp hr/hr parfile=exp_par.txt directory=HRDATA dumpfile=expdat.dmp content=data_only include=table:"in ('DEPARTMENTS', 'DEPARTMENTS_HIST','EMPLOYEES', 'EMPLOYEES_HIST')" query="where DEPARTMENT_ID != 30 order by DEPARTMENT_ID"

  22. External Table Population: Overview • Unload data to external flat files • Handle complex ETL situations CREATE TABLE… AS SELECT INSERT … SELECT Unloading Loading Flat files(Proprietary format) Tables Tables

  23. External Table Population Operation • Uses the ORACLE_DATAPUMP access driver • Data cannot be modified. • Resulting files can be read only with the ORACLE_DATAPUMP access driver. • You can combine generated files from different sources for loading purposes. ORACLE_DATAPUMP DPAPIflat files Oracle database

  24. External Table Parallel Populate Operation • Multiple files can be created. • Exactly one parallel execution server per file • The PARALLEL and LOCATION clauses influence the degree of parallelism. Coordinator Parallelexecutionservers Generatedfiles emp1.exp emp2.exp emp3.exp

  25. External Table Population: Example CREATE TABLE emp_ext (first_name, last_name, department_name) ORGANIZATION EXTERNAL ( TYPE ORACLE_DATAPUMP DEFAULT DIRECTORY ext_dir LOCATION ('emp1.exp','emp2.exp','emp3.exp') ) PARALLEL AS SELECT e.first_name,e.last_name,d.department_name FROM employees e, departments d WHERE e.department_id = d.department_id AND d.department_name in ('Marketing', 'Purchasing');

  26. External Table Projected Columns 2 1 1 1 SELECT COUNT(order_id)FROM order_items_ext; SELECT COUNT(line_id)FROM order_items_ext; Accessdriver ALL REFERENCED 2355,1,2289,46,200 2355,A,2264,50,100 order_items1.dat

  27. External Table Projected Column: Examples • The default value is ALL. • REFERENCED is useful for performance when data is known to be safe. ALTER TABLE order_items_ext PROJECTCOLUMN {ALL|REFERENCED}; SELECT propertyFROM DBA_EXTERNAL_TABLESWHERE table_name = 'ORDER_ITEMS_EXT';

  28. Cross-Platform Transportable Tablespaces • Simplify data distribution between data warehouse and data marts • Allow database migration from one platform to another • Supported platforms:

  29. Minimum Compatibility Level • Both source and target databases must have COMPATIBLE set to 10.0.0 or higher. • Data file headers are platform-aware. • Before transporting, make sure that all read-only and offline files are platform-aware. Read/write Read/write Read-only Read-only Read/write Read-only COMPATIBLE=9.2.0 COMPATIBLE=10.0.0

  30. Transportable Tablespace Procedure Source Make tablespaces read-only. Use Data Pump to extract metadata. Target uses the same endian format? No Convert data files using RMAN. Yes Ship data files and dump file to target. Use Data Pumpto import metadata. Make tablespaces read/write. Target

  31. Determining the Endian Formatof a Platform SELECT tp.endian_format FROM v$transportable_platform tp, v$database d WHERE tp.platform_name = d.platform_name; Source Target

  32. Data File Conversion: Examples Source $ rman target=/ RMAN> CONVERT TABLESPACE 'FINANCE,HR' TO PLATFORM = 'AIX-Based Systems (64-bit)' DB_FILE_NAME_CONVERT = '/orahome/dbs1', '/orahome/dbs/transport_aix','/orahome/dbs2', '/orahome/dbs/transport_aix'; or $ rman target=/ RMAN> CONVERT DATAFILE '/tmp/transport_stage/*' FROM PLATFORM = 'Solaris[tm] OE (32-bit)' DB_FILE_NAME_CONVERT ='/tmp/transport_stage/fin' , '/orahome/dbs1/fin', '/tmp/transport_stage/hr' , '/orahome/dbs2/hr'; Target

  33. CLOB Exception Source Target Automaticendian-independent conversion Endian- dependentCLOBs CLOB CLOB CLOB CLOB Convert Transport Read

  34. Dynamic Performance View Changes V$DATABASE PLATFORM_ID PLATFORM_NAME ENDIAN_FORMAT V$TRANSPORTABLE_PLATFORM

  35. Summary • In this lesson, you should have learned how to: • Use Data Pump to import and export data • Create external tables for data population • Transport tablespaces across different platforms

  36. Practice 17 Overview: Using Data Pump • This practice covers using Data Pump to manage data.

More Related