Oracle Data Pump is a complex tool. It has many features controlled by over 80 parameters which interfere in sometimes unobvious ways. AlderPump makes their use a lot easier by disabling non-applicable pages and warning about incompatible combinations - but still, sometimes it is hard to figure the right one. These mini-tutorials were created by AlderProgs to guide you through some common tasks and perhaps demonstrate a trick or two.
The screenshots show how to perform tasks in AlderPump, but because of its unique Command Line generation feature, you can always see corresponding expdp or impdp command. Each tutorial includes AlderPump saved job which you can load into AlderPump's Job Creation Wizard and run by yourself.
The tutorials use Oracle sample schemas: HR, OE, PM, IX, SH, and all times favorite tiger SCOTT. Please refer to Oracle Sample Schemas guide for their description, diagrams, and setup instructions.
For file locations, we are using Oracle directory DATA_PUMP_DIR which is typically created at database installation. Oracle Utilities guide's section Default Locations for Dump, Log, and SQL Files provides more details on the directory and other available options.
Although we run all examples as user SYSTEM, its schema never get modified by our actions. We always operate on other schemas, and SYSTEM is used merely for connecting to the database.
To demonstrate network imports, we've created public database link LOOPBACK and granted access to it to PUBLIC. The link connects with current user's credentials and points to our database:
CREATE PUBLIC DATABASE LINK loopback USING 'tns_alias';
Finally, for file browser tutorials, we configured server-side directory browsing using PL/SQL method as explained in AlderPump manual.
In this section we concentrate on simple everyday tasks such as moving table data around. This includes exporting data from tables and re-importing it into tables located on the same or different database. Later sections care more about moving around object definitions, or metadata; here we stick to operating on table contents of tables rather than table's structure.
Table operations are the simplest. As their name implies, they operate on objects located in one single schema (multiple schemas since v11.2), but limited to tables and their associated objects: triggers, foreign key constraints, indexes, comments, and the like. Note that objects like views, synonyms, types, packages, sequences, etc are not associated with tables (although depend on them); in table mode such objects are not processed.How to create or fix sample schema should it become corrupted
This is the base tutorial. It shows how to export selected tables
residing in the same schema and objects associated with them. It also
creates dump file scott_tables.dmp referenced throughout these
We've exported tables into dump file, but what about importing them back? This tutorial shows how to bring these tables into same on another database. The tables should not be present in the schema, they are created by import.
So far we've covered case when tables being imported did not exist. But what about cases when all or some of them do? Import would fail with ORA-39151: Table SCOTT.DEPT exists. All dependent metadata and data will be skipped due to table_exists_action of skip. This tutorial discusses how to handle such cases.
Dump files may contain a number of tables; sometimes it is desirable to import only some of them. This is done by applying metadata filters specifying which objects should or should not be processed by import.
By now we've learned how to reload tables into their original schema - whenever they were present or not. But what if we want to restore tables to another schema? This is achieved by adding metadata remap rule which applies to schema names and translates old name to new.
Situation: import fails to create tables because their tablespace does not exist on the target database. Are tables pinned to their tablespace forever?
In all previous chapters we knew exactly what our dump file contained. But sometimes you just want to quickly check if a particular table is in dump set without actually importing things. While Data Pump offers no direct way to do that, there is workaround.
Unlike Table mode which only works on tables and their "satellites", in schema mode Data Pump processes objects of any type. Beside objects with data (tables and materialized views), it starts to recognize ones which are essentially metadata - constraints, grants, stored procedures... any type Oracle database may contain. Another important point is that now definition of the user is exported - including default/temporary tablespaces and grants.
For tutorials below, schema SCOTT with its 4 tables and 2 indexes is too simple; we are going to use OE which has wider variety of object types: a function, a sequence, synonyms, triggers, views, types and type bodies - and of course a number of tables.
Export schema OE including user definition and all objects it contains. The result file oe.dmp will serve as base for other schema tutorials.
© 2007-2012 AlderProgs Consulting Ltd.