AlderPump logo

Version 2.5

Full support for 32 and 64-bit clients.

Covers all features of Oracle 10.1 to 11.2

Usability improvements and bugfixes

Examining contents of dump file

Download job definition

In ideal world, right after taking an export we'd immediately write down what tables or schemas it contained and store this note in a file which is kept or moved around together with its dump file. In reality, we do that on special occasions, and forget to supply log file name, and it defaults to "export.log", and gets overwritten by next export. In any case, from time to time we find ourself with several export dumps with little clue which one contains the very table we need so desperately.

Data Pump does not expose methods to view contents of dump files, however package DBMS_DATAPUMP has procedure GET_DUMPFILE_INFO which returns some basic information about export operation which created it. AlderPump retrieves and displays this data in File Browser, take a look at the screenshot.

The only documented way to peek inside dump file is sql file import. It is a special form of import where no real action is performed, but statements which would be executed are written to text file. We can then browse through the file and note which objects are in there. While not perfect, this is better than nothing, and hopefully Oracle will supply better method in future releases.

Sql file mode also great for testing remaps and transforms without actually running import. You can experiment as long as needed to figure proper parameters combination without risk of accidentally erasing something valuable.

Prerequisites:

  1. scott_tables.dmp created by table export tutorial and located in DATA_PUMP_DIR

Demonstrated actions:

  1. Job Creation Wizard: setting operation and mode, specifying input dump file and output sql file, creating the job.
  2. Main window: Job creation, job launching, progress log, job completion.
  3. File viewer: viewing SQL file

We select SQL FILE operation and TABLE mode.

Selecting operation SQL_FILE and mode TABLES on Operations And Mode tab

Step 2/4: Specifying input file

In SQL mode Data Pump generates sql file, so its name is required. We name the file scott_tables.sql. Also, since we are doing import, dump file name is needed. No surprise, we use our old buddy scott_tables.dmp.

Specifying scott_tables.dmp as input file ans scott_tables.sql as output

Step 3/4: Running SqlFile import

Job created and executed as usual, but no objects are modified. Only SQL file specified in Files section was created. The file overwrites any existing file with the same name.

Creating and launching job. The job successfully completes generating sql file

Step 4/4: Viewing resulting SQL file

Job has completed creating the SQL file. AlderPump detects presence of SQL and LOG files and displays link to view them. When link is clicked, AlderPump reads the file to local computer and opens it for viewing. A bit of warning: opening large files may take a while; program may appear to hang - but it is busy reading data. Recommended method of bringing large files to client computers is File Browser - which displays progress with time estimations and allows cancellation.

Log in text format

Viewing result SQL file

Dump file information

AlderPump retrieves and displays information provided by Data Pump API. While objects contained within the dump file are not listed, there is some rudimentary data about the export job. There is database version, job name, platform, creation date (yes, some of us moonlighting), and few other parameters. Each Oracle release adds a line or two to the list, and who knows? - maybe few dozen versions later we'll have list of objects.

Dump file information as returned by API all