In ideal world, right after taking an export we'd immediately write down what tables or schemas it contained and store this note in a file which is kept or moved around together with its dump file. In reality, we do that on special occasions, and forget to supply log file name, and it defaults to "export.log", and gets overwritten by next export. In any case, from time to time we find ourself with several export dumps with little clue which one contains the very table we need so desperately.
Data Pump does not expose methods to view contents of dump files, however package DBMS_DATAPUMP has procedure GET_DUMPFILE_INFO which returns some basic information about export operation which created it. AlderPump retrieves and displays this data in File Browser, take a look at the screenshot.
The only documented way to peek inside dump file is sql file import. It is a special form of import where no real action is performed, but statements which would be executed are written to text file. We can then browse through the file and note which objects are in there. While not perfect, this is better than nothing, and hopefully Oracle will supply better method in future releases.
Sql file mode also great for testing remaps and transforms without actually running import. You can experiment as long as needed to figure proper parameters combination without risk of accidentally erasing something valuable.
We select SQL FILE operation and TABLE mode.
In SQL mode Data Pump generates sql file, so its name is required. We name the file scott_tables.sql. Also, since we are doing import, dump file name is needed. No surprise, we use our old buddy scott_tables.dmp.
Job created and executed as usual, but no objects are modified. Only SQL file specified in Files section was created. The file overwrites any existing file with the same name.
Job has completed creating the SQL file. AlderPump detects presence of SQL and LOG files and displays link to view them. When link is clicked, AlderPump reads the file to local computer and opens it for viewing. A bit of warning: opening large files may take a while; program may appear to hang - but it is busy reading data. Recommended method of bringing large files to client computers is File Browser - which displays progress with time estimations and allows cancellation.
AlderPump retrieves and displays information provided by Data Pump API. While objects contained within the dump file are not listed, there is some rudimentary data about the export job. There is database version, job name, platform, creation date (yes, some of us moonlighting), and few other parameters. Each Oracle release adds a line or two to the list, and who knows? - maybe few dozen versions later we'll have list of objects.
© 2007-2012 AlderProgs Consulting Ltd.