Sunday, March 7, 2021

Import and Export Data Using Data Pump on Autonomous Transaction Processing DB

This post shows the steps for importing and exporting data using data dump on autonomous transaction processing (ATP) DB. The ATP used in this case is on shared exadata infrastructure and 19c (19.5).
1. As the first step create the user for import (or export). Admin user (a user created automatically in ATP) could be used for import and export but as admin user has higher privilges it may not be a good idea. The user craetion below is bare minimum to do an import. It has privilges to connect to the ATP DB with quota unlimited on default "data" tablespace. Also it has read,write privilges on the data_pump_dir where the log files would be created. Finally execute privilege on dbms_cloud to create the credentail. This is needed only for the duration of the import or export. Afterwards could be revoked from the user if deemed unnecessary. Puting it altogether the user creation looks like as follows.
create user asanga identified by password quota unlimited on data;
grant connect to asanga;
grant create table to asanga;
grant read,write on directory DATA_PUMP_DIR to asanga; 
grant execute on dbms_cloud to asanga;
2. Create a bucket on the OCI to hold the dump files. The impdp/expdp will access this bucket to get the necessary dump files.

3. Upload the dump files to the object storage bucket created earlier.

4. Get the URI of the dump file in the bucket selecting view object details.

5. Generate an auth token for an OCI user who has access to the bucket created in step 2. Navigate through Identity -> users and select the user to get to user details. Then generate an auth code (make note of the code generated).

Another way to get to auth tokens is via profile -> user setting.

6. Setup an enviornment to login and execute expdp/impdp on the ATP. For this an Oracle client installation is needed that has access to the ATP. Easiest way to acheive this is to create a temporary VM in the same network as ATP (this ATP was using private endpoints) and allow access for this VM to the ATP via NSG. On the VM (if using Oracle Linux) install instant-client, sqlplus and tools from the yum repository (use yum list | grep instant-client to identify the exact versions). This would have installed sqlplus, expdp and impdp binaries in the VM. This is the minimum needed to do the import and export

7. Download the ATP wallet from the console into the VM created in step 6 (or to where impdp/expdp would be executed). Copy the cwallet.sso, ewallet.p12, sqlnet.ora and tnsnames.ora files to a temporary location (eg /home/opc/atp). Edit the sqlnet.ora to reflect the wallet location.
At the same time set the TNS_ADMIN variable to the location where wallet file resides.
export TNS_ADMIN=/home/opc/atp
These files could be removed once the impdp/expdp is done.

8. If step 6 and 7 are done correctly it would be possible to login to the ATP now. Login to the ATP as the user created in step 1 and create a credential.
sqlplus asanga@testatp_tpurgent

BEGIN
  DBMS_CLOUD.CREATE_CREDENTIAL(
    credential_name => 'ATP_CRED_NAME',
    username => 'user@email.com',
    password => 'auth token created before'
  );
END;
/
To check if user can access the bucket using the crednetail use the following SQL. If use has the permission then this will list the contents of the bucket.
select object_name, bytes from dbms_cloud.list_objects('ATP_CRED_NAME','https://objectstorage.#region#.oraclecloud.com/n/#tenancy namespace#/b/dumps');

If using swiftobject format then use
select object_name, bytes from dbms_cloud.list_objects('ATP_CRED_NAME','https://swiftobjectstorage.#region#.oraclecloud.com/v1/#tenancy namespace#/dumps');

As the client used in this case 19c no other steps are needed after creating the credentail. But if it is 12.2 client used for export/import then set the default credentail property.

9. Run the import data pump command by specify the storage bucket object URI as the dump file. In the shared ATP there's a single default tablespace called data and a remap tablespace must be used if the system exporting data had data residing in other tablespaces. Beside the username and the password for the DB user the credentail name (created in step 8) must also be specified.
impdp asanga@atpfree_tp 
directory=data_pump_dir 
credential=ATP_CRED_NAME 
dumpfile=https://objectstorage.#region#.oraclecloud.com/n/#tenancy namespace#/b/dumps/o/asanga.dmp  
remap_tablespace=users:data 
exclude=role_grant,SYSTEM_GRANT,DEFAULT_ROLE,TABLESPACE_QUOTA,user

Import: Release 19.0.0.0.0 - Production on Thu Aug 20 16:48:44 2020
Version 19.8.0.0.0

Copyright (c) 1982, 2019, Oracle and/or its affiliates.  All rights reserved.
Password:

Connected to: Oracle Database 19c Enterprise Edition Release 19.0.0.0.0 - Production
Master table "ASANGA"."SYS_IMPORT_FULL_01" successfully loaded/unloaded
Starting "ASANGA"."SYS_IMPORT_FULL_01":  asanga/********@atpfree_tp directory=data_pump_dir credential=ATP_CRED_NAME dumpfile=https://objectstorage.#region#.oraclecloud.com/n/#tenancy namespace#/b/dumps/o/asanga.dmp remap_tablespace=users:data exclude=role_grant,SYSTEM_GRANT,DEFAULT_ROLE,TABLESPACE_QUOTA,user
Processing object type SCHEMA_EXPORT/PRE_SCHEMA/PROCACT_SCHEMA
Processing object type SCHEMA_EXPORT/TABLE/TABLE
Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
. . imported "ASANGA"."TEST"                             2.406 MB   19862 rows
. . imported "ASANGA"."DATETEST"                           5.5 KB       1 rows
. . imported "ASANGA"."UCPTEST"                          5.484 KB       2 rows
Processing object type SCHEMA_EXPORT/TABLE/CONSTRAINT/CONSTRAINT
Processing object type SCHEMA_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS
Processing object type SCHEMA_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
Processing object type SCHEMA_EXPORT/STATISTICS/MARKER
Job "ASANGA"."SYS_IMPORT_FULL_01" successfully completed at Thu Aug 20 11:19:18 2020 elapsed 0 00:00:17

The ATP (shared) documentation mentions using transform parameters such as segment_attributes:N but using any of the following transform parameters
TRANSFORM=LOB_STORAGE:SECUREFILE,DISABLE_ARCHIVE_LOGGING:Y,constraint_use_default_index:Y,segment_attributes:N
resulted in import job failing with
UDI-00942: operation generated ORACLE error 942
ORA-00942: table or view does not exist
This was the case even when the import job was run as admin user.

10. Once the import job is done the log file could be viewd with the following.
sqlplus asanga@testatp_tpurgent
col CHECKSUM format a10
set line 1000
col OBJECT_NAME format a30
SELECT * FROM DBMS_CLOUD.LIST_FILES('DATA_PUMP_DIR');
The log file could be copied out of ATP host to the bucket created to hold dump file with the following (imp.log is the log file name).
exec DBMS_CLOUD.PUT_OBJECT(credential_name => 'ATP_CRED_NAME',
object_uri => 'https://objectstorage.#region#.oraclecloud.com/n/#tenancy namespace#/b/dumps/o/imp.log',
directory_name => 'DATA_PUMP_DIR',
file_name => 'imp.log');
Log file could be removed with
exec dbms_cloud.DELETE_FILE('DATA_PUMP_DIR','imp.log');
Finally if credentail is nolonger needed it could be dropped with
exec dbms_cloud.DROP_CREDENTIAL('ATP_CRED_NAME');


Similar to importing directly from a object storage bucket, exporting to object storage bucket could be done as well. Following example shows exporting two tables into object storage bucket.
expdp asanga@testatp_tpurgent 
dumpfile=https://objectstorage.#region#.oraclecloud.com/n/#tenancy namespace#/b/dumpbucket/o/configs.dmp 
logfile=configimp.log 
parallel=6 
tables=CONFIG,CONFIG_PARAMETER 
directory=data_pump_dir 
credential=ATP_CRED_NAME 
logtime=all metrics=yes


Export: Release 21.0.0.0.0 - Production on Tue Feb 23 13:31:14 2021
Version 21.1.0.0.0

Copyright (c) 1982, 2020, Oracle and/or its affiliates.  All rights reserved.
Password:

Connected to: Oracle Database 19c Enterprise Edition Release 19.0.0.0.0 - Production
23-FEB-21 13:31:31.786: Starting "ASANGA"."SYS_EXPORT_TABLE_01":  asanga/********@testatp_tpurgent dumpfile=https://objectstorage.#region#.oraclecloud.com/n/#tenancy namespace#/b/dumpbucket/o/configs.dmp  logfile=admconfigimp.log parallel=6 tables=CONFIG,CONFIG_PARAMETER directory=data_pump_dir credential=ATP_CRED_NAME logtime=all metrics=yes
23-FEB-21 13:31:33.289: W-1 Startup on instance 3 took 2 seconds
23-FEB-21 13:31:35.670: W-2 Startup on instance 3 took 0 seconds
23-FEB-21 13:31:37.252: W-3 Startup on instance 3 took 1 seconds
23-FEB-21 13:31:37.262: W-4 Startup on instance 3 took 1 seconds
23-FEB-21 13:31:37.276: W-5 Startup on instance 3 took 1 seconds
23-FEB-21 13:31:38.534: W-6 Startup on instance 3 took 2 seconds
23-FEB-21 13:31:39.605: W-4 Processing object type TABLE_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS
23-FEB-21 13:31:40.643: W-6 Processing object type TABLE_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
23-FEB-21 13:31:41.711: W-4      Completed 4 INDEX_STATISTICS objects in 3 seconds
23-FEB-21 13:31:42.072: W-1 Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
23-FEB-21 13:31:42.236: W-6      Completed 2 TABLE_STATISTICS objects in 3 seconds
23-FEB-21 13:31:42.852: W-5 Processing object type TABLE_EXPORT/TABLE/COMMENT
23-FEB-21 13:31:43.211: W-2 Processing object type TABLE_EXPORT/TABLE/GRANT/OWNER_GRANT/OBJECT_GRANT
23-FEB-21 13:31:46.866: W-3 Processing object type TABLE_EXPORT/TABLE/TABLE
23-FEB-21 13:31:48.604: W-6 Processing object type TABLE_EXPORT/TABLE/INDEX/INDEX
23-FEB-21 13:31:48.610: W-3      Completed 2 TABLE objects in 9 seconds
23-FEB-21 13:31:49.579: W-6      Completed 4 INDEX objects in 7 seconds
23-FEB-21 13:31:50.627: W-1 Processing object type TABLE_EXPORT/TABLE/CONSTRAINT/CONSTRAINT
23-FEB-21 13:31:50.636: W-3 Processing object type TABLE_EXPORT/TABLE/CONSTRAINT/REF_CONSTRAINT
23-FEB-21 13:31:50.925: W-5      Completed 14 COMMENT objects in 9 seconds
23-FEB-21 13:31:52.100: W-2      Completed 2 OBJECT_GRANT objects in 12 seconds
23-FEB-21 13:31:53.086: W-1      Completed 2 CONSTRAINT objects in 10 seconds
23-FEB-21 13:31:53.398: W-3      Completed 3 REF_CONSTRAINT objects in 5 seconds
23-FEB-21 13:31:54.387: W-5 . . exported "ASANGA"."CONFIG"                          235.8 KB    3856 rows in 2 seconds using direct_path
23-FEB-21 13:31:54.606: W-5 . . exported "ASANGA"."CONFIG_PARAMETER"                137.8 KB    1351 rows in 0 seconds using direct_path
23-FEB-21 13:32:00.680: W-6 Processing object type TABLE_EXPORT/TABLE/TRIGGER
23-FEB-21 13:32:00.948: W-6      Completed 5 TRIGGER objects in 10 seconds
23-FEB-21 13:32:05.097: W-4 Processing object type TABLE_EXPORT/TABLE/STATISTICS/MARKER
23-FEB-21 13:32:05.443: W-4      Completed 1 MARKER objects in 23 seconds
23-FEB-21 13:32:05.565: ORA-39173: Encrypted data has been stored unencrypted in dump file set.
23-FEB-21 13:32:05.710: W-1      Completed 2 TABLE_EXPORT/TABLE/TABLE_DATA objects in 2 seconds
23-FEB-21 13:32:08.028: W-1 Master table "ASANGA"."SYS_EXPORT_TABLE_01" successfully loaded/unloaded
23-FEB-21 13:32:08.032: ******************************************************************************
23-FEB-21 13:32:08.033: Dump file set for ASANGA.SYS_EXPORT_TABLE_01 is:
23-FEB-21 13:32:08.037:   https://objectstorage.#region#.oraclecloud.com/n/#tenancy namespace#/b/dumpbucket/o/configs.dmp


Related Posts
JDBC Thin Connections to Autonomous Transaction Processing DB