Setup a HammerDB Testing AWS RDS Oracle Database

AWS Oracle RDS

Sat, 15 Aug 2020

Move the HammerDB testing database to AWS from the local Virtual Box Oracle development image. It could provide a better testing platform as it can scale up without limit if you willing to pay. (Yes, it has limit, but it is much much higher than what I need.)

Create RDS Database for Testing

NOTE: We will create AWS resources that incur costs. It is chargeable, please be aware.

Create a AWS RDS Oracle database after login AWS console:

6 rds create database 7 rds create database

Standard Edition 2 (licence included) should be good enough for this testing. Select Dev/Test database template: 8 rds create database

Pick the CPU type that you like, it can be changed afterward: 9 rds create database

20GB should be large enough.

P.S. 25GB is not enough for build in SELECT Audit test. Need to increase to 50GB at the end. 10 rds create database 11 rds create database 12 rds create database

No backup and Encryption is needed for this testing database: 13 rds create database

Enable Performance insights and enhanced monitoring to collect performance information: 14 rds create database 15 rds create database 16 rds create database 17 rds create database 18 rds create database 19 rds create database

Export TPCC schema from the existing HammerDB database

I have an existing HammerDB testing database in my VirtualBox Oracle development VM. It contains a TPCC schema and 60 warehouses. To save time to build the TPCC schema and warehouses in AWS RDS. I decide to export the schema using Oracle datadump export (expdp) and upload the dump to S3. I will import the data to AWS RDS later using datadump import (impdp)

In the VirtualBox Oracle development VM, execute following commands in sqlplus to create a directory and expdp:

CREATE DIRECTORY DATA_PUMP_DIR_1 AS '/u01/datapump';

1 export tpcc

DECLARE
hdnl NUMBER;
BEGIN
hdnl := DBMS_DATAPUMP.OPEN( operation => 'EXPORT', job_mode => 'SCHEMA', job_name=>null);
DBMS_DATAPUMP.ADD_FILE( handle => hdnl, filename => 'tpcc.dmp', directory => 'DATA_PUMP_DIR_1', filetype => dbms_datapump.ku$_file_type_dump_file);
DBMS_DATAPUMP.ADD_FILE( handle => hdnl, filename => 'tpcc_exp.log', directory => 'DATA_PUMP_DIR_1', filetype => dbms_datapump.ku$_file_type_log_file);
DBMS_DATAPUMP.METADATA_FILTER(hdnl,'SCHEMA_EXPR',' IN (''TPCC'')');
DBMS_DATAPUMP.START_JOB(hdnl);
END;
/  

2 export tpcc 3 export tpcc

Create a S3 bucket and upload the expdp dump. 4 upload dump to s3 5 upload dump to s3

Set up S3 integration in the RDS

Create an IAM policy to allow Amazon RDS access to an Amazon S3 bucket: 50 s3 intergration 51 s3 intergration 52 s3 intergration

Create an IAM role to allow Amazon RDS access to an Amazon S3 bucket: 53 s3 intergration 54 s3 intergration 55 s3 intergration 56 s3 intergration

In the RDS console, associate your IAM role with your DB instance: 57 s3 intergration

Configure an option group for Amazon S3 integration: 58 s3 intergration 59 s3 intergration 60 s3 intergration

Need to restart the database to apply the option group. 61 s3 intergration

Download the expdb dump Files from the Amazon S3 Bucket to the Oracle DB Instance: Connect to the RDS via SQLDeveloper, run following SQL:

SELECT rdsadmin.rdsadmin_s3_tasks.download_from_s3(
      p_bucket_name    =>  'hammerdb-expdp',       
      p_directory_name =>  'DATA_PUMP_DIR') 
   AS TASK_ID FROM DUAL;  

62 s3 intergration

To read the log file, run following SQL:

SELECT text FROM table(rdsadmin.rds_file_util.read_text_file('BDUMP','dbtask-1597064157117-203.log'));                
            

63 s3 intergration

After the expdp dump uploaded to database, run following SQL to create the TPCCTAB tablespace:

CREATE TABLESPACE TPCCTAB;

Run the impdb to import the TPCC schema:

DECLARE
hdnl NUMBER;
BEGIN
hdnl := DBMS_DATAPUMP.OPEN( operation => 'IMPORT', job_mode => 'SCHEMA', job_name=>null);
DBMS_DATAPUMP.ADD_FILE( handle => hdnl, filename => 'tpcc.dmp', directory => 'DATA_PUMP_DIR', filetype => dbms_datapump.ku$_file_type_dump_file);
DBMS_DATAPUMP.ADD_FILE( handle => hdnl, filename => 'tpcc_imp.log', directory => 'DATA_PUMP_DIR', filetype => dbms_datapump.ku$_file_type_log_file);
DBMS_DATAPUMP.METADATA_FILTER(hdnl,'SCHEMA_EXPR','IN (''TPCC'')');
DBMS_DATAPUMP.START_JOB(hdnl);
END;
/  

64 s3 intergration

Read the impdp status and read the impdp dump logfile:

SELECT * from DBA_DATAPUMP_JOBS;
SELECT text FROM table(rdsadmin.rds_file_util.read_text_file('DATA_PUMP_DIR','tpcc_imp.log'));                           

65 s3 intergration 66 s3 intergration

The TPCC schema is loaded to the Oracle RDS database and we could use it for HammerDB TPCC test.

Remember to terminate the instance when the testing is complete, if the instance is running, AWS will continue to charge you. You could stop the instance, but there will be a certain amount of charges for the storage for the database.

Loading...
Drew Lo

Drew.vip - Share my learning path about - Database, Security and Cloud

  • Copyright @ Drew.vip 2020. All rights reserved.
  • Photos by unsplash.com.