

# Migrating an Amazon RDS for Oracle Database to Amazon Redshift


This walkthrough gets you started with heterogeneous database migration from Amazon RDS for Oracle to Amazon Redshift using AWS Database Migration Service (AWS DMS) and the AWS Schema Conversion Tool (AWS SCT). This introductory exercise doesn’t cover all scenarios but provides you with a good understanding of the steps involved in such a migration.

It is important to understand that AWS DMS and AWS SCT are two different tools and serve different needs. They don’t interact with each other in the migration process. At a high level, the steps involved in this migration are the following:

1. Using AWS SCT to do the following:
   + Run the conversion report for Oracle to Amazon Redshift to identify the issues, limitations, and actions required for the schema conversion.
   + Generate the schema scripts and apply them on the target before performing the data load by using AWS DMS. AWS SCT performs the necessary code conversion for objects like procedures and views.

1. Identify and implement solutions to the issues reported by AWS SCT.

1. Disable foreign keys or any other constraints that might impact the AWS DMS data load.

1.  AWS DMS loads the data from source to target using the Full Load approach. Although AWS DMS is capable of creating objects in the target as part of the load, it follows a minimalistic approach to efficiently migrate the data so that it doesn’t copy the entire schema structure from source to target.

1. Perform postmigration activities such as creating additional indexes, enabling foreign keys, and making the necessary changes in the application to point to the new database.

This walkthrough uses a custom AWS CloudFormation template to create RDS DB instances for Oracle and Amazon Redshift. It then uses a SQL command script to install a sample schema and data onto the RDS Oracle DB instance that you then migrate to Amazon Redshift.

This walkthrough takes approximately two hours to complete. Be sure to follow the instructions to delete resources at the end of this walkthrough to avoid additional charges.

To estimate what it will cost to run this walkthrough on AWS, you can use the AWS Pricing Calculator. For more information, see [https://calculator.aws/](https://calculator.aws/).

**Topics**
+ [

# Prerequisites for migrating from Amazon RDS for Oracle to Amazon Redshift
](chap-rdsoracle2redshift.prerequisites.md)
+ [

# Migration architecture for migrating from Amazon RDS for Oracle to Amazon Redshift
](chap-rdsoracle2redshift.architecture.md)
+ [

# Step-by-step Amazon RDS for Oracle to Amazon Redshift migration walkthrough
](chap-rdsoracle2redshift.steps.md)
+ [

# Migration from Amazon RDS for Oracle to Amazon Redshift next steps
](chap-rdsoracle2redshift.nextsteps.md)

# Prerequisites for migrating from Amazon RDS for Oracle to Amazon Redshift


The following prerequisites are also required to complete this walkthrough:
+ Familiarity with Amazon RDS, Amazon Redshift, the applicable database technologies, and SQL.
+ The custom scripts that include creating the tables to be migrated and SQL queries for confirming the migration, as listed following:
  +  `Oracle_Redshift_For_DMSDemo.template` — an AWS CloudFormation template.
  +  `Oraclesalesstarschema.sql` — SQL statements to build the **SH** schema.

    These scripts are available at the following link: ` [dms-sbs-RDSOracle2Redshift.zip](http://docs.aws.amazon.com/dms/latest/sbs/samples/dms-sbs-RDSOracle2Redshift.zip) `.

    Each step in the walkthrough also contains a link to download the file involved or includes the exact query in the step.
+ A user with AWS Identity and Access Management (IAM) credentials that allow you to launch Amazon RDS, AWS Database Migration Service (AWS DMS) instances, and Amazon Redshift clusters in your AWS Region. For information about IAM credentials, see [Setting up for Amazon RDS](https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/CHAP_SettingUp.html#CHAP_SettingUp.IAM).
+ Basic knowledge of the Amazon Virtual Private Cloud (Amazon VPC) service and of security groups. For information about using Amazon VPC with Amazon RDS, see [Virtual Private Clouds (VPCs) and Amazon RDS](https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/USER_VPC.html). For information about Amazon RDS security groups, see [Amazon RDS Security Groups](https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/Overview.RDSSecurityGroups.html). For information about using Amazon Redshift in a VPC, see [Managing Clusters in an Amazon Virtual Private Cloud (VPC)](https://docs.aws.amazon.com/redshift/latest/mgmt/managing-clusters-vpc.html).
+ An understanding of the supported features and limitations of AWS DMS. For information about AWS DMS, see https://docs.aws.amazon.com/dms/latest/userguide/Welcome.html.
+ Knowledge of the supported data type conversion options for Oracle and Amazon Redshift. For information about data types for Oracle as a source, see [Using an Oracle database as a source](https://docs.aws.amazon.com/dms/latest/userguide/CHAP_Source.Oracle.html). For information about data types for Amazon Redshift as a target, see [Using an Amazon Redshift Database as a Target](https://docs.aws.amazon.com/dms/latest/userguide/CHAP_Target.Redshift.html).

For more information about AWS DMS, see [Getting started with Database Migration Service](https://docs.aws.amazon.com/dms/latest/userguide/CHAP_GettingStarted.html).

# Migration architecture for migrating from Amazon RDS for Oracle to Amazon Redshift


This walkthrough uses AWS CloudFormation to create a simple network topology for database migration that includes the source database, the replication instance, and the target database in the same VPC. For more information about AWS CloudFormation, see the [AWS CloudFormation documentation](https://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/Welcome.html).

We provision the AWS resources that are required for this AWS DMS walkthrough through AWS CloudFormation. These resources include a VPC and Amazon RDS instance for Oracle and an Amazon Redshift cluster. We provision through AWS CloudFormation because it simplifies the process, so we can concentrate on tasks related to data migration. When you create a stack from the AWS CloudFormation template, it provisions the following resources:
+ A VPC with CIDR (10.0.0.0/24) with two public subnets in your region, DBSubnet1 at the address 10.0.0.0/26 in Availability Zone (AZ) 1 and DBSubnet2 at the address 10.0.0.64/26, in AZ 12.
+ A DB subnet group that includes DBSubnet1 and DBSubnet2.
+ Oracle RDS Standard Edition Two with these deployment options:
  + License Included
  + Single-AZ setup
  + db.m3.medium or equivalent instance class
  + Port 1521
  + Default option and parameter groups
+ Amazon Redshift cluster with these deployment options:
  + dc1.large
  + Port 5439
  + Default parameter group
+ A security group with ingress access from your computer or 0.0.0.0/0 (access from anywhere) based on the input parameter

We have designed the AWS CloudFormation template to require few inputs from the user. It provisions the necessary AWS resources with minimum recommended configurations. However, if you want to change some of the configurations and parameters, such as the VPC CIDR block and Amazon RDS instance types, feel free to update the template.

We use the [AWS Management Console](https://console.aws.amazon.com/) to provision the AWS DMS resources, such as the replication instance, endpoints, and tasks. You install client tools such as SQL Workbench/J and the AWS Schema Conversion Tool (AWS SCT) on your local computer to connect to the Amazon RDS instances.

Following is an illustration of the migration architecture for this walkthrough.

![\[replication instance\]](http://docs.aws.amazon.com/dms/latest/sbs/images/sbs-rdsor2RedshiftMigrationArchitecture.png)


# Step-by-step Amazon RDS for Oracle to Amazon Redshift migration walkthrough


In the following sections, you can find step-by-step instructions for migrating an Amazon RDS for Oracle database to Amazon Redshift. These steps assume that you have already prepared your source database as described in preceding sections.

**Topics**
+ [

# Step 1: Launch the RDS Instances in a VPC by Using the AWS CloudFormation Template
](chap-rdsoracle2redshift.steps.launchrdswcloudformation.md)
+ [

# Step 2: Install the SQL Tools and AWS Schema Conversion Tool on Your Local Computer
](chap-rdsoracle2redshift.steps.installsct.md)
+ [

# Step 3: Test Connectivity to the Oracle DB Instance and Create the Sample Schema
](chap-rdsoracle2redshift.steps.connectoracle.md)
+ [

# Step 4: Test the Connectivity to the Amazon Redshift Database
](chap-rdsoracle2redshift.steps.connectredshift.md)
+ [

# Step 5: Use AWS SCT to Convert the Oracle Schema to Amazon Redshift
](chap-rdsoracle2redshift.steps.convertschema.md)
+ [

# Step 6: Validate the Schema Conversion
](chap-rdsoracle2redshift.steps.validateschemaconversion.md)
+ [

# Step 7: Create an AWS DMS Replication Instance
](chap-rdsoracle2redshift.steps.createreplicationinstance.md)
+ [

# Step 8: Create AWS DMS Source and Target Endpoints
](chap-rdsoracle2redshift.steps.createsourcetargetendpoints.md)
+ [

# Step 9: Create and Run Your AWS DMS Migration Task
](chap-rdsoracle2redshift.steps.createmigrationtask.md)
+ [

# Step 10: Verify That Your Data Migration Completed Successfully
](chap-rdsoracle2redshift.steps.verifydatamigration.md)
+ [

# Step 11: Delete Walkthrough Resources
](chap-rdsoracle2redshift.steps.deleteresources.md)

# Step 1: Launch the RDS Instances in a VPC by Using the AWS CloudFormation Template


Before you begin, you’ll need to download an AWS CloudFormation template. Follow these instructions:

1. Download the following archive to your computer: `http://docs.aws.amazon.com/dms/latest/sbs/samples/dms-sbs-RDSOracle2Redshift.zip` 

1. Extract the AWS CloudFormation template (`Oracle_Redshift_For_DMSDemo.template`) from the archive.

1. Copy and paste the `Oracle_Redshift_For_DMSDemo.template` file into your current directory.

Now you need to provision the necessary AWS resources for this walkthrough.

1. Sign in to the AWS Management Console and open the AWS CloudFormation console at [https://console.aws.amazon.com/cloudformation](https://console.aws.amazon.com/cloudformation/).

1. Choose **Create stack**.

1. On the **Select Template **page, choose **Upload a template to Amazon S3**.

1. Click **Choose File**, and then choose the `Oracle_Redshift_For_DMSDemo.template` file that you extracted from the `dms-sbs-RDSOracle2Redshift.zip` archive.

1. Choose **Next**. On the **Specify Details** page, provide parameter values as shown following.    
[\[See the AWS documentation website for more details\]](http://docs.aws.amazon.com/dms/latest/sbs/chap-rdsoracle2redshift.steps.launchrdswcloudformation.html)  
![\[Specify Details page\]](http://docs.aws.amazon.com/dms/latest/sbs/images/sbs-rdsor2redshift3.png)

1. Choose **Next**. On the **Options** page, choose **Next**.

1. On the **Review** page, review the details, and if they are correct choose **Create**.  
![\[replication instance\]](http://docs.aws.amazon.com/dms/latest/sbs/images/sbs-rdsor2redshift5.png)

1.  AWS can take about 20 minutes or more to create the stack with an Amazon RDS for Oracle instance and an Amazon Redshift cluster.  
![\[Create Stack page\]](http://docs.aws.amazon.com/dms/latest/sbs/images/sbs-rdsor2redshift6.png)

1. After the stack is created, select the **OracletoRedshiftDWusingDMS** stack, and then choose the **Outputs** view. Record the JDBC connection strings, **OracleJDBCConnectionString** and **RedshiftJDBCConnectionString**, for use later in this walkthrough to connect to the Oracle and Amazon Redshift databases.

# Step 2: Install the SQL Tools and AWS Schema Conversion Tool on Your Local Computer


Next, you need to install a SQL client and AWS SCT on your local computer.

This walkthrough assumes you will use the SQL Workbench/J client to connect to the RDS instances for migration validation.

1. Download SQL Workbench/J from [the SQL Workbench/J website](http://www.sql-workbench.net/downloads.html), and then install it on your local computer. This SQL client is free, open-source, and DBMS-independent.

1. Download the JDBC driver for your Oracle database release. For more information, go to https://www.oracle.com/jdbc.

1. Download the Amazon Redshift driver file, `RedshiftJDBC41-1.1.17.1017.jar`, as described following.

   1. Find the Amazon S3 URL to the file in [Previous JDBC Driver Versions](https://docs.aws.amazon.com/redshift/latest/mgmt/jdbc-previous-versions.html) of the *Amazon Redshift Cluster Management Guide*.

   1. Download the driver as described in [Download the Amazon Redshift JDBC Driver](https://docs.aws.amazon.com/redshift/latest/mgmt/configure-jdbc-connection.html#download-jdbc-driver) of the same guide.

1. Using SQL Workbench/J, configure JDBC drivers for Oracle and Amazon Redshift to set up connectivity, as described following.

   1. In SQL Workbench/J, choose **File**, then choose **Manage Drivers**.

   1. From the list of drivers, choose **Oracle**.

   1. Choose the **Open** icon, then choose the `ojdbc.jar` file that you downloaded in the previous step. Choose **OK**.  
![\[driver management\]](http://docs.aws.amazon.com/dms/latest/sbs/images/sbs-rdsor2redshift7.png)

   1. From the list of drivers, choose **Redshift**.

   1. Choose the **Open** icon, then choose the Amazon Redshift JDBC driver that you downloaded in the previous step. Choose **OK**.  
![\[driver management\]](http://docs.aws.amazon.com/dms/latest/sbs/images/sbs-rdsor2redshift8.png)

Next, install AWS SCT and the required JDBC drivers.

1. Download AWS SCT from [Installing, verifying, and updating the Schema Conversion Tool](https://docs.aws.amazon.com/SchemaConversionTool/latest/userguide/CHAP_Installing.html).

1. Follow the instructions to install AWS SCT.

1. Launch AWS SCT.

1. In AWS SCT, choose **Global settings** from **Settings**.

1. Choose **Settings**, **Global settings**, then choose **Drivers**, and then choose **Browse** for **Oracle driver path**. Locate the Oracle JDBC driver and choose **OK**.

1. Choose **Browse** for **Amazon Redshift driver path**. Locate the Amazon Redshift JDBC driver and choose **OK**. Choose **OK** to close the dialog box.  
![\[Connecting to the Oracle DB instance\]](http://docs.aws.amazon.com/dms/latest/sbs/images/sct-drivers.png)

# Step 3: Test Connectivity to the Oracle DB Instance and Create the Sample Schema


After the AWS CloudFormation stack has been created, test the connection to the Oracle DB instance by using SQL Workbench/J and then create the HR sample schema.

1. In SQL Workbench/J, choose **File**, then choose **Connect window**. Create a new connection profile using the following information.    
[\[See the AWS documentation website for more details\]](http://docs.aws.amazon.com/dms/latest/sbs/chap-rdsoracle2redshift.steps.connectoracle.html)

1. Test the connection by choosing **Test**. Choose **OK** to close the dialog box, then choose **OK** to create the connection profile.  
![\[Connecting to the Oracle DB instance\]](http://docs.aws.amazon.com/dms/latest/sbs/images/sbs-rdsor2redshift9.png)
**Note**  
If your connection is unsuccessful, ensure that the IP address you assigned when creating the AWS CloudFormation template is the one you are attempting to connect from. This issue is the most common one when trying to connect to an instance.

1. Create the **SH** schema you will use for migration using a custom `Oraclesalesstarschema.sql` SQL script. To obtain this script, do the following:
   + Download the following archive to your computer: `http://docs.aws.amazon.com/dms/latest/sbs/samples/dms-sbs-RDSOracle2Redshift.zip` 
   + Extract the `Oraclesalesstarschema.sql` SQL script from the archive.
   + Copy and paste the `Oraclesalesstarschema.sql` file into your current directory.

     1. Open the SQL script in a text editor. Copy the entire script.

     1. In SQL Workbench/J, paste the SQL script in the Default.wksp window showing **Statement 1**.

     1. Choose **SQL**, then choose **Execute All**.  
![\[SQL script to install the demo schema\]](http://docs.aws.amazon.com/dms/latest/sbs/images/sbs-rdsor2redshift9.5.png)

1. Verify the object types and count in **SH** Schema were created successfully by running the following SQL query.

   ```
   Select OBJECT_TYPE, COUNT(*) from dba_OBJECTS where owner='SH'
   GROUP BY OBJECT_TYPE;
   ```

   The results of this query should be similar to the following.

   ```
   OBJECT_TYPE     | COUNT(*)
   ----------------+---------
   INDEX PARTITION |       40
   TABLE PARTITION |        8
   TABLE           |        5
   INDEX           |       15
   ```

1. Verify the total number of tables and number of rows for each table by running the following SQL query.

   ```
   Select table_name, num_rows from dba_tables where owner='SH'  order by 1;
   ```

   The results of this query should be similar to the following.

   ```
   TABLE_NAME | NUM_ROWS
   -----------+---------
   CHANNELS   |        5
   CUSTOMERS  |        8
   PRODUCTS   |       66
   PROMOTIONS |      503
   SALES      |      553
   ```

1. Verify the integrity in tables. Check the number of sales made in different channels by running the following SQL query.

   ```
   Select b.channel_desc,count(*) from SH.SALES a,SH.CHANNELS b where a.channel_id=b.channel_id
   group by b.channel_desc
   order by 1;
   ```

   The results of this query should be similar to the following.

   ```
   CHANNEL_DESC | COUNT(*)
   -------------+---------
   Direct Sales |      710
   Internet     |       52
   Partners     |      344
   ```

**Note**  
The preceding examples are representative of validation queries. When you perform actual migrations, you should develop similar queries to validate the schema and the data integrity.

# Step 4: Test the Connectivity to the Amazon Redshift Database


Next, test your connection to your Amazon Redshift database.

1. In SQL Workbench/J, choose **File**, then choose **Connect window**. Choose the **Create a new connection profile** icon. Connect to the Amazon Redshift database in SQL Workbench/J by using the information shown following.    
[\[See the AWS documentation website for more details\]](http://docs.aws.amazon.com/dms/latest/sbs/chap-rdsoracle2redshift.steps.connectredshift.html)

1. Test the connection by choosing **Test**. Choose **OK** to close the dialog box, then choose **OK** to create the connection profile.  
![\[Connecting to the Amazon Redshift DB instance\]](http://docs.aws.amazon.com/dms/latest/sbs/images/sbs-rdsor2redshift10.png)
**Note**  
If your connection is unsuccessful, ensure that the IP address you assigned when creating the AWS CloudFormation template is the one you are attempting to connect from. This issue is the most common one when trying to connect to an instance.

1. Verify your connectivity to the Amazon Redshift DB instance by running a sample SQL command, such as `select current_date;`.

# Step 5: Use AWS SCT to Convert the Oracle Schema to Amazon Redshift


Before you migrate data to Amazon Redshift, you convert the Oracle schema to an Amazon Redshift schema. [This video covers all the steps of this process](https://youtu.be/ZK7J74VJT04).

To convert an Oracle schema to an Amazon Redshift schema using AWS Schema Conversion Tool (AWS SCT), do the following:

1. Launch AWS SCT. In AWS SCT, choose **File**, then choose **New Project**. Create a new project named `DWSchemaMigrationDemoProject`, specify the **Location** of the project folder, and then choose **OK**.

1. Choose **Add source** to add a source Oracle database to your project, then choose **Oracle**, and choose **Next**.

1. Enter the following information, and then choose **Test Connection**.    
[\[See the AWS documentation website for more details\]](http://docs.aws.amazon.com/dms/latest/sbs/chap-rdsoracle2redshift.steps.convertschema.html)  
![\[Connecting to an Amazon RDS for Oracle DB instance\]](http://docs.aws.amazon.com/dms/latest/sbs/images/sbs-rdsor2redshift11.png)

1. Choose **OK** to close the alert box, then choose **Connect** to close the dialog box and to connect to the Oracle DB instance.

1. Choose **Add target** to add a target Amazon Redshift database to your project, then choose **Amazon Redshift**, and choose **Next**.

1. Enter the following information and then choose **Test Connection**.    
[\[See the AWS documentation website for more details\]](http://docs.aws.amazon.com/dms/latest/sbs/chap-rdsoracle2redshift.steps.convertschema.html)

1. Choose **OK** to close the alert box, then choose **Connect** to connect to the Amazon Redshift DB instance.

1. In the tree in the left panel, select only the **SH** schema. In the tree in the right panel, select your target Amazon Redshift database. Choose **Create mapping**.  
![\[Creating a mapping rule\]](http://docs.aws.amazon.com/dms/latest/sbs/images/sbs-rdsor2redshift12.png)

1. Choose **Main view**.

1. In the tree in the left panel, right-click the **SH** schema and choose **Collect Statistics**. AWS SCT analyzes the source data to recommend the best keys for the target Amazon Redshift database. For more information, see [Collecting or Uploading Statistics](https://docs.aws.amazon.com/SchemaConversionTool/latest/userguide/CHAP_Converting.DW.html#CHAP_Converting.DW.Statistics).
**Note**  
If the **SH** schema does not appear in the list, choose **Actions**, then choose **Refresh from Database**.

1. In the tree in the left panel, right-click the **SH** schema and choose **Create report**. AWS SCT analyzes the **SH** schema and creates a database migration assessment report for the conversion to Amazon Redshift.

1. Check the report and the action items it suggests. The report discusses the type of objects that can be converted by using AWS SCT, along with potential migration issues and actions to resolve these issues. For this walkthrough, you should see something like the following:  
![\[Database migration report\]](http://docs.aws.amazon.com/dms/latest/sbs/images/sbs-rdsor2redshift13.png)

1. Review the report summary. To save the report, choose either **Save to CSV** or **Save to PDF**.

1. Choose the **Action Items** tab. The report discusses the type of objects that can be converted by using AWS SCT, along with potential migration issues and actions to resolve these issues.

1. In the tree in the left panel, right-click the **SH** schema and choose **Convert schema**.

1. Choose **Yes** for the confirmation message. AWS SCT then converts your schema to the target database format.
**Note**  
The choice of the Amazon Redshift sort keys and distribution keys is critical for optimal performance. You can use key management in AWS SCT to customize the choice of keys. For this walkthrough, we use the defaults recommended by AWS SCT. For more information, see [Optimizing Amazon Redshift](https://docs.aws.amazon.com/SchemaConversionTool/latest/userguide/CHAP_Converting.DW.RedshiftOpt.html).

1. In the tree in the right panel, choose the converted **sh** schema, and then choose **Apply to database** to apply the schema scripts to the target Amazon Redshift instance.

1. In the tree in the right panel, choose the **sh** schema, and then choose **Refresh from Database** to refresh from the target database.

The database schema has now been converted and imported from source to target.

# Step 6: Validate the Schema Conversion


To validate the schema conversion, you compare the objects found in the Oracle and Amazon Redshift databases using SQL Workbench/J.

1. In SQL Workbench/J, choose **File**, then choose **Connect window**. Choose the **RedshiftConnection** you created in an earlier step. Choose **OK**.

1. Run the following script to verify the number of object types and count in **SH** schema in the target Amazon Redshift database. These values should match the number of objects in the source Oracle database.

   ```
   SELECT 'TABLE' AS OBJECT_TYPE,
          TABLE_NAME AS OBJECT_NAME,
          TABLE_SCHEMA AS OBJECT_SCHEMA
   FROM information_schema.TABLES
   WHERE TABLE_TYPE = 'BASE TABLE'
   AND   OBJECT_SCHEMA = 'sh';
   ```

   The output from this query should be similar to the following.

   ```
   object_type | object_name | object_schema
   ------------+-------------+--------------
   TABLE       | channels    | sh
   TABLE       | customers   | sh
   TABLE       | products    | sh
   TABLE       | promotions  | sh
   TABLE       | sales       | sh
   ```

1. Verify the sort and distributions keys that are created in the Amazon Redshift cluster by using the following query.

   ```
   set search_path to '$user', 'public', 'sh';
   
   SELECT tablename,
          "column",
          TYPE,
          encoding,
          distkey,
          sortkey,
          "notnull"
   FROM pg_table_def
   WHERE (distkey = TRUE OR sortkey <> 0);
   ```

   The results of the query reflect the distribution key (`distkey`) and sort key (`sortkey`) choices made by using AWS SCT key management.

   ```
   tablename  | column              | type                        | encoding | distkey | sortkey | notnull
   -----------+---------------------+-----------------------------+----------+---------+---------+--------
   channels   | channel_id          | numeric(38,18)              | none     | true    |       1 | true
   customers  | cust_id             | numeric(38,18)              | none     | false   |       4 | true
   customers  | cust_gender         | character(2)                | none     | false   |       1 | true
   customers  | cust_year_of_birth  | smallint                    | none     | false   |       3 | true
   customers  | cust_marital_status | character varying(40)       | none     | false   |       2 | false
   products   | prod_id             | integer                     | none     | true    |       4 | true
   products   | prod_subcategory    | character varying(100)      | none     | false   |       3 | true
   products   | prod_category       | character varying(100)      | none     | false   |       2 | true
   products   | prod_status         | character varying(40)       | none     | false   |       1 | true
   promotions | promo_id            | integer                     | none     | true    |       1 | true
   sales      | prod_id             | numeric(38,18)              | none     | false   |       4 | true
   sales      | cust_id             | numeric(38,18)              | none     | false   |       3 | true
   sales      | time_id             | timestamp without time zone | none     | true    |       1 | true
   sales      | channel_id          | numeric(38,18)              | none     | false   |       2 | true
   sales      | promo_id            | numeric(38,18)              | none     | false   |       5 | true
   ```

# Step 7: Create an AWS DMS Replication Instance


After we validate the schema structure between source and target databases, as described preceding, we proceed to the core part of this walkthrough, which is the data migration. The following illustration shows a high-level view of the migration process.

![\[migration process\]](http://docs.aws.amazon.com/dms/latest/sbs/images/sbs-rdsor2redshift-Step7-DMSOverview.png)


A DMS replication instance performs the actual data migration between source and target. The replication instance also caches the transaction logs during the migration. How much CPU and memory capacity a replication instance has influences the overall time required for the migration.

To create an AWS DMS replication instance, do the following:

1. Sign in to the AWS Management Console, open the AWS DMS console at https://console.aws.amazon.com/dms/v2/, and choose **Create Migration**. If you are signed in as an AWS Identity and Access Management (IAM) user, you must have the appropriate permissions to access AWS DMS. For more information about the permissions required, see [IAM Permissions](https://docs.aws.amazon.com/dms/latest/userguide/CHAP_Security.html#CHAP_Security.IAMPermissions).

1. Choose **Create migration** to start a database migration.

1. On the **Welcome** page, choose **Next**.

1. On the **Create replication instance** page, specify your replication instance information as shown following.    
[\[See the AWS documentation website for more details\]](http://docs.aws.amazon.com/dms/latest/sbs/chap-rdsoracle2redshift.steps.createreplicationinstance.html)

1. For the **Advanced** section, leave the default settings as they are, and choose **Next**.

# Step 8: Create AWS DMS Source and Target Endpoints


While your replication instance is being created, you can specify the source and target database endpoints using the [AWS Management Console](https://console.aws.amazon.com/). However, you can only test connectivity after the replication instance has been created, because the replication instance is used in the connection.

1. Specify your connection information for the source Oracle database and the target Amazon Redshift database. The following table describes the source settings.    
[\[See the AWS documentation website for more details\]](http://docs.aws.amazon.com/dms/latest/sbs/chap-rdsoracle2redshift.steps.createsourcetargetendpoints.html)

   The following table describes the target settings.    
[\[See the AWS documentation website for more details\]](http://docs.aws.amazon.com/dms/latest/sbs/chap-rdsoracle2redshift.steps.createsourcetargetendpoints.html)

   The completed page should look like the following.  
![\[Advanced section\]](http://docs.aws.amazon.com/dms/latest/sbs/images/sbs-rdsor2redshift19.5.png)

1. Wait for the status to say **Replication instance created successfully.**.

1. To test the source and target connections, choose **Run Test** for the source and target connections.

1. Choose **Next**.

# Step 9: Create and Run Your AWS DMS Migration Task


Using an AWS DMS task, you can specify what schema to migrate and the type of migration. You can migrate existing data, migrate existing data and replicate ongoing changes, or replicate data changes only. This walkthrough migrates existing data only.

1. On the **Create Task** page, specify the task options. The following table describes the settings.    
[\[See the AWS documentation website for more details\]](http://docs.aws.amazon.com/dms/latest/sbs/chap-rdsoracle2redshift.steps.createmigrationtask.html)

   The page should look like the following.  
![\[Create task page\]](http://docs.aws.amazon.com/dms/latest/sbs/images/sbs-rdsor2redshift23.png)

1. On the **Task Settings** section, specify the settings as shown in the following table.    
[\[See the AWS documentation website for more details\]](http://docs.aws.amazon.com/dms/latest/sbs/chap-rdsoracle2redshift.steps.createmigrationtask.html)

   The section should look like the following.  
![\[Create task page\]](http://docs.aws.amazon.com/dms/latest/sbs/images/sbs-rdsor2redshift23.5.png)

1. In the **Selection rules** section, specify the settings as shown in the following table.    
[\[See the AWS documentation website for more details\]](http://docs.aws.amazon.com/dms/latest/sbs/chap-rdsoracle2redshift.steps.createmigrationtask.html)

   The section should look like the following:  
![\[Add selection rule page\]](http://docs.aws.amazon.com/dms/latest/sbs/images/sbs-rdsor2redshift24.png)

1. Choose **Add selection rule**.

1. Choose **Create task**. The task begins immediately. The **Tasks** section shows you the status of the migration task.

![\[Tasks page\]](http://docs.aws.amazon.com/dms/latest/sbs/images/sbs-rdsor2redshift25.5.png)


# Step 10: Verify That Your Data Migration Completed Successfully


When the migration task completes, you can compare your task results with the expected results.

1. On the navigation pane, choose **Tasks**.

1. Choose your migration task (`migrateSHschema`).

1. Choose the **Table statistics** tab, shown following.  
![\[Table statistics tab\]](http://docs.aws.amazon.com/dms/latest/sbs/images/sbs-rdsor2redshift26.png)

1. Connect to the Amazon Redshift instance by using SQL Workbench/J, and then check whether the database tables were successfully migrated from Oracle to Amazon Redshift by running the SQL script shown following.

   ```
   select "table", tbl_rows
   from svv_table_info
   where
   SCHEMA = 'sh'
   order by 1;
   ```

   Your results should look similar to the following.

   ```
   table      | tbl_rows
   -----------+---------
   channels   |        5
   customers  |        8
   products   |       66
   promotions |      503
   sales      |     1106
   ```

1. To verify whether the output for tables and number of rows from the preceding query matches what is expected for RDS Oracle, compare your results with those in previous steps.

1. Run the following query to check the relationship in tables; this query checks the departments with employees greater than 10.

   ```
   Select b.channel_desc,count(*) from SH.SALES a,SH.CHANNELS b where a.channel_id=b.channel_id
   group by b.channel_desc
   order by 1;
   ```

   The output from this query should be similar to the following.

   ```
   channel_desc | count
   -------------+------
   Direct Sales |   355
   Internet     |    26
   Partners     |   172
   ```

1. Verify column compression encoding.

   DMS uses an Amazon Redshift COPY operation to load data. By default, the COPY command applies automatic compression whenever loading to an empty target table. The sample data for this walkthrough is not large enough for automatic compression to be applied. When you migrate larger data sets, COPY will apply automatic compression.

   For more details about automatic compression on Amazon Redshift tables, see [Loading Tables with Automatic Compression](https://docs.aws.amazon.com/redshift/latest/dg/c_Loading_tables_auto_compress.html).

   To view compression encodings, run the following query.

   ```
   SELECT *
   FROM pg_table_def
   WHERE schemaname = 'sh';
   ```

Now you have successfully completed a database migration from an Amazon RDS for Oracle DB instance to Amazon Redshift.

# Step 11: Delete Walkthrough Resources


After you have completed this walkthrough, perform the following steps to avoid being charged further for AWS resources used in the walkthrough. It’s necessary that you do the steps in order, because some resources cannot be deleted if they have a dependency upon another resource.

To delete AWS DMS resources, do the following:

1. On the navigation pane, choose **Tasks**, choose your migration task (`migratehrschema`), and then choose **Delete**.

1. On the navigation pane, choose **Endpoints**, choose the Oracle source endpoint (`orasource`), and then choose **Delete**.

1. Choose the Amazon Redshift target endpoint (`redshifttarget`), and then choose **Delete**.

1. On the navigation pane, choose **Replication instances**, choose the replication instance (`DMSdemo-repserver`), and then choose **Delete**.

Next, you must delete your AWS CloudFormation stack, `DMSdemo`. Do the following:

1. Sign in to the AWS Management Console and open the AWS CloudFormation console at [https://console.aws.amazon.com/cloudformation](https://console.aws.amazon.com/cloudformation/).

   If you are signed in as an IAM user, you must have the appropriate permissions to access AWS CloudFormation.

1. Choose your AWS CloudFormation stack, `OracletoRedshiftDWusingDMS`.

1. For **Actions**, choose **Delete stack**.

The status of the stack changes to DELETE\$1IN\$1PROGRESS while AWS CloudFormation cleans up the resources associated with the `OracletoRedshiftDWusingDMS` stack. When AWS CloudFormation is finished cleaning up resources, it removes the stack from the list.

# Migration from Amazon RDS for Oracle to Amazon Redshift next steps


You can explore several other features of AWS DMS that were not included in this walkthrough, including the following:
+ The AWS DMS change data capture (CDC) feature, for ongoing replication of data.
+ Transformation actions that let you specify and apply transformations to the selected schema or table as part of the migration process.

  For more information, see [Getting started with Database Migration Service](https://docs.aws.amazon.com/dms/latest/userguide/CHAP_GettingStarted.html).