Migrating Data to AWS Using the AWS Schema Conversion Tool: A Preview

We will explore the best ways of migrating data to AWS using the AWS Schema Conversion Tool.

The AWS Schema Conversion Tool is a component of AWS Data Migration Service (DMS) which is still in preview mode at the time of writing this post (December 2015). We have discussed the benefits of migrating to AWS in a previous post.

With DMS, Amazon is wooing corporate customers with a low-cost means of moving their database workloads to the cloud. Amazon already has a managed database service called RDS (Relational Database Service). However RDS has its own limitations, and one of those limitations is related to data migration. Until now, there was no simple way of migrating schema and data from an on-premise or EC2-hosted database to RDS. Because RDS is a managed service it doesn’t support a number of features specific to the database engines. Previously, the only solution would require DBAs and developers writing their own migration scripts and testing them in a painful and an iterative manner until all the wrinkles have been ironed out.

Amazon addresses this issue with Data Migration Service. As Amazon’s public website for DMS states, the service will allow seamless and continuous data replication between a source and a target environment during a migration process. These environments may use either the same database engines or different ones as source and target. It is worth noting that the source can be an on-premise database instance, an EC2-hosted instance, or even another RDS instance. The target can be an EC2 instance or another RDS instance.  AWS now supports a number of database engines, so that these combinations can support a wide variety of use cases. Migrating data to AWS is progressively easier with each passing week. Here’s a good post about the AWS Data Migration Tool that was written at re:Invent 2015.

Anyone who ever worked in a data migration project knows the process is never simple — never. A number of disciplines are involved here: solutions architects, infrastructure architects, engineers, DBAs, developers, and quality assurance testers. All these roles form components of project teams whose work can span from a few weeks to many months. Successful database migration is generally iterative in nature.  A repeatable process may recreate the target database structure every time, and load it with fresh source data repeatedly for quality assurance purposes. This process continues until everything works perfectly and the approvals are given for the next step. In my experience, the sooner this iterative process is put into practice, the better the chance of successful migration.

But a seamless data migration can only happen when the source and destination databases use similar or, at least, compatible structures. It may be relatively easy to migrate an on-premise SQL Server database to an RDS instance running the same engine, but, not so easy if the target is,  say, MySQL. That’s because both engines use different methods of defining database objects and their code syntax is also widely different. There is no easy way out here. When the source and target databases are different, DBAs and developers have to bite the bullet and script the schema objects and programming logic in the target environment’s own language. More often than not, this becomes the most time-consuming part of migrating data to AWS.

This is where the AWS Schema Conversion Tool can come in really handy. You are probably wondering, “can the AWS Schema Conversion Tool save time for the database engineer?” The answer is yes! The AWS Schema Conversion Tool can connect to a source database engine and reverse engineer a source database structure to a format suitable for the target database. This includes converting both database schema objects (tables, indexes, constraints etc.) and database logic (functions, stored procedures, triggers etc). Where it can’t convert a source item, it will flag it and recommend remediation steps. You can then apply the converted schema and logic to the target environment from this same tool and make any manual changes to the target as suggested.

Migrating data to AWS has been common for a while and database migration toolkits are nothing new either.  Most commercial database vendors offer them for free, but there are few that offer multiple source and target options, and I can’t think of any others available for almost all operating systems. The AWS Schema Conversion Tool is a desktop product that runs on Windows, Mac OS, Fedora and Ubuntu Linux. It’s free and can be downloaded once you sign up for a preview of the Database Migration Service.
Being the first of its kind from AWS, this tool has some limitations in its first release. It’s used for converting databases to RDS instances only — more precisely MySQL or Aurora in RDS. The source database engines can be either MS SQL Server or Oracle. That means there’s no support for PostgreSQL or the more recent addition, MariaDB.

Installing the AWS Schema Conversion Tool

We tested the AWS Schema Conversion Tool by installing it on a Windows 10 machine as well as on a Mac (running Yosemite). The installation process is fairly simple and fast in both cases.
Once the tool is installed, you need to download and install the JDBC drivers for the supported database engines. This is because The AWS Schema Conversion Tool uses JDBC to connect to databases. In our case, we downloaded and installed two:

Once the installation is complete, you need to tell the AWS Conversion Tool where to find those drivers. This is done by selecting the Settings > Global Settings menu option and choosing Drivers from the dialog box. In the image below, we have specified the paths for both the drivers:
AWS Schema Migration Toolkit JDBC Global Settings
With the initial setup complete, you can start a new migration project (every migration has to be part of a project). In our case, we chose to convert the well-known AdventureWorks database running in an EC2 instance to a MySQL 5.6 RDS instance.
The first step is to define the project properties. When in this screen, you see the source can be either SQL Server or Oracle and the destination can be MySQL or Aurora.
AWS Schema Conversion Tool New Project

Connecting to the Source Database

Once the project is created, it’s time to connect to the source instance. In the image below we are connecting to our SQL Server 2012 instance in EC2.
SQL Server Connection Properties
Although SQL Servers have instance names (default instances typically have the same name as the Windows box), we did not specify it here. The user account was created before. One thing to note here is, that the user account must be a SQL Server standard login and not a Windows domain or local account. If the SQL Server is running in Windows-only authentication mode, this will not work because no standard logins (even sa) will be available.
Once the source is connected, its databases and schemas under each database are loaded in the left side of the screen.
Source Database Tree View
Now if you are a SQL Server DBA or developer, you will notice a couple of things immediately:

  • A SQL Server instance has many components that are visible from the SQL Server Management Studio — the de-facto client tool for SQL Server. These objects include logins and server roles, jobs, alerts, linked servers, replication publishers/subscribers and so on. Even within each database, there are objects like, users, partitions, full text indexes etc.  When you connect to the instance with the AWS Schema Conversion Tool, only databases and their objects, like tables, indexes, constraints or stored procedures are accessible.

If you are wondering why the tool works this way, remember, the Schema Conversion Tool is built for converting only database components, nothing else. That’s why it shows database objects from the source instance only.

The same thing can be said about Oracle databases. There are many different links for schemas, tables, users, directory objects, jobs or RMAN backups within the Oracle Enterprise Manager Database Control. Again, only database objects for different schemas are viewable from the conversion tool.

  • In SQL Server Management Studio, similar types of components in a database are grouped together. For example, all tables from all schemas in a database are listed under the tables folder. The same types of folders exist for views, stored procedures, or functions. You may also note that even within each type of object, there can be further grouping for related objects. For example, the node for a particular table will have all its indexes listed under one folder and all the constraints listed under another folder, and so on. With the AWS Schema Conversion Tool, this classification is shown in a different way. Here, each schema under a database will have its own folder. Under each schema, there will be sub-folders for tables, views, stored procedures and so on. In essence, even though the objects could belong to the same database, their placement may be under different schemas.

The reason for this difference is in the naming convention. The term “database schema” is loosely used to mean two different things. One meaning refers to schema as the overall structural definition of the database. This definition includes the actual command to create the database as well as the commands to create objects within the database. The other meaning of schema refers to a namespace that belongs to a user account. In traditional database engineering terms, a schema is related to a database user — all the objects owned by a user belong to that user’s schema. A database then becomes just the holder for all the schemas in it.

SQL Server did not implement the concept of schemas in its older versions. However it’s very much a part of the database engine now and Microsoft has implemented the ANSI version of schema definition in its database product. On a high-level though, everything is still seen from an individual database perspective.

AWS Schema Conversion Tool sees things differently. It treats “schemas” in the traditional sense, a grouping of database objects from the same user. This means when you convert the source database components, the highest level you can convert from is the schema level, not the SQL Server database level. That means each separate schema under an SQL Server database will end up as a separate MySQL database in the target.

For a visual understanding, we are showing a side-by-side comparison:
SQL Server Management Studio and AWS Schema Conversion Toolkit User Interface

Connecting to the Destination Database

Next, we connected to our target environment, the MySQL RDS instance. Here we provided similar information:
MySQL Connection Properties
Once connected, the target instance schema objects are loaded in the right pane of the tool.

Running the Database Migration Assessment Report

It’s now time to check what components from our source databases (AdventureWorks) can be converted to MySQL format. For this we selected a schema, right-clicked on it and selected “Create Report” from the pop-up menu.
Starting the Database Migration Assessment Report
This will kick in the schema conversion engine. AWS Schema Conversion Tool will take its time to reverse engineer every object within the schema and check if the generated code can be run against the target instance without any change. If it can, it will make a note of this finding. If it can’t, it will make a note about why it’s not possible. At the end, the assessment will generate a report. That report is called the Database Migration Assessment Report.

You can access the Database Migration Assessment Report from the View menu of the conversion tool. There are two parts of the assessment report: the Summary part and the Action Items part.
The Summary page gives an overall picture of the migration possibility. It shows how many objects in each type of component it can convert and how many it can’t.

The Action Items page will go deeper into the analysis. It will list every object that can’t be converted to the target engine and the reason for their failure. I find it hugely helpful that the AWS Schema Conversion Tool highlights the particular command or syntax in the generated code that caused the problem. This code is viewable from the lower half of the Action Items screen.

In the following images, we are seeing the summary and action items report for two schemas. You can see the bottom half of the Action Items screen shows the generated code for a particular object that can’t be converted. When you select an individual problem item from the top half of the screen, its related object is highlighted in the left pane and its generated code is shown in the bottom.
Database Migraton Assessment Report Summary View
Database Migration Assessment Report Action Items
Database Migration Assessment Report Summary View
Database Migration Assessment Report Action Items

Converting the Source Schema

With the assessment report giving you the overall migration possibility, you have two choices:

  • Make changes to the source database structure so it matches the target database syntax. This may mean changing data types for table columns or rewriting stored procedures, triggers, and functions with different syntax — in most migration scenarios, this is not an option.
  • Apply the converted schema to the target instance as is, and then make manual changes to the target database by creating tables with correct data types, or write code with the correct syntax. The second option is the preferred method by most data engineers.

Migrating the schema in the target environment is a two-step process:

  • Convert the schema.  This is where the AWS Schema Conversion Tool will generate a local script to apply to the target environment.
  • Apply the converted schema. The converted schema code is actually applied to the target database instance, and the target objects are created.

In the images below, we have chosen to convert two schemas within the AdventureWorks database (HumanResources and dbo):
Converting Source Database Schema
Converting Source Database Schema

Applying the Converted Schema

With the schemas converted to MySQL format, the target instance will show the databases created. However, they don’t exist yet. To physically create the database objects, we had to apply the schemas from the target database pane:
Applying the Converted Schema to Target
Applying the Converted Schema to Target
And that completes the initial schema migration. If you are interested in migrating to virtual machines, you may find our post on Migrating Virtual Machines to the AWS Cloud useful.

Conclusion

We found the AWS Schema Conversion Tool a simple and intuitive piece of software with an elegant installation.

As noted before, there is no option to migrate to or from a PostgreSQL databases. Postgres is run in many corporate database workloads and we don’t know when this functionality will be available in the tool. The same can be said about other database engines like MySQL or MariaDB as sources. The target environment can only be one type and that’s RDS. This may help Amazon push its customers to adopt RDS, but it isn’t helpful if a customer wants to run their database in EC2.

Even if the schema migration runs smoothly and most of the objects can be transferred easily, database engineers still have to support other “moving parts.”  This includes a multitude of components like: SQL Server Agent Jobs, Oracle Jobs, custom cron jobs, Windows scheduled tasks, network shares, users and privileges, linked servers, linked databases, partitions, full-text indexes, and so on.

The philosophy behind this tool’s purpose is focused. The tool is not meant to be a magic bullet for a turn-key migration – no tool can claim to have such functionality. Data mapping is the biggest time-sink for most migrations and this tool address that issue very well.

As we saw from our example, it’s not easy to migrate even a sample database in a lift-and-shift manner. That may disappoint many DBAs. After all, everyone wants a smooth migration where the tool does all the heavy lifting. We believe the AWS Schema Conversion Tool’s strengths lie in its ability to pinpoint exactly where the migration problems are, and suggesting ways to address those issues.

Avatar

Written by

Sadequl Hussain

Sadequl Hussain is an IT pro based in Sydney, Australia. He comes from a strong database administration backround and has more than 15 years of experience in development, database management, training, and technical writing. Sadequl also holds a number of vendor certifications, including one from AWS. He loves working with cloud technologies, NoSQL / Big Data databases, automation toolsets, open source technologies and Windows / Linux system administration. When he is not doing any of these, Sadequl loves to spend time with his young family.


Related Posts

Avatar
Sudhi Seshachala
— October 9, 2019

Top 13 Amazon Virtual Private Cloud (VPC) Best Practices

Amazon Virtual Private Cloud (VPC) brings a host of advantages to the table, including static private IP addresses, Elastic Network Interfaces, secure bastion host setup, DHCP options, Advanced Network Access Control, predictable internal IP ranges, VPN connectivity, movement of interna...

Read more
  • AWS
  • best practices
  • VPC
Avatar
Stuart Scott
— October 2, 2019

Big Changes to the AWS Certification Exams

With AWS re:Invent 2019 just around the corner, we can expect some early announcements to trickle through with upcoming features and services. However, AWS has just announced some big changes to their certification exams. So what’s changing and what’s new? There is a brand NEW ...

Read more
  • AWS
  • Certifications
Alisha Reyes
Alisha Reyes
— October 1, 2019

New on Cloud Academy: ITIL® 4, Microsoft 365 Tenant, Jenkins, TOGAF® 9.1, and more

At Cloud Academy, we're always striving to make improvements to our training platform. Based on your feedback, we released some new features to help make it easier for you to continue studying. These new features allow you to: Remove content from “Continue Studying” section Disc...

Read more
  • AWS
  • Azure
  • Google Cloud Platform
  • ITIL® 4
  • Jenkins
  • Microsoft 365 Tenant
  • New content
  • Product Feature
  • Python programming
  • TOGAF® 9.1
Avatar
Stuart Scott
— September 27, 2019

AWS Security Groups: Instance Level Security

Instance security requires that you fully understand AWS security groups, along with patching responsibility, key pairs, and various tenancy options. As a precursor to this post, you should have a thorough understanding of the AWS Shared Responsibility Model before moving onto discussi...

Read more
  • AWS
  • instance security
  • Security
  • security groups
Avatar
Jeremy Cook
— September 17, 2019

Cloud Migration Risks & Benefits

If you’re like most businesses, you already have at least one workload running in the cloud. However, that doesn’t mean that cloud migration is right for everyone. While cloud environments are generally scalable, reliable, and highly available, those won’t be the only considerations dri...

Read more
  • AWS
  • Azure
  • Cloud Migration
Joe Nemer
Joe Nemer
— September 12, 2019

Real-Time Application Monitoring with Amazon Kinesis

Amazon Kinesis is a real-time data streaming service that makes it easy to collect, process, and analyze data so you can get quick insights and react as fast as possible to new information.  With Amazon Kinesis you can ingest real-time data such as application logs, website clickstre...

Read more
  • amazon kinesis
  • AWS
  • Stream Analytics
  • Streaming data
Joe Nemer
Joe Nemer
— September 6, 2019

Google Cloud Functions vs. AWS Lambda: The Fight for Serverless Cloud Domination

Serverless computing: What is it and why is it important? A quick background The general concept of serverless computing was introduced to the market by Amazon Web Services (AWS) around 2014 with the release of AWS Lambda. As we know, cloud computing has made it possible for users to ...

Read more
  • AWS
  • Azure
  • Google Cloud Platform
Joe Nemer
Joe Nemer
— September 3, 2019

Google Vision vs. Amazon Rekognition: A Vendor-Neutral Comparison

Google Cloud Vision and Amazon Rekognition offer a broad spectrum of solutions, some of which are comparable in terms of functional details, quality, performance, and costs. This post is a fact-based comparative analysis on Google Vision vs. Amazon Rekognition and will focus on the tech...

Read more
  • Amazon Rekognition
  • AWS
  • Google Cloud Platform
  • Google Vision
Alisha Reyes
Alisha Reyes
— August 30, 2019

New on Cloud Academy: CISSP, AWS, Azure, & DevOps Labs, Python for Beginners, and more…

As Hurricane Dorian intensifies, it looks like Floridians across the entire state might have to hunker down for another big one. If you've gone through a hurricane, you know that preparing for one is no joke. You'll need a survival kit with plenty of water, flashlights, batteries, and n...

Read more
  • AWS
  • Azure
  • Google Cloud Platform
  • New content
  • Product Feature
  • Python programming
Joe Nemer
Joe Nemer
— August 27, 2019

Amazon Route 53: Why You Should Consider DNS Migration

What Amazon Route 53 brings to the DNS table Amazon Route 53 is a highly available and scalable Domain Name System (DNS) service offered by AWS. It is named by the TCP or UDP port 53, which is where DNS server requests are addressed. Like any DNS service, Route 53 handles domain regist...

Read more
  • Amazon
  • AWS
  • Cloud Migration
  • DNS
  • Route 53
Alisha Reyes
Alisha Reyes
— August 22, 2019

How to Unlock Complimentary Access to Cloud Academy

Are you looking to get trained or certified on AWS, Azure, Google Cloud Platform, DevOps, Cloud Security, Python, Java, or another technical skill? Then you'll want to mark your calendars for August 23, 2019. Starting Friday at 12:00 a.m. PDT (3:00 a.m. EDT), Cloud Academy is offering c...

Read more
  • AWS
  • Azure
  • cloud academy content
  • complimentary access
  • GCP
  • on the house
Avatar
Michael Sheehy
— August 19, 2019

What Exactly Is a Cloud Architect and How Do You Become One?

One of the buzzwords surrounding the cloud that I'm sure you've heard is "Cloud Architect." In this article, I will outline my understanding of what a cloud architect does and I'll analyze the skills and certifications necessary to become one. I will also list some of the types of jobs ...

Read more
  • AWS
  • Cloud Computing