Azure SQL: Designing and Implementing a Migration Plan - Summary
Start course

Once you've decided to take advantage of cloud technology, resources, and scale, and have set up an Azure infrastructure, there is the small matter of transferring your existing data to the cloud. A migration is successful when the target production system is up and running correctly with the minimum impact on the business' operation. Achieving this requires knowing what data issues you will have in the new environment and minimizing system downtime while the data is transferred. This course looks at strategies and tools to achieve an optimal migration.

Learning Objectives

  • Understand the underlying concepts of migrating relational databases to Azure
  • Explore the factors in determining the optimal migration scenario as well as those affecting the physical movement of data from on-premises to Azure
  • Learn the different tools and services to aid in discovery, assessment, and migration
  • Learn how to migrate an on-premises SQL Server DB to Azure SQL Managed Instance using the Data Migration Service
  • Use Data Migration Assistant to migrate on-Premises SQL Server databases to Azure SQL DB
  • Migrate MySQL to Azure MySQL using MySQL Workbench
  • Migrate a MySQL database to Azure SQL Server using SQL Server Migration Assistant

Intended Audience

This course is ideal for anyone looking to migrate their existing on-premise database environment to an Azure SQL database solution.


To get the most out of this course, you should be familiar with general database and cloud concepts.



Successfully migrating an on-premises relational database to Azure involves three basic steps.

  1. Assessing your current databases' environmental requirements in light of the desired target DBMS.
  2. Designing an efficient and optimal migration strategy, minimizing overall system downtime.
  3. Running the migration process ensuring the target system is functional, and all data has been transferred on completion.

The overriding consideration in migrating a database is whether you are changing the database server. If the target system is the same as the current on-premises system and both are either SQL Server or a non-Microsoft brand of database, you are looking at a rehosting scenario. Rehosting or "lift and shift" is essentially replicating the on-premises servers on Azure virtual machines.

When the source system is a non-SQL Server, and the target is one of the Azure SQL variants, either SQL Server, Managed Instance, or Azure SQL DB, migration is a re-platforming exercise. Re-platforming also applies to non-Microsoft Azure offerings like Azure MySql and PostgreSql not running on a VM. In any case, where you are changing the underlying database engine, you need to assess the current database compatibility in terms of data types, functionality, and external software dependencies. While SQL Server on a VM and SQL Managed Instance are 100% compatible with recent SQL Server versions, Azure SQL DB is not so much.

There are several tools under the Azure Migrate umbrella that are useful in assessing your on-premises environment. Azure Migrate: Discovery and Assessment will evaluate on-premises physical and virtual servers, recommending Azure VMs based on system and usage data along with estimated costs.

Microsoft Data Migration Assistant is a tool you can download to assess on-premises SQL Server instances' suitability for migration to SQL Server on an Azure VM, SQL Server Managed Instance or Azure SQL. Data Migration Assistant reports on server feature compatibility and what database attributes and elements will need to change for a successful migration based on the target database platform. The assessment is particularly useful when targeting Azure SQL, identifying deprecated data types, cross-database joins, and reliance on SQL Server functionality not available in Azure SQL. 

Microsoft Data Experimentation Assistant helps you to assess the target system using recordings of source system workloads. While the Data Access Migration Toolkit, a Visual Studio Code extension, can be used to discover and assess SQL statements embedded in application source code.

To be forewarned is to be forearmed, and knowing what data has to be converted or functionality upgraded is integral in determining your migration strategy. The strategy's primary goal is to minimize system downtime or the cut-over period when neither source nor target systems are functional. A common approach is setting up the database structure and functionality and migrating static and slow-changing data to the target system while the source system remains operational. How you implement a staggered data transfer is very dependent on data usage patterns, that is, is data mainly added to a database, or is existing data changed a lot. Knowing the usage pattern will determine how to isolate fast and slow-changing data.

If no system downtime can be tolerated, then you will need to implement some kind of data replication model to synchronize source and target systems so cut over from the old to the new system can happen seamlessly.

The mechanics of transferring the data to Azure are dependent on the source and target database engines. Moving to a database hosted on a VM gives you the greatest flexibility, enabling you to use any variety of migration tools. Microsoft has three tools to aid in migration. 

  1. Data Migration Assistant is a downloaded tool to migrate from an on-premises SQL Server to one of the Azure SQL variants. 
  2. Data Migration Service is an Azure portal-based tool to migrate from on-premises to an Azure SQL type database or Azure MySql or Azure PostgreSQL. 
  3. SQL Server Migration Assistant, a tool run at the source, comes in five versions for migrating from Access, DB2, Oracle, SAP ASE, and SQL Server to one of the Azure SQL variants.


Whether using Microsoft or third-party tools, you must configure your on-premises and Azure networks to enable the appropriate access.

About the Author
Learning Paths

Hallam is a software architect with over 20 years experience across a wide range of industries. He began his software career as a  Delphi/Interbase disciple but changed his allegiance to Microsoft with its deep and broad ecosystem. While Hallam has designed and crafted custom software utilizing web, mobile and desktop technologies, good quality reliable data is the key to a successful solution. The challenge of quickly turning data into useful information for digestion by humans and machines has led Hallam to specialize in database design and process automation. Showing customers how leverage new technology to change and improve their business processes is one of the key drivers keeping Hallam coming back to the keyboard.