image
Homogeneous Migration Planning
Start course
Difficulty
Intermediate
Duration
39m
Students
335
Ratings
5/5
starstarstarstarstar
Description

Migrating an SAP landscape to Azure is a complicated task with lots of moving parts. Apart from the sheer volume of data, a migration can involve software upgrades and data conversions. Not only will a landscape need to work properly once migrated, but in most cases, the migration needs to happen quickly as SAP systems are usually the information backbone of an organization.

Once the target environment structure has been finalized the next important phase is planning the migration so that system downtime is kept to a minimum. There are different migration strategies to suit multiple scenarios and this course looks at which strategy works best for each scenario. We then look at ways to optimize a migration, and its pros, cons, and issues to look out for.

Learning Objectives

  • Understand the main scenarios and considerations when migrating from SAP to Azure
  • Learn the two methods for getting your data into Azure
  • Understand the difference between homogeneous and heterogeneous migration
  • Understand how the database migration option available in SAP can be used to migrate your data to Azure
  • Learn about the SAP HANA Large Instances service from Azure

Intended Audience

  • Database administrators
  • Anyone looking to migrate data from SAP to Azure

Prerequisites

To get the most out of this course, you should have a basic knowledge of both Microsoft Azure and SAP.

Transcript

In terms of migration methodologies, let's start with the most basic and straightforward: a homogeneous migration where you replicate the same OS and DBMS in Azure as you have in your current environment. At its most elementary level, this is as simple as taking a backup of the database, copying the file to Azure, and then restoring it to the new database instance. Now there are a few things you can do to speed the process up and reduce downtime.

You can enable compression on your backups if you haven't already. More importantly, instead of shutting your whole system down to do the backup, ensuring no data is changed, making the system unavailable until the migration has completed, use incremental backups. That is, restore a full backup that is taken as part of normal operations, and then only have the system unavailable while a much smaller incremental backup is taken, transferred, and restored. This strategy massively reduces the cutover period as compared with a full backup and restore scenario.

If you are moving from one SQL Server instance to another, you can stop the SQL Server service, detach the database, compress it, copy the file to Azure, uncompress it, and attach it to the server instance running on the VM. Whether this shortcut hack is quicker or not depends on the size of the database.

If your organization has no tolerance for downtime, and your migration is homogenous, you can employ database replication for near-instantaneous cutover. Most database systems have some kind of proprietary replication. For example, SQL Server has Always on Availability Groups. All database replication requires a reliable and performant network connection between the two database servers.

Once you have configured the necessary firewall ports on both servers and restored a full backup to the target Azure VM server, you can switch on the replication to synchronize the target database with the source database.

In a way, the replication method is not too dissimilar to the incremental backup method, except that the final incremental backup is replaced with replication. If your source and target database servers are not exactly the same version check to make sure that replication is possible between your current version and the new database software running on the Azure VM.

Machine replication via Azure Migrate or Azure Site Recovery is best suited for application server migration, but in theory could be used for database migration. Azure Site Recovery is a disaster recovery tool to replicate entire machines, and Microsoft says it should only be used for DR, but as you might guess the two products share functionality in the area of lift and shift.

Both products can replicate physical machines as well as Hyper-V and VMware virtual machines, although Azure Migrate provides an agentless option. As an indication of their shared roots, when an agentless migration is performed it's all native Azure Migrate: Server Migration, but when an agent-based migration is carried out, Azure Migrate: Server Migration uses the Site Recovery replication engine.

There's a couple of reasons why Azure Site Recovery is not an ideal solution. Firstly, as a disaster recovery tool, it's designed to take everything, files, configuration settings, hostnames, the lot. Apart from not starting with a fresh, clean slate, there will be some remedial work necessary to adapt what is essentially your on-premises servers to the new environment.

Secondly, ASR just copies changed files in an unintelligent way. It doesn't know about databases, log files, and caches. The workaround is to shut down the database servers, wait for all data to be flushed to disk, and wait until the next app-consistent snapshot has been taken. By default, snapshots are taken every 4 hours, but the snapshot frequency can be changed. As I said, it is possible to use Azure Site Recovery, but probably not the optimal migration solution.

On the other hand, Azure Migrate: Server Migration does provide a raft of functionality around current or source infrastructure assessment. Although this may be of marginal use in an SAP migration as you are limited to SAP-certified SKUs for the target machines. There is no charge for using the Migrate service for 180 days from the start of replication to a VM. The product continues to evolve with new functionality added. Azure databox is classified as an Azure migrate service.

 

About the Author
Students
19499
Courses
65
Learning Paths
12

Hallam is a software architect with over 20 years experience across a wide range of industries. He began his software career as a  Delphi/Interbase disciple but changed his allegiance to Microsoft with its deep and broad ecosystem. While Hallam has designed and crafted custom software utilizing web, mobile and desktop technologies, good quality reliable data is the key to a successful solution. The challenge of quickly turning data into useful information for digestion by humans and machines has led Hallam to specialize in database design and process automation. Showing customers how leverage new technology to change and improve their business processes is one of the key drivers keeping Hallam coming back to the keyboard.