Migrating an SAP landscape to Azure is a complicated task with lots of moving parts. Apart from the sheer volume of data, a migration can involve software upgrades and data conversions. Not only will a landscape need to work properly once migrated, but in most cases, the migration needs to happen quickly as SAP systems are usually the information backbone of an organization.
Once the target environment structure has been finalized the next important phase is planning the migration so that system downtime is kept to a minimum. There are different migration strategies to suit multiple scenarios and this course looks at which strategy works best for each scenario. We then look at ways to optimize a migration, and its pros, cons, and issues to look out for.
Learning Objectives
- Understand the main scenarios and considerations when migrating from SAP to Azure
- Learn the two methods for getting your data into Azure
- Understand the difference between homogeneous and heterogeneous migration
- Understand how the database migration option available in SAP can be used to migrate your data to Azure
- Learn about the SAP HANA Large Instances service from Azure
Intended Audience
- Database administrators
- Anyone looking to migrate data from SAP to Azure
Prerequisites
To get the most out of this course, you should have a basic knowledge of both Microsoft Azure and SAP.
The main difference between a homogenous migration and a heterogeneous one is that the data needs to be extracted or exported from the source database system into files. Those files are transferred to Azure and imported into the target database. This is a more complicated process as the export and import steps are considerably more involved than a backup and restore. The irony is that the exported data files are smaller than the source database, so the time to copy across the network will be reduced. Heterogeneous migrations come in two flavors: manual migration and Data Migration Option (DMO) with system move, which is automated and can include software upgrades. Let's start by looking at a classical migration plan.
Classical migrations can target any database, that is, SAP HANA, SQL Server, or Oracle. Because this is a manual migration process that involves many elements and possible options in terms of optimization, SAP requires that the person executing the migration holds the SAP OS/DB migration certification.
Suppose you are upgrading to SAP HANA as part of the migration process. In that case, you will need to perform a Unicode conversion if your current system is not Unicode. If SAP HANA does not support your existing SAP application software, you will need to update or upgrade before commencing the migration.
The data export phase has three steps.
Step 1. Generate the data definition language (DDL) statements for non-standard database objects. This step should be done just before the migration, scripting objects to SQL via the SMIGR_CREATE_DDL report.
Step 2. Use SAP Software Provisioning Manager to determine the table and index sizes. SWPM calls R3SZCHK to perform the size calculations and writes the output to DBSIZE.XML. Size calculations should be run prior to the migration.
Step 3. Export the data to files. This step is the beginning of the actual migration and marks the beginning of system downtime. It also offers significant optimization opportunities.
As with a homogeneous migration, the extracted data files are transferred to Azure. As we've already talked about, you can do this with AzCopy, or you could set up an FTP server and transfer them directly to the VMs.
Assuming that you have installed the DBMS software, whatever that might be, and the SAP system, you can begin importing the data files. Once the data import has been completed, run system integrity checks with SAP Software Provisioning Manager and start the instance.
The primary determinant of system downtime or the length of the cutover period is the amount of data exported, transferred, and then imported. As I've said, migration optimizations can become complex as database sizes increase to very large. Like most performance gains in software, speeding up a migration is achieved by splitting the task into multiple parallel processes.
Exporting and importing is carried out by the SAP R3Load process. An R3Load process executes a package that contains database objects. You can run multiple R3Load processes in parallel on one CPU or vCPU. SAP databases tend to have most of their data located in a small number of very large tables. The complexity arises when splitting these very large tables over multiple R3Load processes. Because a migration is not complete until all data has been transferred, the large tables will determine when you're finished. This is why splitting and scheduling the load processes is so important.
When dealing with very large databases, it is not uncommon to use multiple dedicated export servers running R3Load processes. A typical scenario would be having one server dedicated to the largest one to four tables. The exact number will be dependent on data distribution within the database. Two servers dedicated to other split tables, and a server to export all other tables. It is best to use physical machines as R3Load servers, as they perform better than virtual machines, particularly VMware VMs.
Having four export servers intensively hitting the database server may result in the database itself becoming a performance bottleneck. Apart from ensuring that all indexes are refreshed or rebuilt, another option would be to split the database over several DB servers. On the Azure or import side of the migration, these kinds of hardware or infrastructure limitations are less critical. It is easy to scale up your environment for the duration of the migration.
Hallam is a software architect with over 20 years experience across a wide range of industries. He began his software career as a Delphi/Interbase disciple but changed his allegiance to Microsoft with its deep and broad ecosystem. While Hallam has designed and crafted custom software utilizing web, mobile and desktop technologies, good quality reliable data is the key to a successful solution. The challenge of quickly turning data into useful information for digestion by humans and machines has led Hallam to specialize in database design and process automation. Showing customers how leverage new technology to change and improve their business processes is one of the key drivers keeping Hallam coming back to the keyboard.