This course covers the core learning objective to meet the requirements of the 'Designing Network & Data Transfer solutions in AWS - Level 2' skill
- Understand the most appropriate AWS connectivity options to meet performance demands
- Understand the appropriate features and services to enhance and optimize connectivity to AWS public services such as Amazon S3 or Amazon DynamoDB.
- Understand the appropriate AWS data transfer service for migration and/or ingestion
- Apply an edge caching strategy to provide performance benefits for AWS solutions
Let me run through a couple of scenarios whereby you might choose to use AWS DataSync in a live environment.
Firstly one that will probably be familiar to a lot of people, and that is archiving data into cold storage. You may have been running your own on-premise storage services for a long time, and you have a lot of data residing in that storage architecture that would be better suited for cloud storage, in particular that of the Amazon S3 Glacier storage classes, perhaps Glacier Deep Archive which is the cheapest storage solution provided by AWS primarily used for archiving rarely accessed data. By utilising AWS DataSync you can schedule a task to migrate this data from your own data centre into one of the Glacier storage services. This then ensures the durability of the data at a very low cost without the worry of maintaining that data yourself on-premises. This would also likely allow you to remove any old and perhaps legacy storage solution that you were using to store this data yourself.
The next use case might be related to the need to implement steady and active data migrations to Amazon S3, Amazon FSx for Windows File Server or EFS on a regular basis. Depending on your workload will depend on which service you are migrating to. You might want to consider this option for potential backup purposes or DR, for example you might migrate data you EFS for a standby file system in the cloud should your primary on-site file system suffer an outage.
Finally, let’s assume you are utilising a hybrid cloud solution, utilising services and solutions both on premises and in AWS. Occasionally you might need to harness the power and speed that many of the AWS services have, this is especially true if you are working with Machine learning or trying to process data sets in a short period of time. You could migrate the data into AWS for this additional processing and analysis and then migrate data back with any results into your Data Centre when your operations have completed.
Stuart has been working within the IT industry for two decades covering a huge range of topic areas and technologies, from data center and network infrastructure design, to cloud architecture and implementation.
To date, Stuart has created 150+ courses relating to Cloud reaching over 180,000 students, mostly within the AWS category and with a heavy focus on security and compliance.
Stuart is a member of the AWS Community Builders Program for his contributions towards AWS.
He is AWS certified and accredited in addition to being a published author covering topics across the AWS landscape.
In January 2016 Stuart was awarded ‘Expert of the Year Award 2015’ from Experts Exchange for his knowledge share within cloud services to the community.
Stuart enjoys writing about cloud technologies and you will find many of his articles within our blog pages.