Start course
Difficulty
Beginner
Duration
1h 24m
Students
34355
Ratings
4.8/5
Description

One of the core building blocks of Infrastructure as a Service (IaaS) is that of storage, and AWS provides a wide range of storage services that allow you to architect the correct solution for your needs. Understanding what each of these services, plus what they have been designed and developed for, gives you the knowledge to implement best practices ensuring your data is stored, transmitted, and backed up in the most efficient and scalable way. This course will focus on each of the storage services provided through AWS training and will explain what the service is, its key features and when and why you might use the service within your own environment.

Learning Objectives

The objectives of this course are to provide:

Intended Audience

This course is designed as an introduction to the AWS storage services and methods of storing data.  As a result, this course is suitable for:

  • Those who are starting out their AWS journey to understand the various services that exist and their use case
  • Storage engineers responsible for maintaining and storing data within the enterprise
  • Security engineers who secure and safeguard data within AWS
  • Those who are looking to begin their certification journey with either the AWS Cloud Practitioner or one of the 3 Associate level certifications

Prerequisites

This is an entry-level course to AWS storage services and so no prior knowledge of these services are required, however, a basic understanding of Cloud Computing and awareness of AWS would be beneficial but is not essential.

Feedback

If you have thoughts or suggestions for this course, please contact Cloud Academy at support@cloudacademy.com.

 

 

Transcript

Hello and welcome to this lecture focusing on the AWS Snowball service. Essentially this service is used to securely transfer large amounts of data and I'm talking out of petabyte scale here, in and out of AWS. Either from your on-premise data center to Amazon S3 or from Amazon S3 back to your data center using a physical appliance, known as a snowball. 

The snowball appliance comes as either a 50 terabyte or 80 terabyte storage device, depending on your region. Currently the 50 terabyte version is only available within the US regions. The appliance is dust, water, and tamper resistant and can even withstand an eight and a half G jolt from within its own external shipping container and so it's been built to code with a lot of stress conditions to ensure the durability of your data. The snowball appliance has been designed to allow for high-speed data transfer thanks to a range of interfaces allowing you to select the most appropriate connection for your needs. Onboard the snowball appliance, the following I/O 10 gigabit interfaces are available, RJ45 using CAT6, SFP Copper, and SFP Optical. 

By default, all data transferred to a snowball appliance is automatically encrypted using 256-bit encryption keys generated from KMS, the Key Management Service. Whilst on the topic of security, it also features end to end tracking using an E-Ink shipping label. This ensures that when the device leaves your premises, it is sent to the right AWS facility. The appliance can also be tracked using the AWS Simple Notification Service with text messages or via the AWS Management Console. From a compliance perspective, AWS Snowball is also HIPAA compliant allowing you to transfer protected health information in and out of S3. When the transfer of data is complete by the inter S3 or into a customer's data center and the appliance is sent back to AWS, it is then the responsibility of AWS to ensure that data held in the snowball appliance is deleted and removed. To control this process AWS conforms to standards and guidelines set by NIST, the National Institute of Standard and Technology, to ensure this is performed and controlled and that all traces of data are removed from the media. 

When sending or retrieving data, snowball appliances can be aggregated together. For example, if you needed to retrieve 400 terabytes of data from S3 then your data will be sent by five 80 terabyte snowball appliances. So, from a disaster recovery perspective when might you need to use AWS Snowball? Well it all depends on how much data you need to get back from S3 to your own corporate data center and how quickly you can do that. On the other hand, how much data you need to get into S3. This'll depend on the connection you have to AWS from your data center. You may have direct-connect connections, a VPN, or just an internet connection. And if you need to restore multiple petabytes of data, this could take weeks or even months to complete. As a general rule, if your data retrieval will take longer than a week using your existing connection method, then you should consider using AWS Snowball. Your global location will effect specific shipping times and so more information on this can be found using the link on the screen. 

If you did decide to use AWS Snowball to retrieve your data in the event of a disaster, the process to use AWS Snowball is a fairly simple process. At a high level this is how it looks. Firstly, you need to create an export job from within the AWS Management Console. Within this job you can dictate shipping details, the S3 bucket, and the data to exported security mechanisms such as the KMS key for data encryption and also notifications. You will then receive delivery of your snowball appliance. You can now connect the appliance to your local corporate network. Firstly, use the ports to connect the appliance to your network whilst it's powered off. Next power on the device and the E Ink display will let you know that it's ready. You can then configure the network settings of the device, such as the IP address, to enable communications. From here you are now ready to start transferring the data. To do this you must first gain specific access credential via a manifest file through the management console, which has to be downloaded. You must then install the snowball Client software and you can now begin transferring data using the client software once authenticated with the manifest file. When the data transfer is complete, you can disconnect the snowball appliance. The appliance must then be returned to AWS using specified shipping carriers. It's important to note that all snowball appliances are the property of AWS and the E Ink label will display the return address. 

Much like many other AWS pricing for storage, any data transferred into AWS does not incur a data transfer charge. However, you are charged for the normal Amazon S3 data charges, as discussed in a previous lecture. For each data transfer job, there is a charge in additional to shipping costs associated to the job. As I mentioned previously, there are two sizes of snowball. For the 50 terabyte snowball, there is a $200 charge and for the 80 terabyte, it's $250 unless it's in the Singapore region which will then be $320. You are allowed the snowball for 10 days in total. Any delays requiring additional days incur further charges between $15 to $20, depending on the region. The data transfer charges out of Amazon S3 to different regions is priced as follows. And the shipping will vary depending on your chosen carrier. For further information on this, please visit the following link.

Lectures:

0

About the Author
Students
236970
Labs
1
Courses
232
Learning Paths
187

Stuart has been working within the IT industry for two decades covering a huge range of topic areas and technologies, from data center and network infrastructure design, to cloud architecture and implementation.

To date, Stuart has created 150+ courses relating to Cloud reaching over 180,000 students, mostly within the AWS category and with a heavy focus on security and compliance.

Stuart is a member of the AWS Community Builders Program for his contributions towards AWS.

He is AWS certified and accredited in addition to being a published author covering topics across the AWS landscape.

In January 2016 Stuart was awarded ‘Expert of the Year Award 2015’ from Experts Exchange for his knowledge share within cloud services to the community.

Stuart enjoys writing about cloud technologies and you will find many of his articles within our blog pages.