Requesting a Snow Device
Start course
1h 30m

This course covers the core learning objective to meet the requirements of the 'Designing Network & Data Transfer solutions in AWS - Level 2' skill

Learning Objectives:

  • Understand the most appropriate AWS connectivity options to meet performance demands
  • Understand the appropriate features and services to enhance and optimize connectivity to AWS public services such as Amazon S3 or Amazon DynamoDB.
  • Understand the appropriate AWS data transfer service for migration and/or ingestion
  • Apply an edge caching strategy to provide performance benefits for AWS solutions

So you might be wondering, ‘How does the snow family work as a service?’, so let’s take a look at a very high level how you request a device and the general life-cycle of the whole process, and I’ll split this up into 2 section, the first section will look at how to order a Snowcone or Snowball, and then I’ll finish up by looking at how your order the snowmobile.

So firstly, you will likely have a requirement to implement edge computing or need to transfer a large amount of data from the edge perhaps with the Snowball, or you might require the ability to collect, process, and move data to AWS from remote and extreme locations, using the lightweight and portable Snowcone.  Once you have decided on your use case, snowcone or Snowball you are ready to create a job, either programmatically using the AWS CLI or via the Snow family dashboard in the Management console.  For the rest of this lecture, I will be referencing the AWS Management Console.

Using the Snow Dashboard in the console you can select 1 of 3 types of job you would like to raise:

Import into Amazon S3, Export from Amazon S3, and Local compute and storage only.

The first 2 options allow you to move and migrate data into and out of AWS, the last option simply allows you to perform local compute and storage capabilities without needing to move any data to or from AWS.  Using this option also allows you to cluster between 5 and 10 snowballs together to increase data durability and storage capacity.

Once you have chosen your use case, let’s say for example ‘Import into Amazon S3’, you will be prompted to supply details including your shipping address, the type of snow device you require, snowcone or snowball, the pricing option, either on-demand or committing to 1 year or 3 year period, an S3 bucket you’d like to transfer data to, and the buckets that you select will appear as directories on your snow device when it arrives.  Any data stored in these directories will then be migrated back to the S3 bucket when it’s sent to AWS or when it’s transferred using AWS DataSyn if using the snowcone. You will also select the Amazon Machine Image to be used for the EC2 instance that is pre-loaded onto the snow device. You can select the Encryption key backed by KMS that you want to encrypt the data with. A service-link role can then be defined which will give the snow device permissions to write to your S3 bucket and publish Simple Notification Service messages on your behalf, detailing status updates of your job. You can then specify which SNS topic messages will be published to. Finally, you will be asked if you want to download AWS OpsHub, which provides a graphical user interface to help you manage your snow family of devices. 

AWS will then reserve and prepare the relevant snow device for you and ship it out to the address defined during the job creation process.

When the device arrives, you can then power the snow device and use the AWS OpsHub software to unlock it using an encrypted manifest code stored and created in the AWS Snow Family Console that’s associated with your job.

You must then configure the snow device as required, connecting it to your local network if required using a static IP address or one that is dynamically assigned using DHCP.

You can then use the device as required storing any data or using the edge computing potential for processing your edge workloads.

Once you have finished with the snow device, if it’s a Snowcone you can either transfer the data on-line using the AWS DataSync agent or return it to AWS to transfer the data offline.  If using the Snowball your only option is to return the device to AWS to transfer the data, there is no on-line transfer option available.

To send the device back to AWS for offline data transfer you must prepare the device for its return trip.  The snow devices used an E-ink digital return label which automatically selects the correct return address.

When the device reaches AWS, the data will be transferred to the location specified during the job creation process and the data deleted from the snow device.

So as you can see if it’s a fairly simple process from start to finish, if you want to order a snowmobile then the procedure is slightly different.

It’s suggested that if you are looking to migrate 10 petabytes or more from a single location then the AWS snowmobile is your go-to solution. Let’s assume you needed to transfer 100petebytes of data to S3, firstly you must contact AWS Sales support to discuss the requirements. 

AWS will then carry out an assessment of the requirements and destination to ensure there is physical access and sufficient connectivity from both a power and network perspective.

Following a successful assessment survey, a snowmobile will then be driven to your location.

AWS personnel will then configure and power the snowmobile allowing you to connect your network to it to transfer your data across the high-speed connections.

Once the data transfer is complete, AWS personnel will drive it back to AWS to import and transfer the data into S3.

About the Author
Learning Paths

Stuart has been working within the IT industry for two decades covering a huge range of topic areas and technologies, from data center and network infrastructure design, to cloud architecture and implementation.

To date, Stuart has created 150+ courses relating to Cloud reaching over 180,000 students, mostly within the AWS category and with a heavy focus on security and compliance.

Stuart is a member of the AWS Community Builders Program for his contributions towards AWS.

He is AWS certified and accredited in addition to being a published author covering topics across the AWS landscape.

In January 2016 Stuart was awarded ‘Expert of the Year Award 2015’ from Experts Exchange for his knowledge share within cloud services to the community.

Stuart enjoys writing about cloud technologies and you will find many of his articles within our blog pages.