Practical aspects of S3

Contents

keyboard_tab
Storage on AWS
1
Introduction
PREVIEW5m 26s
Elastic Block Store
2
Overview of EBS
PREVIEW7m 26s
Simple Storage Service
Advanced services: Glacier and Storage Gateway
Start course
Overview
Difficulty
Beginner
Duration
52m
Students
1118
Description

Storage is a central part of any computing infrastructure. Amazon provides many services on the cloud to replace traditional, on-premises storage systems, ranging from short-term storage for running instances who are doing computation on smal batches of data, up to long-term archives saved in redundant disks or even tapes.

In this course, the Computer Engineer and Cloud Expert Mohammad Ali Tabibi will give you an overview of the AWS storage services like EBS, S3, Glacier and Storage Gateway, to better understand what they are for, how they are built, and how they can be best used.

Who should take this course

Being a beginner course, no prerequisites are needed to understand the concepts of this course. Nevertheless, having some knowledge of what AWS is, and having some experience with the Linux Command Line Interface, might be helpful to follow along the course.

If you want to test your knowledge on the basic topics covered in this course, we strongly suggest to take our AWS questions. Also, if you want to learn more about the other AWS services, please consider checking out our other AWS courses.

Transcript

To start using Amazon S3 you need an AWS account. If you don't already have one, you'll be prompted to create one when you sign up. You will not be charged for Amazon S3 until you use it.

To sign up for Amazon S3 on your AWS management console in storage and content delivery section, click on S3 then follow the on-screen instructions. If you want to create a bucket, click on create bucket.

In the create a bucket dialog box, enter a bucket name you like in the bucket name box. This name should be unique across all existing bucket names in Amazon S3. Note that after you've created a bucket, you cannot change its name and this name is visible in the URL that points to the objects stored in the bucket. So choose an appropriate name. Now, select a region from the region box. It's good to know that objects stored in a region never leave that region unless you explicitly transfer them to another region. Now you can click on create. When Amazon S3 successfully creates your bucket, the console displays your empty bucket in the buckets panel. Okay. Now it's time to use your bucket by adding an object to it. An object can be any kind of file, a text file, a photo, a video, etc. When you add a file to Amazon S3 you have the option of including meta data with the file and setting permissions to control access to the file. To upload an object in the Amazon S3 console, select the bucket you want to upload an object to and click upload. In the newly opened windows, click add files. As you can see, a file selection dialog box opens. In this new dialog box, select the file that you want to upload and then click open. You can see your selected file in the upload window.

So you should click on start upload. You can check the progress from within the transfer panel. Once complete, the selected file is added to the created bucket. Now that you've added an object to a bucket, you could open and view it in your browser or you can also download it to your local computer. To open or download an object in the Amazon S3 console, from the objects and folders list, select the object or the objects that you want to open or download then right click and choose open or download as appropriate. If you're downloading the object, specify where you want to save the downloaded object. The procedure for saving the object will depend on the browser and operating system that you're using. Note that by default your Amazon S3 buckets and objects are private. To make an object viewable by using a URL, for example S3. AmazonAWS.com/bucket/object, you must make the object publicly readable. Otherwise you'll need to create a signed URL that includes a signature with authentication information. Now that you've added an object to a bucket and viewed it, you can move the object to a different bucket or folder. First of all you need to create a folder by clicking on create folder on your Amazon S3 console and choosing a name for this new folder, then press enter. Now you can move your object to this folder by right clicking on the selected object and choosing cut.

Navigate the bucket or folder where you want to move the object, right click the folder or bucket and then click paste into. If you no longer need to store the objects that you've uploaded and moved, you should delete them to prevent further changes. To delete an object in the Amazon S3 console, in the objects and folders panel right click on the object that you want to delete and then click delete.

When the confirmation message appears, click okay. Also, please note, that for deleting a bucket you must first delete all the objects within it. Right click the bucket you want to delete and then click delete. And when a confirmation appears, click okay. Data stored on Amazon S3 is private by default. Only bucket and object owners have access to the Amazon S3 resources they create. Amazon S3 supports several mechanisms that give you flexibility to control who can access your data as well as how, when and where they can access it. Amazon S3 provides four different access control mechanisms.

Identity and access management policies. IAM enables organizations with multiple employees to create and manage multiple users under a single AWS account. With IAM policies, you can grant IAM users fine-grain control over your Amazon S3 bucket or objects. Access control lists.

By using this mechanism you can use ACLs to selectively add or grant certain permissions on individual objects. Bucket policies. Amazon S3 bucket policies can be used to add or deny permissions across some or all of the objects within a single bucket. Query String Authentication. With query string authentication, you can have the ability to share Amazon S3 objects through URLs that are valid for a pre-defined amount of time. You can securely upload, download your data to Amazon S3 via the SSL encrypted endpoints using the HTTPS protocol. Amazon S3 also provides multiple options for encryption of data at rest. If you want Amazon S3 to manage the encryption and decryption of data, you have two options. You can use Amazon S3 server side encryption, SSE, if you prefer to have Amazon S3 manage encryption keys for you.

If you prefer to manage your own encryption keys, you can use Amazon S3 server side encryption with customer provided keys. For both options, Amazon S3 will automatically encrypt your data on writing and decrypt your data on retrieval. Alternatively, you can use a client encryption library like the Amazon S3 encryption client to encrypt your data before uploading to Amazon S3. Amazon S3 also supports logging of requests made against your Amazon S3 resources. You can configure your Amazon S3 bucket to create access log records for the requests made against it. These server access logs capture all requests made against a bucket or the objects in it and it can be used for auditing purposes. Amazon S3 provides highly durable storage infrastructure designed for mission-critical and primary data storage. To increase durability, Amazon S3 synchronously stores your data across multiple facilities before returning success. In addition, Amazon S3 calculates checksums on all network traffic to detect corruption of data packets when storing or retrieving data.

Amazon S3 provides further protection via versioning. You can use versioning to preserve, retrieve and restore every version of every object stored in your Amazon S3 bucket. This allows you to easily recover from both unintended user actions and application failures. Another storage option within Amazon S3 that enables customers to reduce their cost by storing non-critical, reproducible data at lower levels of redundancy is reduced redundancy storage or RRS.

About the Author
Avatar
Mohammad Ali Tabibi
Software Engineer

Computer Engineer and Cloud Expert