Kinesis Streams

The course is part of these learning paths

Getting Started With Amazon Kinesis
course-steps 4 certification 1 lab-steps 1
DevOps Engineer – Professional Certification Preparation for AWS
course-steps 35 certification 5 lab-steps 17 quiz-steps 2 description 3
Solutions Architect – Professional Certification Preparation for AWS
course-steps 45 certification 6 lab-steps 19 quiz-steps 5 description 2
Certified Developer – Associate Certification Preparation for AWS
course-steps 27 certification 5 lab-steps 22 description 2
AWS Big Data – Specialty Certification Preparation for AWS
course-steps 14 certification 1 lab-steps 4 quiz-steps 4
more_horiz See 2 more

Contents

keyboard_tab
Course Introduction
Kinesis Analytics
Course Conclusion
play-arrow
Start course
Overview
DifficultyIntermediate
Duration25m
Students1142

Description

Course Description:

This course will provide you with a good foundation to better understand Amazon Kinesis, along with helping you to get started with building streamed solutions. In this course we'll put a heavier emphasis on hands-on demos along with breaking down the concepts and introducing you to the components that make up Amazon Kinesis.

Intended audience:

  • People working with Big Data
  • Business intelligence
  • DevOps
  • Development

Learning Objectives:

  • Demonstrate knowledge of Amazon Kinesis, what are the core aspects of the service (streams, firehose, analytics), their common use cases, big data use cases, and how do you access / integrate with the Kinesis service Demonstrate how to work with and create Kinesis streams
  • Demonstrate how to work with Kinesis Firehose and how to use Firehose with Redshift
  • Set up monitoring and analysis of the stream services
  • Understand the how to extract analysis from Kinesis

This Course Includes:

  • 45 minutes of high-definition video
  • Live demonstrations on key course concepts

What You'll Learn:

  • What is Streaming Data: An overview of streaming data and it’s common uses.
  • Setting Up Kinesis Agent: In this demo, we're working with installing the Amazon Kinesis Stream agent.
  • Kinesis Streams: An overview of Kinesis Streams, what they do, and common use cases.
  • Performing Basic Stream Operations: In this demo, we'll be pulling a basic Amazon Kinesis stream from the command line.
  • Firehose: In this lesson we'll be discussing the fully managed solution,Amazon Kinesis Firehose.
  • Firehose Delivery Stream: In this demo we're going to set up an Amazon Kinesis Firehose stream.
  • Testing Delivery Stream: In this lesson we're going to do a quick follow up to the Firehose stream, and test the data delivery.
  • Kinesis Analytics: In this lesson we'll go over the analytics components of Kinesis.
  • Kinesis Analytics Demo: In this demo we're going to begin working with Amazon Kinesis Analytics.
  • Kinesis Features Comparison: In this lesson, we'll compare some products within the Amazon Kinesis suite, as well as some other Amazon services.
  • Course Conclusion: A wrap-up and review of the course.

Updates:

28/05/2019 - Re-record of lectures to improve audio 

Transcript

Welcome to Working with Amazon Kinesis. I'm Richard Augenti. I'll be your instructor for this lesson. In this lesson, we'll be working with the Amazon Kinesis Streams. Amazon Kinesis Streams is a stream solution which collects, processes and delivers stream data into applications and solutions, data warehouses or data base tables. It can collect data from thousands of sources and deliver data through multiple destinations through parallel processes. Kinesis Streams is able to capture terabyte of data per hour from hundreds of sources.

It provides a flexibility to build customized applications to ingest streaming data that suit your use case needs. It enables you to build customized applications, consume data for dash boards, realtime monitoring and alerts. It provides the ability to work with other Amazon services including Amazon S3, Amazon Redshift, Amazon Elastic MapReduce and DynamoDB.

Here's a diagram of a high level architecture of Amazon Kinesis Stream's product. Producers push streaming data to the Kinesis Stream's pipeline; that data is further processed and pushed out to other AW Services such as S3, DynamoDB or Redshift or, in some cases, the streamed data can then be pushed along to another Kinesis stream, Kinesis Firehose or Kinesis Analytic Service. A Kinesis Stream is an ordered sequence of data records with the data record being a unit of stored data in the Kinesis Stream.

A data record is made of sequence number, partition key and data Producers put records into an Amazon Kinesis Stream while a consumer gets records from an Amazon Kinesis Stream. Amazon Kinesis applications are a consumer of Kinesis Streams, which normally run on EC2 instance. Shards are uniquely identified group of data records in the Kinesis Stream. A stream is normally made up of one or more shards.

A partition key is used to group data by a shard in a Kinesis Stream. While each data record has a sequence number that is unique within each shard. The Kinesis client library is compiled within your application to enable full to underconsumption of data from the stream. Data records are accessible at default of 24 hours in the time they are added to the stream.

Their attention period is configurable in an hourly increments from 24 to 168 hours. Kinesis Streams utilize Amazon CloudWatch and Cloud Trial for monitoring and logging purposes. There's an additional monitoring for other Kinesis components which include tools like the Streams Agent which has its own logging mechanism. Well, that wraps up this lesson. I'm Richard Augenti. I'll see you in the upcoming lesson.

About the Author

Students1285
Courses2
Learning paths1

Richard Augenti is a DevOps Engineer with 23 years of IT professional experience and 7 years of cloud experience with AWS and Azure. He has been engaged with varying sized projects with clients all across the globe including most sectors. He enjoys finding the best and most efficient way to make things work so, working with automation, cloud technologies, and DevOps has been the perfect fit. When Richard is not engaged with work, he can also be found presenting workshops and talks at user conferences on cloud technologies and other techie talks.