This course will provide you with a good foundation to better understand Amazon Kinesis, along with helping you to get started with building streamed solutions. In this course, we'll put a heavier emphasis on hands-on demos along with breaking down the concepts and introducing you to the components that make up Amazon Kinesis.
- People working with Big Data
- Business intelligence
- Demonstrate knowledge of Amazon Kinesis, what are the core aspects of the service (streams, firehose, analytics), their common use cases, big data use cases, and how do you access / integrate with the Kinesis service Demonstrate how to work with and create Kinesis streams
- Demonstrate how to work with Kinesis Firehose and how to use Firehose with Redshift
- Set up monitoring and analysis of the stream services
- Understand how to extract analysis from Kinesis
This Course Includes
- 45 minutes of high-definition video
- Live demonstrations on key course concepts
What You'll Learn
- What is Streaming Data: An overview of streaming data and it’s common uses.
- Setting Up Kinesis Agent: In this demo, we're working with installing the Amazon Kinesis Stream agent.
- Kinesis Streams: An overview of Kinesis Streams, what they do, and common use cases.
- Performing Basic Stream Operations: In this demo, we'll be pulling a basic Amazon Kinesis stream from the command line.
- Firehose: In this lesson, we'll be discussing the fully managed solution, Amazon Kinesis Firehose.
- Firehose Delivery Stream: In this demo, we're going to set up an Amazon Kinesis Firehose stream.
- Testing Delivery Stream: In this lesson, we're going to do a quick follow up to the Firehose stream, and test the data delivery.
- Kinesis Analytics: In this lesson, we'll go over the analytics components of Kinesis.
- Kinesis Analytics Demo: In this demo, we're going to begin working with Amazon Kinesis Analytics.
- Kinesis Features Comparison: In this lesson, we'll compare some products within the Amazon Kinesis suite, as well as some other Amazon services.
- Course Conclusion: A wrap-up and review of the course.
28/05/2019 - Re-record of lectures to improve audio
Welcome to Working with Amazon Kinesis. I'm Richard Augenti and I'll be your instructor for this lesson. In this lesson we'll be discussing Amazon Kinesis Firehose. Amazon Kinesis Firehouse is similar to Kinesis Streams. It handles the collection, processing and delivery of streaming data. The main difference is that Kinesis Firehose is a fully managed offering from Amazon.
Obviously, the major benefit in using Kinesis Firehose is that the administration of the streaming solution is completely automated. You get to focus on what you are good at while Amazon focuses on what they are good at by managing your streaming environment. This diagram portrays the high level architecture of Amazon Kinesis Firehose.
There two main methods to load data to Firehose. This includes using the Firehose agent, or AWS SDK. The Firehouse agent is a Java-based software application which collects and sends data to Firehose. The agent manages file rotation, check pointing and retry upon failures. It also logs CloudWatch metrics to better track your streaming process. The AWS software development kit supports AWS SDK for Java, .NET, Node.js, Python or Ruby if you use the AWS software development kit to build an application which handles loading your data.
Data transformation can be enabled which enables a buffer of up to three megabytes and sends the data to lambda. Further processing takes place to convert the data. Each data record must contain a record ID, result and data or a result and a failure and shift to S3 for storage. Finally, the streaming data's delivered to one of the different available on Amazon services which include S3, and Redshift for elastic search or another Amazon Kinesis service. Data is stored up to 24 hours in case the delivery destination is unavailable. Kinesis Firehouse can be monitored through CloudWatch metrics, CloudWatch logs, Kinesis Agent or through the API logging and history. Well, that wraps up our lesson. I'm Richard Augenti and I'll see you in the upcoming lesson.
Richard Augenti is a DevOps Engineer with 23 years of IT professional experience and 7 years of cloud experience with AWS and Azure. He has been engaged with varying sized projects with clients all across the globe including most sectors. He enjoys finding the best and most efficient way to make things work so, working with automation, cloud technologies, and DevOps has been the perfect fit. When Richard is not engaged with work, he can also be found presenting workshops and talks at user conferences on cloud technologies and other techie talks.