The course is part of these learning pathsSee 1 more
This course will provide you with a good foundation to better understand Amazon Kinesis, along with helping you to get started with building streamed solutions. In this course we'll put a heavier emphasis on hands-on demos along with breaking down the concepts and introducing you to the components that make up Amazon Kinesis.
- People working with Big Data
- Business intelligence
- Demonstrate knowledge of Amazon Kinesis, what are the core aspects of the service (streams, firehose, analytics), their common use cases, big data use cases, and how do you access / integrate with the Kinesis service Demonstrate how to work with and create Kinesis streams
- Demonstrate how to work with Kinesis Firehose and how to use Firehose with Redshift
- Set up monitoring and analysis of the stream services
- Understand the how to extract analysis from Kinesis
This Course Includes:
- 45 minutes of high-definition video
- Live demonstrations on key course concepts
What You'll Learn:
- What is Streaming Data: An overview of streaming data and it’s common uses.
- Setting Up Kinesis Agent: In this demo, we're working with installing the Amazon Kinesis Stream agent.
- Kinesis Streams: An overview of Kinesis Streams, what they do, and common use cases.
- Performing Basic Stream Operations: In this demo, we'll be pulling a basic Amazon Kinesis stream from the command line.
- Firehose: In this lesson we'll be discussing the fully managed solution,Amazon Kinesis Firehose.
- Firehose Delivery Stream: In this demo we're going to set up an Amazon Kinesis Firehose stream.
- Testing Delivery Stream: In this lesson we're going to do a quick follow up to the Firehose stream, and test the data delivery.
- Kinesis Analytics: In this lesson we'll go over the analytics components of Kinesis.
- Kinesis Analytics Demo: In this demo we're going to begin working with Amazon Kinesis Analytics.
- Kinesis Features Comparison: In this lesson, we'll compare some products within the Amazon Kinesis suite, as well as some other Amazon services.
- Course Conclusion: A wrap-up and review of the course.
28/05/2019 - Re-record of lectures to improve audio
About the Author
Richard Augenti is a DevOps Engineer with 23 years of IT professional experience and 7 years of cloud experience with AWS and Azure. He has been engaged with varying sized projects with clients all across the globe including most sectors. He enjoys finding the best and most efficient way to make things work so, working with automation, cloud technologies, and DevOps has been the perfect fit. When Richard is not engaged with work, he can also be found presenting workshops and talks at user conferences on cloud technologies and other techie talks.
- [Instructor] Let's start by understanding what we mean when we say streaming data. Streaming data is the continuous transfer of data at high velocity. This data is usually generated from hundreds to thousands of different data sources. The data is collected, processed and then consumed for use in applications, data analysis and data warehouses. The benefit of working with streaming data is the ability to analyze data in real time. The value in this may be to identify important information that is more valuable in real time than at a later point through traditional means, data batch processing for example. So this information may lead to generating an alert or providing some other important insight that is time sensitive. There are many great use cases for working with streaming data, for example fraud prevention and detection. Banking or credit card companies can pick up on suspicious activity through picking up inconsistent behaviors in how a bank or credit card is being used. This may include purchases outside of our normal buying patterns or ATM withdrawals outside a state that we live in. The internet of things is built upon streaming data so it's heavily reliant on platforms such as Amazon Kinesis to consume and process device data for further analysis and other actions. Batch processing is the traditional method of collecting large quantities of data over a period of time to then be processed at a scheduled time along with performing any deeper analysis. The difference with streaming data is that it is transferred, collected and processed in near real time. Streaming data processing requires two layers which include a storage layer and a processing layer. There are many different platforms which will work with streaming data. These platforms include Amazon Kafka, Apache Flume, Apache Spark Streaming, Apache Storm, Amazon Elastic Map Reduce and Amazon Kinesis. The Amazon Kinesis suite includes Amazon Kinesis Streams, Amazon Kinesis Firehose and Amazon Kinesis Analytics. Amazon Kinesis Streams and Kinesis Firehose consume, process and deliver streaming data to applications, data warehouses, tables and at the analytics platforms. Amazon Kinesis Analytics is a fully managed tool used for further analysis with traditional and CSQL queries. That concludes this lecture. I'll see you in the next one.