Everything You Ever Wanted to Know about Amazon Kinesis Firehose

Amazon Kinesis Firehose makes it easy to load streaming data into AWS. Here’s what you need to know.

In an earlier blog post, I introduced you to Amazon Kinesis, the real-time streaming data service from Amazon. Now we will discuss the equally-important Amazon Kinesis Firehose service and how you can leverage it to easily load streaming data into AWS. We’ll discuss an overview of the service, the key concepts you need to know, and how to get started today. To make your life easier, I’ve included screenshots to help describe the individual steps. Read on to learn more about Amazon Kinesis Firehose!

Introducing Amazon Kinesis Firehose

Amazon Kinesis Firehose is simple to use, and the process of analyzing massive volumes of streaming data requires only five easy steps:

  1. Capture and submit streaming data to Amazon Kinesis Firehose.
  2. Amazon Kinesis Firehose loads streaming data to Amazon S3.
  3. Analyze streaming data using any BI tools of your choice.
  4. ???
  5. Profit!

Amazon Kinesis Firehose is a managed service, so you won’t have to wrestle with any obtuse administrative tasks. It automatically scales according to your data throughput and needs. To minimize storage use and for security purposes, Amazon Kinesis Firehose also batches, compresses, and encrypts data before loading it. The service manages all the underlying infrastructure, storage, networking, and configuration required for capturing and loading data into Amazon S3 or Amazon Redshift. Apart from the elastic nature of it’s underlying infrastructure, Amazon Kinesis Firehose synchronously replicates data across three facilities in an AWS Region, providing high availability and durability for the data as it is transported to various destinations.

The Key Concepts of Amazon Kinesis Firehose You Need to Know

You need to be familiar with a few terms before we know how to use Firehose. Let’s take a closer look at each of them.

Delivery Stream: Users submit the data to Amazon Kinesis Firehose by creating a Delivery Stream.
Records: A Record is the data that user submits to the delivery stream. Record can be up to 1000KB each.
Data Producers: In Amazon Kinesis parlance, producers are the entities those generate streaming data. A web server can be a producer that emits log data, or a twitter stream, which submits data about a particular trend etc. The producer list can go to anything that fits your requirement such as click stream, a log shipper like Logstash and so on.
Buffer Size and Buffer Interval: Amazon Kinesis Firehose buffers incoming streaming data to a certain size or for a certain period of time before delivering to destinations. Buffer Size is in MBs and Buffer Interval is in seconds.  Amazon Kinesis Firehose buffers incoming data before delivering it to your S3 bucket. You can configure buffer size (1 to 128 MBs) or buffer interval (60 to 900 seconds), and the one the condition is satisfied the system triggers the data delivery to your S3 bucket. In circumstances where data delivery to destination is falling behind data writing to delivery stream, Firehose raises buffer size dynamically to catch up and make sure that all data is delivered to the destination.
Amazon Kinesis Agents: Amazon Kinesis Agent is a pre-built Java application for the linux-based servers (only Amazon and RHEL the time writing this blog), that monitors files such as log files and continuously collect and send data to your delivery stream.

Getting Started with Kinesis Firehose (Screenshots included)

Note: To use Kinesis Firehose, you should already have an AWS account.

Login to Amazon and go to the Kinesis Service

Amazon kinesis firehose

Go to Kinesis Firehose to create a Delivery Stream.

Amazon Kinesis Firehose Kinesis

The next step is to create a delivery stream as follows. Choose either S3 or Redshift as your destination.

Amazon Kinesis Firehose

You have to provide a stream name, select a pre-existing bucket or you can create a new S3 bucket in a region of your choice, and add a prefix that will be a folder in the aforementioned bucket.

Amazon Kinesis Firehose

The next step shows you how to configure various parameters such as buffer-size, buffer interval, Compression and Encryption, IAM Roles. We have configured this as below:

Amazon Kinesis Firehose

To Create/Update Existing IAM Role, click on Firehose Delivery IAM Role. It will take to a new page. Here we have taken the default policy and role name etc. and allow it.

KinesisYou will be taken to previous Configuration page where you click next.

The next page will be review page where you will get a chance to review your configurations.

Amazon Kinesis Firehose

If everything looks OK, “Create Delivery Stream”.

The stream will be created. Users have options to Edit, Delete, Monitor and Send Data to the stream.

Amazon Kinesis Firehose
  • Your stream is ready and you can send data to the stream using either aforementioned Amazon Kinesis Agent.
  • You can either use a java program or aws cli to submit data to Firehose (we are doing it to keep things simple). In real world data comes from various sources with a high velocity and volume.
  • AWS CLI for Generating logs:
aws firehose put-record --delivery-stream-name <stream-name> --record Data="<log string>\n"
  • Sample Java API to generate logs for stream:
import java.nio.ByteBuffer;
import java.nio.charset.StandardCharsets;
import com.amazonaws.auth.AWSCredentials;
import com.amazonaws.auth.BasicAWSCredentials;
import com.amazonaws.services.kinesisfirehose.AmazonKinesisFirehoseClient;
import com.amazonaws.services.kinesisfirehose.model.PutRecordRequest;
import com.amazonaws.services.kinesisfirehose.model.PutRecordResult;
import com.amazonaws.services.kinesisfirehose.model.Record;
public class KinesisFireHoseSampleApp {
private static AmazonKinesisFirehoseClient firehoseClient;
private static void init() throws Exception {
AWSCredentials credentials = new BasicAWSCredentials("xxxxxxxxxxxxxx", "yyyyyyyyyyyyyyyyyyyy");
firehoseClient = new AmazonKinesisFirehoseClient(credentials);
}
public static void main(String[] args) throws Exception {
init();
String data = "my log data" + "\n"; // add \n as a record separator
Record record = new Record();
record.setData(ByteBuffer.wrap(data.getBytes(StandardCharsets.UTF_8)));
PutRecordRequest putRecordRequest = new PutRecordRequest()
.withDeliveryStreamName("test-stream")
.withRecord(record);
putRecordRequest.setRecord(record);
PutRecordResult result = firehoseClient.putRecord(putRecordRequest);
System.out.println("Result Inserted with ID: "+result.getRecordId());
}
}

You can see the metrics in Cloudwatch dashboard associated with the stream. Here are few samples:

Amazon Kinesis Firehose

The above example is a very basic one and sends through the above java client which sends a log record each time the program is run. Please note that we need aws-java-sdk-1.10.43 and amazon-kinesis-client-1.6.1 in the project library to run the application. After submitting the requests, you can see the graphs plotted against the requested records. However, this is not the ideal scenario to get Kinesis Firehose log graphs. The above example only shows how a record is submitted to Firehose stream – You need a high volume or high velocity data stream to realize the benefits of Kinesis Firehose. One of the best ways to send your logs to Firehose is to install the Amazon Kinesis Agent.

Amazon Kinesis Agent Setup Process 

Amazon Kinesis Agent is a stand-alone Java software application that offers an easier way to collect and ingest data in Firehose which continuously monitors a set of files and sends new data records to your delivery stream. Users can install the agent on Linux-based server environments such as web servers, front-ends, log servers, and database servers, and configure the agent by specifying the log files to monitor and the delivery stream names. Afterwards, the agent continuously collects data from the log files and submits the data to the delivery stream.

  • The supported linux versions are Amazon Linux AMI with version 2015.09 or later, or Red Hat Enterprise Linux version 7 or later.
  • Run following command to install the agent:
    sudo yum install –y aws-kinesis-agent
  • Alternately you can download and install in RHEL:
    sudo yum install –y https://s3.amazonaws.com/streaming-data-agent/aws-kinesis-agent-1.0-1.amzn1.noarch.rpm
  • Start the agent manually:
    sudo service aws-kinesis-agent start
  • Optionally configure the agent to start on system startup:
    sudo chkconfig aws-kinesis-agent on
  • The agent is now running as a system service in the background. It continuously monitors the location specified in the configuration file and emits the data into Firehose, logging agent activity in /var/log/aws-kinesis-agent/aws-kinesis-agent.log.
  • You can configure the agent to use optional settings, which the agent loads from the file /etc/aws-kinesis/agent.json by default.
  • Any changes to the configuration file requires stopping and re-starting the agent, as shown in the following commands:
    sudo service aws-kinesis-agent stop
    sudo service aws-kinesis-agent start

or sudo service aws-kinesis-agent restart

You can learn more about Kinesis Agent from here.

Next Steps

Kinesis is an efficient and must-try feature from AWS that works on both high data volumes and velocities. Gigabytes of data can be processed using Firehose with ease and made very simple to work with and analyze.

To get a solid foundation in Amazon analytics, check out our Analytics Fundamentals for AWS course. You’ll learn everything you need to know about the entire family of AWS analytics services, as well as develop the critical skills you need to bolster your career. Get started today with a free 7-day trial and start learning right away!

 

Avatar

Written by

Chandan Patra

Cloud Computing and Big Data professional with 10 years of experience in pre-sales, architecture, design, build and troubleshooting with best engineering practices. Specialities: Cloud Computing - AWS, DevOps(Chef), Hadoop Ecosystem, Storm & Kafka, ELK Stack, NoSQL, Java, Spring, Hibernate, Web Service


Related Posts

Alisha Reyes
Alisha Reyes
— January 6, 2020

New on Cloud Academy: Red Hat, Agile, OWASP Labs, Amazon SageMaker Lab, Linux Command Line Lab, SQL, Git Labs, Scrum Master, Azure Architects Lab, and Much More

Happy New Year! We hope you're ready to kick your training in overdrive in 2020 because we have a ton of new content for you. Not only do we have a bunch of new courses, hands-on labs, and lab challenges on AWS, Azure, and Google Cloud, but we also have three new courses on Red Hat, th...

Read more
  • agile
  • AWS
  • Azure
  • Google Cloud Platform
  • Linux
  • OWASP
  • programming
  • red hat
  • scrum
Alisha Reyes
Alisha Reyes
— December 24, 2019

Cloud Academy’s Blog Digest: Azure Best Practices, 6 Reasons You Should Get AWS Certified, Google Cloud Certification Prep, and more

Happy Holidays from Cloud Academy We hope you have a wonderful holiday season filled with family, friends, and plenty of food. Here at Cloud Academy, we are thankful for our amazing customer like you.  Since this time of year can be stressful, we’re sharing a few of our latest article...

Read more
  • AWS
  • azure best practices
  • blog digest
  • Cloud Academy
  • Google Cloud
Avatar
Guy Hummel
— December 12, 2019

Google Cloud Platform Certification: Preparation and Prerequisites

Google Cloud Platform (GCP) has evolved from being a niche player to a serious competitor to Amazon Web Services and Microsoft Azure. In 2019, research firm Gartner placed Google in the Leaders quadrant in its Magic Quadrant for Cloud Infrastructure as a Service for the second consecuti...

Read more
  • AWS
  • Azure
  • Google Cloud Platform
Alisha Reyes
Alisha Reyes
— December 10, 2019

New Lab Challenges: Push Your Skills to the Next Level

Build hands-on experience using real accounts on AWS, Azure, Google Cloud Platform, and more Meaningful cloud skills require more than book knowledge. Hands-on experience is required to translate knowledge into real-world results. We see this time and time again in studies about how pe...

Read more
  • AWS
  • Azure
  • Google Cloud
  • hands-on
  • labs
Alisha Reyes
Alisha Reyes
— December 5, 2019

New on Cloud Academy: AWS Solution Architect Lab Challenge, Azure Hands-on Labs, Foundation Certificate in Cyber Security, and Much More

Now that Thanksgiving is over and the craziness of Black Friday has died down, it's now time for the busiest season of the year. Whether you're a last-minute shopper or you already have your shopping done, the holidays bring so much more excitement than any other time of year. Since our...

Read more
  • AWS
  • AWS solution architect
  • AZ-203
  • Azure
  • cyber security
  • FCCS
  • Foundation Certificate in Cyber Security
  • Google Cloud Platform
  • Kubernetes
Avatar
Cloud Academy Team
— December 4, 2019

Understanding Enterprise Cloud Migration

What is enterprise cloud migration? Cloud migration is about moving your data, applications, and even infrastructure from your on-premises computers or infrastructure to a virtual pool of on-demand, shared resources that offer compute, storage, and network services at scale. Why d...

Read more
  • AWS
  • Azure
  • Data Migration
Wendy Dessler
Wendy Dessler
— November 27, 2019

6 Reasons Why You Should Get an AWS Certification This Year

In the past decade, the rise of cloud computing has been undeniable. Businesses of all sizes are moving their infrastructure and applications to the cloud. This is partly because the cloud allows businesses and their employees to access important information from just about anywhere. ...

Read more
  • AWS
  • Certifications
  • certified
Avatar
Andrea Colangelo
— November 26, 2019

AWS Regions and Availability Zones: The Simplest Explanation You Will Ever Find Around

The basics of AWS Regions and Availability Zones We’re going to treat this article as a sort of AWS 101 — it’ll be a quick primer on AWS Regions and Availability Zones that will be useful for understanding the basics of how AWS infrastructure is organized. We’ll define each section,...

Read more
  • AWS
Avatar
Dzenan Dzevlan
— November 20, 2019

Application Load Balancer vs. Classic Load Balancer

What is an Elastic Load Balancer? This post covers basics of what an Elastic Load Balancer is, and two of its examples: Application Load Balancers and Classic Load Balancers. For additional information — including a comparison that explains Network Load Balancers — check out our post o...

Read more
  • ALB
  • Application Load Balancer
  • AWS
  • Elastic Load Balancer
  • ELB
Albert Qian
Albert Qian
— November 13, 2019

Advantages and Disadvantages of Microservices Architecture

What are microservices? Let's start our discussion by setting a foundation of what microservices are. Microservices are a way of breaking large software projects into loosely coupled modules, which communicate with each other through simple Application Programming Interfaces (APIs). ...

Read more
  • AWS
  • Docker
  • Kubernetes
  • Microservices
Nisar Ahmad
Nisar Ahmad
— November 12, 2019

Kubernetes Services: AWS vs. Azure vs. Google Cloud

Kubernetes is a popular open-source container orchestration platform that allows us to deploy and manage multi-container applications at scale. Businesses are rapidly adopting this revolutionary technology to modernize their applications. Cloud service providers — such as Amazon Web Ser...

Read more
  • AWS
  • Azure
  • Google Cloud
  • Kubernetes
Avatar
Stuart Scott
— October 31, 2019

AWS Internet of Things (IoT): The 3 Services You Need to Know

The Internet of Things (IoT) embeds technology into any physical thing to enable never-before-seen levels of connectivity. IoT is revolutionizing industries and creating many new market opportunities. Cloud services play an important role in enabling deployment of IoT solutions that min...

Read more
  • AWS
  • AWS IoT Events
  • AWS IoT SiteWise
  • AWS IoT Things Graph
  • IoT