Everything You Ever Wanted to Know about Amazon Kinesis Firehose

Amazon Kinesis Firehose makes it easy to load streaming data into AWS. Here’s what you need to know.

In an earlier blog post, I introduced you to Amazon Kinesis, the real-time streaming data service from Amazon. Now we will discuss the equally-important Amazon Kinesis Firehose service and how you can leverage it to easily load streaming data into AWS. We’ll discuss an overview of the service, the key concepts you need to know, and how to get started today. To make your life easier, I’ve included screenshots to help describe the individual steps. Read on to learn more about Amazon Kinesis Firehose!

Introducing Amazon Kinesis Firehose

Amazon Kinesis Firehose is simple to use, and the process of analyzing massive volumes of streaming data requires only five easy steps:

  1. Capture and submit streaming data to Amazon Kinesis Firehose.
  2. Amazon Kinesis Firehose loads streaming data to Amazon S3.
  3. Analyze streaming data using any BI tools of your choice.
  4. ???
  5. Profit!

Amazon Kinesis Firehose is a managed service, so you won’t have to wrestle with any obtuse administrative tasks. It automatically scales according to your data throughput and needs. To minimize storage use and for security purposes, Amazon Kinesis Firehose also batches, compresses, and encrypts data before loading it. The service manages all the underlying infrastructure, storage, networking, and configuration required for capturing and loading data into Amazon S3 or Amazon Redshift. Apart from the elastic nature of it’s underlying infrastructure, Amazon Kinesis Firehose synchronously replicates data across three facilities in an AWS Region, providing high availability and durability for the data as it is transported to various destinations.

The Key Concepts of Amazon Kinesis Firehose You Need to Know

You need to be familiar with a few terms before we know how to use Firehose. Let’s take a closer look at each of them.

Delivery Stream: Users submit the data to Amazon Kinesis Firehose by creating a Delivery Stream.
Records: A Record is the data that user submits to the delivery stream. Record can be up to 1000KB each.
Data Producers: In Amazon Kinesis parlance, producers are the entities those generate streaming data. A web server can be a producer that emits log data, or a twitter stream, which submits data about a particular trend etc. The producer list can go to anything that fits your requirement such as click stream, a log shipper like Logstash and so on.
Buffer Size and Buffer Interval: Amazon Kinesis Firehose buffers incoming streaming data to a certain size or for a certain period of time before delivering to destinations. Buffer Size is in MBs and Buffer Interval is in seconds.  Amazon Kinesis Firehose buffers incoming data before delivering it to your S3 bucket. You can configure buffer size (1 to 128 MBs) or buffer interval (60 to 900 seconds), and the one the condition is satisfied the system triggers the data delivery to your S3 bucket. In circumstances where data delivery to destination is falling behind data writing to delivery stream, Firehose raises buffer size dynamically to catch up and make sure that all data is delivered to the destination.
Amazon Kinesis Agents: Amazon Kinesis Agent is a pre-built Java application for the linux-based servers (only Amazon and RHEL the time writing this blog), that monitors files such as log files and continuously collect and send data to your delivery stream.

Getting Started with Kinesis Firehose (Screenshots included)

Note: To use Kinesis Firehose, you should already have an AWS account.

Login to Amazon and go to the Kinesis Service

Amazon kinesis firehose

Go to Kinesis Firehose to create a Delivery Stream.

Amazon Kinesis Firehose Kinesis

The next step is to create a delivery stream as follows. Choose either S3 or Redshift as your destination.

Amazon Kinesis Firehose

You have to provide a stream name, select a pre-existing bucket or you can create a new S3 bucket in a region of your choice, and add a prefix that will be a folder in the aforementioned bucket.

Amazon Kinesis Firehose

The next step shows you how to configure various parameters such as buffer-size, buffer interval, Compression and Encryption, IAM Roles. We have configured this as below:

Amazon Kinesis Firehose

To Create/Update Existing IAM Role, click on Firehose Delivery IAM Role. It will take to a new page. Here we have taken the default policy and role name etc. and allow it.

KinesisYou will be taken to previous Configuration page where you click next.

The next page will be review page where you will get a chance to review your configurations.

Amazon Kinesis Firehose

If everything looks OK, “Create Delivery Stream”.

The stream will be created. Users have options to Edit, Delete, Monitor and Send Data to the stream.

Amazon Kinesis Firehose
  • Your stream is ready and you can send data to the stream using either aforementioned Amazon Kinesis Agent.
  • You can either use a java program or aws cli to submit data to Firehose (we are doing it to keep things simple). In real world data comes from various sources with a high velocity and volume.
  • AWS CLI for Generating logs:
aws firehose put-record --delivery-stream-name <stream-name> --record Data="<log string>\n"
  • Sample Java API to generate logs for stream:
import java.nio.ByteBuffer;
import java.nio.charset.StandardCharsets;
import com.amazonaws.auth.AWSCredentials;
import com.amazonaws.auth.BasicAWSCredentials;
import com.amazonaws.services.kinesisfirehose.AmazonKinesisFirehoseClient;
import com.amazonaws.services.kinesisfirehose.model.PutRecordRequest;
import com.amazonaws.services.kinesisfirehose.model.PutRecordResult;
import com.amazonaws.services.kinesisfirehose.model.Record;
public class KinesisFireHoseSampleApp {
private static AmazonKinesisFirehoseClient firehoseClient;
private static void init() throws Exception {
AWSCredentials credentials = new BasicAWSCredentials("xxxxxxxxxxxxxx", "yyyyyyyyyyyyyyyyyyyy");
firehoseClient = new AmazonKinesisFirehoseClient(credentials);
}
public static void main(String[] args) throws Exception {
init();
String data = "my log data" + "\n"; // add \n as a record separator
Record record = new Record();
record.setData(ByteBuffer.wrap(data.getBytes(StandardCharsets.UTF_8)));
PutRecordRequest putRecordRequest = new PutRecordRequest()
.withDeliveryStreamName("test-stream")
.withRecord(record);
putRecordRequest.setRecord(record);
PutRecordResult result = firehoseClient.putRecord(putRecordRequest);
System.out.println("Result Inserted with ID: "+result.getRecordId());
}
}

You can see the metrics in Cloudwatch dashboard associated with the stream. Here are few samples:

Amazon Kinesis Firehose

The above example is a very basic one and sends through the above java client which sends a log record each time the program is run. Please note that we need aws-java-sdk-1.10.43 and amazon-kinesis-client-1.6.1 in the project library to run the application. After submitting the requests, you can see the graphs plotted against the requested records. However, this is not the ideal scenario to get Kinesis Firehose log graphs. The above example only shows how a record is submitted to Firehose stream – You need a high volume or high velocity data stream to realize the benefits of Kinesis Firehose. One of the best ways to send your logs to Firehose is to install the Amazon Kinesis Agent.

Amazon Kinesis Agent Setup Process 

Amazon Kinesis Agent is a stand-alone Java software application that offers an easier way to collect and ingest data in Firehose which continuously monitors a set of files and sends new data records to your delivery stream. Users can install the agent on Linux-based server environments such as web servers, front-ends, log servers, and database servers, and configure the agent by specifying the log files to monitor and the delivery stream names. Afterwards, the agent continuously collects data from the log files and submits the data to the delivery stream.

  • The supported linux versions are Amazon Linux AMI with version 2015.09 or later, or Red Hat Enterprise Linux version 7 or later.
  • Run following command to install the agent:
    sudo yum install –y aws-kinesis-agent
  • Alternately you can download and install in RHEL:
    sudo yum install –y https://s3.amazonaws.com/streaming-data-agent/aws-kinesis-agent-1.0-1.amzn1.noarch.rpm
  • Start the agent manually:
    sudo service aws-kinesis-agent start
  • Optionally configure the agent to start on system startup:
    sudo chkconfig aws-kinesis-agent on
  • The agent is now running as a system service in the background. It continuously monitors the location specified in the configuration file and emits the data into Firehose, logging agent activity in /var/log/aws-kinesis-agent/aws-kinesis-agent.log.
  • You can configure the agent to use optional settings, which the agent loads from the file /etc/aws-kinesis/agent.json by default.
  • Any changes to the configuration file requires stopping and re-starting the agent, as shown in the following commands:
    sudo service aws-kinesis-agent stop
    sudo service aws-kinesis-agent start

or sudo service aws-kinesis-agent restart

You can learn more about Kinesis Agent from here.

Next Steps

Kinesis is an efficient and must-try feature from AWS that works on both high data volumes and velocities. Gigabytes of data can be processed using Firehose with ease and made very simple to work with and analyze.

To get a solid foundation in Amazon analytics, check out our Analytics Fundamentals for AWS course. You’ll learn everything you need to know about the entire family of AWS analytics services, as well as develop the critical skills you need to bolster your career. Get started today with a free 7-day trial and start learning right away!

 

Avatar

Written by

Chandan Patra

Cloud Computing and Big Data professional with 10 years of experience in pre-sales, architecture, design, build and troubleshooting with best engineering practices. Specialities: Cloud Computing - AWS, DevOps(Chef), Hadoop Ecosystem, Storm & Kafka, ELK Stack, NoSQL, Java, Spring, Hibernate, Web Service

Related Posts

Avatar
Stuart Scott
— July 18, 2019

AWS Fundamentals: Understanding Compute, Storage, Database, Networking & Security

If you are just starting out on your journey toward mastering AWS cloud computing, then your first stop should be to understand the AWS fundamentals. This will enable you to get a solid foundation to then expand your knowledge across the entire AWS service catalog.   It can be both d...

Read more
  • AWS
  • Compute
  • Database
  • fundamentals
  • networking
  • Security
  • Storage
Avatar
Adam Hawkins
— July 17, 2019

How to Become a DevOps Engineer

The DevOps Handbook introduces DevOps as a framework for improving the process for converting a business hypothesis into a technology-enabled service that delivers value to the customer. This process is called the value stream. Accelerate finds that applying DevOps principles of flow, f...

Read more
  • AWS
  • AWS Certifications
  • DevOps
  • DevOps Foundation Certification
  • Engineer
  • Kubernetes
Avatar
Stuart Scott
— July 2, 2019

AWS Machine Learning Services

The speed at which machine learning (ML) is evolving within the cloud industry is exponentially growing, and public cloud providers such as AWS are releasing more and more services and feature updates to run in parallel with the trend and demand of this technology within organizations t...

Read more
  • Amazon Machine Learning
  • AWS
  • AWS re:Invent
  • Machine Learning
Avatar
Stuart Scott
— June 27, 2019

AWS Control Tower & VPC Traffic Mirroring

AWS re:Inforce 2019 is a two-day conference for security, identity, and compliance learning and community building. This year's keynote, presented by AWS Vice President and CIO, Stephen Schmidt, announced the general availability of AWS Control Tower and the new VPC Traffic Mirroring fe...

Read more
  • AWS
  • re:Inforce 2019
  • traffic mirroring
  • VPC
Avatar
Stuart Scott
— June 20, 2019

Working with AWS Networking & Amazon VPC

Being able to architect your own isolated segment of AWS is a simple process using VPCs; understanding how to architect its related networking components and connectivity architecture is key to making it a powerful service. Many services within Amazon Web Services (AWS) require you t...

Read more
  • AWS
  • VPC
Avatar
Stuart Scott
— June 19, 2019

AWS Compute Fundamentals Update

AWS is renowned for the rate at which it reinvents, revolutionizes, and meets customer demands and expectations through its continuous cycle of feature and service updates. With hundreds of updates a month, it can be difficult to stay on top of all the changes made available.   Here ...

Read more
  • AWS
Jeff Hyatt
Jeff Hyatt
— June 18, 2019

10 Steps for an Effective Reserved Instances Strategy

Amazon Web Services (AWS) offers three different ways to pay for EC2 Instances: On-Demand, Reserved Instances, and Spot Instances. This article will focus on effective strategies for purchasing Reserved Instances. While most of the major cloud platforms offer pre-pay and reservation dis...

Read more
  • AWS
  • EC2
Joe Nemer
Joe Nemer
— June 18, 2019

AWS Certification Practice Exam: What to Expect from Test Questions

If you’re building applications on the AWS cloud or looking to get started in cloud computing, certification is a way to build deep knowledge in key services unique to the AWS platform. AWS currently offers 11 certifications that cover major cloud roles including Solutions Architect, De...

Read more
  • AWS
  • AWS Certifications
Avatar
John Chell
— June 13, 2019

AWS Certified Solutions Architect Associate: A Study Guide

The AWS Solutions Architect - Associate Certification (or Sol Arch Associate for short) offers some clear benefits: Increases marketability to employers Provides solid credentials in a growing industry (with projected growth of as much as 70 percent in five years) Market anal...

Read more
  • AWS
  • AWS Certifications
Chris Gambino and Joe Niemiec
Chris Gambino and Joe Niemiec
— June 11, 2019

Moving Data to S3 with Apache NiFi

Moving data to the cloud is one of the cornerstones of any cloud migration. Apache NiFi is an open source tool that enables you to easily move and process data using a graphical user interface (GUI).  In this blog post, we will examine a simple way to move data to the cloud using NiFi c...

Read more
  • AWS
  • S3
Avatar
Chandan Patra
— June 11, 2019

Amazon DynamoDB: 10 Things You Should Know

Amazon DynamoDB is a managed NoSQL service with strong consistency and predictable performance that shields users from the complexities of manual setup. Whether or not you've actually used a NoSQL data store yourself, it's probably a good idea to make sure you fully understand the key ...

Read more
  • AWS
  • DynamoDB
Avatar
Andrew Larkin
— June 6, 2019

The 11 AWS Certifications: Which is Right for You and Your Team?

As companies increasingly shift workloads to the public cloud, cloud computing has moved from a nice-to-have to a core competency in the enterprise. This shift requires a new set of skills to design, deploy, and manage applications in cloud computing. As the market leader and most ma...

Read more
  • AWS
  • AWS Certifications