Everything You Ever Wanted to Know about Amazon Kinesis Firehose

Amazon Kinesis Firehose makes it easy to load streaming data into AWS. Here’s what you need to know.

In an earlier blog post, I introduced you to Amazon Kinesis, the real-time streaming data service from Amazon. Now we will discuss the equally-important Amazon Kinesis Firehose service and how you can leverage it to easily load streaming data into AWS. We’ll discuss an overview of the service, the key concepts you need to know, and how to get started today. To make your life easier, I’ve included screenshots to help describe the individual steps. Read on to learn more about Amazon Kinesis Firehose!

Introducing Amazon Kinesis Firehose

Amazon Kinesis Firehose is simple to use, and the process of analyzing massive volumes of streaming data requires only five easy steps:

  1. Capture and submit streaming data to Amazon Kinesis Firehose.
  2. Amazon Kinesis Firehose loads streaming data to Amazon S3.
  3. Analyze streaming data using any BI tools of your choice.
  4. ???
  5. Profit!

Amazon Kinesis Firehose is a managed service, so you won’t have to wrestle with any obtuse administrative tasks. It automatically scales according to your data throughput and needs. To minimize storage use and for security purposes, Amazon Kinesis Firehose also batches, compresses, and encrypts data before loading it. The service manages all the underlying infrastructure, storage, networking, and configuration required for capturing and loading data into Amazon S3 or Amazon Redshift. Apart from the elastic nature of it’s underlying infrastructure, Amazon Kinesis Firehose synchronously replicates data across three facilities in an AWS Region, providing high availability and durability for the data as it is transported to various destinations.

The Key Concepts of Amazon Kinesis Firehose You Need to Know

You need to be familiar with a few terms before we know how to use Firehose. Let’s take a closer look at each of them.

Delivery Stream: Users submit the data to Amazon Kinesis Firehose by creating a Delivery Stream.
Records: A Record is the data that user submits to the delivery stream. Record can be up to 1000KB each.
Data Producers: In Amazon Kinesis parlance, producers are the entities those generate streaming data. A web server can be a producer that emits log data, or a twitter stream, which submits data about a particular trend etc. The producer list can go to anything that fits your requirement such as click stream, a log shipper like Logstash and so on.
Buffer Size and Buffer Interval: Amazon Kinesis Firehose buffers incoming streaming data to a certain size or for a certain period of time before delivering to destinations. Buffer Size is in MBs and Buffer Interval is in seconds.  Amazon Kinesis Firehose buffers incoming data before delivering it to your S3 bucket. You can configure buffer size (1 to 128 MBs) or buffer interval (60 to 900 seconds), and the one the condition is satisfied the system triggers the data delivery to your S3 bucket. In circumstances where data delivery to destination is falling behind data writing to delivery stream, Firehose raises buffer size dynamically to catch up and make sure that all data is delivered to the destination.
Amazon Kinesis Agents: Amazon Kinesis Agent is a pre-built Java application for the linux-based servers (only Amazon and RHEL the time writing this blog), that monitors files such as log files and continuously collect and send data to your delivery stream.

Getting Started with Kinesis Firehose (Screenshots included)

Note: To use Kinesis Firehose, you should already have an AWS account.

Login to Amazon and go to the Kinesis Service

Amazon kinesis firehose

Go to Kinesis Firehose to create a Delivery Stream.

Amazon Kinesis Firehose Kinesis

The next step is to create a delivery stream as follows. Choose either S3 or Redshift as your destination.

Amazon Kinesis Firehose

You have to provide a stream name, select a pre-existing bucket or you can create a new S3 bucket in a region of your choice, and add a prefix that will be a folder in the aforementioned bucket.

Amazon Kinesis Firehose

The next step shows you how to configure various parameters such as buffer-size, buffer interval, Compression and Encryption, IAM Roles. We have configured this as below:

Amazon Kinesis Firehose

To Create/Update Existing IAM Role, click on Firehose Delivery IAM Role. It will take to a new page. Here we have taken the default policy and role name etc. and allow it.

KinesisYou will be taken to previous Configuration page where you click next.

The next page will be review page where you will get a chance to review your configurations.

Amazon Kinesis Firehose

If everything looks OK, “Create Delivery Stream”.

The stream will be created. Users have options to Edit, Delete, Monitor and Send Data to the stream.

Amazon Kinesis Firehose
  • Your stream is ready and you can send data to the stream using either aforementioned Amazon Kinesis Agent.
  • You can either use a java program or aws cli to submit data to Firehose (we are doing it to keep things simple). In real world data comes from various sources with a high velocity and volume.
  • AWS CLI for Generating logs:
aws firehose put-record --delivery-stream-name <stream-name> --record Data="<log string>\n"
  • Sample Java API to generate logs for stream:
import java.nio.ByteBuffer;
import java.nio.charset.StandardCharsets;
import com.amazonaws.auth.AWSCredentials;
import com.amazonaws.auth.BasicAWSCredentials;
import com.amazonaws.services.kinesisfirehose.AmazonKinesisFirehoseClient;
import com.amazonaws.services.kinesisfirehose.model.PutRecordRequest;
import com.amazonaws.services.kinesisfirehose.model.PutRecordResult;
import com.amazonaws.services.kinesisfirehose.model.Record;
public class KinesisFireHoseSampleApp {
private static AmazonKinesisFirehoseClient firehoseClient;
private static void init() throws Exception {
AWSCredentials credentials = new BasicAWSCredentials("xxxxxxxxxxxxxx", "yyyyyyyyyyyyyyyyyyyy");
firehoseClient = new AmazonKinesisFirehoseClient(credentials);
}
public static void main(String[] args) throws Exception {
init();
String data = "my log data" + "\n"; // add \n as a record separator
Record record = new Record();
record.setData(ByteBuffer.wrap(data.getBytes(StandardCharsets.UTF_8)));
PutRecordRequest putRecordRequest = new PutRecordRequest()
.withDeliveryStreamName("test-stream")
.withRecord(record);
putRecordRequest.setRecord(record);
PutRecordResult result = firehoseClient.putRecord(putRecordRequest);
System.out.println("Result Inserted with ID: "+result.getRecordId());
}
}

You can see the metrics in Cloudwatch dashboard associated with the stream. Here are few samples:

Amazon Kinesis Firehose

The above example is a very basic one and sends through the above java client which sends a log record each time the program is run. Please note that we need aws-java-sdk-1.10.43 and amazon-kinesis-client-1.6.1 in the project library to run the application. After submitting the requests, you can see the graphs plotted against the requested records. However, this is not the ideal scenario to get Kinesis Firehose log graphs. The above example only shows how a record is submitted to Firehose stream – You need a high volume or high velocity data stream to realize the benefits of Kinesis Firehose. One of the best ways to send your logs to Firehose is to install the Amazon Kinesis Agent.

Amazon Kinesis Agent Setup Process 

Amazon Kinesis Agent is a stand-alone Java software application that offers an easier way to collect and ingest data in Firehose which continuously monitors a set of files and sends new data records to your delivery stream. Users can install the agent on Linux-based server environments such as web servers, front-ends, log servers, and database servers, and configure the agent by specifying the log files to monitor and the delivery stream names. Afterwards, the agent continuously collects data from the log files and submits the data to the delivery stream.

  • The supported linux versions are Amazon Linux AMI with version 2015.09 or later, or Red Hat Enterprise Linux version 7 or later.
  • Run following command to install the agent:
    sudo yum install –y aws-kinesis-agent
  • Alternately you can download and install in RHEL:
    sudo yum install –y https://s3.amazonaws.com/streaming-data-agent/aws-kinesis-agent-1.0-1.amzn1.noarch.rpm
  • Start the agent manually:
    sudo service aws-kinesis-agent start
  • Optionally configure the agent to start on system startup:
    sudo chkconfig aws-kinesis-agent on
  • The agent is now running as a system service in the background. It continuously monitors the location specified in the configuration file and emits the data into Firehose, logging agent activity in /var/log/aws-kinesis-agent/aws-kinesis-agent.log.
  • You can configure the agent to use optional settings, which the agent loads from the file /etc/aws-kinesis/agent.json by default.
  • Any changes to the configuration file requires stopping and re-starting the agent, as shown in the following commands:
    sudo service aws-kinesis-agent stop
    sudo service aws-kinesis-agent start

or sudo service aws-kinesis-agent restart

You can learn more about Kinesis Agent from here.

Next Steps

Kinesis is an efficient and must-try feature from AWS that works on both high data volumes and velocities. Gigabytes of data can be processed using Firehose with ease and made very simple to work with and analyze.

To get a solid foundation in Amazon analytics, check out our Analytics Fundamentals for AWS course. You’ll learn everything you need to know about the entire family of AWS analytics services, as well as develop the critical skills you need to bolster your career. Get started today with a free 7-day trial and start learning right away!

 

Avatar

Written by

Chandan Patra

Cloud Computing and Big Data professional with 10 years of experience in pre-sales, architecture, design, build and troubleshooting with best engineering practices. Specialities: Cloud Computing - AWS, DevOps(Chef), Hadoop Ecosystem, Storm & Kafka, ELK Stack, NoSQL, Java, Spring, Hibernate, Web Service

Related Posts

Joe Nemer
Joe Nemer
— September 12, 2019

Real-Time Application Monitoring with Amazon Kinesis

Amazon Kinesis is a real-time data streaming service that makes it easy to collect, process, and analyze data so you can get quick insights and react as fast as possible to new information.  With Amazon Kinesis you can ingest real-time data such as application logs, website clickstre...

Read more
  • amazon kinesis
  • AWS
  • Stream Analytics
  • Streaming data
Avatar
Alex Casalboni
— September 3, 2019

Google Vision vs. Amazon Rekognition: A Vendor-Neutral Comparison

Google Cloud Vision and Amazon Rekognition offer a broad spectrum of solutions, some of which are comparable in terms of functional details, quality, performance, and costs. This post is a fact-based comparative analysis on Google Vision vs. Amazon Rekognition and will focus on the tech...

Read more
  • Amazon Rekognition
  • AWS
  • Google Cloud Platform
  • Google Vision
Alisha Reyes
Alisha Reyes
— August 30, 2019

New on Cloud Academy: CISSP, AWS, Azure, & DevOps Labs, Python for Beginners, and more…

As Hurricane Dorian intensifies, it looks like Floridians across the entire state might have to hunker down for another big one. If you've gone through a hurricane, you know that preparing for one is no joke. You'll need a survival kit with plenty of water, flashlights, batteries, and n...

Read more
  • AWS
  • Azure
  • Google Cloud Platform
  • New content
  • Product Feature
  • Python programming
Joe Nemer
Joe Nemer
— August 27, 2019

Amazon Route 53: Why You Should Consider DNS Migration

What Amazon Route 53 brings to the DNS table Amazon Route 53 is a highly available and scalable Domain Name System (DNS) service offered by AWS. It is named by the TCP or UDP port 53, which is where DNS server requests are addressed. Like any DNS service, Route 53 handles domain regist...

Read more
  • Amazon
  • AWS
  • Cloud Migration
  • DNS
  • Route 53
Alisha Reyes
Alisha Reyes
— August 22, 2019

How to Unlock Complimentary Access to Cloud Academy

Are you looking to get trained or certified on AWS, Azure, Google Cloud Platform, DevOps, Cloud Security, Python, Java, or another technical skill? Then you'll want to mark your calendars for August 23, 2019. Starting Friday at 12:00 a.m. PDT (3:00 a.m. EDT), Cloud Academy is offering c...

Read more
  • AWS
  • Azure
  • cloud academy content
  • complimentary access
  • GCP
  • on the house
Avatar
Michael Sheehy
— August 19, 2019

What Exactly Is a Cloud Architect and How Do You Become One?

One of the buzzwords surrounding the cloud that I'm sure you've heard is "Cloud Architect." In this article, I will outline my understanding of what a cloud architect does and I'll analyze the skills and certifications necessary to become one. I will also list some of the types of jobs ...

Read more
  • AWS
  • Cloud Computing
Avatar
Nitheesh Poojary
— August 19, 2019

Boto: Using Python to Automate AWS Services

Boto allows you to write scripts to automate things like starting AWS EC2 instances Boto is a Python package that provides programmatic connectivity to Amazon Web Services (AWS). AWS offers a range of services for dynamically scaling servers including the core compute service, Elastic...

Read more
  • Automated AWS Services
  • AWS
  • Boto
  • Python
Avatar
Andrew Larkin
— August 13, 2019

Content Roadmap: AZ-500, ITIL 4, MS-100, Google Cloud Associate Engineer, and More

Last month, Cloud Academy joined forces with QA, the UK’s largest B2B skills provider, and it put us in an excellent position to solve a massive skills gap problem. As a result of this collaboration, you will see our training library grow with additions from QA’s massive catalog of 500+...

Read more
  • AWS
  • Azure
  • content roadmap
  • Google Cloud Platform
Avatar
Adam Hawkins
— August 9, 2019

DevSecOps: How to Secure DevOps Environments

Security has been a friction point when discussing DevOps. This stems from the assumption that DevOps teams move too fast to handle security concerns. This makes sense if Information Security (InfoSec) is separate from the DevOps value stream, or if development velocity exceeds the band...

Read more
  • AWS
  • cloud security
  • DevOps
  • DevSecOps
  • Security
Avatar
Stefano Giacone
— August 8, 2019

Test Your Cloud Knowledge on AWS, Azure, or Google Cloud Platform

Cloud skills are in demand | In today's digital era, employers are constantly seeking skilled professionals with working knowledge of AWS, Azure, and Google Cloud Platform. According to the 2019 Trends in Cloud Transformation report by 451 Research: Business and IT transformations re...

Read more
  • AWS
  • Cloud skills
  • Google Cloud
  • Microsoft Azure
Avatar
Andrew Larkin
— August 7, 2019

Disadvantages of Cloud Computing

If you want to deliver digital services of any kind, you’ll need to estimate all types of resources, not the least of which are CPU, memory, storage, and network connectivity. Which resources you choose for your delivery —  cloud-based or local — is up to you. But you’ll definitely want...

Read more
  • AWS
  • Azure
  • Cloud Computing
  • Google Cloud Platform
Joe Nemer
Joe Nemer
— August 6, 2019

Google Cloud vs AWS: A Comparison (or can they be compared?)

The "Google Cloud vs AWS" argument used to be a common discussion among our members, but is this still really a thing? You may already know that there are three major players in the public cloud platforms arena: Amazon Web Services (AWS), Microsoft Azure, and Google Cloud Platform (GCP)...

Read more
  • AWS
  • Google Cloud Platform
  • Kubernetes