Getting Started with Chef on Amazon AWS

This is a guest post from 47Line Technologies
As explained in the last blog post, Chef comprises of three main elements – a server, one or more nodes and atleast one workstation.

  • The server acts as a hub that is available to every node. All chef client nodes will be registered with the server. The server holds all the cookbooks, recipes and policies. Clients communicate with the server to get the right configuration elements from the server and apply it to the nodes.
  • The workstation is the development machine from which configuration elements like cookbooks, recipes and policies are defined. Configuration elements are synchronized with the chef-repo and uploaded to the server with knife command.
  • Nodes contain chef-client which performs all the infrastructure automation.

In this blog post, we will setup

  • Chef server on an AWS EC2 Ubuntu instance
  • Workstation on an AWS ECS Ubuntu instance

Chef Server Installation on Amazon EC2 Ubuntu 12.04

Download the Chef server from the main website, and select the appropriate package as shown below

Launch an instance of Ubuntu Server 12.04 LTS (PV) – ami-3c39686e (64-bit) in your AWS account and SSH to the server with the key file and username ubuntu

# switch to home folder
cd ~
# Download the Chef Server Package
wget https://opscode-omnibus-packages.s3.amazonaws.com/ubuntu/12.04/x86_64/chef-server_11.0.10-1.ubuntu.12.04_amd64.deb
# Install the Chef Server
sudo dpkg -i chef-server*
# reconfigure the service for your machine
sudo chef-server-ctl reconfigure

After the above step, you can access the web interface by typing https:// from your browser. Because the SSL certificate is signed by an authority not recognized by your browser, you will get a warining. Click on the “Proceed anyway” button. Ensure port 443 is open in the security group associated with the server.
Login with the default admin credentials

username: admin
password: p@ssw0rd1

Kindly change the default password immediately after logging in for the first time!

Setting up the Workstation

The first step in setting up the workstation is to install git or any other VCS of your choice. Chef community heavily uses git.

sudo apt-get update
sudo apt-get install git
# Download and run the client installation script from the Chef website.
curl -L https://www.opscode.com/chef/install.sh | sudo bash

The Chef package is now installed. The next step is to clone the chef-repo skeleton directory.

cd ~
git clone https://github.com/opscode/chef-repo.git

This will create a directory called chef-repo in your home directory. This is where the entire configuration will be contained.
Create a .chef directory inside chef-repo to save the authentication and configuration files.

mkdir ~/chef-repo/.chef

 

Login to the Chef Server (https://<<ElasticIP of the Chef Server>>) with the admin credentials.
Click on the “Clients” tab in the top navigation bar.

Click on the “Edit” button associated with the chef-validator client. Regenerate the private key by selecting that check box and clicking “Save Client”

Copy the private key and save it in the chef-validator.pem file in ~/chef-repo/.chef directory.
Similarly Click on the Users tab in the Navigation bar, Click on the Edit hyperlink associated with admin user and regenerate the private key. Copy the private key and save it in admin.pem file in ~/chef-repo/.chef directory

Next step is to configure the knife command
Knife is a command-line tool that provides an interface between a local chef-repo and the Chef server. Knife helps provisioning resources, manage recipes/cookbooks, nodes & more.

knife configure –initial

The command will prompt you for the path to pem files, server URL, username and password.

Add the .chef directory to be ignored in the .gitignore list.
Setup your email and name with git  and add the ruby embedded with chef path to PATH variable.
To ensure everything is fine, run the “knife user list” command. It will list all the users.

In the next blog, we will look into bootstrapping EC2 instances with Chef.
Contributed by Santhosh Daivajna, Senior Cloud Consultant at 47Line Technologies

Written by

47Line is building solutions solving critical business problems using “cloud as the backbone”. The team has been working in Cloud Computing domain for last 6 years and have proven thought leadership in Cloud, Big Data technologies.

Related Posts

— November 28, 2018

Two New EC2 Instance Types Announced at AWS re:Invent 2018 – Monday Night Live

Let’s look at what benefits these two new EC2 instance types offer and how these two new instances could be of benefit to you. Both of the new instance types are built on the AWS Nitro System. The AWS Nitro System improves the performance of processing in virtualized environments by...

Read more
  • AWS
  • EC2
  • re:Invent 2018
— November 21, 2018

Google Cloud Certification: Preparation and Prerequisites

Google Cloud Platform (GCP) has evolved from being a niche player to a serious competitor to Amazon Web Services and Microsoft Azure. In 2018, research firm Gartner placed Google in the Leaders quadrant in its Magic Quadrant for Cloud Infrastructure as a Service for the first time. In t...

Read more
  • AWS
  • Azure
  • Google Cloud
Khash Nakhostin
— November 13, 2018

Understanding AWS VPC Egress Filtering Methods

Security in AWS is governed by a shared responsibility model where both vendor and subscriber have various operational responsibilities. AWS assumes responsibility for the underlying infrastructure, hardware, virtualization layer, facilities, and staff while the subscriber organization ...

Read more
  • Aviatrix
  • AWS
  • VPC
— November 10, 2018

S3 FTP: Build a Reliable and Inexpensive FTP Server Using Amazon’s S3

Is it possible to create an S3 FTP file backup/transfer solution, minimizing associated file storage and capacity planning administration headache?FTP (File Transfer Protocol) is a fast and convenient way to transfer large files over the Internet. You might, at some point, have conf...

Read more
  • Amazon S3
  • AWS
— October 18, 2018

Microservices Architecture: Advantages and Drawbacks

Microservices are a way of breaking large software projects into loosely coupled modules, which communicate with each other through simple Application Programming Interfaces (APIs).Microservices have become increasingly popular over the past few years. The modular architectural style,...

Read more
  • AWS
  • Microservices
— October 2, 2018

What Are Best Practices for Tagging AWS Resources?

There are many use cases for tags, but what are the best practices for tagging AWS resources? In order for your organization to effectively manage resources (and your monthly AWS bill), you need to implement and adopt a thoughtful tagging strategy that makes sense for your business. The...

Read more
  • AWS
  • cost optimization
— September 26, 2018

How to Optimize Amazon S3 Performance

Amazon S3 is the most common storage options for many organizations, being object storage it is used for a wide variety of data types, from the smallest objects to huge datasets. All in all, Amazon S3 is a great service to store a wide scope of data types in a highly available and resil...

Read more
  • Amazon S3
  • AWS
— September 18, 2018

How to Optimize Cloud Costs with Spot Instances: New on Cloud Academy

One of the main promises of cloud computing is access to nearly endless capacity. However, it doesn’t come cheap. With the introduction of Spot Instances for Amazon Web Services’ Elastic Compute Cloud (AWS EC2) in 2009, spot instances have been a way for major cloud providers to sell sp...

Read more
  • AWS
  • Azure
  • Google Cloud
— August 23, 2018

What are the Benefits of Machine Learning in the Cloud?

A Comparison of Machine Learning Services on AWS, Azure, and Google CloudArtificial intelligence and machine learning are steadily making their way into enterprise applications in areas such as customer support, fraud detection, and business intelligence. There is every reason to beli...

Read more
  • AWS
  • Azure
  • Google Cloud
  • Machine Learning
— August 17, 2018

How to Use AWS CLI

The AWS Command Line Interface (CLI) is for managing your AWS services from a terminal session on your own client, allowing you to control and configure multiple AWS services.So you’ve been using AWS for awhile and finally feel comfortable clicking your way through all the services....

Read more
  • AWS
Albert Qian
— August 9, 2018

AWS Summit Chicago: New AWS Features Announced

Thousands of cloud practitioners descended on Chicago’s McCormick Place West last week to hear the latest updates around Amazon Web Services (AWS). While a typical hot and humid summer made its presence known outside, attendees inside basked in the comfort of air conditioning to hone th...

Read more
  • AWS
  • AWS Summits
— August 8, 2018

From Monolith to Serverless – The Evolving Cloudscape of Compute

Containers can help fragment monoliths into logical, easier to use workloads. The AWS Summit New York was held on July 17 and Cloud Academy sponsored my trip to the event. As someone who covers enterprise cloud technologies and services, the recent Amazon Web Services event was an insig...

Read more
  • AWS
  • AWS Summits
  • Containers
  • DevOps
  • serverless