The new AWS services announced at re:Invent 2014

It’s the week of re:Invent 2014 in Las Vegas, probably the most important event by AWS for the whole year. During the first re:Invent day yesterday, Andy Jassy, Senior Vice President of AWS, announced a lot of new services and major improvements to the existing AWS services. It’s a good new for developers, given that many of them are aimed at code management and deployment, but in general they look like a very nice addition to the set of services already available on AWS.
new AWS services
So, let’s take a closer look at the brand new AWS services that Amazon announced. Some of them are live already, and you can start taking advantage of their’s new features.
Aurora for Amazon RDS
Aurora is a brand new MySQL-compatible, relational database engine. It is strongly ispired by a good mix of high performance and low costs. According to Amazon’s claims, it provides up to five times better performance than MySQL at a price point one tenth that of a commercial database, with an overall high availability. Being a database engine for RDS, it’s not a brand new service, rather an option for the existing AWS relational DBMS, so it benefits from all the other features and characteristics of RDS. The huge performance claimed by Amazon is based on a tight integration of the database engine with an SSD-based virtualized storage layer purpose-built for database workloads. This reduces writes to the storage system, minimizing lock contention and eliminating delays created by database process threads.

  • MySQL Compatible Database engine.
  • 5x better performance than standard MySQL.
  • Highly available, durable, scalable and secure.
  • Available through Amazon RDS as an engine for your database.
  • 10th of the cost of the leading commercial Database Engines.
  • Upgrade an existing RDS instance or snapshot with a single click to Aurora.

AWS CodeDeploy

Amazon has a code deployment application called Apollo, which they use for all their internal deployments and got success across all the AWS divisions. Apollo has pushed 50 million deployments so far during the last 12 months, which is an impressive 95 deployments/minute. Amazon made this Apollo service available to all the AWS customers as a service under the name name of CodeDeploy. It is a fully managed, high scalable code deployment service to Amazon EC2 instances. AWS CodeDeploy allows to rapidly release new features, to avoid downtime during deployment, and handles the complexity of updating your applications.

  • Rolling deployments.
  • Deployment health tracking.
  • Stop and Rollback if your latest deployment is broken.
  • Option to deploy all of the instances or groups of instances.
  • Centralized overview of all your deployments.
  • Virtually works with any programming languages.

AWS CodePipeline

CodePipeline is another new AWS service aimed at code deployment. It’s a continuous delivery and release automation service thought to perform Continuous Build, Test, Integration and Deployment of your whole environment. You can design your development workflow for checking in code, building the code, deploying your application into staging, testing it, and releasing it to production. Also, it allows you to integrate 3rd party tools into any step of your release process in case you have specific need. This new service hasn’t been released yet, but it should become available in early 2015.

  • Enables repeatable, automated integration.
  • Can take code from any repository and integrate any kind of policies.
  • Good workflow modeling and visualization.
  • Integrates with existing build and deployment tools that you currently use.

AWS CodeCommit

The third brand new service announced by Amazon for code management and operations is CodeCommit. It’s a managed and scalable source control service that hosts private Git repositories. eliminating the need of operating a separate source code repository. Being based on Git, it supports all the standard functionality of this very popular open source Version Control System, allowing it to work seamlessly with your existing Git-based tools. Also, it has team management features to allow all of your organization to browse, edit, and collaborate on projects. This service too will be available only next year, but it’s clear that the combination of CodeCommit, CodePipeline and CodeDeploy is a really major advancement of AWS in the DevOps field, making it a very convenient set of solutions.

  • Git-based managed code repository in the cloud.
  • Fully managed, available and scalable
  • No size limits on repositories or files
  • Full support for git, git-based 3rf party tool and good integration with other AWS Code* services

AWS Key Management Service

AWS KMS, that is Key Management Service, fills an hole for Encryption Keys Management and compliance. It makes it easy to create and control keys used to encrypt your data, and uses Hardware Security Modules (HSMs) to protect their security. It is well integrated with other AWS services, including AWS CloudTrail to provide you logs of keys usage to help meet your regulatory and compliance needs.

  • One-Click encryption from AWS Console or APIs or SDKs.
  • Centralized Key Management.
  • Can enforced Automatic Key rotation.
  • Full logging on CloudTrail
  • High Available, Durable and well integrated with the AWS Services

AWS Config

The last service announced by Amazon is AWS Config, a new service to manage resource dependency and auditing from a centralized location. It provides you an AWS resource inventory, configuration history, and configuration change notifications to enable security and governance. With AWS Config you can discover existing AWS resources, export a complete inventory of your AWS resources with all configuration details, and determine how a resource was configured at any point in time.

  • It helps you to solve your CMDB issues in the cloud.
  • Full visibility of all the resources from a centralized location.
  • Infer and manage the relationships between resources.
  • Identify the blast radius of a configuration change.
  • Auditing and troubleshooting of configuration changes.

Written by

I have strong experience on Multiple Unix/Linux flavours, LAMP Stack, Monitoring Systems, Database, NoSQL. I love to explore the new concepts/services in Cloud Computing World. I have written 4 certifications in different flavours of Linux/Unix.

Related Posts

— November 28, 2018

Two New EC2 Instance Types Announced at AWS re:Invent 2018 – Monday Night Live

Let’s look at what benefits these two new EC2 instance types offer and how these two new instances could be of benefit to you. Both of the new instance types are built on the AWS Nitro System. The AWS Nitro System improves the performance of processing in virtualized environments by...

Read more
  • AWS
  • EC2
  • re:Invent 2018
— November 21, 2018

Google Cloud Certification: Preparation and Prerequisites

Google Cloud Platform (GCP) has evolved from being a niche player to a serious competitor to Amazon Web Services and Microsoft Azure. In 2018, research firm Gartner placed Google in the Leaders quadrant in its Magic Quadrant for Cloud Infrastructure as a Service for the first time. In t...

Read more
  • AWS
  • Azure
  • Google Cloud
Khash Nakhostin
— November 13, 2018

Understanding AWS VPC Egress Filtering Methods

Security in AWS is governed by a shared responsibility model where both vendor and subscriber have various operational responsibilities. AWS assumes responsibility for the underlying infrastructure, hardware, virtualization layer, facilities, and staff while the subscriber organization ...

Read more
  • Aviatrix
  • AWS
  • VPC
— November 10, 2018

S3 FTP: Build a Reliable and Inexpensive FTP Server Using Amazon’s S3

Is it possible to create an S3 FTP file backup/transfer solution, minimizing associated file storage and capacity planning administration headache?FTP (File Transfer Protocol) is a fast and convenient way to transfer large files over the Internet. You might, at some point, have conf...

Read more
  • Amazon S3
  • AWS
— October 18, 2018

Microservices Architecture: Advantages and Drawbacks

Microservices are a way of breaking large software projects into loosely coupled modules, which communicate with each other through simple Application Programming Interfaces (APIs).Microservices have become increasingly popular over the past few years. The modular architectural style,...

Read more
  • AWS
  • Microservices
— October 2, 2018

What Are Best Practices for Tagging AWS Resources?

There are many use cases for tags, but what are the best practices for tagging AWS resources? In order for your organization to effectively manage resources (and your monthly AWS bill), you need to implement and adopt a thoughtful tagging strategy that makes sense for your business. The...

Read more
  • AWS
  • cost optimization
— September 26, 2018

How to Optimize Amazon S3 Performance

Amazon S3 is the most common storage options for many organizations, being object storage it is used for a wide variety of data types, from the smallest objects to huge datasets. All in all, Amazon S3 is a great service to store a wide scope of data types in a highly available and resil...

Read more
  • Amazon S3
  • AWS
— September 18, 2018

How to Optimize Cloud Costs with Spot Instances: New on Cloud Academy

One of the main promises of cloud computing is access to nearly endless capacity. However, it doesn’t come cheap. With the introduction of Spot Instances for Amazon Web Services’ Elastic Compute Cloud (AWS EC2) in 2009, spot instances have been a way for major cloud providers to sell sp...

Read more
  • AWS
  • Azure
  • Google Cloud
— August 23, 2018

What are the Benefits of Machine Learning in the Cloud?

A Comparison of Machine Learning Services on AWS, Azure, and Google CloudArtificial intelligence and machine learning are steadily making their way into enterprise applications in areas such as customer support, fraud detection, and business intelligence. There is every reason to beli...

Read more
  • AWS
  • Azure
  • Google Cloud
  • Machine Learning
— August 17, 2018

How to Use AWS CLI

The AWS Command Line Interface (CLI) is for managing your AWS services from a terminal session on your own client, allowing you to control and configure multiple AWS services.So you’ve been using AWS for awhile and finally feel comfortable clicking your way through all the services....

Read more
  • AWS
Albert Qian
— August 9, 2018

AWS Summit Chicago: New AWS Features Announced

Thousands of cloud practitioners descended on Chicago’s McCormick Place West last week to hear the latest updates around Amazon Web Services (AWS). While a typical hot and humid summer made its presence known outside, attendees inside basked in the comfort of air conditioning to hone th...

Read more
  • AWS
  • AWS Summits
— August 8, 2018

From Monolith to Serverless – The Evolving Cloudscape of Compute

Containers can help fragment monoliths into logical, easier to use workloads. The AWS Summit New York was held on July 17 and Cloud Academy sponsored my trip to the event. As someone who covers enterprise cloud technologies and services, the recent Amazon Web Services event was an insig...

Read more
  • AWS
  • AWS Summits
  • Containers
  • DevOps
  • serverless