The course is part of this learning path
HashiCorp Vault provides a simple and effective way to manage security in cloud infrastructure. The HashiCorp Vault service secures, stores, and tightly controls access to tokens, passwords, certificates, API keys, and other secrets in modern computing.
This course will enable you to recognize, explain, and implement the services and functions provided by the HashiCorp Vault service.
In this course we learn to recognize and implement the core HashiCorp Vault services in cloud infrastructure. The topics we cover are as follows:
- Vault architecture and its core components
- Vault policies and how they are used to grant or forbid access to operations in Vault
- Secrets and secret management as performed within Vault
- Vault cubbyholes and how they can be utilized
- Vault dynamic secrets
- Vault authentication and Vault identities
This course will appeal to anyone looking to extend their knowledge of cloud security best practices, and to learn more about the tools and services available to help manage cloud security. If you are performing any of the roles below, we recommend completing this course.
- Architects and Developers
- System Administrators
- Security specialists
- DevOps specialists
- And anyone else interested in managing and maintaining secrets
At the end of this course you will be able to explain and implement the HashiCorp Vault service, and you will also be able to implement the Vault CLI and API to execute tasks related to Vault administration. By completing this course, you will:
- Understand the core principles of Vault, including how Vault can be used to manage and maintain secrets
- Understand the key benefits of using Vault, including how to deploy and configure it within your own environments
- Be able to evaluate and select HashiCorp Vault services
- Know how to implement the Vault CLI and API to execute tasks related to administration and configuration
We recommend completing the Cloud Academy DevOps Fundamentals Learning Path so you have a basic understanding of system administration and configuration tasks.
In this lecture we'll introduce you to Dynamic Secrets a specialized secrets engine capable of dynamically generating secrets for different backends.
The agenda for this lecture includes the following, we'll cover cases for which dynamic secret generation is beneficial. We'll show you how to setup and configure a secret engine. We'll discuss Roles and Policies to help secure and control dynamic secrets generation. We'll review the method in which you acquire dynamically generated secret credentials. And finally, we'll examine the lifecycle management of these types of secrets. In this section, we'll set the scene for dynamic secrets by examining current challenges in managing secrets.
A common challenge that many organizations encounter is that of managing, isolating, and protecting secrets. Organizations evolve over time in many dimensions, such as company growth, divisional reorganizations, staff retention as a few examples. Companies can't keep pace with the rate of ongoing change within, and this often leads to mismanagement of secrets, and if not done well leads to secret exposure and misuse. Rather than having statically assigned and hard coded secrets spread around and unmanaged, what if we were able to dynamically generate these secrets, complete with lifetime and lease management? This is exactly what Vault provides in its Dynamic Secrets feature set, as we'll see in the next slide.
Vault can be configured to dynamically generate secrets, and, in essence, act as if it were a Secrets as a Service. By leveraging this capability, you can take back control of the spread of secrets and their associated lifetimes. Dynamic secrets allow you to generate credentials on-demand dynamically complete with lifecycle management. Vault Policies are used to control who has access to create dynamic secrets, to ensure the system isn't abused. To help clarify the benefits of dynamic secrets, let's walk through an example in which we have an application that reads and writes customer data to a database. Rather than have static database credentials stored within the application, we can configure the application to ask Vault for a database credential.
In this use case, a Vault administrator has set a database dynamic secrets engine, and has configured a TTL for any generated database credential, to enforce its validity so that it is automatically revoked when it is no longer used. Each application instance can request unique credentials such that they don't need to be shared. By making these credentials short-lived, you reduce the chance of the credentials being compromised. If an application was compromised, the credentials used by the application can be revoked rather than changing a global set of credentials. In another example, when an app needs to access an Amazon S3 bucket, it asks Vault for AWS credentials. Vault will generate an AWS credential granting permissions to access the S3 bucket. In addition, Vault will automatically revoke this credential after the TTL is expired. Vault comes with several Dynamic Secrets engines, examples being Database and AWS specific. Let's clarify the workflow associated with using a Dynamic Secrets engine. For starters, you'll need to activate or enable the particular dynamic secrets engine. Next the secrets engine will need to be configured. The specific configuration is different for each backend. Next, you create Vault Roles and Policies to control and authorize who can use and generate dynamic credentials ensuring and maintaining the rule of least privilege within the overall system. With the configuration all in place, you can then request credentials to be dynamically requested. And finally, if required, renew the lease on the credential returned before it expires, assuming a TTL has been set on it.
In this section we'll go step by step through the process of setting up a Vault Dynamic Secrets engine. For starters, Vault ships with several secrets engines. The majority of these are disabled by default. Before a particular one can be used, it must first be enabled. This can be done using the Vault CLI. Running the command vault secrets-h provides help and lists the sub commands available for managing secrets engines. As shown here, the sub commands are Disable, which disables a secrets engine. Enable, which enables it. List, which lists all the currently enabled secret engines. Move, which moves an already enabled secret engine to a different path. Tune, which tunes a secret engine configuration for example changing the TTLs for the lease.
Let's walk through an example where we use the Vault CLI to enable the AWS secrets engine. Before we start, we can familiarize ourselves with how to perform this enablement. First we'll consult the help directly by executing the command vault secrets enable-h. Having consulted the help, we're reminded how to enable the AWS secrets engine we do so by performing the command vault secrets enable aws. With the AWS secrets engine successfully enabled, we next consult the contextual pathing information for the AWS secrets engine. We do so by running the command vault path help aws. This provides us with all the possible paths supported by the AWS secrets engine. Knowing that we need to configure the AWS IAM credentials that the Vault AWS secrets engine will use we can see here that we need to write them to the config root path. To establish and configure the AWS IAM credentials for the Vault AWS secrets engine, we simply run the command vault write aws config root with the parameters access key and secret key supplied. Finally, lets adjust the security on the generated credentials returned by the Vault AWS secrets engine. As an example, let's ensure that the AWS credentials dynamically created by the AWS secrets engine expire one hour after creation, but with a 24 hour lease renewal. That is the credential lease can be renewed multiple times as needed within 24 hours of its creation. With this type of configuration in place, we are protecting ourselves from having any discovered AWS credentials floating around in the system, remaining in a valid and usable state.
In this section we'll examine Vault Roles and how they can be used to establish roles within Vault to govern who can do what. When setting up a Dynamic Secrets engine, you have the option of configuring Roles. A role in the context of a secret engine allows you to control and manages what privileges are given within the backend. For example, if we were setting up the AWS secrets engine to provide credentials to access an S3 bucket, then we could establish two roles, one for read-only access to the S3 bucket, and another for write-only access. When we are ready to request an AWS dynamically generated credential from Vault we would specify the name of the role and get back a credential mapped to the permissions associated with that particular role. This feature allows us to facilitate the rule of least privilege for the different actors within our systems. When setting up roles for the AWS Dynamic Secrets engine we need to create IAM policies that specify the least privilege rule sets for the various roles that we want to establish. In the IAM policy example shown here, we have the support policy json file that provides read only access to an S3 bucket, where as the web app policy json file provides write only access. Taking the IAM policies that were documented in the previous slide, we would proceed by using the Vault CLI to upload them into Vault and bind them to named roles. In the examples shown here, the first example creates a named role called support which is bound to the support policy json file giving it read access only on the S3 bucket. The second example creates a named role called webapp which is bound to the web app policy json file giving it write access only on the S3 bucket. Later on in the lecture we'll show you how to request dynamically generated AWS credentials for the various established roles.
In this section we'll review Vault Policies and how they can used to control who can request and receive dynamically generated credentials. Given the power that dynamically generated credentials now affords us, it is important to control which users and/or applications can use it. We do so by implementing Vault Policies. Continuing on with our AWS S3 bucket example as used within previous slides, we want to grant read access to generated AWS credentials that Vault creates for us. Additionally, we could establish policy to allow an expiring lease to be renewed for our dynamically generated AWS credentials. Let's see how we would do this using Vault Policies. In the examples shown here the left hand Support policy provides read capabilities on the aws creds support path allowing access to any credentials generated and bound to the support role. Additionally, the policy allows you to renew the lease on the credential. The right hand WebApp policy provides read capabilities on the aws creds webapp path allowing access to any credentials generated and bound to the webapp role. Additionally the policy allows you to renew the lease on the credential. To create either of these policies, you would use the Vault CLI and run the command vault policy write providing both the policy name and the policy path parameters.
In this section we'll show you how to actually acquire a dynamically generated credential. With the AWS secrets engine enabled, roles and policies have been created, and we are ready to dynamically generate a credential. Using the Vault CLI we would simply run the command vault read aws/creds/support to generate an AWS credential that we could use to perform read only operations on an S3 bucket. In the example shown here, the AWS secrets engine returns an AWS credential consisting of an access_key and secret_key. The credential has a lease set on it of one hour, and if required the lease can be renewed.
In this section we'll dive deeper into secret leases. Almost everything in Vault has an associated lease, and when that lease is expired, the secret is revoked. This actually includes tokens as well. A token, if not renewed, will automatically expire. When a new token or secret is created, it is a child of the creator. If the parent is revoked or expired, then so do all of the children regardless of their own leases. A child may be a token, secret, or authentication created by a parent. A parent is almost always a token.
In this scenario, the lease ID of 1d3f will expire in an hour. If a token or secret with a lease is not renewed before the lease expires, it will be revoked by the Vault server. When it's revoked, it takes its child 794b although the child has one more hour before it expires. Then, two hours later, b519 will be revoked and will take child 6a2c with it. Let's walk through an example where we'll perform a lease renewal. Using the Vault CLI we'll consult the help for lease renewals by running the command vault lease renew-h. To perform a lease renewal we simply run the command vault lease renew and pass in the lease id parameter. The example shown here shows us renewing the lease for an existing Support role bound credential. In doing so the credential is valid for another hour. Leases can also be revoked using Vault CLI. Revoking a lease blocks the associated credential from having any further privilege within the configured backend. In the examples shown here, we can either revoke a specific lease by lease id, or a group of leases that share a common lease id prefix. When configuring with leases, a best practice is to perform renewal at the halfway mark of the least duration. For example, in our AWS scenario, the default TTL was set at one hour. If continuous use is required, we would attempt to renew the lease every 30 minutes. If renewal fails fall back to re reading, which will generate a fresh credential.
Okay, that completes this lecture on Dynamic Secrets. Go ahead and close this lecture, and we'll see you shortly in the next one.
Jeremy is a Content Lead Architect and DevOps SME here at Cloud Academy where he specializes in developing DevOps technical training documentation.
He has a strong background in software engineering, and has been coding with various languages, frameworks, and systems for the past 25+ years. In recent times, Jeremy has been focused on DevOps, Cloud (AWS, Azure, GCP), Security, Kubernetes, and Machine Learning.
Jeremy holds professional certifications for AWS, Azure, GCP, Terraform, Kubernetes (CKA, CKAD, CKS).