Overview
Overview
Difficulty
Intermediate
Duration
40m
Students
2075
Ratings
4.7/5
starstarstarstarstar-half
Description

This course provides an overview of Redis Cache and how to create a Redis Cache instance in Azure. With Redis Cache deployed in Azure, we’ll then connect an application to the cache.
Next, we’ll walk through the process of storing and retrieving data in Redis Cache. After covering Redis Cache, we’ll walk through an overview of what CDN is and what it’s used for. We’ll then develop some code for leveraging CDN. As we wrap up the course, we’ll cover the process for invalidating data in both Redis Cache and in a CDN.

Intended Audience

This course is intended for IT professionals who are interested in earning Azure certification and those who need to incorporate Redis Cache or CDN into their solutions. To get the most from this course, you should have at least a moderate understanding of what caching is and why it’s used.

Learning Objectives

By the end of this course, you should have a good understanding of what Redis Cache and CDN are and what purposes they serve. You’ll also know how to connect to each from applications and how to purge or invalidate data in both.



Transcript

- [Instructor] Azure Cache for Redis is usually used to improve performance and scalability of systems and applications that rely on back end data-stores. Leveraging Redis cache improves performance by temporarily copying frequently accessed data to fast storage that's located close to the application that's being run. Azure Cache for Redis provides the storage and memory instead of loading data from disc via a database. 

In addition to providing caching services, Azure Cache for Redis can also be used for other purposes. For example, it can be used as an in-memory data structure store or even a distributed non-relational database. By taking advantage of the Redis engines high throughput and low latency, application performance is improved. 

Leveraging Azure Cache for Redis provides organizations with access to a cache that's managed by Microsoft. Because it's hosted in Azure, Azure Cache for Redis is accessible to all applications, whether they reside within Azure or outside of it. 

There are several typical patterns where Azure Cache for Redis comes in handy for supporting application architecture or to improve application performance. 

Common patterns include things such as cache-aside, content caching, user session caching, job and message queuing and distributed transactions. 

Because databases can be quite large, they should never be loaded in their entirety into a cache. That said, a common strategy would be to use the cache-aside pattern to load data items into the cache only as needed. When the back end data is updated by the system, the cache can also be updated. Such updates are then distributed with other clients. Through expiration settings or by using an eviction policy, the system can cause data updates to be reloaded into the cache. 

Because most web pages are generated from templates that contain static headers, footers, toolbars, et cetera, they don't change all that often, as such, generating them dynamically isn't recommended. By using an in-memory cache like Azure Cache for Redis, you can speed up access to web content by providing your web servers with quicker access to this type of static content when compared to back end data-stores. The content caching pattern reduces processing time and server load that would otherwise be necessary to generate content dynamically. What this does, is allow web servers to be more responsive. As such, this results in the ability to actually reduce the number of servers that are needed to handle similar loads. Azure Cache for Redis provides the Redis output cache provider to help support this pattern with asp.net. 

User session caching is typically used with shopping carts. It's also used in other applications that deal with user-history types of information that make use of cookies. Because storing too much in a cookie can negatively affect performance, you can instead use the cookie as a key to query the data that's stored in a back end database. Using Azure Cache for Redis to associate information with a user is far faster than interacting with a full relational database. 

Any time an application receives a request, there's a chance that the operations that are associated with that request, might take additional time to execute. A common way to deal with these types of longer running operations is to add them to a queue, which is then processed later and maybe even by a different server altogether. This type of deferment strategy is called task queuing and Azure Cache for Redis serves this purpose well by acting as a distributed queue. 

An application will often need to be able to execute multiple commands against a back end data-store in a single operation. If all commands don't succeed, then they must all be rolled back to the initial state. 

Azure Cache for Redis provides support for executing a batch of commands in a single operation. It does so in the form of transactions. Azure Cache for Redis is available in three different tiers. These tiers include basic, standard and premium. The basic tier offers a single node cache. It supports multiple memory sizes from 250 megabytes all the way up to 53 gigabytes. 

The basic tier is a good fit for development environments, testing and non-critical workloads. It's important to note that the basic tier offers no service-level agreement. The standard tier offers a replicated cache in a two-node, or primary, secondary configuration that's managed by Microsoft. 

The standard tier offers a high-availability SLA of 99.99% or four nines. 

The premium tier is enterprise ready. Caches in the premium tier offer more features, higher throughputs, and lower latencies than standard or basic. Premier tier caches are deployed on more powerful hardware than the other tiers, which in turn obviously offers better performance than basic or standard. 

It's important to note that a cache can be scaled to a higher tier after it's already been created, but it can't be scaled down to a lower tier. So with that in mind, when you're deploying a cache make sure that you don't over-provision, because if you do, you may find yourself in a spot where you can't go backwards. It's always better to slightly under-provision and then scale higher if necessary.

About the Author
Students
84185
Courses
82
Learning Paths
62

Tom is a 25+ year veteran of the IT industry, having worked in environments as large as 40k seats and as small as 50 seats. Throughout the course of a long an interesting career, he has built an in-depth skillset that spans numerous IT disciplines. Tom has designed and architected small, large, and global IT solutions.

In addition to the Cloud Platform and Infrastructure MCSE certification, Tom also carries several other Microsoft certifications. His ability to see things from a strategic perspective allows Tom to architect solutions that closely align with business needs.

In his spare time, Tom enjoys camping, fishing, and playing poker.