The course is part of this learning path
This course covers the core learning objective to meet the requirements of the 'Architecting for Management & Governance in AWS - Level 2' skill
Learning Objectives:
- Understand the different AWS management services available to monitor the performance of a solution
- Apply Amazon CloudWatch monitoring contols to respond to system-wide performance changes
- Apply AWS Config controls to manage compliance based upon business guidelines
Probably one of the most impressive features about Eventbridge is the access to Eventbridge archives. The archive allows you to create a place where you can archive events and easily replay them at a later time.
These events will stay in the archive for as long as you set the retention period, after that time they are discarded. You can of course keep your events in the archive indefinitely. there is no limit to the length of time that you can keep events for, and since their text is based anyways it doesn't take up that much room storage wise.
One of the most obvious benefits of Archiving all of your events is for disaster recovery scenarios. Let's say your database is corrupted or gets deleted somehow, if you have a record of all the events that took place you can reconstruct exactly where your database was before the catastrophe.
Disaster recovery isn't the only reason you want to archive. Even something as simple as updating an app with a new feature or a new underlying system could be a great reason to replay events. It might allow you to get new information out of old data that you didn't have the ability to retrieve before.
Amazon has made it super easy to replay these archived events. The functionality is already built into the service and all you have to do is create a new replay, select the archive you wish to draw your events from, and select the destination you wish them to go. At the moment you can only send them back to the same event bus they originally created from but that's OK. I imagine there will be more updates to this feature to allow replay to many different targets... and of course, if you didn’t want to keep these events streaming back from the archive, you can always stop a reply at any time.
Schema Registry
When writing applications that deal with events and receiving information from the event bridge it's important to know about the schema of the events that you are going to use. A schema describes the structure of an event and helps you understand what you can expect within the event in regard to attributes and data types.
For example, a customer review event might always contain two strings: one for customer name, and one for the review itself.
Eventbridge has a schema registry built into the service, where you can see all the possible schemas that are available on your event bus. To help facilitate your development, every single AWS service that is available to event bridge, has a prebuilt schema in this registry for you to search through.
This registry allows you to browse through by title or content all of the possible schemas. This search can include variable names within the schemas as well as the title of the service themselves.
At the moment, the SAAS events do not have a prebuilt catalog available for each of their event types, but we can easily discover schemas based on these events. This is literally as easy as selecting one of your SAAS events – for example, a Zendesk new ticket event – and pressing the discover schema button.
Finally, the schema registry allows you to create your own custom event schemas from a JSON string that you create yourself, or provided by your custom service / application.
Code Bindings
When developing for events and Eventbridge you have the option to generate code bindings that can be used within Visual Studio.
A code binding is simply an extension to the visual editor that brings in the schema and allows Visual Studio to easily check to see if your variables are of the right type and to expose attributes making programming a lot easier.
Code bindings can greatly increase development speed and are available for java, Python, and Typescript. Bindings can be created for any of the AWS services already supported within EventBridge, as well as your own custom and discovered schemas.
Event bridge vs SNS
Now you may have noticed some similarities between EventBridge which can push events to many subscribers and the Amazon simple notification service. And I would say you're 100 % correct there are many crossovers between these two services.
There are a few key differences here that allow you to make a decision about which is best for your solution. The simple notification service is as the name states very simple it operates on a limited set of parameters, but it does allow you to scale up to millions of subscribers. It however doesn't have direct connectivity to software as a service provider and doesn't provide as much routing capability as Amazon event bridge does.
For example, it's extremely hard to have SNS trigger a step functions state machine compared to Amazon event bridge.
And even though SNS scales almost infinitely the filtering is limited to attributes only not including the content within an event.
So if you are looking to have a dead simple service that can handle the pub sub architecture go for SNS, but if you need a more complex and sophisticated approach, take a look at EventBridge.
Event bridge vs Kinesis
Kinesis actually does a fairly good job of being what EventBridge is; it's able to route events as well as work as event storage which is ideal for processing real time data at large scales.
One of the problems however is there's a limit to the number of consumers that can connect to a single stream. Additionally, each consumer would have to be responsible for filtering out any of the messages that came through kinesis to determine what was important for it.
While they are a very close comparison, EventBridge really does allow you some fantastic flexibility when dealing with your SaaS Providers, so if that's more of your area of concern, do some more reading into EventBridge for your solution.
Stuart has been working within the IT industry for two decades covering a huge range of topic areas and technologies, from data center and network infrastructure design, to cloud architecture and implementation.
To date, Stuart has created 150+ courses relating to Cloud reaching over 180,000 students, mostly within the AWS category and with a heavy focus on security and compliance.
Stuart is a member of the AWS Community Builders Program for his contributions towards AWS.
He is AWS certified and accredited in addition to being a published author covering topics across the AWS landscape.
In January 2016 Stuart was awarded ‘Expert of the Year Award 2015’ from Experts Exchange for his knowledge share within cloud services to the community.
Stuart enjoys writing about cloud technologies and you will find many of his articles within our blog pages.