Next-Generation Software Delivery Models on AWS
Software delivery has been evolving. Not too many years ago most software lived on-premise. Then came the web-hosted app, and then robust cloud solutions like those provided by various combinations of AWS services.
At this week’s re:Invent, Sajee Mathew – Solutions Architect at AWS – talked about the best approach to build SaaS (Software as a Solution) solutions. There are basically three possible approaches:
- Isolated customer stacks, which offer independent resources for each customer.
- Containerization on shared platforms which use EC2 Container Service and Docker to provide “slices” of AWS.
- Pure SaaS shared architecture by means of on-demand resources.
The isolated customer stacks model means that, for every new customer, you simply replicate the stack. It makes billing and provisioning quite simple, but it comes with a catch: thousands of customers will translate into way too many stacks.
Containerization requires fewer resources, as a considerable part of your infrastructure can be shared among all your customers. Assuming you won’t face too many coding changes, it’s a good solution for new apps.
Pure SaaS shared architecture is definitely the best approach for a brand new app. Despite the need for all parts of the application to be multi-tenant-aware, you can benefit from economies of scale. Deployments serving new customers are automatically built on autoscaling single infrastructures.
As part of his presentation, Sajee defined each component of a pure SaaS shared architecture (along with AWS services built to deliver it):
- SaaS Ordering: the entry point for purchasing access to SaaS apps, for the orchestration of which AWS Simple Workflow might be useful.
- SaaS Provisioning: this component manages the fully automated deployment of resources and represents the cornerstone of elasticity and scalability. CloudFormation is used to define the stack, while OpsWorks, Beanstalk, and ECS are used to deploy components.
- Application Lifecycle Management: the biggest challenge in traditional architectures. Operations need to be transparent and must function with zero downtime as automation layers for continuous integration. CodePipeline, CodeCommit, and CodeDeploy are all powerful management tools.
- SaaS Billing: this component aggregates per-customer metering and rate information. You can use DynamoDB to store bills and aggregated data, and EMR for processing usage info and generate bills.
- SaaS Analytics: the aggregation point for all data sources in the development of a data warehouse. Analytics can provide a useful analysis of app performance and usage that drive decisions.
- SaaS Authentication and Authorization: a single store for all users data, third party SSO, and corporate directories. You could use IAM for policy-based access, KMS for key management, Cognito for mobile and web authentication, Directory Service, and RDS.
- SaaS Monitoring: real-time monitoring and awareness of application health require the highest scale and availability. There are plenty of off-the-shelf solutions if you don’t feel like using Amazon Kinesis and CloudWatch.
- SaaS Metering: this component gives your system the ability to understand and track usage and activity, and support audit requirements for billing. You might use Amazon Lambda to feed a metering queue on SQS.
SaaS best practices
Sajee also provided some best practices for building SaaS solutions:
- Separate the platform from the program. Avoid tight coupling. Applications will change a lot over time, but core services should remain reusable so they can support a whole fleet of SaaS applications.
- Optimize for cost and performance. Go for horizontal scalability at every level and create small parallel resource units that scale more efficiently. Also, use scalable services such as DynamoDB and Aurora.
- Design for multi-multi-tenancy.
- Know your data lifecycle. Value and usage change over time, you should, therefore, leverage efficient storage options.
- Collect everything and learn from it. Reliably collect as many metrics as possible and store them long term. The goal is to know your customers in order to learn and profit through analytics.