Microservices architecture continues to be a trending topic in the application development world, largely because it makes apps easier to build, expand, and scale compared to traditional monolithic architecture.
But there are numerous considerations that go into the use of microservices. One of them is how to package and deploy the services.
In a previous blog, we discussed the use of containers for microservices.
In this one, the focus is on serverless.
Microservices and Serverless Defined
What are microservices? As a refresher, microservices refers to an architectural style in which a monolithic app is broken down into a collection of small, loosely coupled services. Each service is continuously developed and separately maintained and communicates through a clearly defined, lightweight framework to achieve a business objective.
A microservice may be broken up into functions. A function is a small piece of code that performs a single action in response to an event. Depending on how an app is divided up, a microservice may be equivalent to a function — it performs only one action. Or it may consist of multiple functions.
This brings us to serverless. What is serverless? Sometimes referred to as Function-as-a-Service (FaaS), serverless is a cloud-based execution model for building and running applications and services without having to manage the underlying infrastructure. It enables apps to run parts of their code through event triggers and evoke functions on demand.
Note: There’s another kind of serverless application architecture referred to as Backend as a Service (BaaS). It’s primarily used for front-end heavy apps and will be discussed in a future blog.
The Serverless Approach
Using the serverless model, microservices are deployed within a cloud service provider’s (CSP) cloud environment. Although the name suggests otherwise, there are still servers involved. They’re not running constantly, and their provisioning and maintenance are managed entirely by the CSP.
The CSP typically uses either containers or virtual machines VMs) to isolate the services. The code is triggered by an event, such as an API request or file upload, and the necessary resources are dynamically allocated. When the action is completed, the server goes idle until another action is requested. Users only pay for time when the server is executing the action, and there’s no wasted money due to over-provisioning cloud resources.
Serverless Benefits for Microservices
Serverless works well with microservices. It alleviates the need for developers to spin up their own server instances. Since they don’t have to worry about provisioning infrastructure, managing capacity, or avoiding downtime, developers can build microservices more quickly.
In addition, the serverless deployment process is simple. Developers simply package the code (as a ZIP file, for example), upload it to the deployment infrastructure, and describe the desired performance characteristics. There’s no need to use Puppet or Chef tools or start/stop shell scripts.
Among the biggest benefits of using serverless for microservices is cost savings. As previously noted, users only pay for compute resources when the server is executing an action. This can generate savings in many ways.
For example, the use of serverless supports breaking down components by logic/domain even if the costs of such fine granularity might have been otherwise prohibitive. As a result, developers can try out something new with low friction and minimal cost.
Then there’s the scaling functionality of serverless. That reduces compute cost, as well as operational management because the scaling is automatic. Because scaling is handled by the CSP for every request/event, there’s no need to think about how many concurrent requests can be handled before running out of memory or experiencing too significant of an impact on performance.
There are many other benefits as well, depending on the use case. But there are also disadvantages to consider. Among them: warmup issues when creating new instances and higher costs due to the communication between the different services.
Resources Make a Difference
Thanks to the growing popularity of serverless, there are numerous serverless platforms to choose from, including Google Cloud Run, Microsoft Azure Functions, Alibaba Function Compute, IBM OpenWisk, and many more. Choosing the right one – and using the right resources – can boost the benefits of using serverless microservices architecture and minimize or mitigate the disadvantages.
In our experience at ClearScale, AWS Lambda is among the best in terms of serverless platforms. It can natively support Java, Go, PowerShell, Node.js C#, Python and Ruby, and provides a runtime API for additional flexibility. Users can reuse core logic from their existing app with minor refactoring.
AWS Lambda is also complemented by a wide range of AWS services that facilitate serverless microservice development and deployment:
- AWS API Gateway works like the event listener for API resource requests received over the web and triggers Lambda functions as required. These are requests typically forwarded from AWS CloudFront.
- AWS CloudFront is a fast content delivery network (CDN) service that securely delivers data, videos, applications, and APIs globally with low latency and high transfer speeds.
- AWS Serverless Application Model (SAM), an open-source framework for building serverless apps, provides shorthand syntax to express functions, APIs, databases, and event source mappings. During deployment, SAM transforms and expands the syntax into AWS CloudFormation syntax, making it faster to build serverless applications.
- AWS CodePipeline can be used with the AWS Serverless Application Model to automate building, testing, and deploying serverless apps.
- AWS CloudFormation treats infrastructure as code to quickly provision AWS and third-party resources.
- AWS CodeDeploy can be used to roll out and test new Lambda function versions, while AWS CodeBuild integrates with it to provide automated builds.
- AWS Cognito is a fully managed, identity solution where it allows user authentication and authorization using OpenID Connect.
- AWS DynamoDB is a NoSQL database service offered by AWS to store and retrieve data.
- Amazon EventBridge, a serverless event bus service, makes it easy to access application data from a variety of sources and send it into an AWS environment.
- AWS Fargate is a serverless compute engine for containers that removes the need to provision and manage servers.
- Amazon RDS Proxy, a highly available database proxy, manages thousands of concurrent connections to relational databases.
- Amazon Aurora Serverless is an on-demand, auto-scaling configuration for Amazon Aurora (MySQL-compatible and PostgreSQL-compatible editions). It enables running databases in the cloud without managing database instances.
The Value of Experience
Of course, having the resources available to employ serverless microservices architecture is one thing. Knowing how to use the resources, as well as combine them and complement them with other resources to meet business requirements, is another.
That’s why there’s great value in partnering with an organization like ClearScale.
ClearScale has extensive experience in developing apps using serverless microservices — and using AWS resources and a variety of other resources and technologies to do so. Here is just one example of that experience:
Our experience, combined with our status as an AWS Premier Consulting Partner and the vast knowledge base of our team, means we know how to determine if a serverless deployment is a good fit for a project. We are well versed in dealing with the constraints associated with a serverless deployment environment compared to VM-based or container-based infrastructure.
We understand the importance of the geographical distribution of edge servers if some application code could be delivered from the edge. We’re experienced in using the many different patterns that can be used with serverless applications. We know the best ways to handle orchestration within serverless applications. And we’re adept at using asynchronous workloads whenever possible.
There’s a lot to know about both serverless and microservices to make the best use of them. Working with ClearScale can help ensure you do. Get started with a free serverless assessment.