Function as a Service (FaaS) Explained
Function as a service (FaaS), also known as “serverless” computing, is an option for deploying applications in the cloud.
It’s been around for almost a decade and has been available from the mainstream cloud providers for at least six years. For example, Amazon released AWS Lambdas in late 2014. Microsoft made Azure functions available in early 2016.
But, what exactly is serverless computing? When is it the right choice? When you’re designing an application to use a serverless architecture, what do you need to consider?
Let’s take a look.
What Is FaaS?
FaaS simplifies deploying applications to the cloud. With serverless computing, you install a piece of business logic, a “function,” on a cloud platform.
The platform executes the function on demand. So you can run backend code without provisioning or maintaining servers.
But that’s only part of the story.
The cloud platform makes the function available and manages resource allocation for you. If the system needs to accommodate 100 simultaneous requests, it allocates 100 (or more) copies of your service.
If demand drops to two concurrent requests, it destroys the unneeded ones. You pay for the resources your functions use, and only when your functions need them.
“Serverless” computing has servers, but they’re not your problem. The cloud provider manages them for you.
SaaS? PaaS? IaaS? Where Does This Fit in?
There are a lot of different XaaS acronyms floating around out there. What are the differences? Where does FaaS fit in?
- Infrastructure as a Service (IaaS) gives you the building blocks for applications. It typically provides access to computers (virtual or on dedicated hardware), networks, and storage. AWS S3 and EC are two commonly used examples. IaaS offers you the most control, but you have to build your systems and networks from scratch.
- Platform as a Service (PaaS) offers cloud components mainly for applications. It gives developers a framework for creating customized applications. Some prominent examples are AWS Elastic Beanstalk and Google App Engine. PaaS makes deploying applications easy, but at the cost of vendor-lock-in and often increased cost.
- Software as a Service (SaaS) is complete applications delivered from the cloud, like Office365 and Scalyr.
FaaS is closer to PaaS than IaaS but with some critical differences.
Instead of deploying an entire application to one or more servers, with FaaS, you install functions, parts of an app. The functions are only loaded when needed and can be executed in parallel on-demand.
Let’s take a closer look at how this works and why it’s effective.
Why FaaS?
So we’ve already covered three of the main advantages of FaaS:
- Managing servers is no longer your problem
- The platform manages horizontal scaling for you
- You only pay for what you use
By managing the servers for you, FaaS abstracts the server platform away from your application too. You can write your functions in almost any language. You can access other cloud resources like databases and caches.
If you conform to the platform’s defined interfaces, your service will work. This freedom doesn’t come for free, though. FaaS places constraints on functions, and it’s not always the best option.
Automatic Horizontal Scaling
The automatic scaling you get with serverless computing is a significant benefit. It saves you money and protects you from unexpected spikes in usage. As long as you pay your bill, your application remains available.
Without dynamic scaling, you have to size your system based on the most substantial level of utilization, not an average. This means paying for resources that spend most of their time doing nothing.
Even then, the sizing is an estimate based on past usage. What happens when demand exceeds that estimate?
Another option is to roll your own cloud scaling using technology like Docker. This solution still means incurring a great deal of overhead in both cloud resources and personnel.
Containers and orchestration provide you with dynamic scaling and excellent recovery capabilities, but you still need servers and skilled DevOps people. Even containers need security patches.
FaaS applications are simple to deploy and update. They are, as the name implies, functions.
You don’t need to enable extra systems or be a cloud expert. All you need to do is upload your compiled code and tell the platform how to provision it. You can focus on your application instead of worrying about cloud infrastructure.
Functions scale horizontally. Your service provider provisions new instances on-demand and shuts them down when they are no longer needed. Think about the power this gives you!
Instead of dividing functionality over one or more REST servers, you can decompose your application into discrete functions. FAAS almost makes REST services look like old-fashioned application servers!
When Does This Work Well?
So why isn’t everyone migrating their applications to serverless?
FaaS isn’t always the best option, or even possible, for some applications. There are design constraints. But first, let’s look at when it works well.
The name “Function as a Service” isn’t an accident or affectation.
Your service needs to operate like a mathematical function. Each invocation must be stateless; you can’t assume that information about one call to your service will be available in a subsequent request. Any state your application needs must be externalized to a database or filesystem.
This restriction makes perfect sense. FaaS provides scaling for you but without demanding any intimate application knowledge. It can only do this by assuming that it doesn’t have to manage any application state for you.
So if your functions don’t maintain state or solely rely on external resources for it, they’re a good fit for FaaS. RESTful applications are a good example. The functions externalize resource state while clients bear responsibility for maintaining their own.
An event-driven service that needs horizontal scaling can enjoy running as a function. FaaS platforms use the events to create instances of the functions and react based on the volume of requests.
RESTFul and other event-driven applications are a good fit, and so is work that runs on a schedule. Instead of paying for one or more servers that sit dormant most of the time, you can write the job as a function.
When Is FaaS a Bad Fit?
The limitation on application state isn’t the only constraint on serverless computing. There are a few more, and they may prevent your application from running as one or more functions.
Or, they might mean you need to rethink your design.
The platform loads functions on demand. They should start up quickly, usually in milliseconds.
Then the platform immediately gives them a request. When processing completes, it terminates them. The platform may reuse an instance with a “warm start” to save time, but the function cannot rely on this.
This is where the constraint on state comes from. But it also means that an application that performs a lot of initialization will not work well with FaaS.
AWS limits Lambdas to 15 minutes of execution time. Azure limits its Functions to 10 minutes.
This is plenty of time for an API call, but it might not be for a scheduled job. Unfortunately, functions have hard limits on execution time.
FaaS might also be a bad fit if you’re concerned about vendor lock-in and can’t figure out how to code around it. If you’re going to have someone else run your code on their platform, you need to write to their API.
Depending on how you structure your code, you may be able to avoid lock-in. But if you do, serverless might not be the right solution. Or you might not care.
Decreased Cost and Increased Efficiency
Cloud vendors bill FaaS based on consumption. You pay for what you use after you use it. You can even control this by setting limits on usage if your application is amenable to that.
This is in stark contrast to provisioning servers in advance, based on the anticipated load, where you are always hoping that you pay for more than you need.
Not having to run servers also means less, or no, staff to maintain them. The cloud provider maintains everything in serverless architecture, eliminating the need for system administration.
Even if you can only offload part of your application to serverless, you can save on staff or allow them to focus on the critical parts of your mission.
What Are Some FaaS Best Practices?
There are a few simple rules you can follow to make deploying FaaS work for you.
- Remember the limitations on application state. If you find yourself planning workarounds so that your functions can hold state, you’re using the wrong architecture.
- FaaS is a powerful technology, and once you’ve seen it in action, you may want to use it everywhere. You can’t.
- Make sure your functions perform one and only one action. Think in terms of a single request that yields a single response. Keep it simple.
- Resist the urge to overcome the one-and-only-one operation rule by having one call another. One of the primary benefits of serverless architecture is isolation. Creating dependencies between them negates that advantage.
- Finally, keep an eye on that load time. Again, keep it simple! Don’t use too many libraries or write a function that requires a lot of memory.
How Do I Start Using Faas?
Amazon launched AWS Lambda in 2014. Since then, it’s grown into one of their most important services. It’s the platform for Alexa Skills Development and a useful mechanism for accessing many of AWS’s monitoring features. Lambda has native support for Java, Go, PowerShell, Node.js, C#, Python, and Ruby code. You can get started with one of their tutorials here.
Microsoft launched Azure Functions a couple of years after Lambda, but it’s made up a lot of ground since then. You can code Functions in C#, JavaScript/Typescript, F#, Java, Powershell, PHP, and Python. Azure also has support for “Workflows,” which add a limited notion of state to services. You can follow links to different tutorials from this page.
The Google Cloud Platform has Cloud Functions. Google’s Functions support Javascript, Python, Ruby, Java, .NET, Go, and PHP. Functions support an event model that you can extend with plugins. Google’s getting started guide is here.
Cloudflare Serverless supports the creation of Javascript-based workers for offloading work from your web infrastructure and into Cloudflare, where it will run instances where they are needed. This is a compelling feature if your clients cover the globe.
If you don’t want to tie yourself to a specific cloud vendor, you can implement FaaS with Kubernetes and Knative. Kubernetes is an open-source orchestration tool for Docker. You can run Kubernetes on cloud architectures like AWS and GCP, or you can run it on-premises.
With Knative, you run FaaS on your Kubernetes cluster. This powerful combination provides you with horizontal scaling, powerful monitoring tools, and a very high level of fault tolerance.
What Are Some Examples of FaaS?
You may not realize it, but you’ve used serverless architectures.
Have you ever used Amazon’s Alexa? All of Alexa’s skills are implemented as AWS Lambdas. If you think about it, it makes perfect sense. Skills need to load quickly and don’t require any state. Amazon also needs to be able to scale them based on demand. Some skills, like news and weather, are used very heavily. Others mush less so,
Extract, Transform, Load (ETL) processes lend themselves to FaaS. Retrieving data, processing it, and storing the results in a database (or any other store) works well as a function that can be triggered remotely or set up on a schedule.
FaaS provides you with a way to process these jobs in parallel with multiple functions—as many as your store can handle—and then stop paying for the functions when they’re done.
Does FaaS Work For You?
Serverless computing has a lot to offer. It provides an easy path to migrating or building new services in the cloud.
Without the overhead of managing servers and with the increased efficiency of paying for only what you need, you can focus on your business and your application, and your logs.
Scalyr integrates perfectly with FaaS, since it gives you a central store of your logs, with fast ingestion, embedded metrics, and an unparalleled query language.