The next generation of cloud infrastructure – Serverless explainer

The first thing to know about serverless computing is that “serverless” is a pretty bad name to call it.

Contrary to the vernacular, the technology that has burst onto the cloud computing scene in the past two years still does in fact run on servers. The name serverless instead highlights the fact that end users don’t have to manage servers that run their code anymore.

Perhaps this sounds familiar. Technically, in a public infrastructure as a service (IaaS) the end user isn’t physically managing servers either; that’s up to the Amazon Web Services and Microsoft Azures of the world.

But so-called serverless computing takes that idea a step further and executes code that developers write using only the precise amount of compute resources needed to complete the task, no more, no less. When a pre-defined event occurs that triggers that code, the serverless platform executes the task. The end user doesn’t need to tell the serverless provider how many times these events or functions will occur. Customers pay a fraction of a penny every time a function is executed. Some believe Functions as a Service (FaaS) or event-driven computing is a better name.

“The way we came to think about it is there are different levels of abstraction that developers can interact with from an infrastructure perspective,” explains IBM Vice President of Cloud Product Management Damion Heredia, who manages IBM’s serverless computing offering named OpenWhisk. There’s bare metal, virtual machines and containers. “For certain workloads, we wanted to abstract away all that management so that you can execute your code without worrying about the infrastructure or management of the servers. That’s serverless.”

Now industry analysts, proponents and skeptics are debating just how big of a deal this technology is. Is it evolutionary or revolutionary? Will it be used to power most future applications, or just a subset of use cases? The answer, for now, is that the market is in its earliest days, so it’s difficult to say. But the hype, interest and potential benefits of this technology should not be ignored.

Pros of serverless

Amazon Web Services is largely credited with starting the serverless market hype in 2014 when the company introduced Lambda, its serverless computing product.

General Manager of AWS Strategy Matt Wood said the product was inspired by one of the company’s most popular products: Simple Storage Service (S3).

Blogger Sam Kroonenburg says the relationship between S3 and Lambda is an important analogy. “S3 deals in objects for storage. You provide an object and S3 stores it. You don’t know how, you don’t know where. You don’t care. There are no drives to concern yourself with. There’s no such thing as disk space… All of this is abstracted away. You cannot over-provision or under-provision storage capacity in S3. It just is,” Kroonenburg explains in his A Cloud Guru Blog.

Wood says AWS wanted to take that same philosophy to computing. “Lambda deals in functions. You provide function code and Lambda executes it on demand…. You cannot over provision, or under provision execution capacity in Lambda. It just is.”

In a traditional IaaS cloud environment customers provision virtual machines, storage, databases and all the security and management tools to go along with it. They load applications on to those VMs, and then they use tools like load balancers to scale them. They use management software to optimize their instance sizes and find virtual machines that have been left on by accident. Lambda and other FaaS platforms offer a different model. Code is written in functions. When an event happens that triggers that function Lambda runs it. That’s it. No capacity planning, no load balancing; just tasks being executed.

Wood, the AWS GM, says this is helpful in a variety of use cases. For example, Lambda functions can be written so that every time a photo is uploaded to S3, Lambda creates copies in different sizes, optimized for desktop, mobile and tablets. Or, every time an entry is uploaded into a database, a Lambda function can be written to load the data into a data warehouse like Amazon Redshift for analysis at a later time. Wood says many customers use Lambda to “glue” AWS services together and perform tasks of reporting, scheduling and altering data in preparation for analysis.

Another prime use case, Wood says, is in the Internet of Things world, where real-time responses are needed at large scales. AWS has recently introduced the ability to run Lambda functions on Internet of Things devices through its Greengrass platform, which can perform Lambda functions on devices in low-connectivity areas where no round-trip back to the cloud data center is needed. A security camera can run Lambda and every time motion is detected, to record the data and send it to a database. There is no virtual machine server sitting idly by 24 hours a day; the event-driven code just runs when it’s triggered to do so. AWS customer FireEye says it saved up to 80% off its VM pricing by using Lambda instead of its EC2 instances.

“Serverless is at its most simple an outsourcing solution. It allows you to pay someone to manage servers, databases and even application logic that you might otherwise manage yourself,” explains a deep-dive explainer on what serverless is on Martin Fowler’s blog. The big difference with serverless is that “you only pay for the compute that you need, down to a 100ms boundary.”

There is no waiting for servers to boot up, or load balancing to configure. Tasks are just infinitely executed. Fowler argues this model allows developers and companies to test ideas and bring them to market faster than other models.

Cons

Serverless computing is not a panacea and there are drawbacks. For one, this is a very immature market. Managing serverless use cases at scale is difficult, says Gartner Research Director Craig Lowery. There are scant management tools for coordinating groups of functions. The security, monitoring and optimization software supporting this technology are nascent. Perhaps most importantly, it requires developers to write apps in a different type of way. “A lot of the limitations have to do with the architectural constraints it places on software design,” he explains.

Serverless functions are also stateless. They can be re-used and re-executed, but they don’t store state, they execute their task and that’s it. Vendors charge for FaaS platforms by the fraction of a penny per 1 million functions performed, which, illustrates how vendors encourage developers to plan to run their functions for. “It gets messy to manage when you have a lot of functions,” Lowery says.

There’s also a concern with vendor lock in. It’s not exactly easy to take an application built in a FaaS and port it over to another platform, either on premises or in another public cloud, Lowery notes. Because the market is so young, the tooling for serverless platforms is customized to the environment in which they live. AWS Lambda integrates deeply with many other AWS products. Wood says that because Lambda supports common programming languages like Node.js, Python and Java, that code can be transferred out. “There is no special Lambda language,” he notes.

Overall, Lowery says serverless computing or FaaS is, “a very powerful edition to existing compute paradigms of virtual machines and containers. Lambda is a whole new thing. I think we’ll see a lot of people successfully create entire apps based on serverless. At the same time, it’s not the right fit for every application.” Databases will not be run in Lambda, or any other application that requires state to be maintained.

Market for serverless

While AWS is credited with being the first to market with a serverless computing platform, since then the other major IaaS public cloud providers have fallen suit. Wood even makes the argument that many AWS services are “serverless” including Lambda, S3, as well as its NoSQL DynamoDB database, and SQL-supported Aurora database platform. Each of these products requires no pre-planning of resource usage or ongoing management of infrastructure.

Lambda comes with up to 1 million requests processed per month for free; thereafter each 1 million requests costs $0.20. Lambda also charges based on the amount of time spent on computing processes, at a rate of $0.00001667 per GB per second the platform is used, rounded to the nearest 100th of a millisecond.

Microsoft Azure has the same prices for its Azure Functions product, which the company made generally available in November, 2016. Google Functions is in beta and provides up to 2 million requests for free each month, with slightly more expensive per-transaction costs but lower compute rates. IBM does not explicitly list pricing for OpenWhisk, but Heredia, the OpenWhisk executive, says the big differentiator for IBM’s serverless computing platform is that it’s open source, hosted in the Apache Software Foundation. In theory, that gives customers the ability to run the OpenWhisk code wherever they want.

here are also a handful of startups in the market attempting to offer serverless computing platforms, components and management tooling. Perhaps the most well known is named Iron.io which provides a serverless computing platform based off of Docker containers.

Lowery, the Gartner analyst, says the market is so young that there are not yet winners and losers. But, AWS has had a product on the market generally available for the longest period of time. The real key, he says, is determining what the serverless system will be used for. FaaS can be a powerful tool for “gluing” together various services within a specific vendor’s cloud. On the other hand, other Internet of Things event-driven use cases may not be as tied to a specific vendor’s cloud.

This story, “Serverless explainer: The next generation of cloud infrastructure ” was originally published by Network World and CIO.