AWS Lambda: BYORuntime to serverless and stop replicating basics

AWS Lambda: BYORuntime to serverless and stop replicating basics

In a bid to make serverless development easier, AWS has introduced Lambda Layers and a Lambda Runtime API to its serverless offering, promising to make code more manageable and the platform almost polyglot in the process.

Components like standard libraries that are used by different Lambda functions can now be put into a *.zip-file and uploaded as Lambda Layers which are then accessible to all functions. The new approach means that common code doesn’t have to be deployed several times – to go along with the functions – anymore. Lambda Functions are able to reference up to five layers, one of which can be a runtime, because of the newly available Runtime API (more on that below).

Since they are all extracted in the same path and can therefore overwrite each other, the order in which they are referenced should be considered, though. Layers can be managed from the Lambda console, where users are also able to find pre-installed ones from AWS partners such as Datadog, Serverless, and Stackery for specific purposes like application monitoring and security.

Developers that haven’t made the jump to serverless, because their language of choice hasn’t been supported, can take advantage of the just introduced Lambda Runtime API. With it, Lambda users are now able to choose a custom runtime when creating or updating functions instead of having to wait for AWS to add a language. Open source runtimes for C++ and Rust are already available, while AWS is working with a couple of partners on adding Erlang, Elixir, Cobol, Node.js, and PHP into the mix.

Those interested in publishing their own runtime need to get implementing a so called bootstrap executable for communication between a user’s code and the Lambda environment. It then has to be included in either the code of the function or one of the newly introduced layers. Runtime bootstraps get their event payload for invocations via a HTTP interface, which also returns the function response. Information is shared via environment variables.

The team behind the Serverless framework already made use of the Runtime API by setting up a Serverless Open Runtime for AWS Lambda, which can be found on GitHub.

It follows a pipeline approach and is meant to help with sharing solutions to commonly encountered problems via middleware for example. Although it is only a proof of concept at this point, planned features include transformations of AWS events into CloudEvents or HTTP requests for further processing, as well as security implementations and ways to include more detailed tracing and debugging opportunities into serverless applications.

The Open Runtime fetches requests from the AWS Runtime API as soon as Lambda receives an execution request. Middleware is invoked as an executable, so if any is used, request events are then fed into them via STDIN. The processed result is read back via STDOUT and passed to the language specific runtime that in turn invokes a developer’s business logic. Responses are handled in a similar way to the initiating event and transported back to the service platform.