AWS is examining further extending the execution timeout on its Lambda service, on the assumption that companies will increasingly use the serverless architecture for data heavy tasks such as machine learning and video processing.
AWS jacked up the timeout limit for Lambda functions to 15 minutes earlier this month, from its earlier paltry five. The move sparked jubilation amongst most Lambda fans, though some suggested the limit was “excessive” and heavy data tasks should be broken up into smaller ones.
Danilo Poccia, principal evangelist for serverless at AWS, told Devclass that it expected customers would quickly soak up their extra 15 minutes with new use cases, prompting a further extension.
Poccia said the 15 minute extension had been prompted by user demand. Customers were increasingly running serverless functions or applications that also touched legacy systems which sometimes created longer execution functions.
At the same time, customers were already using Lambda to do data heavy tasks such as big file uploads, video processing and running machine learning inferences.
Using a dedicated machine learning service for building models, then using serverless for running inference could be much cost effective, he added.
Customers would inevitably push the new limits of the service, he said, and AWS would have to respond. “We will start to see new use cases that were just not possible two weeks ago, and these customers will probably ask us to raise the limit again. We will try to do that.”
Extending the limit was not a trivial task he said. The internal architecture of our service is full of interactions. To raise the timeout we need to make certain all those interactions can sustain the new time out. It took us some time.”
“We’re not 100 per cent sure what is the highest unit we can reach,” he continued. “It depends on what our customer asks.”
But he said, “you’ll probably see something new coming shortly.”
Time limits had nothing to do with being Serverless or not, he said: “It’s still event driven, it’s still full managed infrastructure, so from my point of view, it’s 100 per cent Serverless.”