AWS throws deep learning into containers

AWS logo

AWS used its summit’s stop in Santa Clara to introduce its customers to new ways of doing machine learning and a few advances in networking and routing.

Having picked up on the fact that some companies have started deploying TensorFlow workloads to the cloud by using Amazon’s managed Kubernetes service for example, AWS now offers Docker images for machine learning tasks.

AWS Deep Learning Containers can for now be used for deep learning training or inferencing, which basically means applying a learned capability to new data, with either TensorFlow or Apache MXNet.

To reduce training time and increase performance, the images offer training on single nodes or multi-node clusters, access to GPUs, and help in using the Horovod framework for distributed training. They support Python in version 2.7 or 3.6, and customisation through additional libraries or packages is possible.

Deep Learning Containers are available on the AWS Marketplace and the Elastic Container Registry.

In other cloud related news AWS has extended host-based routing to let ops folks use multiple conditions for their routing rules and let them match on multiple values. Rules can now also take standard and custom HTTP headers and methods, the query string or the source IP address into account.

Load balancers can have up to 100 rules, which can reference up to five values and use up to five wildcards. Developers that want to use this advanced request routing feature with existing application load balancers will have to edit their rules accordingly.

Those interested in AWS App Mesh will be pleased to learn that the tool is now generally available for running and monitoring HTTP and TCP services in the following regions: US East (N. Virginia), US East (Ohio), US West (Oregon), US West (N. California), Canada (Central), Europe (Ireland), Europe (Frankfurt), Europe (London), Asia Pacific (Mumbai), Asia Pacific (Tokyo), Asia Pacific (Sydney), Asia Pacific (Singapore), and Asia Pacific (Seoul).

Services can run on AWS Fargate, Amazon EC2, Amazon ECS, Amazon Elastic Container Service for Kubernetes, or Kubernetes, with traffic proxied via the CNCF backed Envoy project. The whole service is available through a web console, an App Mesh CLI, or an API.