Docker adds AI agents to Compose along with GPU-powered cloud Offload service

Docker adds AI agents to Compose along with GPU-powered cloud Offload service

Docker has added AI agent support to its Compose command, plus a new GPU-enabled Offload service which enables compute-intensive AI workloads in ephemeral cloud environments.

Docker Compose is a widely used tool for defining and running multi-container applications, configured using a Compose file using the YAML markup language. 

Compose will now support AI agents, models and tools in order to run agentic workloads, including deployment to cloud container services. The enhanced Compose is available now, with support for deployment to Google Cloud Run, for example using the command:

gcloud run compose up

In this command, gcloud is the Google Cloud CLI (command line interface). Support for Azure Container Apps is coming soon, according to Docker. There is no mention of AWS as yet; though this and other deployment options should be possible using the generic agent support. Compose now supports AI agent orchestration frameworks such as CrewAI, LangGraph, and Spring AI.

A Compose configuration file including agentic tools

Along with the update to Compose, Docker has previewed a remote compute and build service called Docker Offload, now in closed beta. Docker Offload is similar to the existing Build Cloud, but with the addition of Nvidia L4 Tensor Core GPUs, designed for AI workloads. Build Cloud will continue as a separate service. Docker Offload is suitable not only for build, but also for running AI applications for test and development or occasional use.

Features of Docker Offload include cloud builds, ephemeral cloud runners, and hybrid workflows where both local and remote compute are in the workflow. Developers can have a local-like experience using port forwarding and bind mounts, and Offload works with the existing buildx and build CLI commands. Multi-platform output is possible. Use of Offload does require Docker Desktop as well as a Pro account or higher. The service is billed in minutes, though prices are not yet available. 

The pricing model for Build Cloud is that each subscription includes some free minutes, and then additional minutes cost from $25 for 500 minutes, but Offload may cost more because of the GPU support.

As with Build Cloud, Docker Offload currently only uses the US East region, which means that developers elsewhere may see high latency.

In its press information, Docker made no mention of security issues, though this is a hot topic regarding AI agents. There is an inherent conflict between the demands of automation, which implies a hands-off experience, and that of AI agentic best practice, which is to include human approval steps for tasks which are security-critical. Ephemeral environments like those in Docker Offload, or created via containers on a local computer, mitigate this risk by enabling isolated environments, but it is not a complete solution because of the capabilities of many MCP (model context protocol) agents and the fact that the output will be used in production.

Docker also offers an MCP Gateway which connects to a curated MCP catalog, giving more assurance that the tools themselves are not compromised, but this does not solve the issue of potentially malicious prompts being given to trusted agents.