
AWS has added Model Context Protocol (MCP) support to its Q Developer AI-driven coding assistant, initially only for the command line (CLI) but with support within IDEs (via the Q Developer plugin) promised “in the coming weeks.”
MCP is an API introduced by Anthropic to standardize how AI assistants and agents communicate with tools, giving the AI important new capabilities but also introducing new risks.
Q Developer CLI runs on macOS or Linux, or Windows via WSL (Windows Subsystem for Linux). The product already has access to CLI commands but with MCP this can be extended; the example given in the introductory blog is querying a PostgreSQL database using the official MCP server for this open source database manager. AWS also already has a range of MCP servers for interacting with its cloud services. AWS is putting the emphasis on MCP as a source of context and data for AI rather than to empower agentic AI, though both these aspects are important.

MCP servers for Q Developer can be configured globally or just for the current workspace, using JSON files which describe how to start the server, supported arguments, and environment variables such as secrets giving access to services. Using Q Developer CLI, tools can be invoked either with natural language, or by directly invoking a specific tool.
Regarding security, the documentation states that MCP support is designed to require specific user permission, local execution only, and isolation with each MCP server in a separate process. However, most of the security burden falls on the user, with a plea to only install MCP servers from trusted sources and to examine them carefully, to keep MCP servers up to date with the latest versions, and to monitor logs for “unexpected activity.”
Many worry about MCP risks. Security outfit Wiz said in a post that “MCP introduces a broad range of security risks.” However, vendors appear to worry more about being left behind as their competitors add MCP support. AWS, Microsoft, Google and many others have added both support in their AI offerings, and servers enabling access to their services.