New Relic aims to crack open MCP servers

New Relic aims to crack open MCP servers

New Relic has added Model Context Protocol support to its existing AI monitoring portfolio, giving AI teams at least some insight into interactions between their applications and language model agents.

MCP was only open sourced by Anthropic last year. At the time, Anthropic described it as a standard for “connecting AI assistants to the systems where data lives, including content repositories, business tools, and development environments.”

The aim was to help frontier models produce “better relevant responses.” With the increased focus on agentic AI, not least as a way of delivering value on all those investments on Gen AI, the timing was opportune to say the least. It has been adopted by a raft of providers, with OpenAI getting onboard earlier this year, and Google pledging support in Gemini

At the same time, this leaves developers and AI teams with yet another black box to prise open when it comes to pinpointing problems and bottlenecks or assessing performance to they can improve their systems.

New Relic said MCP servers could result in a lack of visibility and performance monitoring when it came to the tools being used by AI agents.  Likewise, MCP providers struggle to identify bottlenecks or pin down errors.

New Relic CTO Siva Padisetty said, “MCP has quickly become the standard protocol for agentic AI. Once again meeting our customers where and how they work, our new MCP integration is a game-changer for anyone building or operating AI systems that rely on this protocol.”

In practice, New Relic will provide instant MCP tracing visibility, which it said will “uncover specific usage and patterns of the entire lifecycle of an MCP request” It will also offer “proactive MCP optimization”, analyzing and evaluating which tools agents select, along with usage patterns and metrics including latency and errors.

This is all correlated with the entire application ecoystem, such as databases and microservices.

Padisetty said the company had moved beyond “siloed LLM monitoring” to “connecting insights from AI interactions directly with the performance of the entire application stack for a holistic view. All this is offered as an integral part of our industry leading APM technology.”