Anthropic is proposing a new standard for connecting AI assistants to the systems where data resides.
A new standard called the Model Context Protocol, or MCP for short, Anthropic says the standard, which it open sourced today, could help AI models produce better, more relevant responses to queries.
MCP enables models, independent of their providers, not just Anthropic, to access data from corporate tools and applications besides content stores and app development environments to carry out tasks.
"As AI assistants become commonly adopted, the industry has invested in capabilities within its models, leading to rapid improvements in reasoning and quality," Anthropic outlined in a blog post. "Yet even the most sophisticated models are constrained by their isolation from data — trapped behind information silos and legacy systems. Every new data source requires its own custom implementation, making truly connected systems difficult to scale."
MCP ostensibly solves this problem through a protocol that enables developers to build two-way connections between data sources and AI-powered applications (e.g., chatbots). Developers can expose data through “MCP servers” and build “MCP clients” — for instance, apps and workflows — that connect to those servers on command.
Here’s a quick demo using the Claude desktop app, where we’ve configured MCP:
Watch Claude connect directly to GitHub, create a new repo, and make a PR through a simple MCP integration.
Building this integration took less than an hour after setting up MCP in Claude desktop. pic.twitter.com/xseX89Z2PD
— Alex Albert (@alexalbert__) November 25, 2024
Anthropic also notes that companies such as Block and Apollo have already integrated MCP into their systems, while dev tooling firms like Replit, Codeium, and Sourcegraph are adding MCP support to their platforms.
"Instead of maintaining separate connectors for each data source, developers can now build against a standard protocol," Anthropic wrote. "As the ecosystem matures, AI systems will maintain context as they move between different tools and data sets, replacing today's fragmented integrations with a more sustainable architecture."
Developers can now begin building with MCP connectors and subscribers to Anthropic's Claude Enterprise plan can connect the company's Claude chatbot to internal systems via MCP servers. Anthropic has shared prebuilt MCP servers for enterprise systems like Google Drive, Slack, and GitHub, and says it'll soon provide toolkits for deploying production MCP servers that can serve whole organizations.
"We are dedicated to making MCP a collaborative, open-source project and ecosystem," Anthropic wrote. "We invite [developers] to build the future of context-aware AI together."
MCP sounds like a good idea in theory. But it's far from clear that it'll gain much traction, particularly among rivals like OpenAI, which would surely prefer that customers and ecosystem partners use their data-connecting approaches and specifications.
In fact, OpenAI just added a data-connecting feature to its AI-powered chatbot platform, ChatGPT. This allows ChatGPT to read code in dev-focused coding apps, and the use cases for MCP align closely with what's possible through this feature. OpenAI has stated that it will add the capability, called Work with Apps, to other categories of apps but is exploring implementations directly with close partners rather than open sourcing the underlying tech.
It also remains to be seen whether MCP is as beneficial and performant as Anthropic claims it to be. The company, for example, says that MCP can enable an AI bot to "better retrieve relevant information to further understand the context around a coding task, but the company offers no benchmarks to back up this assertion.