When we started building SLEDS, we faced a fundamental question: how do you connect AI tools that were never designed to talk to each other? REST APIs were the obvious answer, and they work — but they're not enough. Enter MCP.
What Is MCP?
Model Context Protocol is a standard created by Anthropic for connecting AI models to external tools and data sources. Think of it as a universal adapter for AI integrations. Instead of building custom integrations for each AI tool, you build one MCP server, and any MCP-compatible client can connect to it.
For SLEDS, this means that Claude Desktop, Claude Code, and any future MCP-compatible tool can connect to a sled using just a URL and an API key. No custom plugin, no platform-specific integration, no OAuth dance.
Why MCP Over REST
REST APIs are great for request-response interactions. Ask for data, get data. But AI tool integration needs more than that. It needs tool discovery — the AI needs to know what capabilities are available. It needs structured input/output — the AI needs to understand what parameters to provide and what response to expect. And it needs context-aware interaction — the tools should provide relevant context proactively, not just respond to queries.
MCP handles all of this. When an AI tool connects to our MCP server, it discovers available tools (search, observe, read threads, share assets), understands their parameters, and can use them naturally in conversation. The AI doesn't need special prompting or formatting — it just has new capabilities.
Our MCP Implementation
SLEDS exposes 29 MCP tools across several categories: connecting to spaces, reading threads and assets, writing observations, sharing artifacts, searching across the knowledge base, managing dispatches, and querying integrated services like Linear.
The connect flow is intentionally simple. You add the SLEDS MCP URL to your AI tool's configuration with your API key, and you're connected. On first connect, the tool receives a summary of the space's current state — active threads, pending dispatches, recent activity — so the AI starts with context, not a blank slate.
The Cross-Tool Future
MCP is still early, but the trajectory is clear. As more AI tools adopt the protocol, the ability to share context across them becomes trivial. Today, connecting Claude to SLEDS takes one configuration line. When ChatGPT, Cursor, and other tools adopt MCP natively, the same will be true for them.
In the meantime, we maintain a REST API for tools that don't support MCP yet. ChatGPT Actions, custom integrations, and web-based tools all connect through REST. But MCP is our primary integration surface because it provides the richest, most natural interaction model for AI tools.