MCP Integration
What is MCP?
Section titled “What is MCP?”Model Context Protocol (MCP) is an open standard that enables secure connections between large language models and data sources. MCP servers provide tools and resources that agents can use to access external systems, APIs, and data sources.
Why Use MCP with LangCrew?
Section titled “Why Use MCP with LangCrew?”MCP integration in LangCrew provides:
- Standardized Connections - Connect to any MCP-compatible server
- Secure Access - Built-in security and permission controls
- Multiple Transports - Support for SSE, HTTP streaming, and stdio
- Tool Filtering - Control which tools agents can access
- Easy Scaling - Add new capabilities without code changes
Supported Transport Types
Section titled “Supported Transport Types”LangCrew supports three MCP transport methods:
1. Server-Sent Events (SSE)
Section titled “1. Server-Sent Events (SSE)”Best for real-time data and live updates:
server_config = { "url": "https://api.example.com/mcp/sse?key=your_key", "transport": "sse"}2. Streamable HTTP
Section titled “2. Streamable HTTP”Ideal for REST API integrations:
server_config = { "url": "https://api.example.com/mcp", "transport": "streamable_http"}3. Standard I/O (stdio)
Section titled “3. Standard I/O (stdio)”Perfect for local tools and scripts:
server_config = { "command": "python3", "args": ["path/to/your/tool.py"], "transport": "stdio"}Basic Usage
Section titled “Basic Usage”1. Configure MCP Server
Section titled “1. Configure MCP Server”Define your MCP server configuration:
from langcrew import Agent
# Configure your MCP servermcp_server_configs = { "my_server": { "url": "https://api.example.com/mcp", "transport": "streamable_http" }}2. Create Agent with MCP
Section titled “2. Create Agent with MCP”Add MCP servers to your agent:
@agentdef my_agent(self) -> Agent: return Agent( config=self.agents_config["my_agent"], mcp_servers=mcp_server_configs, llm=self._get_default_llm(), verbose=True )3. Optional: Filter Available Tools
Section titled “3. Optional: Filter Available Tools”Control which tools the agent can access:
@agentdef restricted_agent(self) -> Agent: return Agent( config=self.agents_config["my_agent"], mcp_servers=mcp_server_configs, mcp_tool_filter=["search", "calculator"], # Only these tools llm=self._get_default_llm() )Complete Example
Section titled “Complete Example”Here’s a full working example using different MCP transport types:
import osfrom langchain_openai import ChatOpenAIfrom langcrew import Agent, CrewBase, agent, task, crewfrom langcrew.task import Taskfrom langcrew.crew import Crew
@CrewBaseclass MyCrew: agents_config = "config/agents.yaml" tasks_config = "config/tasks.yaml"
def _get_default_llm(self): return ChatOpenAI( model="gpt-4o-mini", temperature=0.1, api_key=os.getenv("OPENAI_API_KEY") )
@agent def web_agent(self) -> Agent: # SSE transport for real-time data server_config = { "url": f"https://api.example.com/sse?key={os.getenv('API_KEY')}", "transport": "sse" } return Agent( config=self.agents_config["web_agent"], mcp_servers={"web_server": server_config}, llm=self._get_default_llm() )
@agent def calculator_agent(self) -> Agent: # stdio transport for local tools current_dir = os.path.dirname(os.path.abspath(__file__)) calc_script = os.path.join(current_dir, "tools", "calculator.py")
server_config = { "command": "python3", "args": [calc_script], "transport": "stdio" } return Agent( config=self.agents_config["calculator"], mcp_servers={"calc_server": server_config}, mcp_tool_filter=["add", "multiply"], # Restrict tools llm=self._get_default_llm() )
@task def web_search_task(self) -> Task: return Task( config=self.tasks_config["web_search"], agent=self.web_agent() )
@task def calculation_task(self) -> Task: return Task( config=self.tasks_config["calculation"], agent=self.calculator_agent() )
@crew def crew(self) -> Crew: return Crew( agents=self.agents, tasks=self.tasks, verbose=True )