Working with MCP Servers
What is MCP?
The Model Context Protocol (MCP) is an open standard that enables AI models to securely interact with external tools and data sources. MCP servers expose capabilities that AI clients can discover and use.
MCP Server Basics
An MCP server typically provides:
-
Tools Functions the AI can call to perform actions
-
Resources Data sources the AI can query
-
Prompts Pre-defined prompt templates
-
Sampling AI model interactions
Transport Modes
MCP supports two transport modes. For Targetly deployments, you must use SSE:
SSE (Server-Sent Events) - Required for Targetly
from mcp.server import Server
from mcp.server.sse import SseServerTransport
import uvicorn
app = Server("my-mcp-server")
# Define your tools
@app.call_tool()
async def my_tool(param: str) -> str:
return f"Result: {param}"
# Use SSE transport on port 8080
if __name__ == "__main__":
transport = SseServerTransport("/messages")
uvicorn.run(transport.asgi_app(app), host="0.0.0.0", port=8080)
STDIO - Not Suitable for Web Deployments
# ❌ Don't use stdio for Targetly deployments
from mcp.server.stdio import stdio_server
# This only works for local, command-line MCP servers
STDIO transport is for local processes only. Always use SSE transport when deploying to Targetly.
Dockerfile Requirements
Your Dockerfile must:
- Expose port 8080
- Listen on 0.0.0.0 (not localhost)
- Use SSE transport
Example Dockerfile:
FROM python:3.11-slim
WORKDIR /app
# Install dependencies
COPY requirements.txt .
RUN pip install --no-cache-dir -r requirements.txt
# Copy application
COPY . .
# Expose the port
EXPOSE 8080
# Run server
CMD ["python", "server.py"]
Port Configuration
Targetly expects your MCP server to run on port 8080. This is the standard port for all deployments.
Configure your server to listen on port 8080:
uvicorn.run(
transport.asgi_app(app),
host="0.0.0.0", # Important: use 0.0.0.0, not 127.0.0.1
port=8080
)
const express = require('express');
const app = express();
app.listen(8080, '0.0.0.0', () => {
console.log('MCP server listening on port 8080');
});
http.ListenAndServe("0.0.0.0:8080", handler)
Health Checks
Implement a health check endpoint for monitoring:
@app.get("/health")
async def health():
return {"status": "healthy"}
Test your health endpoint after deployment:
curl https://your-deployment.prod.targetly.io/health
Environment Variables
Use environment variables for configuration:
import os
DATABASE_URL = os.getenv("DATABASE_URL", "default_value")
API_KEY = os.getenv("API_KEY")
In future versions, you'll be able to set environment variables via the Targetly dashboard.
Best Practices
1. Error Handling
Implement robust error handling:
@app.call_tool()
async def my_tool(param: str) -> str:
try:
result = perform_operation(param)
return result
except Exception as e:
logger.error(f"Tool failed: {e}")
raise
2. Logging
Use structured logging:
import logging
logging.basicConfig(
level=logging.INFO,
format='%(asctime)s %(levelname)s %(message)s'
)
logger = logging.getLogger(__name__)
@app.call_tool()
async def my_tool(param: str) -> str:
logger.info(f"Tool called with param: {param}")
# ... tool logic
View logs with:
tly logs <deployment-id>
3. Testing Locally
Always test your MCP server locally before deploying:
# Build Docker image
docker build -t my-mcp-server .
# Run locally
docker run -p 8080:8080 my-mcp-server
# Test health endpoint
curl http://localhost:8080/health
# Test your tools
# (use your AI client configured to http://localhost:8080)
4. Dependencies
Pin your dependencies for reproducible builds:
# Good: Pinned versions
mcp==1.0.2
uvicorn==0.24.0
pydantic==2.5.0
# Bad: Unpinned versions
mcp
uvicorn
pydantic
5. Security
Never hardcode API keys or secrets in your code. Use environment variables instead.
# ❌ Bad
API_KEY = "sk-1234567890abcdef"
# ✅ Good
API_KEY = os.getenv("API_KEY")
if not API_KEY:
raise ValueError("API_KEY environment variable not set")
Example MCP Servers
Simple Tool Server
from mcp.server import Server
from mcp.server.sse import SseServerTransport
import uvicorn
app = Server("calculator")
@app.call_tool()
async def add(a: float, b: float) -> float:
"""Add two numbers"""
return a + b
@app.call_tool()
async def multiply(a: float, b: float) -> float:
"""Multiply two numbers"""
return a * b
if __name__ == "__main__":
transport = SseServerTransport("/messages")
uvicorn.run(transport.asgi_app(app), host="0.0.0.0", port=8080)
Resource Server
from mcp.server import Server
from mcp.server.sse import SseServerTransport
import uvicorn
app = Server("data-server")
@app.list_resources()
async def list_data():
"""List available data resources"""
return [
{"uri": "data://users", "name": "Users"},
{"uri": "data://products", "name": "Products"}
]
@app.read_resource()
async def read_data(uri: str):
"""Read a data resource"""
if uri == "data://users":
return get_users()
elif uri == "data://products":
return get_products()
if __name__ == "__main__":
transport = SseServerTransport("/messages")
uvicorn.run(transport.asgi_app(app), host="0.0.0.0", port=8080)
Connecting AI Clients
Configure your AI client to use your deployed MCP server:
{
"mcpServers": {
"my-server": {
"url": "https://your-deployment.prod.targetly.io/messages",
"transport": "sse"
}
}
}
Troubleshooting
Server not responding at /messages
Problem: AI client can't connect
Solutions:
- Verify SSE transport is configured correctly
- Check server is listening on the SSE endpoint path
- View logs:
tly logs <deployment-id>
Port binding error
Problem: Address already in use
Solutions:
- Ensure you're binding to 0.0.0.0:8080
- Check no other process is using port 8080
- Use the exact port 8080 (not 3000, 5000, etc.)
Tools not discovered
Problem: AI client can't see your tools
Solutions:
- Verify
@app.call_tool()decorator is used - Check tool functions are async
- Ensure transport is SSE, not stdio
Resources
-
MCP Documentation Official MCP protocol documentation
-
MCP Python SDK Python SDK for building MCP servers
-
MCP TypeScript SDK TypeScript SDK for building MCP servers
-
Example Servers Awesome MCP Servers collection