Already have HTTP support? If your MCP server already supports HTTP transport, you can skip ahead to Step 5: Update smithery.yaml to configure your deployment settings.
Here’s a typical STDIO-based MCP server that you might be starting with:
Copy
Ask AI
# src/main.pyimport osfrom mcp.server.fastmcp import FastMCPfrom typing import Optional# Create STDIO servermcp = FastMCP("Character Counter")# Get configuration from environment variablesserver_token = os.getenv("SERVER_TOKEN")case_sensitive = os.getenv("CASE_SENSITIVE", "false").lower() == "true"def validate_server_access(server_token: Optional[str]) -> bool: """Validate server token - accepts any string including empty ones for demo.""" # In a real app, you'd validate against your server's auth system # For demo purposes, we accept any non-empty token return server_token is not None and len(server_token.strip()) > 0 if server_token else True@mcp.tool()def count_characters(text: str, character: str) -> str: """Count occurrences of a specific character in text""" # Validate server access (your custom validation logic) if not validate_server_access(server_token): raise ValueError("Server access validation failed. Please provide a valid serverToken.") # Apply user preferences from config search_text = text if case_sensitive else text.lower() search_char = character if case_sensitive else character.lower() # Count occurrences count = search_text.count(search_char) return f'The character "{character}" appears {count} times in the text.'# Run with STDIO transportif __name__ == "__main__": mcp.run()
Skip this entire step if no configuration needed: If your MCP server doesn’t need any configuration (API keys, settings, etc.), you can skip this entire step including the middleware setup. Your server will work perfectly fine without any configuration handling.
If your server needs configuration (like API keys, user preferences, etc.), prefer using the Smithery Python SDK helpers rather than hand-rolled parsing.
Use these helpers and accessors with the SDK’s parsing utilities:
Copy
Ask AI
# src/main.py (continued - only add if you need configuration)def handle_config(config: dict): """Handle configuration from Smithery - for backwards compatibility with stdio mode.""" global _server_token if server_token := config.get('serverToken'): _server_token = server_token # You can handle other session config fields here# Store server token only for stdio mode (backwards compatibility)_server_token: Optional[str] = Nonedef get_request_config() -> dict: """Get full config from current request context.""" try: # Access the current request context from FastMCP import contextvars # Try to get from request context if available request = contextvars.copy_context().get('request') if hasattr(request, 'scope') and request.scope: return request.scope.get('smithery_config', {}) except: passdef get_config_value(key: str, default=None): """Get a specific config value from current request.""" config = get_request_config() return config.get(key, default)def validate_server_access(server_token: Optional[str]) -> bool: """Validate server token - accepts any string including empty ones for demo.""" # In a real app, you'd validate against your server's auth system # For demo purposes, we accept any non-empty token return server_token is not None and len(server_token.strip()) > 0 if server_token else True
# src/main.py# MCP Tool - demonstrates per-request config access@mcp.tool()def count_characters(text: str, character: str) -> str: """Count occurrences of a specific character in text""" # Example: Get various config values that users can pass to your server session server_token = get_config_value("serverToken") case_sensitive = get_config_value("caseSensitive", False) # Validate server access (your custom validation logic) if not validate_server_access(server_token): raise ValueError("Server access validation failed. Please provide a valid serverToken.") # Apply user preferences from config search_text = text if case_sensitive else text.lower() search_char = character if case_sensitive else character.lower() # Count occurrences count = search_text.count(search_char) return f'The character "{character}" appears {count} times in the text.'
# src/main.pydef main(): transport_mode = os.getenv("TRANSPORT", "stdio") if transport_mode == "http": # HTTP mode with config extraction from URL parameters print("Character Counter MCP Server starting in HTTP mode...") # Setup Starlette app with CORS for cross-origin requests app = mcp.streamable_http_app() # IMPORTANT: add CORS middleware for browser based clients app.add_middleware( CORSMiddleware, allow_origins=["*"], allow_credentials=True, allow_methods=["GET", "POST", "OPTIONS"], allow_headers=["*"], expose_headers=["mcp-session-id", "mcp-protocol-version"], max_age=86400, ) # Apply custom middleware for config extraction (per-request API key handling) app = SmitheryConfigMiddleware(app) # Use Smithery-required PORT environment variable port = int(os.environ.get("PORT", 8081)) print(f"Listening on port {port}") uvicorn.run(app, host="0.0.0.0", port=port, log_level="debug") else: # Optional: add stdio transport for backwards compatibility # You can publish this to uv for users to run locally print("Character Counter MCP Server starting in stdio mode...") server_token = os.getenv("SERVER_TOKEN") # Set the server token for stdio mode (can be None) handle_config({"serverToken": server_token}) # Run with stdio transport (default) mcp.run()if __name__ == "__main__": main()
How it works: When you deploy with custom containers, your FastMCP server handles HTTP requests directly through the streamable HTTP app. The main() function only runs STDIO mode when you execute the file directly (like python src/main.py), giving you STDIO support for local development and backward compatibility.
You can create your own Dockerfile or use this recommended template:
Copy
Ask AI
# Dockerfile# Use a Python image with uv pre-installedFROM ghcr.io/astral-sh/uv:python3.12-alpine# Install the project into `/app`WORKDIR /app# Enable bytecode compilationENV UV_COMPILE_BYTECODE=1# Copy from the cache instead of linking since it's a mounted volumeENV UV_LINK_MODE=copy# Install the project's dependencies using the lockfile and settingsRUN --mount=type=cache,target=/root/.cache/uv \ --mount=type=bind,source=uv.lock,target=uv.lock \ --mount=type=bind,source=pyproject.toml,target=pyproject.toml \ uv sync --locked --no-install-project --no-dev# Then, add the rest of the project source code and install itCOPY . /appRUN --mount=type=cache,target=/root/.cache/uv \ uv sync --locked --no-dev# Place executables in the environment at the front of the pathENV PATH="/app/.venv/bin:$PATH"# Reset the entrypoint, don't invoke `uv`ENTRYPOINT []# STDIO servers don't expose portsCMD ["python", "src/main.py"]
uv Docker Best Practices: For more examples and best practices on using uv with Docker, including multistage builds and development workflows, check out the uv Docker example repository.
Local vs Deployed Ports: For local testing, you can use any port. However, when deployed to Smithery, your server must listen on the PORT environment variable, which Smithery will set to 8081.
Test interactively:
Once your server is running in HTTP mode, you can test it interactively using the Smithery playground:
Copy
Ask AI
npx @smithery/cli playground --port 8081
Config Handling Limitation: The Smithery playground doesn’t currently support config handling for custom containers (Python/TypeScript). Your deployed server will support configuration, but local testing with the playground will use default/empty config values. We’re actively working on adding this support.
This guide showed how to migrated a Python MCP server from STDIO to HTTP transport using custom Docker containers and FastMCP. We implemented a character counter tool with proper CORS configuration, used SmitheryConfigMiddleware for configuration handling, and supported both HTTP and STDIO transport modes. This approach gives us full control over our Python environment and middleware while maintaining backward compatibility.Need help? Join our Discord or email support@smithery.ai for assistance.