Vertex Flow now supports the Model Context Protocol (MCP), an open protocol that enables seamless integration between LLM applications and external data sources and tools. MCP provides a standardized way to connect LLMs with the context they need.
The Model Context Protocol (MCP) is an open standard that allows AI applications to securely connect to external data sources and tools. It provides:
- Resources: Context and data for AI models to use
- Tools: Functions for AI models to execute
- Prompts: Templated messages and workflows for users
- Sampling: Server-initiated agentic behaviors
MCP uses a client-server architecture where:
- Hosts: LLM applications (like Vertex Flow) that initiate connections
- Clients: Connectors within the host application
- Servers: Services that provide context and capabilities
Vertex Flow can act as both an MCP client (consuming external MCP servers) and an MCP server (exposing its own capabilities).
Create or modify vertex_flow/config/mcp.yml:
mcp:
enabled: true
clients:
filesystem:
enabled: true
transport: "stdio"
command: "npx"
args: ["@modelcontextprotocol/server-filesystem", "/path/to/data"]
server:
enabled: true
name: "VertexFlow"
version: "1.0.0"Configure MCP clients to connect to external servers:
clients:
# Filesystem access
filesystem:
enabled: true
transport: "stdio"
command: "npx"
args: ["@modelcontextprotocol/server-filesystem", "/data"]
description: "Access filesystem resources"
# GitHub integration
github:
enabled: true
transport: "stdio"
command: "npx"
args: ["@modelcontextprotocol/server-github"]
env:
GITHUB_PERSONAL_ACCESS_TOKEN: "your-token"
description: "Access GitHub repositories"
# Database access
database:
enabled: true
transport: "stdio"
command: "python"
args: ["-m", "mcp_server_database", "--connection-string", "sqlite:///data.db"]
description: "Database operations"Configure Vertex Flow as an MCP server:
server:
enabled: true
name: "VertexFlow"
version: "1.0.0"
transport:
stdio:
enabled: true
http:
enabled: false
host: "localhost"
port: 8080
resources:
enabled: true
workflows:
enabled: true
path: "vertex_flow/workflow"
pattern: "*.py"
tools:
enabled: true
function_tools:
enabled: true
auto_discover: true
prompts:
enabled: true
custom_prompts:
- name: "code_review"
template: "Review this code: {code}"
description: "Code review prompt"from vertex_flow.mcp.vertex_integration import MCPVertexFlowClient
# Create and connect client
client = MCPVertexFlowClient("MyClient", "1.0.0")
await client.connect_stdio("npx", "@modelcontextprotocol/server-filesystem", "/data")
# List available resources
resources = await client.get_resources()
print(f"Available resources: {[r.name for r in resources]}")
# Read a resource
content = await client.read_resource("file:///data/example.txt")
print(f"Content: {content}")
# List available tools
tools = await client.get_tools()
print(f"Available tools: {[t.name for t in tools]}")
# Call a tool
result = await client.call_tool("search_files", {"pattern": "*.py"})
print(f"Search result: {result.content}")from vertex_flow.mcp.vertex_integration import MCPVertexFlowServer
from vertex_flow.workflow.tools.functions import FunctionTool
# Create server
server = MCPVertexFlowServer("VertexFlow", "1.0.0")
# Add resources
server.add_resource(
"workflow://config",
"workflow_config",
"workflow configuration content"
)
# Add tools
def my_function(text: str) -> str:
return f"Processed: {text}"
tool = FunctionTool(
name="process_text",
description="Process text input",
func=my_function
)
server.add_function_tool(tool)
# Add prompts
server.add_prompt(
"summarize",
"Summarize this text: {text}",
"Text summarization prompt"
)
# Run server
await server.run_stdio()from vertex_flow.mcp.vertex_integration import MCPLLMVertex, MCPVertexFlowClient
# Create MCP-enabled LLM vertex
llm_vertex = MCPLLMVertex("llm_with_mcp", model=my_model)
# Add MCP clients
filesystem_client = MCPVertexFlowClient("filesystem", "1.0.0")
await filesystem_client.connect_stdio("npx", "@modelcontextprotocol/server-filesystem", "/data")
await llm_vertex.add_mcp_client("filesystem", filesystem_client)
# Process with MCP context
result = llm_vertex.process(
{"input": "Analyze the data files"},
{"workflow_context": context}
)The most common transport mechanism where the MCP server runs as a child process:
transport: "stdio"
command: "npx"
args: ["@modelcontextprotocol/server-filesystem", "/data"]For servers that support HTTP connections:
transport: "http"
base_url: "http://localhost:8080"- Filesystem:
@modelcontextprotocol/server-filesystem - GitHub:
@modelcontextprotocol/server-github - GitLab:
@modelcontextprotocol/server-gitlab - Google Drive:
@modelcontextprotocol/server-gdrive - Slack:
@modelcontextprotocol/server-slack - PostgreSQL:
@modelcontextprotocol/server-postgres - SQLite:
@modelcontextprotocol/server-sqlite
- Puppeteer: Web automation and scraping
- Brave Search: Web search capabilities
- AWS: AWS services integration
- Docker: Container management
Install MCP servers using npm:
# Install filesystem server
npm install -g @modelcontextprotocol/server-filesystem
# Install GitHub server
npm install -g @modelcontextprotocol/server-github
# Install database servers
npm install -g @modelcontextprotocol/server-postgres
npm install -g @modelcontextprotocol/server-sqliteConfigure allowed and blocked resource patterns:
security:
allowed_resources:
- "file://*"
- "workflow://*"
- "config://*"
blocked_resources:
- "file:///etc/*"
- "file:///root/*"Set limits on tool execution:
tool_limits:
max_execution_time: 60 # seconds
max_memory_usage: 100 # MBRequire explicit approval for sensitive operations:
security:
require_approval: trueMCP operations include comprehensive error handling:
try:
result = await client.call_tool("risky_operation", {"param": "value"})
if result.isError:
print(f"Tool execution failed: {result.content}")
else:
print(f"Success: {result.content}")
except Exception as e:
print(f"MCP operation failed: {e}")Enable detailed logging for MCP operations:
integration:
logging:
level: "DEBUG"
log_mcp_messages: true
log_tool_calls: trueReuse MCP connections where possible:
# Keep clients connected for multiple operations
client = MCPVertexFlowClient("persistent", "1.0.0")
await client.connect_stdio("server-command")
# Perform multiple operations
resources = await client.get_resources()
for resource in resources:
content = await client.read_resource(resource.uri)
# Process content...
# Close when done
await client.close()Cache frequently accessed resources:
from functools import lru_cache
@lru_cache(maxsize=100)
async def get_cached_resource(uri: str) -> str:
return await client.read_resource(uri)- Server Not Found: Ensure MCP server is installed and accessible
- Permission Denied: Check file/directory permissions
- Connection Timeout: Increase timeout settings
- Protocol Version Mismatch: Ensure compatible MCP versions
Enable debug mode for detailed error information:
import logging
logging.basicConfig(level=logging.DEBUG)
# MCP operations will now show detailed logsTest MCP server connectivity:
# Test filesystem server
echo '{"jsonrpc": "2.0", "id": 1, "method": "ping"}' | npx @modelcontextprotocol/server-filesystem /data- Resource Management: Always close MCP clients when done
- Error Handling: Implement comprehensive error handling
- Security: Follow principle of least privilege
- Performance: Cache frequently accessed resources
- Monitoring: Log MCP operations for debugging
- Configuration: Use environment variables for sensitive data
connect_stdio(command, *args): Connect via stdioconnect_http(base_url): Connect via HTTPlist_resources(): List available resourcesread_resource(uri): Read resource contentlist_tools(): List available toolscall_tool(name, args): Execute a toollist_prompts(): List available promptsget_prompt(name, args): Get prompt content
set_resource_provider(provider): Set resource providerset_tool_provider(provider): Set tool providerset_prompt_provider(provider): Set prompt providerrun_stdio(): Run server on stdiorun_http(host, port): Run HTTP server
add_resource(uri, name, content): Add resourceadd_function_tool(tool): Add function tooladd_prompt(name, template, desc): Add prompt template
Find more examples in the vertex_flow/examples/ directory:
mcp_example.py: Basic MCP usagemcp_filesystem_example.py: Filesystem integrationmcp_workflow_example.py: Workflow integrationmcp_server_example.py: Custom MCP server
To contribute MCP integrations:
- Follow the MCP specification
- Add comprehensive tests
- Update documentation
- Consider security implications
- Provide usage examples