-
Notifications
You must be signed in to change notification settings - Fork 2
new Op๏ผbut SimpleChatOp not foundย #7
Description
๐ ๏ธ Step1 Build Op
from flowllm.core.context import C
from flowllm.core.op import BaseAsyncOp
from flowllm.core.schema import Message
from flowllm.core.enumeration import Role
@C.register_op()
class SimpleChatOp(BaseAsyncOp):
async def async_execute(self):
query = self.context.get("query", "")
messages = [Message(role=Role.USER, content=query)]
# Use token_count method to calculate token count
token_num = self.token_count(messages)
print(f"Input tokens: {token_num}")
response = await self.llm.achat(messages=messages)
self.context.response.answer = response.content.strip()
For details, refer to the Simple Op Guide, LLM Op Guide, and Advanced Op Guide (including Embedding, VectorStore, and concurrent execution).
๐ Step2 Configure Config
The following example demonstrates building an MCP (Model Context Protocol) service. Create a configuration file my_mcp_config.yaml:
backend: mcp
mcp:
transport: sse
host: "0.0.0.0"
port: 8001
flow:
demo_mcp_flow:
flow_content: SimpleChatOp()
description: "Search results for a given query."
input_schema:
query:
type: string
description: "User query"
required: true
llm:
default:
backend: openai_compatible
model_name: qwen3-30b-a3b-instruct-2507
params:
temperature: 0.6
token_count: # Optional, configure token counting backend
model_name: Qwen/Qwen3-30B-A3B-Instruct-2507
backend: hf # Supports base, openai, hf, etc.
params:
use_mirror: true
๐ Step3 Start MCP Service
flowllm
config=my_mcp_config
backend=mcp \ # Optional, overrides config
mcp.transport=sse \ # Optional, overrides config
mcp.port=8001 \ # Optional, overrides config
llm.default.model_name=qwen3-30b-a3b-thinking-2507 # Optional, overrides config
โญโ ReMe โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโฎ
โ โ
โ ____ __ ___ โ
โ / __ \___ / |/ /__ โ
โ / /_/ / _ \/ /|_/ / _ \ โ
โ / _, _/ __/ / / / __/ โ
โ /_/ |_|\___/_/ /_/\___/ โ
โ โ
โ โ
โ โ
โ ๐ฆ Backend: mcp โ
โ ๐ Transport: sse โ
โ ๐ URL: http://0.0.0.0:8001/sse โ
โ โ
โ ๐ FlowLLM version: 0.2.0.3 โ
โ ๐ FastMCP version: 2.13.1 โ
โ โ
โฐโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโฏ
Traceback (most recent call last):
File "/Users/banma-3420/Downloads/reme_ai_test_4/.venv/bin/flowllm", line 8, in <module>
sys.exit(main())
^^^^^^
File "/Users/banma-3420/Downloads/reme_ai_test_4/.venv/lib/python3.12/site-packages/flowllm/main.py", line 87, in main
app.run_service()
File "/Users/banma-3420/Downloads/reme_ai_test_4/.venv/lib/python3.12/site-packages/flowllm/core/application.py", line 384, in run_service
service.run()
File "/Users/banma-3420/Downloads/reme_ai_test_4/.venv/lib/python3.12/site-packages/flowllm/core/service/mcp_service.py", line 50, in run
super().run()
File "/Users/banma-3420/Downloads/reme_ai_test_4/.venv/lib/python3.12/site-packages/flowllm/core/service/base_service.py", line 67, in run
if self.integrate_tool_flow(flow):
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/banma-3420/Downloads/reme_ai_test_4/.venv/lib/python3.12/site-packages/flowllm/core/service/mcp_service.py", line 29, in integrate_tool_flow
request_model = create_pydantic_model(flow.name, flow.tool_call.input_schema)
^^^^^^^^^^^^^^
File "/Users/banma-3420/Downloads/reme_ai_test_4/.venv/lib/python3.12/site-packages/flowllm/core/flow/base_tool_flow.py", line 28, in tool_call
self._tool_call = self.build_tool_call()
^^^^^^^^^^^^^^^^^^^^^^
File "/Users/banma-3420/Downloads/reme_ai_test_4/.venv/lib/python3.12/site-packages/flowllm/core/flow/expression_tool_flow.py", line 43, in build_tool_call
if hasattr(self.flow_op, "tool_call"):
^^^^^^^^^^^^
File "/Users/banma-3420/Downloads/reme_ai_test_4/.venv/lib/python3.12/site-packages/flowllm/core/flow/base_flow.py", line 164, in flow_op
self._flow_op = self.build_flow()
^^^^^^^^^^^^^^^^^
File "/Users/banma-3420/Downloads/reme_ai_test_4/.venv/lib/python3.12/site-packages/flowllm/core/flow/expression_tool_flow.py", line 35, in build_flow
return parse_flow_expression(self.flow_config.flow_content)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/banma-3420/Downloads/reme_ai_test_4/.venv/lib/python3.12/site-packages/flowllm/core/utils/common_utils.py", line 221, in parse_flow_expression
result = eval(last_line_expr, {"__builtins__": {}}, env)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "<string>", line 1, in <module>
NameError: name 'SimpleChatOp' is not defined