Skip to content

Latest commit

 

History

History
152 lines (121 loc) · 4.12 KB

File metadata and controls

152 lines (121 loc) · 4.12 KB
title Using with Agents
sidebarTitle Agent Integration
description Connect your knowledge base to voice agents.

Knowledge Base retrieval is automatic. When your agent receives a user query, the platform searches the KB and includes relevant content in the LLM's context. You don't need to write any retrieval code.

How Retrieval Works

1. User speaks: "What's your return policy?"
         │
2. Platform embeds the query
         │
3. Searches agent's KB for similar content
         │
4. Finds: "Return Policy: Items can be returned within 30 days..."
         │
5. Injects into LLM context alongside the conversation
         │
6. LLM generates response using the retrieved content

Your agent code stays the same — just write the response generator:

class SupportAgent(OutputAgentNode):
    def __init__(self):
        super().__init__(name="support-agent")
        self.llm = OpenAIClient(model="gpt-4o-mini")
        
        self.context.add_message({
            "role": "system",
            "content": "You are a helpful support agent. Use the provided context to answer questions accurately."
        })

    async def generate_response(self):
        # KB content is automatically included in the messages
        response = await self.llm.chat(
            messages=self.context.messages,
            stream=True
        )
        
        async for chunk in response:
            if chunk.content:
                yield chunk.content

Agent's Default KB

Every agent has a global_knowledge_base_id. This is where you upload content:

from smallestai.atoms import AtomsClient

client = AtomsClient()

# Get your agent
agent = client.get_agent_by_id(id="agent-123").data

# Upload content to the agent's KB
client.upload_text_to_knowledge_base(
    id=agent.global_knowledge_base_id,
    knowledgebase_id_items_upload_text_post_request={
        "title": "Product Pricing",
        "content": "Basic plan: $9.99/month. Premium: $19.99/month. Enterprise: Contact sales."
    }
)

System Prompt Tips

Help the LLM use KB content effectively:

self.context.add_message({
    "role": "system",
    "content": """You are a customer support agent for Acme Corp.

Instructions:
- Answer questions using the provided knowledge base content
- If information isn't in the KB, say "I don't have that information, but I can connect you with a specialist"
- Keep responses concise (2-3 sentences max)
- Never make up product details or policies"""
})

When KB Content Is Used

The platform retrieves KB content when:

Scenario KB Used
User asks a question ✓ Yes
User provides information Maybe (depends on query)
Agent makes a statement Usually not
Tool results Not for tool results

Fallback Handling

When KB doesn't have relevant content:

self.context.add_message({
    "role": "system",
    "content": """If you can't find information in the knowledge base:
1. Acknowledge you don't have that specific info
2. Offer to transfer to a human agent
3. Or ask clarifying questions"""
})

Testing KB Retrieval

To verify your KB content is being retrieved correctly:

  1. Upload test content:
client.upload_text_to_knowledge_base(
    id=kb_id,
    knowledgebase_id_items_upload_text_post_request={
        "title": "Test Item",
        "content": "The secret code is BANANA42."
    }
)
  1. Ask about it in chat:
User: "What's the secret code?"
Agent: "The secret code is BANANA42."

If the agent responds with the correct info, retrieval is working!

Best Practices

Outdated info = wrong answers. Update your KB when info changes. Ambiguous content leads to ambiguous answers. Be specific. Chat with your agent to verify it retrieves the right content. Track when users ask things your KB can't answer, then add that content.