Skip to content

Latest commit

 

History

History
287 lines (218 loc) · 7.11 KB

File metadata and controls

287 lines (218 loc) · 7.11 KB

Database Logging Documentation

Overview

TopicLab-Agent supports automatic request/response logging to a database for analytics, debugging, and audit purposes. The system supports both SQLite (local) and PostgreSQL (remote) databases.

Database Selection Priority

The system automatically selects the database based on the following priority:

  1. DATABASE_URL - Explicit configuration (highest priority)
  2. PGDATABASE_URL + USE_POSTGRES=true - PostgreSQL connection
  3. Local SQLite - Default (no configuration needed)

Local SQLite Database (Default)

By default, TopicLab-Agent uses SQLite for local storage, which requires no configuration.

Database Location

data/topiclab.db

Advantages

  • Zero Configuration - Works out of the box
  • No Server Required - Embedded database
  • Easy Backup - Single file for all data
  • Portable - Move the data/ directory to migrate

Request Log Schema

The request_logs table stores all API requests with full runtime metadata:

Column Type Description
id INTEGER Primary key
run_id VARCHAR(64) Unique request identifier
timestamp DATETIME Request timestamp
question TEXT User's question
answer TEXT Agent's response
endpoint VARCHAR(128) API endpoint (/run or /stream_run)
method VARCHAR(16) HTTP method
status VARCHAR(32) Status: pending, success, error
error_message TEXT Error message if failed
response_time_ms INTEGER Response time in milliseconds
Runtime Metadata
topiclab_cli_version VARCHAR(32) TopicLab CLI version
skill_version VARCHAR(64) Website skill version
skill_updated_at DATETIME Skill last update time
agent_uid VARCHAR(64) Agent unique identifier
agent_display_name VARCHAR(256) Agent display name
agent_handle VARCHAR(128) Agent handle
agent_status VARCHAR(32) Agent status
extra_data JSON Additional metadata

Runtime Metadata

The system automatically extracts and stores runtime metadata from incoming requests.

Expected Request Format

{
  "messages": [
    {"type": "human", "content": "如何查看通知"}
  ],
  "runtime_metadata": {
    "topiclab_cli_version": "0.3.2",
    "website_skill_version": "4c7e3e27d70cf622",
    "website_skill_updated_at": "2026-03-31T02:33:04Z",
    "agent_uid": "oc_81da4f29882f7b4a",
    "openclaw_agent": {
      "agent_uid": "oc_81da4f29882f7b4a",
      "display_name": "OpenClaw Guest 699d's openclaw",
      "handle": "openclaw_guest_699d_openclaw",
      "status": "active"
    }
  }
}

Extracted Fields

Source Field Target Column
runtime_metadata.topiclab_cli_version topiclab_cli_version
runtime_metadata.website_skill_version skill_version
runtime_metadata.website_skill_updated_at skill_updated_at
runtime_metadata.agent_uid agent_uid
runtime_metadata.openclaw_agent.agent_uid agent_uid (fallback)
runtime_metadata.openclaw_agent.display_name agent_display_name
runtime_metadata.openclaw_agent.handle agent_handle
runtime_metadata.openclaw_agent.status agent_status

API Endpoints

Get Logs

# Get all logs
curl http://localhost:8000/logs

# Filter by status
curl "http://localhost:8000/logs?status=success"

# Filter by agent
curl "http://localhost:8000/logs?agent_uid=oc_81da4f29882f7b4a"

# Filter by CLI version
curl "http://localhost:8000/logs?cli_version=0.3.2"

# Pagination
curl "http://localhost:8000/logs?limit=50&offset=0"

Response Format

{
  "total": 1,
  "limit": 100,
  "offset": 0,
  "logs": [
    {
      "id": 1,
      "run_id": "abc123",
      "timestamp": "2026-04-01T10:30:00",
      "question": "如何查看通知",
      "answer": "topiclab notifications list --json",
      "endpoint": "/run",
      "method": "POST",
      "status": "success",
      "response_time_ms": 1500,
      "runtime_metadata": {
        "topiclab_cli_version": "0.3.2",
        "skill_version": "4c7e3e27d70cf622",
        "skill_updated_at": "2026-03-31T02:33:04+00:00",
        "agent_uid": "oc_81da4f29882f7b4a",
        "agent_display_name": "OpenClaw Guest 699d's openclaw",
        "agent_handle": "openclaw_guest_699d_openclaw",
        "agent_status": "active"
      }
    }
  ]
}

Get Statistics

curl http://localhost:8000/logs/stats

Response Format

{
  "total_requests": 100,
  "status_counts": {
    "success": 95,
    "error": 5
  },
  "avg_response_time_ms": 1200,
  "unique_agents": 3
}

Query Examples

Via SQLite CLI

# Connect to database
sqlite3 data/topiclab.db

# Get recent requests
SELECT id, question, status, response_time_ms 
FROM request_logs 
ORDER BY timestamp DESC LIMIT 10;

# Count by CLI version
SELECT topiclab_cli_version, COUNT(*) 
FROM request_logs 
GROUP BY topiclab_cli_version;

# Get unique agents
SELECT DISTINCT agent_uid, agent_display_name 
FROM request_logs 
WHERE agent_uid IS NOT NULL;

# Average response time by status
SELECT status, AVG(response_time_ms) 
FROM request_logs 
GROUP BY status;

Use Cases

1. Monitor Usage

# Total requests today
curl "http://localhost:8000/logs" | jq '.total'

# Error rate
curl "http://localhost:8000/logs/stats" | jq '.status_counts'

2. Track Agent Usage

# Filter by specific agent
curl "http://localhost:8000/logs?agent_uid=oc_81da4f29882f7b4a"

# Get all unique agents
sqlite3 data/topiclab.db "SELECT DISTINCT agent_display_name FROM request_logs"

3. Version Tracking

# Check which CLI versions are being used
curl "http://localhost:8000/logs" | jq '.logs[].runtime_metadata.topiclab_cli_version' | sort | uniq -c

4. Performance Analysis

# Get stats
curl "http://localhost:8000/logs/stats"

# Query slow requests
sqlite3 data/topiclab.db "SELECT question, response_time_ms FROM request_logs WHERE response_time_ms > 5000"

Backup and Restore

SQLite Backup

# Simple file copy
cp data/topiclab.db data/topiclab.db.backup

# Or use SQLite's backup command
sqlite3 data/topiclab.db ".backup 'data/topiclab.db.backup'"

Restore

cp data/topiclab.db.backup data/topiclab.db

Migration

When the schema changes, the database will automatically create new columns. For major changes:

# Backup first
cp data/topiclab.db data/topiclab.db.backup

# Reinitialize (will lose data!)
rm data/topiclab.db
python scripts/init_database.py

Troubleshooting

Database Not Created

  1. Check write permissions in the project directory
  2. Manually create the data/ directory: mkdir -p data
  3. Run the initialization script: python scripts/init_database.py

No Logs Being Recorded

  1. Check application logs for database errors
  2. Verify the request_logs table exists
  3. Ensure the payload includes proper structure

Missing Runtime Metadata

  1. Verify the request includes runtime_metadata field
  2. Check the field names match expected format
  3. Review application logs for parsing errors