Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
48 changes: 48 additions & 0 deletions c1-viz-data/.gitignore
Original file line number Diff line number Diff line change
@@ -0,0 +1,48 @@
# See https://help.github.com/articles/ignoring-files/ for more about ignoring files.

# dependencies
/node_modules
/.pnp
.pnp.*
.yarn/*
!.yarn/patches
!.yarn/plugins
!.yarn/releases
!.yarn/versions

# testing
/coverage

# next.js
/.next/
/out/

# production
/build

# misc
.DS_Store
*.pem

# debug
npm-debug.log*
yarn-debug.log*
yarn-error.log*
.pnpm-debug.log*

# env files (can opt-in for committing if needed)
.env*

# vercel
.vercel

# typescript
*.tsbuildinfo
next-env.d.ts

# Prisma SQLite database file
prisma/dev.db
prisma/dev.db-journal

# Generated Prisma Client
src/generated/prisma/
187 changes: 187 additions & 0 deletions c1-viz-data/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,187 @@
# C1 Visualize Data Analyst

Example Next.js project with Thesys GenUI SDK and C1 Visualize API, demonstrating how to build a data analyst assistant that queries Supabase databases using Model Context Protocol (MCP), visualizes data, and persists chat history.

[![Built with Thesys](https://thesys.dev/built-with-thesys-badge.svg)](https://thesys.dev)

### Overview

This is a data analyst assistant that helps users query, analyze, and visualize data from Supabase databases. Here's the overall flow:

1. Sends a request to an LLM (OpenAI) to generate a text/markdown response.
2. The LLM uses MCP (Model Context Protocol) tools to query a live Supabase database.
3. Sends the final LLM response (after any tool calls) to the C1 Visualize API endpoint to convert it into a Generative UI response with visualizations.
4. Stores the complete interaction history with the LLM (including requests, responses, and tool call messages) to maintain context for future interactions.
5. Stores the user messages and the corresponding Assistant Generative UI responses in the Thread Store for display and persistence across page refreshes.
6. Displays raw query results alongside visualizations for transparency.

```mermaid
sequenceDiagram
participant User
participant Application
participant LLM (OpenAI)
participant MCP/Supabase
participant C1 Visualize API
participant ThreadStore

User->>Application: Sends message
Application->>LLM (OpenAI): Sends request (with history)
Note over LLM (OpenAI): Processes query, calls MCP tools
LLM (OpenAI)->>MCP/Supabase: Executes database queries
MCP/Supabase-->>LLM (OpenAI): Returns query results
LLM (OpenAI)-->>Application: Returns final text response
Application->>ThreadStore: Stores LLM interaction history (AIMessage)
Application->>C1 Visualize API: Sends final text response
C1 Visualize API-->>Application: Returns GenUI response
Application->>ThreadStore: Stores User message & GenUI response (UIMessage)
Application-->>User: Displays GenUI response & raw data
```

### Setup Instructions

1. Copy `example.env` to `.env` and configure the required environment variables:

```bash
cp example.env .env
```

Required environment variables:

- `THESYS_API_KEY`: Your Thesys API key for C1 Visualize
- `OPENAI_API_KEY`: Your OpenAI API key
- `MCP_SERVER_URL`: URL of your Supabase MCP server
- `SUPABASE_ACCESS_TOKEN`: Access token for Supabase MCP authentication

2. Install dependencies:

```bash
npm install
```

3. Set up the database:

```bash
npm run prisma:migrate
npm run prisma:generate
```

4. Run the development server:

```bash
npm run dev
```

## Project Structure

```
├── src/
│ ├── app/
│ │ ├── api/ # API routes
│ │ ├── page.tsx # Main page component
│ │ ├── layout.tsx # Root layout
│ │ └── globals.css # Global styles
│ ├── generated/ # Generated code
│ ├── services/ # Service layer
│ └── apiClient.ts # API client configuration
├── prisma/ # Database schema and migrations
├── public/ # Static assets
├── .env # Environment variables
├── example.env # Example environment variables
├── next.config.ts # Next.js configuration
├── package.json # Project dependencies
├── tsconfig.json # TypeScript configuration
└── README.md # Project documentation
```

## Features

This example is a data analyst assistant built with Next.js, Thesys GenUI SDK, Prisma, and Model Context Protocol (MCP). It allows users to:

- Query live Supabase databases using natural language
- Generate AI-powered visualizations from query results
- View raw data alongside visualizations
- Persist chat history across sessions
- Maintain conversation context for follow-up questions

### Example Prompts

```
Show me sales by product category
What are the top 5 best-selling products?
List all customers by country
Show me sales trends over time
```

### MCP Tools (`src/app/api/chat/tools.ts`)

The OpenAI assistant connects to Supabase via Model Context Protocol (MCP) and has access to Supabase MCP tools, which are dynamically discovered from the MCP server. Common tools include:

1. **`list_tables`**:

- **Description**: Lists all available tables in the Supabase database and their schemas.
- **Parameters**: Optional schema names.
- **Returns**: List of tables with their structure.

2. **`execute_sql`**:

- **Description**: Executes SQL queries against the Supabase database.
- **Parameters**: SQL query string.
- **Returns**: Query results as JSON.

3. **Other Supabase MCP tools**:
- Additional tools are dynamically loaded from the MCP server based on your Supabase configuration.

### Persistence

This application persists chat conversations to ensure history is maintained across sessions.

- **Database:** Uses Prisma (`prisma/schema.prisma`) to manage the database schema and interact with the underlying database (e.g., PostgreSQL). The schema defines models for `Thread`, `AIMessage`, and `UIMessage`.

- **Backend Logic (`src/services/threadService.ts`)**: This service contains the core functions for interacting with the database via Prisma to:

- Create new chat threads.
- Fetch existing threads and their messages.
- Add new messages (`AIMessage` and `UIMessage`) to a specific thread.

- **API Endpoints (`src/app/api/...`)**:

- `chat/route.ts`: Orchestrates the chat flow. After receiving responses from both OpenAI (including tool interactions) and Thesys Visualize, it calls `threadService` to save the relevant messages.
- `thread/**` & `threads/**`: Provide RESTful APIs for the frontend to fetch the list of threads and the messages belonging to a specific thread.

- **Data Stored**: Two types of message history are stored per thread:

- **`AIMessage`**: Stores the complete conversation history with the primary LLM (OpenAI). This includes the user's prompts, the assistant's intermediate steps, any tool calls made by the assistant (including MCP Supabase queries), and the results returned by those tools. This full history is crucial for maintaining context in subsequent calls to the OpenAI API within the same thread.
- **`UIMessage`**: Stores the messages intended for display in the user interface. This includes the user's original prompt, the final formatted response generated by the Thesys Visualize API (which is based on the OpenAI assistant's final textual output), and raw data from database queries for display in data tables.

- **Frontend Integration (`src/app/page.tsx`)**:
- Uses hooks like `useThreadManager` and `useThreadListManager` provided by the Thesys GenUi SDK or custom hooks.
- These hooks interact with the backend API endpoints (`/api/thread/...`, `/api/threads/...`) to fetch the list of threads and the `UIMessage` history for the currently selected thread, rendering the conversation in the chat interface.

## Key Components

Checkout the backend code in:

1. **`src/app/api/chat/route.ts`**: Main chat route that orchestrates the flow:

- Connects to Supabase via MCP
- Calls OpenAI with MCP tools
- Processes tool results and extracts raw data
- Sends final response to C1 Visualize API
- Persists messages to the database

2. **`src/app/api/chat/tools.ts`**: MCP client implementation:

- Manages connection to Supabase MCP server
- Dynamically loads available tools
- Executes tool calls and formats results

3. **`src/app/api/thread`** and **`src/app/api/threads`**: REST APIs for Thread and Message CRUD operations

4. **`src/services/threadService.ts`**: Service layer for database operations using Prisma

5. **`src/components/RawDataModal.tsx`**: Component for displaying raw query results in a table format

## Additional Resources

- See `USAGE_EXAMPLES.md` for more example queries and use cases
- See `Visualize.md` for details about the visualization capabilities
78 changes: 78 additions & 0 deletions c1-viz-data/USAGE_EXAMPLES.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,78 @@
# Data Visualization Examples

This application demonstrates querying data from Supabase, generating AI-powered visualizations with Thesys, and displaying both the visualization and raw data in the frontend.

## Sample Queries to Try

### Sales Analysis

- "Show me the top 5 best-selling products by revenue"
- "What are the total sales by product category?"
- "Show me sales trends over time"

### Product Analysis

- "List all products with their current stock levels"
- "What's the average price by product category?"
- "Show me products that are low in stock"

### Customer Analysis

- "How many customers do we have by country?"
- "Show me customer distribution by city"
- "List recent customer registrations"

### Combined Analysis

- "Show me sales performance with customer and product details"
- "Which customers have made the most purchases?"
- "What products are popular in different countries?"

## How It Works

1. **User Query**: Enter a natural language query about the business data
2. **AI Processing**: OpenAI processes the query and uses Supabase MCP tools to fetch data
3. **Data Analysis**: The AI analyzes the data and provides insights
4. **Visualization**: Thesys Visualize generates charts, graphs, and interactive components
5. **Dual Display**:
- **C1 Component**: Shows the AI-generated visualization and insights
- **Data Table**: Displays the raw query results in a structured table

## Database Schema

### Products Table

- `id`: Product ID
- `name`: Product name
- `category`: Product category (Electronics, Furniture, Appliances)
- `price`: Product price
- `stock_quantity`: Current inventory
- `created_at`: Creation timestamp

### Customers Table

- `id`: Customer ID
- `name`: Customer name
- `email`: Customer email
- `city`: Customer city
- `country`: Customer country
- `created_at`: Registration timestamp

### Sales Table

- `id`: Sale ID
- `product_id`: Reference to product
- `customer_id`: Reference to customer
- `quantity`: Items sold
- `unit_price`: Price per unit
- `total_amount`: Total sale amount
- `sale_date`: Sale timestamp

## Features

- **Real-time Data**: Queries live data from Supabase
- **AI-Powered Insights**: OpenAI analyzes data and provides meaningful insights
- **Rich Visualizations**: Thesys generates appropriate charts and graphs
- **Raw Data Access**: View the underlying data in formatted tables
- **Natural Language**: Ask questions in plain English
- **Responsive Design**: Works on desktop and mobile devices
42 changes: 42 additions & 0 deletions c1-viz-data/Visualize.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,42 @@
## Thesys Visualize API

### Endpoint

```
POST /v1/embed/chat/completions
```

### Overview
The Thesys Visualize API enables you to generate dynamic user interfaces based on the final output of other LLM services.

- It visualizes the **last assistant message** you provide, turning it into a rich, generative UI.
- It **does not support tool calls**.
- You can **optionally include previous conversation messages** in your request to provide more context for the visualization.
- Designed to work seamlessly with LLM workflows: after your LLM (e.g., OpenAI) completes all tool calls and generates a final response, you send that response (and optionally, the conversation history) to the Visualize API for UI generation.

This makes it easy to turn LLM-generated content into interactive user experiences, without manual UI coding.

```mermaid
sequenceDiagram
participant User
participant Application
participant LLM (OpenAI)
participant Visualize API

User->>Application: Sends message
Application->>LLM (OpenAI): Sends request (with history)
LLM (OpenAI)-->>Application: Returns final text response (after tool calls)
Application->>Visualize API: Sends last assistant message (+ optional history)
Visualize API-->>Application: Returns Generative UI response
Application-->>User: Displays UI response
```

### Request
Supports both streaming and non-streaming payloads.

### Response
Returns streaming chunks in streaming mode or a message object in non-streaming mode.

### Tip:
For best results, we recommend including a note in your system prompt to instruct the LLM to structure its final response clearly for UI generation. For example:
> “Your final response will be processed by another assistant to generate a user interface (e.g., product lists, forms). Structure your responses clearly for this purpose. If the user asks for output in a specific component (for example, a graph or table), try your best to generate the output in the requested format so that the other assistant can use it to generate the UI.”
16 changes: 16 additions & 0 deletions c1-viz-data/eslint.config.mjs
Original file line number Diff line number Diff line change
@@ -0,0 +1,16 @@
import { dirname } from "path";
import { fileURLToPath } from "url";
import { FlatCompat } from "@eslint/eslintrc";

const __filename = fileURLToPath(import.meta.url);
const __dirname = dirname(__filename);

const compat = new FlatCompat({
baseDirectory: __dirname,
});

const eslintConfig = [
...compat.extends("next/core-web-vitals", "next/typescript"),
];

export default eslintConfig;
4 changes: 4 additions & 0 deletions c1-viz-data/example.env
Original file line number Diff line number Diff line change
@@ -0,0 +1,4 @@
THESYS_API_KEY=<thesys-api-key>
OPENAI_API_KEY=<openai-api-key>
MCP_SERVER_URL=<mcp-server-url>
SUPABASE_ACCESS_TOKEN=<supabase-access-token>
7 changes: 7 additions & 0 deletions c1-viz-data/next.config.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
import type { NextConfig } from "next";

const nextConfig: NextConfig = {
/* config options here */
};

export default nextConfig;
Loading