feat(ui): migrate to Vite React with Heidi backend integration#78
feat(ui): migrate to Vite React with Heidi backend integration#78heidi-dang merged 1 commit intomainfrom
Conversation
- Replace legacy UI with Vite-based React application (port 3002) - Implement Heidi API client layer (src/api/heidi.ts) with typed methods: health(), listAgents(), listRuns(), getRun(), runOnce(), runLoop(), chat(), cancelRun() - Add SSE streaming support (src/api/stream.ts) with polling fallback - Configure Vite: allowedHosts for heidiai.com.au, proxy to backend :7777 - Add CLI commands: heidi ui build, heidi ui status, heidi ui path - Serve UI at /ui/ with SPA fallback routing - Include ui_dist in Python package via setuptools package-data - Add CI ui-build job with Node 20 and npm caching - Update README with dev/prod workflow and port reference - Add worklog entry (2026-02-17) and safe .local/ gitignore policy Testing: - Clean install smoke test passed (pip install -> heidi ui build -> heidi serve) - UI loads at /ui/, assets serve correctly, SPA routing works - No built artifacts committed; source UI only in git Relates to: dev_1_ui_migrator, dev_3_packaging_release
Summary of ChangesHello @heidi-dang, I'm Gemini Code Assist1! I'm currently reviewing this pull request and will post my feedback shortly. In the meantime, here's a summary to help you and other reviewers quickly get up to speed! This pull request introduces a significant overhaul of the Heidi CLI's user interface, transitioning it to a modern Vite-based React application. The changes encompass a complete restructuring of the UI, robust integration with the Heidi backend API for all core functionalities, and the establishment of a streamlined packaging and continuous integration pipeline. This migration aims to provide a more responsive and maintainable frontend experience, while ensuring proper deployment and operational workflows. Highlights
Changelog
Ignored Files
Activity
Using Gemini Code AssistThe full guide for Gemini Code Assist can be found on our documentation page, here are some quick tips. Invoking Gemini You can request assistance from Gemini at any point by creating a comment using either
Customization To customize Gemini Code Assist for GitHub experience, repository maintainers can create a configuration file and/or provide a custom code review style guide (such as PEP-8 for Python) by creating and adding files to a Limitations & Feedback Gemini Code Assist may make mistakes. Please leave feedback on any instances where its feedback is incorrect or counter productive. You can react with 👍 and 👎 on @gemini-code-assist comments. If you're interested in giving your feedback about your experience with Gemini Code Assist for Github and other Google products, sign up here. You can also get AI-powered code generation, chat, as well as code reviews directly in the IDE at no cost with the Gemini Code Assist IDE Extension. Footnotes
|
There was a problem hiding this comment.
Code Review
This pull request successfully migrates the UI to a modern Vite-based React application and integrates it with the Heidi backend. However, a critical security concern was identified: the API key is transmitted as a query parameter in the SSE stream URL, which could lead to credential leakage. It is recommended to use a more secure method for transmitting sensitive information, such as a fetch-based streaming approach with headers or short-lived tokens. Additionally, I've identified a few other issues, including a significant discrepancy in the ui/README.md that could mislead developers, a potential bug in the event streaming fallback logic, and a minor typo in a worklog file. Overall, this is a great step forward for the project, and addressing these points will further enhance its quality and security.
| export const getStreamUrl = (runId: string, apiKey?: string): string => { | ||
| const baseUrl = getBaseUrl(); | ||
| const key = apiKey || getSettings().apiKey; | ||
| const url = `${baseUrl}/runs/${runId}/stream`; | ||
| return key ? `${url}?key=${encodeURIComponent(key)}` : url; | ||
| }; |
There was a problem hiding this comment.
| # 🌌 HEIFI Neural Interface: Matrix v3 | ||
|
|
||
| HEIFI is a high-fidelity, enterprise-grade command center that bridges Large Language Models with infrastructure orchestration. Designed for the "Power Operator," it offers a unified interface for AI reasoning and autonomous SSH workflows. | ||
|
|
||
|  | ||
|
|
||
| ## 🎨 Design Language & Theming | ||
|
|
||
| The **HEIFI Matrix v3** aesthetic is built on a "Premium Industrial" design language, prioritizing high-density information without visual clutter. | ||
|
|
||
| ### Color Palette | ||
| - **Obsidian Backbone**: `#000000` (Pure Black) for maximum contrast and OLED optimization. | ||
| - **Neural Accents**: `#1d9bf0` (Grok Blue) for primary interactive elements and active neural links. | ||
| - **Semantic Feedback**: | ||
| - `Success`: `#10b981` (Emerald) for successful handshakes and stable links. | ||
| - `Error`: `#ef4444` (Rose Red) for critical faults and security breaches. | ||
| - **Typography**: | ||
| - `Sans`: **Inter** for clean, readable UI labels. | ||
| - `Mono`: **Geist Mono** for code, terminal streams, and low-level logical identifiers. | ||
|
|
||
| ### Visual Effects | ||
| - **Glassmorphism 2.0**: Uses deep `backdrop-blur(45px)` combined with low-opacity backgrounds (`rgba(10, 10, 10, 0.15)`) to create a sense of depth and focus. | ||
| - **Ambient Mesh**: A fixed radial gradient system (`radial-gradient`) simulates an ambient glow emanating from the corners of the interface, reducing eye strain in dark environments. | ||
| - **Glow-Pulse Feedback**: Interactive cards use a custom `glowPulse` animation that shifts border-color and box-shadow based on system state. | ||
|
|
||
| ## 🚀 Core Technology Stack | ||
|
|
||
| ### Intelligence Layer | ||
| - **Google Gemini 3 API**: Specifically **Gemini 3 Pro Preview** for its superior tool-calling accuracy and **Gemini 2.5 Flash** for high-speed advisory tasks. | ||
| - **Thinking Budget**: Leverages the model's reasoning tokens (up to 32k) for "Deep Mode" architectural planning. | ||
| - **Local Neural Bridge**: Native integration for **Ollama** and **LM Studio** via a custom local proxy. | ||
|
|
||
| ### Orchestration & Communication | ||
| - **Real-time Streams**: **Socket.io** handles the low-latency duplex stream between the browser and remote SSH targets. | ||
| - **SSH Logic**: Powered by the **ssh2** library on the backend, supporting RSA/Ed25519 keys and password-based auth. | ||
| - **Terminal UI**: **Xterm.js** with `FitAddon` and `WebLinksAddon` for a native-feeling TTY experience. | ||
|
|
||
| ### Frontend Architecture | ||
| - **React 19**: Utilizing the latest concurrent rendering features. | ||
| - **Tailwind CSS**: Custom configuration extending Grok-style spacing and animations. | ||
| - **React Virtuoso**: High-performance virtualization for infinite chat history and agent reasoning logs. | ||
| - **Lucide Icons**: Feather-weight vector iconography for clear operational signals. | ||
|
|
||
| ## 🏗️ Architectural Breakdown | ||
|
|
||
| ### 1. The Autonomous Loop | ||
| When in **Agent Mode**, HEIFI enters a recursive reasoning loop: | ||
| 1. **Perception**: The model analyzes the user's objective and current file system state. | ||
| 2. **Planning**: It emits `functionCall` objects (e.g., `list_dir`, `read_file`). | ||
| 3. **Validation**: The UI presents these actions to the operator (Human-in-the-Loop). | ||
| 4. **Execution**: The backend executes the command via the established SSH socket. | ||
| 5. **Synthesis**: The output is fed back into the model to refine the next step. | ||
|
|
||
| ### 2. Security Enclave | ||
| - **Zero-Knowledge Auth**: Credentials (SSH keys/passwords) are held in ephemeral memory and never persisted to a database. | ||
| - **RSA-4096 Encryption**: All device-flow links (GitHub/Copilot) use standard OAuth 2.0 security protocols. | ||
| - **Termination Purge**: Clicking "Terminate Node" triggers a complete memory sweep of the socket session on the backend. | ||
|
|
||
| ## 🛠️ Operational Commands | ||
|
|
||
| | Command | Description | | ||
| | :--- | :--- | | ||
| | `npm run dev` | Initialize local neural hub and next.js server. | | ||
| | `npm run build` | Compile the matrix for production deployment. | | ||
| | `docker-compose up` | Launch the fully containerized HEIFI stack. | | ||
|
|
||
| --- | ||
| *HEIFI: The thin layer between human intent and machine execution.* No newline at end of file |
There was a problem hiding this comment.
The content of this README appears to be for a different or future version of the project. It mentions technologies like Socket.io, ssh2, and Next.js, which the pull request description states were intentionally removed. This discrepancy can be very misleading for developers. Please update this file to accurately reflect the current state of the UI architecture as implemented in this pull request (e.g., Vite, React, SSE for streaming).
| // Send new events | ||
| details.events.forEach((event, index) => { | ||
| // Simple deduplication: only send last event if it's new | ||
| if (index === details.events.length - 1) { | ||
| onEvent?.(event); | ||
| } | ||
| }); |
There was a problem hiding this comment.
The polling fallback logic is flawed. It only processes the last event from the fetched list in each polling interval (if (index === details.events.length - 1)). If multiple events occur between polls, all but the last one will be missed by the UI.
To fix this, you should process all events that are new since the last poll. You can achieve this by tracking the number of events already processed.
First, declare a counter at the top of the subscribeRunStream function:
let lastEventCount = 0;Then, update the polling logic to process only new events as suggested below.
// Process only new events
if (details.events.length > lastEventCount) {
const newEvents = details.events.slice(lastEventCount);
newEvents.forEach(event => onEvent?.(event));
lastEventCount = details.events.length;
}| @@ -0,0 +1,64 @@ | |||
| # Heidi CLI UI Migration Worklog | |||
| ## Date: 2026-02-17 | |||
Part 1: UI Migration + Backend Integration
What Changed:
ui/src/with components, API layer, and typessrc/api/heidi.ts): health, listAgents, listRuns, getRun, runOnce, runLoop, chat, cancelRunsrc/api/stream.ts): SSE with polling fallbackWhat Did NOT Change (Intentionally Excluded):
Part 2: Packaging, Serving, CI, and Documentation
What Changed:
heidi ui build: builds with--base=/ui/, outputs to~/.cache/heidi/ui/dist/ui/with SPA fallback routing (redirects/ui→/ui/, serves index.html for unknown paths)pyproject.toml: includesui_dist/**/*in package-data for wheel/sdist.github/workflows/ci.yml: addedui-buildjob (Node 20, npm caching, verifies dist artifacts)README.md: added Web UI section with dev/prod workflows and port reference table.local/worklog: added with migration summary and date (2026-02-17).gitignore: granular policy for.local/(tracks docs/worklog, ignores secrets/state/cache)How to Test:
Port Reference:
Backward Compatibility:
HEIDI_UI_DISTcan override dist locationCommit SHA: 5b4ee55
Relates to: dev_1_ui_migrator, dev_3_packaging_release