An AI-powered, extensible UI framework built on the OpenShift Console dynamic plugin, enabling unified, intelligent experiences across Red Hat products.
- Node.js 20+ and yarn - For frontend development
- Python 3.12+ (requires >=3.12, <3.14) - For backend (Lightspeed services). See lightspeed-stack requirements
- Go 1.24.6+ - For obs-mcp server. See obs-mcp requirements
- OpenShift CLI (
oc) - To connect to a cluster - Podman 3.2.0+ or Docker - To run the console
- OpenAI API Key - Or compatible LLM provider
Genie Web Client requires both a frontend (this repo) and a backend (AI service). Follow these steps:
The backend provides AI capabilities. See detailed instructions in backend/README.md.
Quick Start:
# First, clone and start obs-mcp server (terminal 1)
# Make sure you're logged into your OpenShift cluster
oc login
# Clone obs-mcp (one time only, skip if you already have it)
cd ~/Documents/GHRepos # or wherever you keep repos
git clone https://github.com/rhobs/obs-mcp.git
cd obs-mcp
# Start obs-mcp (auto-discovers Prometheus in the cluster)
go run cmd/obs-mcp/main.go --listen 127.0.0.1:9100 --auth-mode kubeconfig --insecure --guardrails none
# Runs on port 9100 - keep running
# Then in another terminal, setup lightspeed-stack
# Clone lightspeed-stack
cd ~/Documents/GHRepos # or your preferred location
git clone https://github.com/lightspeed-core/lightspeed-stack.git
cd lightspeed-stack
# Copy our configs
cp ~/Documents/GHRepos/genie-web-client/backend/lightspeed-stack/*.yaml .
# Install and start
uv sync
export OPENAI_API_KEY="sk-your-key-here"
uv run python -m src.lightspeed_stack
# Runs on port 8080 - keep this terminal runningIn separate terminal windows, run:
Terminal 3: Plugin Dev Server
cd ~/Documents/GHRepos/genie-web-client
yarn install
yarn run start
# Runs on port 9001 - keep runningTerminal 4: OpenShift Console
cd ~/Documents/GHRepos/genie-web-client
oc login # Connect to your cluster
yarn run start-console
# Runs on port 9000 - keep runningAccess the app: http://localhost:9000/genie
Once everything is running, you can test obs-mcp integration with these queries:
- "What alerts are firing in the cluster?"
- "Show me CPU usage metrics"
- "What pods are running in the openshift-monitoring namespace?"
In one terminal window, run:
yarn installyarn run start
In another terminal window, run:
oc login(requires oc and an OpenShift cluster)yarn run start-console(requires Docker or podman 3.2.0+)
This will run the OpenShift console in a container connected to the cluster you've logged into. The plugin HTTP server runs on port 9001 with CORS enabled.
Note: Make sure the backend is running (see "Getting Started" section above) for full AI functionality.
Navigate to http://localhost:9000/genie to see the running plugin.
If you are using podman on a Mac with Apple silicon, yarn run start-console
might fail since it runs an amd64 image. You can workaround the problem with
qemu-user-static by running
these commands:
podman machine ssh
sudo -i
rpm-ostree install qemu-user-static
systemctl rebootMake sure the Remote Containers extension is installed. This method uses Docker Compose where one container is the OpenShift console and the second container is the plugin. It requires that you have access to an existing OpenShift cluster. After the initial build, the cached containers will help you start developing in seconds.
- Create a
dev.envfile inside the.devcontainerfolder with the correct values for your cluster:
OC_PLUGIN_NAME=console-plugin-template
OC_URL=https://api.example.com:6443
OC_USER=kubeadmin
OC_PASS=<password>(Ctrl+Shift+P) => Remote Containers: Open Folder in Container...yarn run start- Navigate to http://localhost:9000/genie
This project uses Jest and React Testing Library for unit testing.
# Run all tests once
yarn test
# Run tests in watch mode (re-runs on file changes)
yarn test:watch
# Run tests with coverage report
yarn test:coverageTests should be placed alongside the components they test with a .test.tsx extension. For components with multiple test files, use a __tests__/ directory.
File Organization:
- Single test file:
src/components/MyComponent.test.tsx(co-located) - Multiple test files:
src/components/my-component/__tests__/(organized)
Example test:
import { render, screen } from '@testing-library/react';
import MyComponent from './MyComponent';
describe('MyComponent', () => {
it('renders correctly', () => {
render(<MyComponent />);
expect(screen.getByText('Expected Text')).toBeInTheDocument();
});
});Integration tests using Cypress are available. See the integration-tests directory for more details.
# Run Cypress in interactive mode
yarn test-cypress
# Run Cypress in headless mode
yarn test-cypress-headlessIf you get this error when starting lightspeed-stack:
ModuleNotFoundError: No module named 'mcp'
Solution: Install the required dependencies:
cd ~/Documents/GHRepos/lightspeed-stack
uv pip install mcp
# Or install all optional dependencies:
uv pip install pandas psycopg2-binary redis aiosqlite pillow "mcp>=1.23.0" scikit-learn pymongo matplotlibThis happens because uv sync only installs dependencies from pyproject.toml, but llama-stack requires additional packages for MCP support.
For backend-specific troubleshooting (port conflicts, API keys, etc.), see backend/README.md.
See CONTRIBUTING.md for guidelines. A PR template is in place (see .github/pull_request_template.md) prompting for a summary and testing details.