Skip to content

ramper-labs/InnerBoard-local

Repository files navigation

InnerBoard-local

A 100% offline meeting prep assistant that turns your console history into team/manager-ready updates.

InnerBoard-local analyzes your terminal sessions and produces concise, professional talking points for your next standup or 1:1. Everything runs locally on your machineβ€”no data ever leaves your device.

License

🎯 What It Does

  • Record Console Sessions: Capture interactive terminal activity with timing
  • Extract Insights (SRE): Turn raw logs into structured successes, blockers, and resources
  • Compose Meeting Prep (MAC): Generate team/manager updates and concrete recommendations
  • 100% Local: Uses local LLMs (Ollama); no data leaves your device
  • Modern CLI: Beautiful terminal interface with progress indicators and rich formatting

πŸ—οΈ Architecture

β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”   process    β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”   generate    β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚  Terminal Logs  β”‚ ────────────▢│  SRE Sessions    β”‚ ────────────▢ β”‚  MAC Meeting    β”‚
β”‚  (raw console)  β”‚              β”‚  (structured     β”‚              β”‚  Prep Output    β”‚
β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜              β”‚   insights)      β”‚              β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜
         β”‚                       β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜                       β”‚
         β”‚                                β”‚                                 β”‚
         β–Ό                                β–Ό                                 β–Ό
β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”              β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”              β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚ Encrypted Vault │◀─────────────│ Local Ollama     │─────────────▢│ Team Updates    β”‚
β”‚ (SQLite+Fernet) β”‚              β”‚ Model Processing β”‚              β”‚ Recommendations β”‚
β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜              β”‚ (localhost:11434)β”‚              β”‚ Manager Reports β”‚
                                 β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜              β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜

✨ Features

πŸ”’ Industry-Grade Security

  • PBKDF2 Key Derivation: 100,000 iterations for cryptographically secure key generation
  • Fernet AES-128 Encryption: Industry-standard encryption for all stored data
  • Password-Protected Keys: Optional password encryption of master keys with integrity validation
  • Input Validation: Comprehensive protection against SQL injection, XSS, path traversal attacks
  • Data Integrity: SHA256 checksums ensure data hasn't been tampered with
  • Network Isolation: Only allows loopback connections to Ollama (ports 11434)
  • Secure File Deletion: Overwrites sensitive files before deletion

🧠 AI-Powered Analysis

  • SRE (Structured Reflection Extraction): Advanced AI extracts structured insights
    • Key successes identification with specifics and context
    • Blocker identification with impact assessment and resolution hints
    • Resource needs assessment and recommendations
    • Session summaries with actionable details
  • MAC (Meeting Advice Composer): Generates professional meeting prep content
    • Team updates with progress highlights and next focus areas
    • Manager updates with outcomes, risks, and resource needs
    • Concrete recommendations with actionable next steps
    • Multi-session synthesis for comprehensive reporting

🎨 User-Friendly Low-Touch Experience

  • Rich Terminal Interface: Beautiful tables, progress bars, and color-coded output
  • Comprehensive Help: Detailed command help and usage guidance
  • Error Handling: User-friendly messages with actionable solutions
  • Progress Indicators: Real-time feedback during AI processing
  • Multi-format Display: Tables, panels, structured data, and formatted text

⚑ High Performance

  • Intelligent Caching: TTL-based caching (responses, models, reflections)
  • Connection Pooling: Efficient Ollama client management (max 5 connections)
  • SQLite Optimization: WAL mode, foreign keys, indexed queries
  • GPU Acceleration: Leverages platform capabilities via Ollama
  • Memory Management: Automatic cache cleanup and size limits
  • Thread-Safe Operations: All caching and connections are thread-safe

βš™οΈ Flexible Configuration

  • Environment Variables: Full configuration via env vars and .env files
  • Auto-loading: .env files loaded automatically with python-dotenv
  • Dynamic Model Switching: Easy switching between Ollama models
  • Validation: Configuration validation with helpful error messages
  • Runtime Updates: Configuration changes take effect immediately

πŸš€ Getting Started

Choose the installation method that works best for you:

⚑ Quick Start (Recommended: Native Install)

Run one command per OS to install natively with a virtual environment and local Ollama:

macOS / Linux:

curl -fsSL "https://raw.githubusercontent.com/ramper-labs/InnerBoard-local/main/quickstart.sh" | bash

Windows (PowerShell):

iwr "https://raw.githubusercontent.com/ramper-labs/InnerBoard-local/main/quickstart.ps1" -UseBasicParsing | iex

The script will:

  • βœ… Create/activate a Python virtual environment
  • βœ… Install InnerBoard-local
  • βœ… Ensure Ollama is installed and running locally
  • βœ… Pull the default AI model (gpt-oss:20b)
  • βœ… Initialize your encrypted vault
  • βœ… Run health checks

During initialization you'll be prompted to set a password for your encrypted vault. Optionally, the quickstart can save your password to .env (plaintext) for convenience; you can decline if you prefer not to store it.

That's it! You're ready to start using InnerBoard-local.


πŸ“¦ Alternative Installation Methods

Option A: Manual Installation (Recommended)

Prerequisites:

# 1. Clone the repository
git clone https://github.com/ramper-labs/InnerBoard-local.git
cd InnerBoard-local

# 2. Environment
python3 -m venv .venv
source .venv/bin/activate

# 3. Install InnerBoard
pip install -e .

# 4. Run automated setup
innerboard setup

Option B: Docker Installation (Flaky at the moment)

# 1. Clone the repository
git clone https://github.com/ramper-labs/InnerBoard-local.git
cd InnerBoard-local

# 2. Run Docker setup script
./docker-setup.sh

Or manually:

# Build and start services
docker-compose up -d

# Initialize InnerBoard (run once)
docker-compose exec innerboard innerboard init

πŸ”§ Post-Installation Setup

After installation, verify everything is working:

# If using virtual environment (created automatically on some systems):
source innerboard_venv/bin/activate

# Check system health
innerboard health

# View available commands
innerboard --help

Note: If the quickstart script created a virtual environment (innerboard_venv), you'll need to activate it each time you want to use InnerBoard-local:

# Activate virtual environment
source innerboard_venv/bin/activate

# Use InnerBoard normally
innerboard add "My reflection"
innerboard list

# Deactivate when done
deactivate

Optional: Add this to your ~/.bashrc for convenience:

echo 'alias innerboard="source innerboard_venv/bin/activate && innerboard"' >> ~/.bashrc
source ~/.bashrc

System Requirements:

  • RAM: 8GB minimum, 16GB recommended
  • Storage: 20GB free space for AI models
  • OS: Linux, macOS, or Windows (WSL)

πŸ•ΉοΈ Record a Console Session

Use the built-in recorder to capture an interactive shell session. By default, writes are flushed frequently for near-real-time updates; you can disable with --no-flush.

# Start recording (type `exit` to finish)
innerboard record

# Useful options
# --dir PATH      Save session under a custom directory
# --name NAME     Filename (auto-generated if omitted)
# --shell PATH    Shell to launch (defaults to $SHELL or /bin/bash)
# --flush/--no-flush  Control write flushing (default: --flush)

Session artifacts (raw log, timing, segments, SRE JSON) are saved under your app data directory, e.g. ~/.local/share/InnerBoard/sessions/ on Linux/WSL.


πŸ—£οΈ Generate Meeting Prep

Aggregate all recorded sessions and produce crisp talking points.

# Generate concise meeting prep
innerboard prep

# Include detailed SRE insights (verbose)
innerboard prep --show-sre

✍️ Optional: Save a Private Note

Initialize Vault and Add a Note

# One-time: create your encrypted vault for notes (password prompt)
innerboard init

# Optional: avoid prompts by exporting your password
export INNERBOARD_KEY_PASSWORD="your_secure_password_here"
# Save a short private note alongside your sessions to remind your future self
innerboard add "Investigated auth token validation; planning staging tests next."

πŸ”§ Suggested Workflow

  1. Record meaningful work sessions: innerboard record
  2. Optionally jot private notes to remind yourself: innerboard add "Short note"
  3. Generate prep before standup/1:1: innerboard prep --show-sre

🎯 Next Steps

Before Standup / 1:1

  • Run innerboard prep for concise talking points
  • Use --show-sre to include detailed context when needed

Weekly Review

  • Skim SRE session summaries to spot patterns
  • Capture follow-ups as private notes with innerboard add "..."

πŸ“‹ Quick Reference

Installation & Setup

Command Description Example
curl -fsSL https://raw.githubusercontent.com/ramper-labs/InnerBoard-local/main/quickstart.sh | bash One-command installation Quick start script
innerboard setup Interactive setup wizard innerboard setup --docker
innerboard health Comprehensive health check innerboard health --detailed
./docker-setup.sh Docker deployment setup Automated Docker setup

Core Commands

Command Description Example
innerboard init Initialize encrypted vault innerboard init
innerboard add "text" Add private reflection innerboard add "Struggling with auth tokens"
innerboard list View saved reflections innerboard list --limit 10
innerboard delete <id> Delete specific reflection innerboard delete 5 --force
innerboard del <id> Alias for delete innerboard del 5 --force
innerboard clear Delete ALL reflections innerboard clear --force
innerboard status Vault status and stats innerboard status

Session Recording & Analysis

Command Description Example
innerboard record Record terminal session innerboard record --name standup
innerboard prep Generate meeting prep innerboard prep --show-sre
innerboard models List available AI models innerboard models

Help Commands

Command Description Example
innerboard --help Show all commands innerboard add --help

🚨 Important Security Notes

  • Your data stays local: No information leaves your device
  • Encryption: All reflections are encrypted at rest
  • Network isolation: Only local Ollama connections allowed during processing
  • Password protection: Your master key is password-protected
  • Secure deletion: Sensitive files are overwritten before deletion

Configuration File

# Create your config file
cp env.example .env

Available Models

# See what models are available
innerboard models

# Expected output:
#    Available   
#  Ollama Models
# ┏━━━━━━━━━━━━━┓
# ┃ Model Name  ┃
# ┑━━━━━━━━━━━━━┩
# β”‚ gpt-oss:20b β”‚
# β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜
#
# Current model: gpt-oss:20b

πŸ› οΈ Troubleshooting Guide

Common Issues

Setup Issues

"externally-managed-environment" error:

# The quickstart script automatically handles this by creating a virtual environment
# If you need to do it manually:
python3 -m venv innerboard_venv
source innerboard_venv/bin/activate
pip install -e .

Quick start script fails:

# Try manual installation
git clone https://github.com/ramper-labs/InnerBoard-local.git
cd InnerBoard-local
innerboard setup

Docker setup fails:

# Check Docker status
docker --version
docker-compose --version

# Clean and retry
docker system prune -a
./docker-setup.sh

Health Check Failures

Run detailed health check:

innerboard health --detailed

Common health check issues:

  • ❌ Python Environment: Upgrade to Python 3.8+
  • ❌ Ollama Service: Run ollama serve
  • ❌ AI Model: Run ollama pull gpt-oss:20b
  • ❌ Vault System: Run innerboard init

"Model not available" Error

# Solution: Pull the model
ollama pull gpt-oss:20b

# Check if Ollama is running
ollama list

"Password required" Error

# Set environment variable
export INNERBOARD_KEY_PASSWORD="your_password"

# Or use the --password flag
innerboard status --password your_password

"No such option: --password" Error

# Some commands don't support --password flag
# Use environment variable instead:
INNERBOARD_KEY_PASSWORD="your_password" innerboard list

Permission Errors

# Ensure write permissions in current directory
chmod 755 .

Ollama Not Starting

# Restart Ollama service
# macOS:
brew services restart ollama

# Linux:
sudo systemctl restart ollama

# Or start manually:
ollama serve

Getting Help

# General help
innerboard --help

# Command-specific help
innerboard add --help
innerboard list --help

Backup your vault:

# Your encrypted data is in:
# - vault.db (encrypted reflections)
# - vault.key (encrypted master key)
cp vault.db vault.db.backup
cp vault.key vault.key.backup

Test Coverage Areas

Component Tests Status
Security 16 tests βœ… All passing
Caching 17 tests βœ… All passing
Integration 11 tests βœ… All passing
Network Safety 6 tests βœ… All passing
Total 50 tests βœ… All passing

Manual Testing Verified

  • βœ… First-run setup and recording flow
  • βœ… AI analysis with real Ollama models
  • βœ… Security validation (blocks SQL injection, XSS)
  • βœ… Configuration file loading
  • βœ… Error handling and recovery
  • βœ… Data persistence and integrity
  • βœ… Performance optimization
  • βœ… Multi-reflection management

πŸ“„ License

Licensed under the Apache License 2.0. See LICENSE for details.

πŸ™ Acknowledgments

  • Powered by Ollama for local model serving
  • Inspired by the need for private, offline meeting preparation
  • Built with Click for CLI, Rich for UI, Cryptography for security

πŸ“Š Example Output Formats

Input Terminal Log:

~/project $ kubectl get pods -n production
NAME                     READY   STATUS    RESTARTS   AGE
auth-service-7d4b8f9c6-x2p9q   1/1     Running   0          2d
~/project $ kubectl describe pod auth-service-7d4b8f9c6-x2p9q -n production
# ... detailed pod information ...
~/project $ git log --oneline -5
a1b2c3d Fix authentication token validation
e4f5g6h Update user service endpoints
# ... more git history ...

SRE Session Output:

[
  {
    "summary": "Investigated authentication service in production, reviewed recent commits for token validation fixes.",
    "key_successes": [
      {
        "desc": "Successfully accessed production Kubernetes cluster",
        "specifics": "kubectl get pods -n production",
        "adjacent_context": "Authentication service pod running stable for 2 days"
      },
      {
        "desc": "Identified recent authentication fixes in git history",
        "specifics": "git log --oneline -5",
        "adjacent_context": "Found commit a1b2c3d fixing token validation"
      }
    ],
    "blockers": [],
    "resources": [
      "kubectl describe pod auth-service-7d4b8f9c6-x2p9q -n production",
      "Git commit a1b2c3d - authentication token validation fix"
    ]
  }
]

MAC Meeting Prep Output:

{
  "team_update": [
    "βœ… Successfully accessed production K8s cluster and verified auth service stability",
    "πŸ“Š Reviewed recent authentication fixes, found token validation improvements",
    "🎯 Next focus: testing token validation changes in staging environment"
  ],
  "manager_update": [
    "Team has good access to production systems and can debug effectively",
    "Recent authentication fixes appear stable with 2+ days uptime",
    "No current blockers, investigation skills developing well"
  ],
  "recommendations": [
    "Test the authentication token validation fix in staging environment",
    "Document the debugging process for future authentication issues",
    "Schedule follow-up review of authentication service architecture"
  ]
}

Your offline meeting prep companion that stays on your device. ✨

About

Your private onboard companion that stays on your device.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors