Create immersive fiction with an AI that remembers.
Novelist is an agentic writing system designed for long-form fiction. Unlike chatty assistants that forget details after a few pages, Novelist uses a dedicated Database Memory and a Parallel Critic Tribunal to maintain consistency, tone, and plot threads across tens of thousands of words.
- SQLite Backend: Migrated from fragile JSON files to a robust SQL database (
story.db). - Entity Tracking: Automatically updates the "World State" (Time, Location, Inventory) and "Character Bible" (Relationships, Hidden Agendas).
- Arc Ledger: Tracks unresolved plot threads, promises to the reader, and thematic resonance.
- "Best of 3" Generation: The agent drafts three variations of every scene simultaneously.
- Agentic Tribunal: Three distinct Critic Agents (Prose, Redundancy, Arc) vote on the best draft.
- Self-Correction: The system automatically fixes common AI tics (purple prose, repetition) before you ever see the text.
- Real-Time Monitoring: Watch your story grow with a Streamlit-based dashboard.
- Structure Visualization: Track word counts, arc progression, and character status visually.
- Project Management: Create and switch between multiple story projects seamlessly.
- Log Control: Real-time agent logs and control directly from the browser.
The Agent Reasoning in Real-Time
Option A: Cloud Intelligence (Recommended for most users)
- Machine: Any standard laptop/desktop (Windows, Mac, Linux).
- RAM: 8GB+
- Backend: Uses APIs (OpenAI, DeepSeek, Anthropic, etc.). Fast and lightweight.
Option B: Local Intelligence (For high-end workstations)
- Machine: High-performance PC or Mac (M-series).
- RAM: 32GB+ System RAM.
- GPU: NVIDIA RTX 3060 (12GB VRAM) or better.
- Backend: Runs Ollama locally. Total privacy, but requires heavy hardware.
- Dev Note: This system was developed on a high-end rig with 64GB RAM and RTX 4090 to support local 32B parameter models.
If you run the agent from WSL (Windows Subsystem for Linux) but want to use Windows Ollama with CUDA/GPU acceleration, you must configure it correctly:
-
Run Ollama on Windows only. Do NOT install or run
ollama serveinside WSL—this will load models onto your CPU instead of GPU. -
Start Ollama with all-interfaces binding:
# In Windows PowerShell (Admin) $env:OLLAMA_HOST="0.0.0.0:11434"; ollama serve
-
Add a Windows Firewall rule (one-time):
netsh advfirewall firewall add rule name="Ollama WSL Access" dir=in action=allow protocol=TCP localport=11434
-
Find your Windows host IP from WSL:
ip route | grep default | awk '{print $3}'
-
Update your
.envfile with the Windows IP:OLLAMA_BASE_URL=http://<YOUR_WINDOWS_IP>:11434
Why? WSL2 runs in a virtual network.
localhostinside WSL does not automatically reach Windows. You must use the Windows host IP to ensure requests hit the GPU-enabled Windows Ollama instance.
- Python 3.10+
- LLM backend (Choose one):
- Local (Ollama): Free, private. Requires Ollama installed on Windows (for CUDA support).
- Cloud (BYOK): Fast, powerful. Requires an API Key (OpenAI, Groq, Together, etc.).
-
Clone the Repository
git clone https://github.com/AxolDad/novelist.git cd novelist -
Setup Environment (BYOK) Create a
.envfile in the root directory. Choose your path:Option A: Local Power (Ollama)
LLM_PROVIDER=ollama OLLAMA_HOST=http://localhost:11434 WRITER_MODEL=mistral CRITIC_MODEL=mistral
Option B: Cloud Speed (OpenAI / Comparable)
LLM_PROVIDER=openai OPENAI_API_KEY=sk-your-key-here WRITER_MODEL=gpt-4o CRITIC_MODEL=gpt-4o-mini # Optional: Custom Base URL for Groq/Together # OPENAI_BASE_URL=https://api.groq.com/openai/v1
-
Install Dependencies
pip install -r requirements.txt
Double-click start.bat on Windows to launch both the Dashboard and the Agent.
Start the Dashboard:
streamlit run dashboard.pyGo to the Home Tab to create a new Story Project.
💡 Note: "Create Project" vs. "Story Title"
- Create Project (Home Tab): Creates a physical folder on your disk. Do this once to start a book. The name you choose here is the folder name.
- Story Title (Story Setup Tab): Changes the display title of your book in
story_manifest.json. Use this to rename your book creatively without breaking file paths.
Start the Agent:
python novelist.py --project "projects/my_story_title"The agent will begin drafting scenes based on your Manifest.
graph TD
A[User Manifest] -->|Config| B(Novelist Agent)
B -->|Generates 3x| C{Parallel Drafts}
C -->|Review| D[Tribunal Critics]
D -->|Vote & Select| E[Best Draft]
E -->|Update| F[(SQLite Story DB)]
F -->|Persist| G[Manuscript.md]
F -->|Visualise| H[Streamlit Dashboard]
Contributions are welcome! Since we are in Alpha, please open an issue before submitting a PR for major architectural changes.
MIT License. Build something beautiful.

