Skip to content

JustMrNone/Dual_Artificial_Intelligence_Inquiry

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

23 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

AIs Discussion Platform

A web application that enables two AI models to have autonomous conversations with full customization control.

Screenshots

Desktop Interface

Desktop Light Mode Desktop Dark Mode

Mobile Interface

Mobile Light Mode Mobile Dark Mode

What Is This?

An interactive platform that orchestrates conversations between two AI models powered by Ollama. Watch AI models discuss, debate, and explore topics autonomously while you maintain full control over their behavior, personality, and constraints through customizable system prompts and model parameters.

Key Features

  • Model Flexibility: Choose any two Ollama models to converse
  • Custom System Prompts: Define personality, behavior, and constraints for each AI
  • Advanced Configuration: Adjust temperature, context length, and other model parameters
  • Real-time Monitoring: Watch conversations unfold live in the web interface
  • Conversation Management: Start, pause, resume, or stop discussions at any time
  • History & Archives: Save, load, and review past conversations
  • Search Functionality: Find specific messages across conversation history
  • Dark Mode: Eye-friendly interface for extended sessions
  • Progressive Web App: Install as a native app on desktop or mobile devices
  • Dual Interface: Web GUI or command-line execution

How It Works

  1. Select Models: Choose two AI models from your Ollama installation
  2. Configure Behavior: Set system prompts and parameters for each model
  3. Initiate Conversation: Start the discussion with a custom opening message
  4. Real-time Exchange: Models take turns responding to each other
  5. Monitor & Control: Pause, resume, or stop the conversation anytime
  6. Archive & Review: Save conversations with full metadata for later analysis

The platform uses Ollama's local inference, ensuring privacy and no external API dependencies.

Installation

Prerequisites

  • Python 3.13+
  • Ollama installed with at least two models
  • Poetry (recommended) or pip

Setup

# Clone the repository
git clone https://github.com/JustMrNone/Dual_Artificial_Intelligence_Inquiry.git
cd Dual_Artificial_Intelligence_Inquiry

# Install GUI dependencies
cd GUI
poetry install

# Install Conversation module dependencies
cd ../Conversation
poetry install

Usage

Web Interface (Recommended)

cd GUI
poetry run python app.py

Open http://localhost:5000 in your browser.

To generate PWA icons:

python generate_icons.py

Command Line

cd Conversation
poetry run python start.py

With custom models:

poetry run python start.py llama3.1 mistral

Technical Architecture

  • Backend: Flask REST API with JSON file storage
  • Frontend: Vanilla JavaScript with server-sent events for real-time updates
  • AI Integration: Ollama Python SDK for local model inference
  • PWA: Service Worker for app installation
  • Responsive Design: Mobile-first CSS with desktop optimization

Project Structure

├── GUI/                    # Web application
│   ├── app.py             # Flask backend
│   ├── static/            # CSS, JS, PWA assets
│   └── templates/         # HTML templates
├── Conversation/          # Core conversation module
│   ├── Module/            # Discussion logic
│   ├── System_Prompts/    # Default prompts
│   └── History/           # Saved conversations
└── README.md              # This file

Configuration

Environment Variables

Create .env in the GUI folder:

SECRET_KEY=your-secret-key-here

System Prompts

Customize AI behavior by editing:

  • Conversation/System_Prompts/One/system_prompt_one.json
  • Conversation/System_Prompts/Two/system_prompt_two.json

Or via the GUI.

Use Cases

  • Research: Study AI interaction patterns and emergent behaviors
  • Education: Demonstrate AI capabilities and limitations
  • Content Generation: Brainstorm ideas through AI dialogue
  • Testing: Compare model capabilities and response styles
  • Entertainment: Create interesting conversations on any topic

License

MIT License - See LICENSE file for details

Acknowledgments

Built with Ollama for local AI inference, ensuring privacy and control over your conversations.


Note: Requires Ollama to be installed and running. Download models using ollama pull <model-name>.

About

An application where two AI entities engage in conversations.

Topics

Resources

License

Stars

Watchers

Forks

Packages

No packages published