A lightweight and modern Ollama Flask server template, perfect for building RESTful APIs and web applications that require prompting to LLMs.
- Clean project structure
- Ready for API development
- Easy to extend and customize
- Ollama - LLM server that needs to be installed locally
- This application uses the
gemma3:12bmodel
- Clone the repository:
git clone https://github.com/yourusername/flask-template.git
cd flask-template-
Install and run Ollama:
- Download from https://ollama.com/download
- After installation, pull the Gemma 3 12B model:
ollama pull gemma3:12b
-
Create a virtual environment and activate it:
python -m venv venv
source venv/bin/activate # On Windows: venv\Scripts\activate- Install dependencies:
pip install -r requirements.txtRun the development server:
python main.pyThe server will start at http://localhost:5000
This project is licensed under the MIT License - see the LICENSE file for details.