Skip to content

Releases: varadanvk/llm-cli

v0.0.7

10 Jul 06:14

Choose a tag to compare

v0.0.7 Update

New: OpenRouter Support

  • Added OpenRouter API with access to Gemini 2.5 Pro, Grok-3, Grok-4
  • Select "custom" to use any OpenRouter model

Easier Setup

  • Load API keys from .env file automatically
  • Better error messages and setup feedback

Install/Update

pip install --upgrade llm-chat-cli
lmci setup

Add these to your .env file:

OPENROUTER_API_KEY=your_key
GROQ_API_KEY=your_key
OPENAI_API_KEY=your_key
ANTHROPIC_API_KEY=your_key
CEREBRAS_API_KEY=your_key

v0.0.6

07 Jul 05:51

Choose a tag to compare

What's Changed

🐛 Bug Fixes

  • Fixed markdown rendering for code blocks
  • Code blocks now properly accumulate content instead of rendering empty blocks
  • Added support for inline code highlighting with backticks
  • Improved buffering logic for multi-line markdown content (tables, lists, etc.)

🔧 Technical Details

  • Simplified the function
  • Fixed state machine logic that was causing code to appear outside markdown blocks
  • Better handling of streaming chunks across code block boundaries

This release fixes the issue where code blocks would render incorrectly during streaming, especially when chunks were split across code block markers.

v0.0.5

19 Mar 18:55

Choose a tag to compare

Streaming Support (fr this time)

v0.0.4

10 Jan 16:49

Choose a tag to compare

Added streaming support to all APIs

v0.0.3

20 Nov 04:15

Choose a tag to compare

Pre-release version #3. Please work this time

v0.0.2

20 Nov 04:12

Choose a tag to compare

A new published version of the package

v0.0.1

20 Nov 04:06

Choose a tag to compare

v0.0.1