Skip to content

[Feature Request]: Add support for local LLM in CLI #51

@GregorBiswanger

Description

@GregorBiswanger

I would like to request a feature enhancement for the vectra CLI functionality. Specifically, I would like to have the option to use local Large Language Models (LLMs) instead of relying solely on OpenAI's API.

Feature Details:

  1. Local LLM Integration:

    • Add support for integrating local LLMs, such as those provided by LM Studio, which allows responses similar to OpenAI.
    • The integration should be flexible to accommodate other LLMs as well.
  2. Global Configuration:

    • Implement a global configuration option that allows users to specify their preferred LLM provider.
    • The configuration should include necessary parameters for the chosen LLM, such as API keys or local model paths.
  3. CLI Commands:

    • Ensure that CLI commands can easily switch between different LLM providers based on the global configuration.
    • Provide clear documentation and examples on how to set up and use local LLMs within the vectra CLI.

Benefits:

  • This feature will provide users with more flexibility and control over their LLM integrations.
  • It will allow users to utilize local models, which can be beneficial for privacy, cost, and offline usage scenarios.

Thank you for considering this feature request. I believe it will greatly enhance the functionality and usability of the vectra CLI.

Cheers,
Gregor

Metadata

Metadata

Assignees

No one assigned

    Labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions