Skip to content
This repository was archived by the owner on Nov 13, 2024. It is now read-only.
This repository was archived by the owner on Nov 13, 2024. It is now read-only.

[Bug] I cant find out how to configure a new LLM #349

@augmentedstartups

Description

@augmentedstartups

Is this a new bug?

  • I believe this is a new bug
  • I have searched the existing issues, and I could not find an existing issue for this bug

Current Behavior

Right now its using the default LLM, I want to change between gpt4o and mini models and also test out some open sources alternatives

I need access to this.

Expected Behavior

0

Steps To Reproduce

0

Relevant log output

s

Environment

- **OS**:
- **Language version**:
- **Canopy version**:
s

Additional Context

s

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions