Skip to content

imclerran/roc-ai

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Roc package for building with LLMs

Roc-Lang GitHub last commit CI status Latest release

This package is still in WIP 🛠️ stages, so the interface may be subject to change. With that said, the package currently supports:

NOTE: roc-ai is currently undergoing a major refactoring. When complete, it should include better support for Anthropic (including function calling), improve encoding performance, and be structured better for long term maintainance and support. The goal is to keep public facing interface change minimal, but some interface changes may also be inbound.

  • 🚀 NEW! Support for many APIs:
    • Anthropic
    • OpenAI
    • OpenAI compliant, with a custom URL
      • (Includes local providers, such as Ollama or LM Studio)
    • OpenRouter, with support for hundreds of models, and many OpenRouter exclusive features.
  • Creating and parsing ChatML style requests and responses.
  • Creating and parsing raw prompt style requests and responses.
  • Formatting prompt strings with [INST], <<SYS>>, and <s> tags for models with llama style fine-tuning.
  • Most common LLM parameters such as temperature, top_p, top_k, repetition_penalty, etc.
  • OpenRouter specific features like fallback models and provider preferences.
  • LLM tool use - this enables the AI model to call Roc functions and use the results in its answers.
    • Includes a collection of prebuilt tools, or you can build your own
  • Prompt caching on supported models

Known issues:

  • Tool use is currently not supported with the anthropic API,
    • this is due to missing support in Roc for decoding json dictionaries roc#5294
    • Workaround: Anthropic models can be accessed through OpenRouter, with full tool calling support
  • Prompt caching has currently only be been tested through OpenRouter

Example

main! = |_|
    api_key = Env.var!("OPENAI_API_KEY")?
    client =
        Chat.new_client({ api: OpenAI, api_key, model: "gpt-4o" })
        |> Chat.append_user_message("Hello, computer!", {})
    response = Http.send!(Chat.build_http_request(client, {}))?
    messages = Chat.update_messages(client, response)? |> .messages
    when List.last(messages) is
        Ok(message) -> Stdout.line!(message.content)
        _ -> Ok({})

For complete example apps, including a full chatbot app with tool use, see the examples folder.

About

Roc Package for building with LLMs.

Topics

Resources

Stars

Watchers

Forks

Packages

No packages published

Languages