This was very badly designed. From startup to looming took like several keypresses and everything was hidden behind options menues. I have made a new better one but its not released. I'll probably incorporate the learnins from these two interfaces into tinyloom. Long story short, the terminal loom interface in general gets unwieldy with larger trees, and you really want a network visualization at some point. Gotta head back to the browser with your tail between your legs.
Command Line Loom is a Python command-line tool for generating text using OpenAI's GPT-3 and other models. It includes a modular model system that allows for easy integration of new models and customization of existing ones.
Includes templates, look in the Turbo Text Transformer Prompts repository for more documentation and to find a list of the templates!
To install Command Line Loom, you can use pip:
pip install command-line-loomor clone the repository and install it manually:
git clone https://github.com/fergusfettes/command-line-loom.git
cd command-line-loom
poetry installShould be self-documenting in the interface. Just type 'help'.
Some node manipulation: vid
Watch this for more advanced usage: vid.
Before using Command Line Loom, you need to set up a configuration file. This should happen automatically when you run the cll command for the first time.
This will create a configuration file in your home directory. See the documentation for each model to learn how to obtain an API key.
api_key: sk-<your api key here>
engine_params:
frequency_penalty: 0
logprobs: null
max_tokens: 1000
model: davinci
n: 4
presence_penalty: 0
stop: null
temperature: 0.9
top_p: 1
models:
- babbage
- davinci
- gpt-3.5-turbo-0301
- text-davinci-003
etc.If you find a bug or would like to contribute to Command Line Loom, please create a new GitHub issue or pull request.
Inspired by Loom.
Command Line Loom is released under the MIT License. See LICENSE for more information.