- Stage 1 - This is a dockerized
pen.elthat makes use of both GPT-3 and GPT-j.- Implementation [2/4]
- [X] emacs-lisp functions are generated from YAML
.promptfiles - [X] Use built-in elisp functions for portability
- [ ] Select backend interface in emacs
- [ ] Shell interface to OpenAI to python
- [ ] GPT-j interface
- [ ] Dockerize Pen
- [X] emacs-lisp functions are generated from YAML
- Implementation [2/4]
- Stage 2 - This is a collaborative
pen.elwith an exploration tool like loom (https://github.com/socketteer/loom)- [ ] Generations stored in Datomic
- [ ] Prompts stored in Datomic
- [ ] Prompt Catalogue
- [ ] Make use of arbitrarily many git repositories that store prompts
- Load a repository such as this: https://github.com/semiosis/prompts
- Stage 3
- [ ] Incorporate OpenAI’s Codex model, which should be available at this stage.
- [ ] vim
- [ ] Standalone application based in emacs, available through a web browser
- Use mermaid for Gantt chart in emacs
- Review of ‘mermaid - flowcharts, diagrams, etc.’ // Bodacious Blog https://mermaid-js.github.io/mermaid-live-editor/
gantt
title Project timeline
dateFormat YYYY-MM-DD
section Stage 1
Stage 1 :done, :s1, 2021-03-01, 120d
Generate elisp functions from YAML :done, :a1, 2021-03-01, 30d
Create a bunch of prompts :done, :a3, 2021-03-30, 30d
Integrate helm, ivy and counsel :done, :a4, 2021-04-30, 30d
Integrate org-brain :done, :a5, 2021-05-30, 30d
Use elisp for portability :done, :b2, 2021-07-02, 2d
Dockerize Pen : active, b3, after b2 , 5d
Excise Pen from emacs.d: active, b3.1, after b2 , 5d
Deploy Pen to straight.el: active, b3.2, after b2 , 5d
Convert shell to Python : active, b4, after b2 , 5d
Incorportate OpenAI parameters from loom : active, b4, after b2 , 5d
Select backend interface in emacs :crit, after b3, 2d
section Stage 2
Stage 2 :s2, 2021-07-12, 120d
Prompt Catalog : b6a, 2021-07-12, 20d
Create prompt-description-mode schema in schemastore : b6b, 2021-07-12, 20d
lm-complete (backend completer) : b6b, 2021-07-12, 20d
Imaginary interpreter + imaginary-mode : b11, 2021-07-12, 20d
Incorporate semantic search : b12, 2021-07-12, 20d
Connect arbitrary prompts repositories : b6, 2021-07-12, 20d
Generations stored in Datomic : b7, after b6, 20d
Connect to more emacs packages : b8, after b7 , 20d
Select from huggingface transformers : b15, after b7 , 20d
melpa : b14, after b7 , 5d
Real-time completion of tokens with interrupt : b9, after b8, 20d
Multiversal viewer : b10, after b9, 20d
section Stage 3
Stage 3 :s3, 2021-12-12, 120d
Incorporate OpenAI Codex model : c1, 2021-12-12, 1d
Incorporate Ocean protocol and Posthuman AI Market : c2, 2021-12-12, 20d
Butterfly web service : c3, 2021-12-12, 20d
- Completed [ ]
- [X] Default behaviour of generated functions
- [X] First argument may be selection
- [ ]If selection then text is replaced by default
- [X] Prompt Catalogue tablist prototype
- [X] Generate functions of composed prompts
- [ ] Generate prompts from interactive keyboard macros
- [ ] Prompt development workflow
- [ ] Prompt catalogue tablist
- [ ] Search workflow
- Semantic search for
straight.el - Semantic search for
nixos.el - Semantic concordance for
KJV bible
- Semantic search for
- [X] Default behaviour of generated functions
- Posthuman AI Market on Ocean Protocol
- https://port.oceanprotocol.com/t/posthuman-ai-market-v1-1-luci-integration/675
- Butterfly
- https://github.com/paradoxxxzero/butterfly
