Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
11 changes: 11 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -95,6 +95,17 @@ Here's the steps to set up the project locally:
7. Fill in the models name and check the "active" checkbox in red boxes in the image bellow and click "Save" (pay attention to the LLM and Embeddings model name placement):
![local_models_admin_fields](docs/assets/local_models_admin_fields.png)

### Using DeepSeek R1 with nomic:

1. `make up`
2. `make createuser`
3. `make ollama model=nomic-embed-text:latest`
4. `make ollama model=deepseek-r1` | `deepseek-coder` | `deepseek-coder-v2`, coder-v2 is the best if you have a good machine.
5. Go to [http://localhost:8000/admin](http://localhost:8000/admin) in internet browser and login (admin/admin)
6. Select "Models" option in the menu on the left
7. Fill in the models name and check the "active" checkbox in red boxes in the image below and click "Save" (pay attention to the LLM and Embeddings model name placement):
![local_models_admin_fields](docs/assets/local_models_admin_fields.png)

## Prompt Examples

There are some prompt examples in [here](docs/prompt_examples.md). These are sorted by complexity and are a way for us to measure the quality of the LLM's response.
1 change: 1 addition & 0 deletions labs/llm/ollama.py
Original file line number Diff line number Diff line change
Expand Up @@ -12,6 +12,7 @@ def completion_without_proxy(self, messages, *args, **kwargs):
model=self._model_name,
messages=messages,
format="json",
options={"num_ctx": 8192},
*args,
**kwargs,
)
Expand Down
Loading