Skip to content

Issues running project : errors in ollama serve logs #1

@gwpl

Description

@gwpl

Regarding: https://github.com/punnerud/Local_Knowledge_Graph

I run into problem in trying your project.
I've installed dependencies, run with python3 app.py under ArchLinux, to see errors in ollama server loga and no output in web browser:

$ ollama serve
(...)
time=2024-09-20T22:44:48.892+02:00 level=WARN source=server.go:597 msg="client connection closed before server finished loading, aborting load"
time=2024-09-20T22:44:48.892+02:00 level=ERROR source=sched.go:456 msg="error loading llama server" error="timed out waiting for llama runner to start: context canceled"
[GIN] 2024/09/20 - 22:44:48 | 499 |  1.057118814s |       127.0.0.1 | POST     "/api/embed"

ofc I first:

ollama pull llama3.1:8b
pulling manifest 
pulling 8eeb52dfb3bb... 100% ▕███████████████████████▏ 4.7 GB                         
pulling 948af2743fc7... 100% ▕███████████████████████▏ 1.5 KB                         
pulling 0ba8f0e314b4... 100% ▕███████████████████████▏  12 KB                         
pulling 56bb8bd477a5... 100% ▕███████████████████████▏   96 B                         
pulling 1a4c3c319823... 100% ▕███████████████████████▏  485 B                         
verifying sha256 digest 
writing manifest 
success

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions