Skip to content

Question: would this work on RPIs i.e. ARM CPUs? #1

@stevef1uk

Description

@stevef1uk

Hi,

It is possible to run some LLMs on the new RPIs with 8GB RAM, but it would be nice to try a larger model running across a number of RPIs as some of us have quite a few of them available to us.

It would be nice if the tool supported this LLaMA repo: https://github.com/ggerganov/llama.cpp/tree/gg/phi-2

Regards

Steve

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions