Skip to content

Any plans to support CPU / MPS for easier learning? #61

@Kingwl

Description

@Kingwl

Hi, thanks for open-sourcing this great project. mini-vLLM is a very helpful resource for understanding the core ideas behind vLLM.

I was wondering if there are any plans to support additional backends such as CPU or Apple MPS. Since this project is educational in nature, being able to run it on more platforms (e.g., laptops without CUDA GPUs) could make it much easier for people to experiment with and learn the concepts.

Even a slower fallback implementation (e.g., pure PyTorch attention) might already be very useful for learning purposes.

Thanks again for the great work!

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions