Skip to content

Runpod Serverless Version of the Llama4-Scout Deployment Recipe #37

@celsofranssa

Description

@celsofranssa

I was going through the excellent Llama 4 Scout on vLLM recipe

Would it be possible to provide a version of this recipe adapted for Runpod vLLM Serverless
In particular, guidance on the required GPUs and the appropriate vLLM configuration would be very helpful.

This would make it much easier for those of us looking to deploy on Runpod’s serverless platform directly.

Thanks a lot for considering this! 🙏

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions