Skip to content

Token limit error  #8

@suyashnmims

Description

@suyashnmims

i am not a professional programmer . I am trying to run this program and I am getting the below error : This model's maximum context length is 4097 tokens, however you requested 11215 tokens (10959 in your prompt; 256 for the completion). Please reduce your prompt; or completion length.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions