Skip to content

Fix Required Transformers Version in requirements.txt to Ensure Compatibility #7

@Comtive-Wdson

Description

@Comtive-Wdson

The current requirements.txt specifies an incorrect version for the transformers library, leading to compatibility issues.

Versions < 4.45.0 lack modules like transformers.mixtral and transformers.mllama, causing ModuleNotFoundError.
Higher versions introduce breaking changes (e.g., AttributeError: 'LlamaAttention' object has no attribute 'num_heads') due to refactored attention layer attributes.

To fix it, pin the transformers version to 4.45.0 in requirements.txt: transformers==4.45.0

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions