build(deps): Bump transformers from 4.53.0 to 5.0.0rc3 in /examples/flowertune-llm#6968
Conversation
There was a problem hiding this comment.
💡 Codex Review
Here are some automated review suggestions for this pull request.
Reviewed commit: 837da66a45
ℹ️ About Codex in GitHub
Codex has been enabled to automatically review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "@codex review".
If Codex has suggestions, it will comment; otherwise it will react with 👍.
When you sign up for Codex through ChatGPT, Codex can also answer questions or update the PR, like "@codex address that feedback".
| "peft==0.6.2", | ||
| "fschat[model_worker,webui]==0.2.35", | ||
| "transformers==4.53.0", | ||
| "transformers==5.0.0rc3", |
There was a problem hiding this comment.
Align transformers pin with the existing PEFT stack
Changing this pin to transformers==5.0.0rc3 while the same file still pins peft==0.6.2 introduces a dependency mismatch that breaks this example’s training path: PEFT’s maintainers document Transformers v5 as incompatible with PEFT versions below 0.18, and this project imports/uses PEFT directly in flowertune_llm/models.py, flowertune_llm/client_app.py, and test.py. In practice, users installing this template will get an environment that fails at import/runtime unless the PEFT/TRL stack is upgraded together (or Transformers stays on 4.x).
Useful? React with 👍 / 👎.
Bumps [transformers](https://github.com/huggingface/transformers) from 4.53.0 to 5.0.0rc3. - [Release notes](https://github.com/huggingface/transformers/releases) - [Commits](huggingface/transformers@v4.53.0...v5.0.0rc3) --- updated-dependencies: - dependency-name: transformers dependency-version: 5.0.0rc3 dependency-type: direct:production ... Signed-off-by: dependabot[bot] <support@github.com>
837da66 to
0e7940b
Compare
Bumps transformers from 4.53.0 to 5.0.0rc3.
Release notes
Sourced from transformers's releases.
... (truncated)
Commits
cb5079fv5.0.0rc3d1808f2[ci] Fixing some failing tests for important models (#43231)3d27645Add LightOnOCR model implementation (#41621)77146ccfix crash in when running FSDP2+TP (#43226)61317f5[CB] Ensure parallel decoding test passes using FA (#43277)1efe1a6Fix failingPegasusX,Mvp&LEDmodel integration tests (#43245)e8ae373[consistency] Ensure models are added to the_toctree.yml(#43264)c85be98[docs] tensorrt-llm (#43176)38022fd[style] Fix init isort and align makefile and CI (#43260)e977446Fix failingHiera,SwiftFormer&LEDModel integration tests (#43225)