feat: transformers to v5#151
Merged
PeterStaar-IBM merged 6 commits intomainfrom Mar 27, 2026
Merged
Conversation
Signed-off-by: Peter Staar <taa@zurich.ibm.com>
….3 patch ranges which do not support granite-docling correctly. Signed-off-by: Peter Staar <taa@zurich.ibm.com>
Contributor
|
✅ DCO Check Passed Thanks @PeterStaar-IBM, all your commits are properly signed off. 🎉 |
Contributor
Merge ProtectionsYour pull request matches the following merge protections and will not be merged until they are valid. 🟢 Enforce conventional commitWonderful, this rule succeeded.Make sure that we follow https://www.conventionalcommits.org/en/v1.0.0/
|
Signed-off-by: Peter Staar <taa@zurich.ibm.com>
Signed-off-by: Peter Staar <taa@zurich.ibm.com>
Signed-off-by: Peter Staar <taa@zurich.ibm.com>
Signed-off-by: Peter Staar <taa@zurich.ibm.com>
Codecov Report❌ Patch coverage is
📢 Thoughts on this report? Let us know! |
dolfim-ibm
approved these changes
Mar 27, 2026
Member
|
We could add the following but not needed for runtime from typing import Optional
if TYPE_CHECKING:
from transformers import GenerationConfig
class TableFormerV2(PreTrainedModel, GenerationMixin):
# Annotate generation_config to satisfy mypy:
generation_config: Optional["GenerationConfig"]
# ... rest of class ... |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
This allows >=4.42.0,<5.0.0 and >=5.4.0,<6.0.0 by excluding the 5.0–5.3 patch ranges which do not support granite-docling correctly.
Two changes were made to docling_ibm_models/tableformer_v2/model.py:
1. Added GenerationMixin to the class inheritance
In transformers v5, PreTrainedModel no longer inherits from GenerationMixin. Any model that defines prepare_inputs_for_generation is expected to explicitly inherit from GenerationMixin (placed after PreTrainedModel in the MRO).
2. Added self.post_init() at the end of init
In transformers v5, post_init() is no longer called automatically by PreTrainedModel.init. Each model must call it explicitly at the end of its own init. This method sets several instance attributes that transformers v5 relies on during from_pretrained, including all_tied_weights_keys — the dict that was causing the AttributeError.