Skip to content

[bugfix] Fix the multi-LoRA issue in Twinkle#24

Open
Jintao-Huang wants to merge 2 commits intomodelscope:mainfrom
Jintao-Huang:fix_twinkle_multi_lora
Open

[bugfix] Fix the multi-LoRA issue in Twinkle#24
Jintao-Huang wants to merge 2 commits intomodelscope:mainfrom
Jintao-Huang:fix_twinkle_multi_lora

Conversation

@Jintao-Huang
Copy link
Copy Markdown
Collaborator

No description provided.

Copy link
Copy Markdown

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request introduces two main changes to src/mcore_bridge/bridge/gpt_bridge.py. First, it refines the logic for processing PEFT-related state dictionary keys in the _set_module function by adding a check for _adapter_name, ensuring that only keys relevant to the current adapter are processed. Second, it adds a None check for the value v before calling the converter function in export_weights, which improves robustness. There are no review comments to address.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant