Skip to content

Commit 6b48d61

Browse files
fix TEGroupedLinear (#94)
1 parent 2c24077 commit 6b48d61

File tree

1 file changed

+0
-2
lines changed
  • src/twinkle/model/megatron/tuners

1 file changed

+0
-2
lines changed

src/twinkle/model/megatron/tuners/lora.py

Lines changed: 0 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -136,8 +136,6 @@ def update_layer(self, adapter_name: str, r: int, *, lora_alpha: int, lora_dropo
136136
'config': self.config,
137137
'is_expert': self.is_expert,
138138
}
139-
if exists('megatron_core>=0.13'):
140-
kwargs['tp_group'] = self.base_layer.tp_group
141139

142140
if isinstance(self.base_layer, TopKRouter):
143141
# Router layer - no parallelism needed

0 commit comments

Comments
 (0)