Skip to content

[论文讨论] Shared LoRA Subspaces for almost Strict Continual Learning #56

@gqy20

Description

@gqy20

论文信息

标题: Shared LoRA Subspaces for almost Strict Continual Learning
作者: Prakhar Kaushik, Ankit Vaidya, Shravan Chaudhari, Rama Chellappa, Alan Yuille
发布时间: 2026-02-05
分类: cs.AI
PDF: Download

简介

Adapting large pretrained models to new tasks efficiently and continually is crucial for real-world deployment but remains challenging due to catastrophic forgetting and the high cost of retraining. While parameter-efficient tuning methods like low rank adaptation (LoRA) reduce computational demands, they lack mechanisms for strict continual learning and knowledge integration, without relying on data replay, or multiple adapters. We propose Share, a novel approach to parameter efficient continual finetuning that learns and dynamically updates a single, shared low-rank subspace, enabling seamless adaptation across multiple tasks and modalities. Share constructs a foundational subspace that extracts core knowledge from past tasks and incrementally integrates new information by identifying essential subspace directions. Knowledge from each new task is incorporated into this evolving subspace, facilitating forward knowledge transfer, while minimizing catastrophic interference. This approach achieves up to 100x parameter reduction and 281x memory savings over traditional LoRA methods, maintaining performance comparable to jointly trained models. A single Share model can replace hundreds of task-specific LoRA adapters, supporting scalable, asynchronous continual learning...

推荐理由

论文 0 适合讨论:LoRA 已是业界标准,Share 如何在单一子空间中平衡新旧任务知识值得深入探讨

讨论

请对这篇论文发表您的见解:

  • 论文的创新点是什么?
  • 方法是否合理?
  • 实验结果是否可信?
  • 有哪些可以改进的地方?

由 arXiv Monitor 自动创建

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions