Skip to content

[feat] add podnet#5

Open
Moenupa wants to merge 5 commits intomainfrom
feat/podnet
Open

[feat] add podnet#5
Moenupa wants to merge 5 commits intomainfrom
feat/podnet

Conversation

@Moenupa
Copy link
Owner

@Moenupa Moenupa commented Feb 7, 2026

What does this PR do?

Add PODNet learner based on iCARL.

CIFAR100 training: https://api.wandb.ai/links/moenupa/vegphl6w

Before submitting

  • This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
  • Did you read the contributor guideline?
  • Did you make sure to update the documentation with your changes?
  • Did you write any new necessary tests?

@Moenupa Moenupa requested a review from SmallPigPeppa February 7, 2026 12:43
Copy link

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This PR adds a PODNet (Pooled Outputs Distillation) learner to the LyCIL library, implementing the method from Douillard et al., ECCV 2020. The implementation extends the existing iCARL learner and introduces spatial and flat distillation losses alongside a Nearest Class Mean (NCA) classification loss. The PR also includes necessary infrastructure changes to support buffer-only training phases and cosine classifier head improvements.

Changes:

  • Adds PODNet learner with NCA loss, spatial distillation, and flat distillation
  • Refactors base learner to support buffer-only training with special task_id handling
  • Updates data module to expose train_filter_fn and use_buffer for flexible buffer control
  • Enhances cosine classifier head with proxy support and explicit gradient control

Reviewed changes

Copilot reviewed 10 out of 10 changed files in this pull request and generated 6 comments.

Show a summary per file
File Description
src/lycil/learner/podnet.py New PODNet learner implementation with NCA loss and spatial/flat distillation
tests/training/test_podnet_cifar.py Test file for PODNet on CIFAR datasets with two-phase training (task + memory)
src/lycil/learner/base.py Refactored sync_with_datamodule to support buffer-only training bypass and moved head expansion logic
src/lycil/data/hfmodule.py Added train_filter_fn and use_buffer attributes for flexible buffer control during training
src/lycil/classifier/init.py Updated cosine head defaults and added explicit gradient requirements for new heads
src/lycil/classifier/linears.py Minor import order fix
tests/training/test_lwf_cifar.py Import order adjustment for consistency
tests/training/test_icarl_cifar.py Import order adjustment for consistency
pyproject.toml Updated wandb version range and commented out NPU lightning dependency
Makefile Changed test environment from WANDB_DISABLED to WANDB_MODE=offline

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant