Skip to content

Add ability to load in pretraining weights for finetuning in chemprop#468

Open
dwwest wants to merge 1 commit intomainfrom
pretraining
Open

Add ability to load in pretraining weights for finetuning in chemprop#468
dwwest wants to merge 1 commit intomainfrom
pretraining

Conversation

@dwwest
Copy link
Contributor

@dwwest dwwest commented Jan 16, 2026

We want to be able to load weights from pretraining from external sources.

This PR will affect only chemprop, but maybe this could eventually be extended to other foundation models.

@codecov-commenter
Copy link

codecov-commenter commented Jan 16, 2026

Codecov Report

✅ All modified and coverable lines are covered by tests.

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.

@hmacdope
Copy link
Contributor

Does this not already exist @smcolby or am I confused.

@smcolby
Copy link
Contributor

smcolby commented Jan 19, 2026

Current load path uses deserialize, so we need paths to the parameters and weights. Devany might be referring to a situation where there's just the weights not trained by us?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants