Releases
v2.1
Compare
Sorry, something went wrong.
No results found
zhjai
released this
02 Mar 13:13
What's New
Cross-Domain Autoregressive Pretraining
Added run_pretrain_universal.py for multi-domain joint pretraining across all five signal domains (ECG, EEG, FD, HAR, RWC)
Supports weighted domain sampling for balanced cross-domain training
Mixed-precision training with cosine warmup scheduling and early stopping
Deterministic reproducibility setup for consistent results
Bug Fixes
Fixed off-by-one error in average loss and metric calculation in TStokenizer Trainer class (loss_sum / idx → loss_sum / (idx + 1))
Documentation
Updated README to reflect the correct three-stage training pipeline:
TStokenizer training
Cross-domain autoregressive pretraining (run_pretrain_universal.py)
Supervised fine-tuning (Universal + Adaptation)
Added run_pretrain_universal.py to project structure
Changed Files
run_pretrain_universal.py (new)
TStokenizer/process.py
README.md
You can’t perform that action at this time.