This repository contains an implementation of TimesBERT (from the paper TimesBERT: A BERT-Style Foundation Model for Time Series Understanding) adapted for ad performance data.
TimesBERT is a BERT-style encoder-only Transformer designed to learn structured representations of multivariate time series.
Instead of focusing only on forecasting, TimesBERT enables a broader paradigm called time series understanding, including:
- Classification
- Anomaly detection
- Imputation
- Short-term forecasting
This implementation adapts TimesBERT to Facebook Ads performance datasets, pretraining on ad-level time series features (spend, impressions, CTR, ROAS, etc.).
- Masked Patch Modeling (MPM): Learns temporal representations by masking and reconstructing patches.
- Functional Token Prediction (FTP): Predicts variate-level and domain-level tokens for better multi-granularity structure.
- Multi-domain Training: Handles ad campaigns across different domains (campaign names).
- Scalable Training: Supports multi-GPU training with PyTorch
DataParallel. - SafeTensors Saving: Uses
safetensorsfor efficient checkpoint saving.
βββ data/ # Input CSV files (one per ad/campaign)
βββ timesbert_ad_performance/ # Output directory for checkpoints & final models
β βββ checkpoints/ # Intermediate checkpoints
β βββ model.safetensors # Final trained model
β βββ config.json # Saved model config
β βββ domain_map.json # Domain (campaign) mappings
βββ train.py # Main training script (this repo's core file)
βββ README.md # Project documentation
- Install dependencies:
pip install -r requirements.txt-
Place your ad performance CSV files under the
data/folder. -
Run training:
python train.pyThe model will:
- Pretrain using MPM + FTP
- Save checkpoints every 10 epochs
- Save the final model at the end
After training, you can run masked patch reconstruction on validation data:
from train import run_inference
plot_path = run_inference("timesbert_ad_performance/checkpoints/model_epoch_10.safetensors", epoch=10)
print("Saved reconstruction plot:", plot_path)If you use this repo, please cite the original paper:
TimesBERT: A BERT-Style Foundation Model for Time Series Understanding
Haoran Zhang, Yong Liu, Yunzhong Qiu, Haixuan Liu, Zhongyi Pei, Jianmin Wang, Mingsheng Long
ArXiv:2502.21245
This implementation is for research purposes only.
Please check the original paper for licensing details.