Skip to content

Guney-olu/TimesBERT-IMPL

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

2 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

TimesBERT-AdPerformance

This repository contains an implementation of TimesBERT (from the paper TimesBERT: A BERT-Style Foundation Model for Time Series Understanding) adapted for ad performance data.


πŸ“Œ Overview

TimesBERT is a BERT-style encoder-only Transformer designed to learn structured representations of multivariate time series.
Instead of focusing only on forecasting, TimesBERT enables a broader paradigm called time series understanding, including:

  • Classification
  • Anomaly detection
  • Imputation
  • Short-term forecasting

This implementation adapts TimesBERT to Facebook Ads performance datasets, pretraining on ad-level time series features (spend, impressions, CTR, ROAS, etc.).


βš™οΈ Features

  • Masked Patch Modeling (MPM): Learns temporal representations by masking and reconstructing patches.
  • Functional Token Prediction (FTP): Predicts variate-level and domain-level tokens for better multi-granularity structure.
  • Multi-domain Training: Handles ad campaigns across different domains (campaign names).
  • Scalable Training: Supports multi-GPU training with PyTorch DataParallel.
  • SafeTensors Saving: Uses safetensors for efficient checkpoint saving.

πŸ“‚ Project Structure

β”œβ”€β”€ data/                          # Input CSV files (one per ad/campaign)
β”œβ”€β”€ timesbert_ad_performance/      # Output directory for checkpoints & final models
β”‚   β”œβ”€β”€ checkpoints/               # Intermediate checkpoints
β”‚   β”œβ”€β”€ model.safetensors          # Final trained model
β”‚   β”œβ”€β”€ config.json                # Saved model config
β”‚   └── domain_map.json            # Domain (campaign) mappings
β”œβ”€β”€ train.py                       # Main training script (this repo's core file)
└── README.md                      # Project documentation

πŸš€ Training

  1. Install dependencies:
pip install -r requirements.txt
  1. Place your ad performance CSV files under the data/ folder.

  2. Run training:

python train.py

The model will:

  • Pretrain using MPM + FTP
  • Save checkpoints every 10 epochs
  • Save the final model at the end

πŸ”Ž Inference

After training, you can run masked patch reconstruction on validation data:

from train import run_inference
plot_path = run_inference("timesbert_ad_performance/checkpoints/model_epoch_10.safetensors", epoch=10)
print("Saved reconstruction plot:", plot_path)

πŸ“š Reference

If you use this repo, please cite the original paper:

TimesBERT: A BERT-Style Foundation Model for Time Series Understanding
Haoran Zhang, Yong Liu, Yunzhong Qiu, Haixuan Liu, Zhongyi Pei, Jianmin Wang, Mingsheng Long
ArXiv:2502.21245


πŸ“ License

This implementation is for research purposes only.
Please check the original paper for licensing details.


About

Implementation of TimesBERT reseach paper

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages