Skip to content

MinghaoFu/CHiLD

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

6 Commits
 
 
 
 
 
 
 
 

Repository files navigation

Zijian Li1,2,*, Minghao Fu4,2,*, Junxian Huang3, Yifan Shen2, Ruichu Cai3, Yuewen Sun1,2, Guangyi Chen1,2, Kun Zhang1,2

1 Carnegie Mellon University  2 Mohamed bin Zayed University of Artificial Intelligence  3 Guangdong University of Technology  4 University of California, San Diego

Python 3.10 PyTorch 2.4.1 License CC BY-NC-SA

Motivation

Modeling hierarchical latent dynamics behind time series data is critical for capturing temporal dependencies across multiple levels of abstraction in real-world tasks. However, existing temporal causal representation learning methods fail to capture such dynamics, as they fail to recover the joint distribution of hierarchical latent variables from single-timestep observed variables. Interestingly, we find that the joint distribution of hierarchical latent variables can be uniquely determined using three conditionally independent observations. Building on this insight, we propose a Causally Hierarchical Latent Dynamic (CHiLD) identification framework. The time series generation process with a hierarchical latent causal process as shown in Figure 1.



Figure 1. Illustration of data generation process with hierarchical temporal dynamics that consists of $L$-layer latent variables. The observed variables $\mathbf{x}_t$ are generated by $\mathbf{x}_t=\mathbf{g}(\mathbf{z}_t^1,\epsilon_t^0)$ where $\mathbf{g}$ and $\epsilon_t^0$ denote the nonlinear mixing function and noise, respectively. And $\mathbf{z}_t^l$ are influenced by its time-delayed and hierarchical parents $\mathbf{z}_{t-1}^l$ and $\mathbf{z}_t^{l+1},\,l\leq L-1$ , respectively.

Model

Based on theoretical results, we develop the CHiLD model as shown in Figure 2.



Figure 2. The overall framework of CHiLD, which incorporates contextual Hierarchical encoder (Enc), step-wise decoder (Dec), and hierarchical prior networks.

Requirements

  • Python==3.10
  • torch==2.4.1
  • tqdm==4.64.1
  • einops==0.8.0
  • numpy==1.24.4

Dependencies can be installed using the following command:

pip install -r requirements.txt

Data

We have already put the datasets in the dataset.zip file. Please unzip it before running the code:

unzip dataset.zip
rm dataset.zip # optional

Reproducibility

To easily reproduce the results you can run the following commands:

bash ./scripts/ETT/ETTh1.sh

And we provide explanations for the important parameters:

Parameter name Description of parameter
data The dataset name.
root_path The root path of the data file (defaults to ./dataset/human/).
data_path The data file name (defaults to WalkDog_all.npy).
features The forecasting task (defaults to M). This can be set to M,S,MS (M : multivariate predict multivariate, S : univariate predict univariate, MS : multivariate predict univariate).
seq_len Input sequence length.(defaults to 24).
des Exp description.
itr Experiments times.
train_epochs Epochs in train.
layer Hierarchical layers of the model.

More parameter information please refer to main.py.

Results

The main results are shown in Table 1.

Table 1.Main results of real-world datasets.


Citation

If you find this repository useful in your research, please consider citing the following papers:

@article{li2025towards,
  title={Towards Identifiability of Hierarchical Temporal Causal Representation Learning},
  author={Li, Zijian and Fu, Minghao and Huang, Junxian and Shen, Yifan and Cai, Ruichu and Sun, Yuewen and Chen, Guangyi and Zhang, Kun},
  journal={arXiv preprint arXiv:2510.18310},
  year={2025}
}

About

[Neurips 2025] Towards Identifiability of Hierarchical Temporal Causal Representation

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published