Drive Accuracy and Efficiency with Intelligence.
For the most comprehensive usage documentation, please visit https://docs.deeph-pack.com/deeph-pack.
The modernized DeepH-pack is built upon the solid foundation of its predecessor and has been re-engineered with JAX and FLAX to unlock new levels of efficiency and flexibility.
Please visit the DeepH-pack official website to apply for and obtain the software.
Before installing DeepH-pack, ensure that uv — a fast and versatile Python package manager — is properly installed and configured, and that your Python 3.13 environment is set up. If you plan to run DeepH in a GPU-accelerated environment, you must also pre-install CUDA 12.8 or 12.9.
pip install ./deepx-1.0.6+light-py3-none-any.whl[gpu] --extra-index-url https://download.pytorch.org/whl/cpuFor step-by-step detailed procedures, please refer to the documentation.
Parameter explanation:
-
./deepx-1.0.6+light-py3-none-any.whlis the Python wheel file available for download from the official DeepH-pack website. -
The
[gpu]extra dependency tag indicates the GPU-accelerated version of the package, which is strongly recommended for optimal performance. If your system only supports CPU computation, replace[gpu]with[cpu]. -
The
--extra-index-urlflag is used to specify an additional package index (in this case, PyTorch's official repository) for resolving certain dependencies.
deeph-train train.toml
deeph-infer infer.tomlFor detailed instructions, see DeepH-pack online documentation.
Any and all use of this software, in whole or in part, should clearly acknowledge and link to this repository.
If you use DeepH-pack in your work, please cite the following publications.
-
The original framework paper introducing the foundational methodology:
-
Complete package featuring the latest implementation, methodology, and workflow:
@article{li2022deep,
title={Deep-learning density functional theory Hamiltonian for efficient ab initio electronic-structure calculation},
author={Li, He and Wang, Zun and Zou, Nianlong and Ye, Meng and Xu, Runzhang and Gong, Xiaoxun and Duan, Wenhui and Xu, Yong},
journal={Nat. Comput. Sci.},
volume={2},
number={6},
pages={367},
year={2022},
publisher={Nature Publishing Group US New York}
}
@article{li2026deeph,
title={DeepH-pack: A general-purpose neural network package for deep-learning electronic structure calculations},
author={Li, Yang and Wang, Yanzhen and Zhao, Boheng and Gong, Xiaoxun and Wang, Yuxiang and Tang, Zechen and Wang, Zixu and Yuan, Zilong and Li, Jialin and Sun, Minghui and Chen, Zezhou and Tao, Honggeng and Wu, Baochun and Yu, Yuhang and Li, He and da Jornada, Felipe H. and Duan, Wenhui and Xu, Yong },
journal={arXiv preprint arXiv:2601.02938},
year={2026}
}For a comprehensive overview of publications and research employing DeepH methods, please see the relevant section below. We also warmly welcome citations to our foundational papers if your work utilizes the DeepH framework or any of its modules (e.g., DeepH-E3, HPRO).
-
Latest Software Implementation
-
Architecture advancements
- DeepH: Original framework Nat. Comput. Sci. 2, 367 (2022)
- DeepH-E3: Integrating equivariant neural network Nat. Commun. 14, 2848 (2023)
- DeepH-2: Incorporating eSCN tensor product arXiv:2401.17015 (2024)
- DeepH-Zero: Leveraging physics-informed unsupervised learning Phys. Rev. Lett. 133, 076401 (2024)
-
Improved compatibility with first-principles codes
- HPRO: Compatibility with plane-wave DFT Nat. Comput. Sci. 4, 752 (2024)
- DeepH-hybrid: Extension to hybrid DFT Nat. Commun. 15, 8815 (2024)
-
Exploration of application scenarios
- xDeepH: Dealing with magnetism with extended DeepH Nat. Comput. Sci. 3, 321 (2023)
- DeepH-DFPT: Investigating density functional perturbation theory Phys. Rev. Lett. 132, 096401 (2024)
- DeepH-UMM: Developing the universal model for electronic structures Sci. Bull. 69, 2514 (2024)
-
Review of Recent Advancement
- From DeepH and ML-QMC to fast, accurate materials computation Nat. Comput. Sci. 5, 1133 (2025)
DeepH-pack is a general-purpose neural network package designed for deep-learning electronic structure calculations, empowering computational materials science with accelerated speed and enhanced efficiency through intelligent algorithms.