A lightweight, modular library for neural network weight optimization using randomized search algorithms built directly on top of PyTorch. Pyperch is a research and teaching-oriented library for training neural networks using randomized optimization methods (RHC, SA, GA), gradient-based methods, and hybrid combinations.
-
Randomized Optimization Algorithms
- Randomized Hill Climbing (RHC)
- Simulated Annealing (SA)
- Genetic Algorithm (GA)
-
Hybrid Training Support
Combine layer-wise modes (freeze, grad, meta) to mix gradient-free and gradient-based optimization in the same network. -
Unified Trainer API
An interface for classification, regression, batching, metrics, early stopping, and reproducibility. -
Pure PyTorch (No Skorch Dependency)
All examples are built on native PyTorch modules and DataLoader. -
Modern Configuration System
Structured configs (TrainConfig,OptimizerConfig, etc.) keep experiments consistent and explicit. -
Utility Functions Included
Metrics, plotting helpers, seed control, and structured outputs. -
Search Integration Optuna-based hyperparameter grid search (parallel-ready) for RHC/SA/GA tuning.
-
Pure PyTorch No Skorch dependency; all examples use native PyTorch modules and DataLoader.
-
Modern Project Tooling
- Poetry for dependencies, builds, and publishing
- Black for code formatting
- Ruff for linting and import sorting
- CircleCI for automated testing
-
Utilities Included Metrics, plotting helpers, consistent seed control, and structured training outputs.
pip install pyperchIf developing locally:
poetry installThe public API and usage examples are documented and organized as follows:
The user-facing APIs are documented under:
Key entry points include:
- Perch Builder API - experiment construction, training, and hybrid optimization
- Optuna Search API - hyperparameter search using an adapter-based Optuna integration
Notebook/colab examples showing common workflows can be found in:
The examples cover:
- Classification and regression
- Randomized optimization (RHC, SA, GA)
- Gradient and hybrid optimization
- Layer freezing and meta-optimization
- Optuna-based hyperparameter search
If you are upgrading from Pyperch ≤ 0.1.6, the original standalone (functional) optimizers have been preserved for backward compatibility.
You can find the previous implementations here:
- Git tag:
v1-legacy - Directory:
pyperch/optim/
The new refactored optimizers can be found under:
pyperch.optim.*
Contributions are welcome. To submit a change:
- Fork the repository
- Create a feature branch:
git checkout -b feature/my-change- Commit your work:
git commit -m "feat: describe your change"- Push your branch:
git push origin feature/my-change- Open a pull request on GitHub
Before opening a PR:
poetry run black pyperch
poetry run ruff check pyperch --fixThis ensures consistent formatting and linting across the project.
MIT License