Skip to content

jlm429/pyperch

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

140 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

pyperch

PyPI Python Versions License: MIT Code Style: Black Linter: Ruff CircleCI

A lightweight, modular library for neural network weight optimization using randomized search algorithms built directly on top of PyTorch. Pyperch is a research and teaching-oriented library for training neural networks using randomized optimization methods (RHC, SA, GA), gradient-based methods, and hybrid combinations.


Key Features

  • Randomized Optimization Algorithms

    • Randomized Hill Climbing (RHC)
    • Simulated Annealing (SA)
    • Genetic Algorithm (GA)
  • Hybrid Training Support
    Combine layer-wise modes (freeze, grad, meta) to mix gradient-free and gradient-based optimization in the same network.

  • Unified Trainer API
    An interface for classification, regression, batching, metrics, early stopping, and reproducibility.

  • Pure PyTorch (No Skorch Dependency)
    All examples are built on native PyTorch modules and DataLoader.

  • Modern Configuration System
    Structured configs (TrainConfig, OptimizerConfig, etc.) keep experiments consistent and explicit.

  • Utility Functions Included
    Metrics, plotting helpers, seed control, and structured outputs.

  • Search Integration Optuna-based hyperparameter grid search (parallel-ready) for RHC/SA/GA tuning.

  • Pure PyTorch No Skorch dependency; all examples use native PyTorch modules and DataLoader.

  • Modern Project Tooling

    • Poetry for dependencies, builds, and publishing
    • Black for code formatting
    • Ruff for linting and import sorting
    • CircleCI for automated testing
  • Utilities Included Metrics, plotting helpers, consistent seed control, and structured training outputs.


Installation

pip install pyperch

If developing locally:

poetry install

API and Examples

The public API and usage examples are documented and organized as follows:

API Documentation

The user-facing APIs are documented under:

Key entry points include:

Examples

Notebook/colab examples showing common workflows can be found in:

The examples cover:

  • Classification and regression
  • Randomized optimization (RHC, SA, GA)
  • Gradient and hybrid optimization
  • Layer freezing and meta-optimization
  • Optuna-based hyperparameter search

Legacy Standalone Optimizers (RHC, SA, GA)

If you are upgrading from Pyperch ≤ 0.1.6, the original standalone (functional) optimizers have been preserved for backward compatibility.

You can find the previous implementations here:

The new refactored optimizers can be found under:

pyperch.optim.*

Contributing

Contributions are welcome. To submit a change:

  1. Fork the repository
  2. Create a feature branch:
git checkout -b feature/my-change
  1. Commit your work:
git commit -m "feat: describe your change"
  1. Push your branch:
git push origin feature/my-change
  1. Open a pull request on GitHub

Code Style

Before opening a PR:

poetry run black pyperch
poetry run ruff check pyperch --fix

This ensures consistent formatting and linting across the project.


License

MIT License

About

Randomized opt networks with PyTorch

Resources

Stars

Watchers

Forks

Packages

 
 
 

Languages