Auto-configure (not only) torch experiments from the CLI.
Parsonaut makes your experiments
- Configurable - configure any parameter of your experiment from CLI
- Reproducible - easily store your full experiment configuration to disk
- Boilerplate-free - make model checkpointing seampless
To install the library, clone the repository and use pip:
pip install git+https://github.com/janvainer/parsonaut.gitLet's supercharge a simple torch experiment with automatic CLI configuration
"""
usage: script.py [-h] [--in_channels int] [--out_channels int]
options:
-h, --help show this help message and exit
--in_channels int
--out_channels int
"""
import torch.nn as nn
from parsonaut import Parsable
class Model(nn.Module, Parsable):
def __init__(
self,
in_channels: int = 4,
out_channels: int = 2,
):
super().__init__()
# Parse user CLI args - ta partially initialized model
partial_model = Model.parse_args()
# Serialize model configuration
partial_model.to_file("model_config.yaml")Now we can do some training. We instantiate the model configuration into a torch model.
model = partial_model.to_eager()
# Training code here ...Finally, serialize model configuration AND weights.
model.to_checkpoint("ckpt_dir")We can now load the experiment configuration and model weights later:
model_with_weights = Model.from_checkpoint("ckpt_dir")
just_config = Model.from_file("model_config.yaml")Parsonaut allows configuring multiple possibly nested classes. Moreover, you can dynamically select which classes to use via enums.
To explore more advanced features, please see the following tutorials:
