riverMoE (/ˈrɪvər moʊ/) is a framework that offers Mixture of Experts (MoE) for online machine learning. It combines the river API and deep-river with the capability of designing MoE-architectures based on different machine learning or deep learning approaches.
MoE-Gating works like a adaptive trainable VotingClassifier!
- Works with adaptive streaming data
- Compatible with
PyTorchmodules - Different types of MoE: Soft MoE, Top-K MoE, SAMoE¹
- Easy API for simple usage
- Modular framework for easy expansion
¹ This frameworks implements the Streaming Adaptive Mixture of Experts (SAMoE) architecture and is part of riverMoE publication.
MoE with three river experts for TrumpApproval dataset:
>>> from rivermoe.utils.generic_nn import GenericNNClassifier
>>> from river import datasets, metrics, preprocessing, evaluate, tree, linear_model, dummy, stats
>>> from rivermoe.regression.soft_moe import SoftMoERegressor
>>>
>>> # Generate neural network
>>> gate = GenericNNClassifier(
... layer_configs=[10,],
... activation_fn="relu",
... loss_fn="mse",
... output_dim=3,
... output_activation=None,
... optimizer_fn="sgd"
... )
>>>
>>> # Generate MoE
>>> model = SoftMoERegressor(
... gate = gate,
... experts = [dummy.StatisticRegressor(stats.Mean()), tree.HoeffdingTreeRegressor(), linear_model.LinearRegression()]
... )
>>>
>>> dataset = datasets.TrumpApproval()
>>> metric = metrics.MAE()
>>> evaluate.progressive_val_score(dataset, preprocessing.StandardScaler() | model, metric)
MAE: 0.923203The same with single expert evaluation:
>>> for model in [tree.HoeffdingTreeRegressor(), linear_model.LinearRegression(), dummy.StatisticRegressor(stats.Mean())]:
... metric = metrics.MAE()
... model_pipeline = preprocessing.StandardScaler() | model
... print(model.__class__.__name__)
... print(evaluate.progressive_val_score(dataset, model_pipeline, metric))
...
HoeffdingTreeRegressor
MAE: 0.956103
LinearRegression
MAE: 1.314548
StatisticRegressor
MAE: 1.567555Results in ensemble are better than alone:
| Dataset | Model | MAE ↓ |
|---|---|---|
| TrumpApproval | riverMoE | 0.923203 |
| TrumpApproval | HoeffingTreeRegressor | 0.956103 |
| TrumpApproval | LinearRegression | 1.314548 |
| TrumpApproval | StatisticRegressor | 1.567555 |
Plot MoE-network:
model.draw()pip install -U rivermoeor install locally with Poetry
- Clone project
git clone https://github.com/bitnulleins/rivermoe.git- Run poetry
poetry add rivermoe- Install pre-commit hook and run pre-commit manually
poetry run pre-commit install
poetry run pre-commit run --all-filesPoetryas the dependencies manager. See configuration inpyproject.tomlandsetup.cfg.- Automatic codestyle with
black,isortandpyupgrade. - Ready-to-use
pre-commithooks with code-formatting. - Type checks with
mypy; docstring checks withdarglint; security checks withsafetyandbandit - Testing with
pytest.
This project is licensed under the terms of the Apache Software License 2.0 license. See LICENSE for more details.
@phdthesis{rivermoe_2025,
address = {Hamburg},
type = {Master’s thesis},
title = {Adaptive {Machine} {Learning} with {Mixture} of {Experts}},
shorttitle = {Adaptive {ML} with {MoE}},
url = {https://www.minds-hh.de/mastersthesis/adaptives-maschinelles-lernen-mit-mixture-of-experts/},
language = {de},
school = {Hamburg University of Applied Science},
author = {Dohrn, Finn},
month = mar,
year = {2025},
}