Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
33 commits
Select commit Hold shift + click to select a range
aec1745
feat: improve bounds in minimize method, it should not require using …
WojtAcht Mar 16, 2025
c780f37
feat: add default values to simplify configuration
WojtAcht Mar 16, 2025
89353cd
feat: use plotly_white as default template for plots
WojtAcht Mar 16, 2025
6f9d7fb
feat: add new interactive plot presenting population
WojtAcht Mar 19, 2025
6d872a8
feat: refine population plot
WojtAcht Mar 22, 2025
31e1228
feat: add new visualization method for the tree
WojtAcht Mar 22, 2025
7756e99
feat: add diagram
WojtAcht Mar 22, 2025
b21143d
feat: diagram improvements
WojtAcht Mar 22, 2025
afef030
feat: add notebook for GECCO EvoOSS
WojtAcht Mar 22, 2025
6e4572c
feat: use new interface for counting evaluations
WojtAcht Mar 22, 2025
736a961
fix: remove test
WojtAcht Mar 22, 2025
acb098d
feat: update notebook, use pre commit
WojtAcht Mar 22, 2025
202eadc
feat: filter out individuals with inf value
WojtAcht Mar 22, 2025
4c9caab
feat: improve plot_fitness_value_by_distance
WojtAcht Mar 22, 2025
fa0965e
feat: improve example
WojtAcht Mar 22, 2025
4bf3815
feat: refine kriging visualization
WojtAcht Mar 24, 2025
3d07566
feat: add example of landscape approximation (HMS + MWEA + Kriging)
WojtAcht Mar 24, 2025
aea90a4
feat: add missing transformation to the definition of C-shaped function
WojtAcht Mar 24, 2025
7cf9a90
feat: add plot_plateau_contour
WojtAcht Mar 27, 2025
c484e65
feat: add instruction how to run it on GoogleColab
WojtAcht Mar 27, 2025
4f75f98
feat: improve `plot_population` style
WojtAcht Mar 27, 2025
0450fcf
Merge branch 'main' into improve-docs
WojtAcht Mar 27, 2025
3827196
feat: improve landscape approximation notebook
WojtAcht Mar 29, 2025
031cb10
feat: simplify initialization
WojtAcht Mar 30, 2025
b655a1a
feat: add new parameter to inject custom deme classes
WojtAcht Mar 30, 2025
042505f
feat: add docs for adding custom demes
WojtAcht Mar 30, 2025
414916c
fix: remove ea_algorithms from index
WojtAcht Mar 30, 2025
40cb4ab
fix: improve whitepsaces
WojtAcht Mar 30, 2025
025050d
feat: refine docs for Problem class
WojtAcht Mar 30, 2025
befb4ec
feat: improve basic usage example by providing more comprehensive des…
WojtAcht Mar 30, 2025
5f2f5c0
feat: use correct name of the library (finally we decided to use pyHM…
WojtAcht Mar 30, 2025
06b9f9d
feat: add DemeInitArgs to the doc page
WojtAcht Mar 30, 2025
5e377ef
feat: add logo
WojtAcht Mar 30, 2025
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
7 changes: 5 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,4 +1,7 @@
# pyhms
# pyHMS

<img src="docs/_static/images/pyhms.png" alt="pyHMS Logo" width="200"/>

![GitHub Test Badge][1] [![codecov][2]](https://codecov.io/gh/agh-a2s/pyhms) [![Documentation Status][3]](https://pyhms.readthedocs.io/en/latest/?badge=latest) [![pypi.org][4]][5] [![versions][6]][7] ![license][8]

[1]: https://github.com/agh-a2s/pyhms/actions/workflows/pytest.yml/badge.svg "GitHub CI Badge"
Expand All @@ -10,7 +13,7 @@
[7]: https://github.com/agh-a2s/pyhms
[8]: https://img.shields.io/github/license/agh-a2s/pyhms

`pyhms` is a Python implementation of Hierarchic Memetic Strategy (HMS).
`pyHMS` is a Python implementation of Hierarchic Memetic Strategy (HMS).

The Hierarchic Memetic Strategy is a stochastic global optimizer designed to tackle highly multimodal problems. It is a composite global optimization strategy consisting of a multi-population evolutionary strategy and some auxiliary methods. The HMS makes use of a dynamically-evolving data structure that provides an organization among the component populations. It is a tree with a fixed maximal height and variable internal node degree. Each component population is governed by a particular optimization engine. This package provides a simple python implementation.

Expand Down
Binary file added docs/_static/images/pyhms.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
165 changes: 165 additions & 0 deletions docs/custom_demes.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,165 @@
Adding Custom Demes to pyHMS
============================

This guide explains how to create your own custom deme implementations for pyHMS.

Overview
--------

pyHMS allows you to extend the system with your own custom deme implementations. To create a custom deme, you need to:

1. Define a new config class that inherits from ``BaseLevelConfig``
2. Create a new deme class that inherits from ``AbstractDeme``
3. Register your custom deme by passing a ``config_class_to_deme_class`` mapping to the ``hms`` function

Step 1: Define Your Config Class
--------------------------------

Start by creating a config class that inherits from ``BaseLevelConfig``. This class should:

- Accept a ``problem`` and a stop condition (``lsc``) as required parameters
- Include any additional parameters your deme implementation needs
- Call the parent class's ``__init__`` method

.. code-block:: python

from pyhms.config import BaseLevelConfig
from pyhms.core.problem import Problem
from pyhms.stop_conditions import LocalStopCondition, UniversalStopCondition

class RandomSearchConfig(BaseLevelConfig):
def __init__(
self,
problem: Problem,
lsc: LocalStopCondition | UniversalStopCondition,
pop_size: int,
) -> None:
super().__init__(problem, lsc)
self.pop_size = pop_size

Step 2: Create Your Deme Class
------------------------------

Next, create a deme class that inherits from ``AbstractDeme``. This class must implement the required interface:

.. code-block:: python

import numpy as np
from pyhms.core.individual import Individual
from pyhms.demes.abstract_deme import AbstractDeme, DemeInitArgs

class RandomSearchDeme(AbstractDeme):
def __init__(
self,
deme_init_args: DemeInitArgs,
) -> None:
super().__init__(deme_init_args)
config: RandomSearchConfig = deme_init_args.config # type: ignore[assignment]
self._pop_size = config.pop_size
self.lower_bounds = config.bounds[:, 0]
self.upper_bounds = config.bounds[:, 1]
self.rng = np.random.RandomState(deme_init_args.random_seed)
self.run()

def run(self) -> None:
genomes = np.random.uniform(
self.lower_bounds,
self.upper_bounds,
size=(self._pop_size, len(self.lower_bounds))
)
population = [Individual(genome, problem=self._problem) for genome in genomes]
Individual.evaluate_population(population)
self._history.append([population])

def run_metaepoch(self, tree) -> None:
# This method is called in each meta-epoch
self.run()

# Check if stopping conditions are met
if (gsc_value := tree._gsc(tree)) or self._lsc(self):
self._active = False
message = "Random Search Deme finished due to GSC" if gsc_value else "Random Search Deme finished due to LSC"
self.log(message)
return

Understanding DemeInitArgs
--------------------------

When implementing a custom deme, you'll receive a ``DemeInitArgs`` object in the constructor. This dataclass contains all the necessary initialization parameters for your deme:

.. code-block:: python

@dataclass
class DemeInitArgs:
id: str
level: int
config: BaseLevelConfig
logger: FilteringBoundLogger
started_at: int = 0
sprout_seed: Individual | None = None
random_seed: int | None = None
parent_deme: AbstractDeme | None = None

Understanding these fields:

- ``id``: A unique string identifier for your deme
- ``level``: The hierarchical level in the HMS tree (starts at 0 for root)
- ``config``: Your custom configuration class instance that inherits from ``BaseLevelConfig``
- ``logger``: A structured logger for outputting debug information
- ``started_at``: The metaepoch number when this deme was created
- ``sprout_seed``: For non-root demes, this is the first individual that sprouted this deme
- ``random_seed``: A seed for random number generation to ensure reproducibility
- ``parent_deme``: Reference to the parent deme that sprouted this deme (None for root demes)

In your custom deme implementation, you'll typically:

1. Pass the ``DemeInitArgs`` object to the parent constructor
2. Cast the ``config`` field to your specific config class type
3. Access the configuration parameters you need
4. Use the provided random seed for any randomized operations

Step 3: Register and Use Your Custom Deme
-----------------------------------------

Finally, register your custom deme by creating a mapping from your config class to your deme class and passing it to the ``hms`` function:

.. code-block:: python

from pyhms import hms
from pyhms.stop_conditions import DontStop, MetaepochLimit

# Create your deme configuration
random_search_config = RandomSearchConfig(
problem=your_problem,
lsc=DontStop(),
pop_size=100
)

# Define the mapping from config class to deme class
config_class_to_deme_class = {
RandomSearchConfig: RandomSearchDeme
}

# Use your custom deme in pyHMS
result = hms(
level_config=[random_search_config],
gsc=MetaepochLimit(10),
sprout_cond=your_sprout_condition,
config_class_to_deme_class=config_class_to_deme_class
)

Important AbstractDeme Properties and Methods
---------------------------------------------

When implementing your custom deme, you can use the following properties and methods from the ``AbstractDeme`` base class:

- ``self._problem``: The optimization problem
- ``self._bounds``: The bounds of the search space
- ``self._active``: A flag indicating if the deme is active
- ``self._history``: History of populations (list of lists of individuals)
- ``self.log(message)``: Log a message with additional meta information
- ``self.centroid``: Compute the centroid of the current population
- ``self.best_individual``: Get the best individual found by the deme
- ``self.current_population``: Get the current population

The most important method you must implement is ``run_metaepoch(self, tree)``, which is called in each meta-epoch of the HMS algorithm.
12 changes: 9 additions & 3 deletions docs/index.rst
Original file line number Diff line number Diff line change
@@ -1,9 +1,14 @@
.. include:: ../README.rst

Welcome to pyhms's documentation!
Welcome to pyHMS's documentation!
===================================

**pyhms** is a Python implementation of Hierarchic Memetic Strategy (HMS).
.. image:: _static/images/pyhms.png
:width: 200px
:alt: pyHMS Logo
:align: center

**pyHMS** is a Python implementation of Hierarchic Memetic Strategy (HMS).

The Hierarchic Memetic Strategy is a stochastic global optimizer designed to tackle highly multimodal problems. It is a composite global optimization strategy consisting of a multi-population evolutionary strategy and some auxiliary methods. The HMS makes use of a dynamically-evolving data structure that provides an organization among the component populations. It is a tree with a fixed maximal height and variable internal node degree. Each component population is governed by a particular optimization engine. This package provides a simple python implementation.

Expand All @@ -23,6 +28,7 @@ Contents
algorithm
usage
inspecting
stop
sprout
custom_demes
stop
problem
2 changes: 1 addition & 1 deletion docs/inspecting.rst
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ Visualizing and inspecting the results of evolutionary strategies (ES) are cruci
Visualization helps in understanding how solutions evolve over generations.
Through visual inspection, one can observe if the population is converging towards a global optimum or if it is stuck in local optima.
Visualization of the population distribution over time can also highlight issues with diversity, indicating whether the evolutionary strategy is exploring the solution space adequately.
`pyhms` provides different methods for `DemeTree` object that enable inspecting results.
`pyHMS` provides different methods for `DemeTree` object that enable inspecting results.
Let's consider Sphere function as an example for `N=5`.

.. code-block:: python
Expand Down
43 changes: 42 additions & 1 deletion docs/problem.rst
Original file line number Diff line number Diff line change
@@ -1,12 +1,53 @@
Problem
=======

`pyhms` provides different `Problem` wrappers. These wrappers are used to wrap the problem and provide additional functionality such as counting the number of evaluations (`EvalCountingProblem`, `EvalCutoffProblem`), or stopping the evaluation when a certain precision is reached (`PrecisionCutoffProblem`).
The `Problem` class hierarchy in `pyHMS` provides the foundation for defining optimization problems and wrapping them with additional functionality.

Base Classes
------------

.. autoclass:: pyhms.core.problem.Problem
:members:

.. autoclass:: pyhms.core.problem.FunctionProblem
:members:

.. autoclass:: pyhms.core.problem.ProblemWrapper
:members:

Problem Wrappers
----------------

`pyHMS` provides various `Problem` wrappers. These wrappers enhance problem instances with additional functionality without modifying their core behavior. Common use cases include:

1. Monitoring optimization performance (counting evaluations, measuring time)
2. Enforcing constraints (maximum evaluations, precision thresholds)
3. Collecting statistics for analysis

Available Wrappers
^^^^^^^^^^^^^^^^^^

.. autoclass:: pyhms.core.problem.EvalCountingProblem
:members:

A wrapper that counts the number of function evaluations performed.

.. autoclass:: pyhms.core.problem.EvalCutoffProblem
:members:

A wrapper that stops evaluations after a specified limit is reached, returning infinity (or negative infinity for maximization problems).

.. autoclass:: pyhms.core.problem.PrecisionCutoffProblem
:members:

A wrapper that tracks when the solution reaches a specified precision threshold relative to the known global optimum.

.. autoclass:: pyhms.core.problem.StatsGatheringProblem
:members:

A wrapper that collects statistics about evaluation times, useful for performance analysis.

Helper Functions
----------------

.. autofunction:: pyhms.core.problem.get_function_problem
48 changes: 29 additions & 19 deletions docs/usage.rst
Original file line number Diff line number Diff line change
Expand Up @@ -40,14 +40,14 @@ The output of the function is a OptimizeResult object:

@dataclass
class OptimizeResult:
x: np.ndarray
nfev: int
fun: float
nit: int
x: np.ndarray # Best solution found
nfev: int # Number of function evaluations
fun: float # Function value at the best solution
nit: int # Number of iterations (metaepochs)


Usage
-----
Detailed Usage
--------------

Let's begin by defining a problem that we want to solve. We will use the following example:

Expand All @@ -74,40 +74,50 @@ To use HMS we need to define global stop condition, in this case we want to run
from pyhms import MetaepochLimit
global_stop_condition = MetaepochLimit(limit=10)

Now we need to decide what should be the height of our tree (maximum number of levels) and what optimization algorithms to run on each level. We will use the following configuration:
Now we need to configure the structure of our HMS tree by defining the optimization algorithms for each level. Each level configuration specifies the following:

1. The optimization algorithm to use (`EALevelConfig` which can run multiple different GAs)
2. The number of iterations per metaepoch (`generations`)
3. The problem to solve (can be different for each level e.g. less accurate for higher levels)
4. Population size and other algorithm-specific parameters
5. Local stop condition (`lsc`)

.. code-block:: python

from pyhms import EALevelConfig, DontStop, SEA

config = [
EALevelConfig(
ea_class=SEA,
generations=2,
problem=square_problem,
pop_size=20,
mutation_std=1.0,
lsc=DontStop(),
ea_class=SEA, # Use Simple Evolutionary Algorithm (GA)
generations=2, # Number of generations per metaepoch
problem=square_problem, # The problem to solve (problems can be different for each level)
pop_size=20, # Population size
mutation_std=1.0, # Standard deviation for mutation
lsc=DontStop(), # Local stop condition (never stop)
),
EALevelConfig(
ea_class=SEA,
generations=4,
generations=4, # More generations for deeper exploration
problem=square_problem,
pop_size=10,
mutation_std=0.25,
sample_std_dev=1.0,
pop_size=10, # Smaller population size at lower levels
mutation_std=0.25, # Smaller mutations for local refinement
sample_std_dev=1.0, # Standard deviation for sampling around parent
lsc=DontStop(),
),
]

Next step is to define sprout condition for our tree. We will use Nearest Better Clustering (NBC) sprout condition.
The HMS algorithm creates a tree-like structure where demes (populations) at higher levels perform broad exploration, while demes at lower levels refine promising solutions. The configuration above defines two levels in our tree.

Next, we need to define a sprouting condition that determines when and where to create new demes at lower levels. We'll use Nearest Better Clustering (NBC) sprouting:

.. code-block:: python

from pyhms import get_NBC_sprout
sprout_condition = get_NBC_sprout(level_limit=4)

Finally we can run the algorithm:
The NBC sprouting condition identifies promising points in the search space by clustering solutions based on their fitness and proximity. See :doc:`sprout` for more details on sprouting mechanisms.

Finally, we can run the algorithm:

.. code-block:: python

Expand Down
Loading