HTN-based plan-recognition playground with multiple self-contained demos ranging from clean-prefix matching to noisy, incremental, and segmentation-based recognition in both logistics and search-and-rescue domains.
- Logistics PGR demos (
demos/pgr_demo*.py): Inject observed action prefixes with locking fluents (Höller-style) and recognise the most likely high-level transport goal(s). - Top-k / feasibility variants (
pgr_demo_top_k.py,pgr_demo_top_k_noisy.py): Enumerate multiple plausible goals or goal subsets when observations are ambiguous. - Dialogue + cost fusion (
pgr_demo2.py): Combine keyword-derived Dirichlet priors with Boltzmann-cost likelihoods to rank strategies. - Noise-robust SAR recognisers (
noisy_prg_*.py): Incremental, edit-distance, and segmentation-based recognisers for search-and-rescue missions with noisy or missing observations. - ASIST-style heuristic recogniser (
usar_pgr.py): Two-layer strategy + target prediction over individual victims. - Utility exercise (
demos/algo.py): Standalone array-product kata (unrelated to PGR).
Dependencies are listed in requirements.txt (key ones: unified-planning, numpy, scipy). Use a virtualenv or the provided venv/ (local to this repo).
python -m venv .venv
source .venv/bin/activate
pip install -r requirements.txtEach script contains an executable __main__ with printed outputs.
# Logistics prefix-guided recognition
python demos/pgr_demo.py
python demos/pgr_demo_top_k.py
# Dialogue-aware variant
python demos/pgr_demo2.py
# Noisy / incremental SAR recognisers
python demos/noisy_prg_full_obs.py
python demos/noisy_prg_on_the_fly.py
python demos/noisy_prg_partial_obs.py
python demos/noisy_prg_segments.py
# ASIST-style two-layer recogniser
python demos/usar_pgr.pydemos/: All runnable examples and small utilities.requirements.txt: Python dependencies.venv/: Local virtual environment (can be ignored or replaced).
- All demos are illustrative and operate on small, in-memory domains—no external data required.
- Scripts log to stdout; no state is persisted.