Companion optimization service for the Diet project. It exposes a FastAPI endpoint that builds and solves a multi-day diet planning model using Pyomo + SCIP.
diet-dietitian is the optimization layer of the broader diet planning system.
Its responsibilities are:
- receive user diet constraints via HTTP,
- fetch meals/dishes data from a configurable provider,
- convert domain data into Pyomo input structures,
- solve a mixed-integer optimization model,
- return a JSON diet plan and optimization metadata.
By default, this service consumes data from diet-backend over gRPC.
This service solves a constrained diet planning problem over a user-defined date range.
The model aims to:
- minimize total diet cost,
- satisfy daily nutritional bounds (calories, protein, carbs, fat),
- enforce dietary preferences (vegetarian, vegan),
- choose dishes compatible with each meal type,
- limit repeated dish selections across the whole plan.
The result is a day-by-day plan organized by meal.
The codebase follows a layered organization:
-
API (
src/app/api)- FastAPI router and request/response schemas.
- Input validation (date range and bounds consistency).
-
Execution (
src/app/execution)- Process orchestration pipeline:
- load provider data,
- build problem data,
- convert to Pyomo format,
- instantiate model,
- solve,
- extract response.
- Process orchestration pipeline:
-
Domain (
src/app/domain)- Problem entities (
Dish,NutritionalRequirements,DietRestrictions,ProblemData).
- Problem entities (
-
Model (
src/app/model)- Pyomo
AbstractModeldefinition:- sets, parameters, variables,
- constraints,
- objective function.
- Pyomo
-
Solver (
src/app/solver)- SCIP discovery and execution.
- Result status normalization and plan extraction.
-
Services (
src/app/services)- Pluggable data providers:
diet_backend(gRPC, async),simulator(local JSON).
- Data transformer (
pyomo) to shape model input.
- Pluggable data providers:
-
Config (
src/app/settings.py)- Pydantic settings for provider and solver configuration.
POST /optimize-dietreceives constraints.- Process gathers days/meals/dishes data.
- Data is transformed into Pyomo dictionary format.
- Model instance is created and dumped to
concrete_model_dump.txt. - SCIP solves the model.
- Service returns solver status + objective + plan.
.
├── main.py
├── pyproject.toml
├── tox.ini
├── conda-env.yaml
├── src/
│ └── app/
│ ├── api/
│ ├── domain/
│ ├── execution/
│ ├── model/
│ ├── services/
│ │ ├── data_provider/
│ │ └── data_transformer/
│ ├── solver/
│ ├── settings.py
│ └── asgi_app.py
└── tests/
The model is implemented in src/app/model/model.py with Pyomo.
For the full mathematical formulation, see modeling.pdf.
- Binary variable
use_dish_meal_day[dish, meal, day]indicating whether a dish is selected for a meal/day.
- Minimize total cost of selected dishes.
- At least one dish per meal/day.
- Meal suitability (dish can only appear in suitable meal type).
- Vegetarian/vegan compliance.
- Daily min/max bounds for calories, protein, carbs, fat.
- Global maximum number of selections per dish.
Uses async gRPC client (grpc.aio) against:
DishService.ListDishesMealService.ListMeals
Provider behavior:
- pulls dishes/meals from backend,
- adapts protobuf messages into domain entities,
- feeds optimization pipeline.
Reads local file:
src/app/services/data_provider/simulator_dishes_db.json
Useful for isolated local testing without backend dependency.
- Python
3.12 - Poetry
- SCIP solver executable
- Optional: Conda (recommended for SCIP binary provisioning)
The repo includes conda-env.yaml to install SCIP only:
conda env create -f conda-env.yaml
conda activate scip-solverpoetry installpython main.pyServer endpoints:
- API:
http://localhost:8000 - Docs:
http://localhost:8000/docs
- Black, Flake8, Isort
- Mypy + Pyright configuration included
- Tox environments for standardized checks
tox
tox -e black
tox -e isort
tox -e black-check,mypy,flake8,pytest- Test coverage is currently minimal and should be expanded (API, transformer, solver integration scenarios).
- Process execution is synchronous from the API route perspective; solver execution is CPU-bound and blocks the request lifecycle.
- gRPC proto stubs are committed/generated in-repo; regeneration workflow is not yet documented as an explicit script.
- Optimization feasibility depends on input bounds and available dish catalog.