Skip to content
This repository was archived by the owner on Mar 10, 2026. It is now read-only.
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
24 changes: 24 additions & 0 deletions .github/copilot-instructions.md
Original file line number Diff line number Diff line change
Expand Up @@ -151,6 +151,30 @@ For DevContainers and Codespaces, the `.devcontainer/` configuration and `bootst
4. **Format**: Run `make fmt` before committing.
5. **Verify**: Run `make deptry` to check dependencies.

## GitHub Agentic Workflows (gh-aw)

This repository uses GitHub Agentic Workflows for AI-driven automation.
Agentic workflow files are Markdown files in `.github/workflows/` with
`.lock.yml` compiled counterparts.

**Key Commands:**
- `make gh-aw-compile` or `gh aw compile` — Compile workflow `.md` files to `.lock.yml`
- `make gh-aw-run WORKFLOW=<name>` or `gh aw run <name>` — Run a specific workflow locally
- `make gh-aw-status` — Check status of all agentic workflows
- `make gh-aw-setup` — Configure secrets and engine for first-time setup

**Important Rules:**
- **Never edit `.lock.yml` files directly** — Always edit the `.md` source and recompile
- Workflows must be compiled before they can run in GitHub Actions
- After editing any `.md` workflow, always run `make gh-aw-compile` and commit both files

**Available Starter Workflows:**
- `daily-repo-status.md` — Daily repository health reports
- `ci-doctor.md` — Automatic CI failure diagnosis
- `issue-triage.md` — Automatic issue classification and labeling

For more details, see `docs/GH_AW.md`.

## Key Files

- `Makefile`: Main entry point for tasks.
Expand Down
29 changes: 22 additions & 7 deletions .github/hooks/session-end.sh
Original file line number Diff line number Diff line change
Expand Up @@ -6,16 +6,31 @@ set -euo pipefail

echo "[copilot-hook] Running post-work quality gates..."

# Format code
echo "[copilot-hook] Formatting code..."
make fmt || {
echo "[copilot-hook] WARNING: Formatting check failed."
if ! make fmt; then
echo "[copilot-hook] ❌ ERROR: Formatting check failed"
echo "[copilot-hook] 💡 Remediation: Review the formatting errors above"
echo "[copilot-hook] 💡 Common fixes:"
echo "[copilot-hook] - Run 'make fmt' locally to see detailed errors"
echo "[copilot-hook] - Check for syntax errors in modified files"
echo "[copilot-hook] - Ensure all files follow project style guidelines"
exit 1
}
fi
echo "[copilot-hook] ✓ Code formatting passed"

# Run tests
echo "[copilot-hook] Running tests..."
make test || {
echo "[copilot-hook] WARNING: Tests failed."
if ! make test; then
echo "[copilot-hook] ❌ ERROR: Tests failed"
echo "[copilot-hook] 💡 Remediation: Review the test failures above"
echo "[copilot-hook] 💡 Common fixes:"
echo "[copilot-hook] - Run 'make test' locally to see detailed output"
echo "[copilot-hook] - Check if new code broke existing functionality"
echo "[copilot-hook] - Verify test assertions match expected behavior"
echo "[copilot-hook] - Review test logs in _tests/ directory"
exit 1
}
fi
echo "[copilot-hook] ✓ Tests passed"

echo "[copilot-hook] All quality gates passed."
echo "[copilot-hook] All quality gates passed"
18 changes: 14 additions & 4 deletions .github/hooks/session-start.sh
Original file line number Diff line number Diff line change
Expand Up @@ -9,19 +9,29 @@ echo "[copilot-hook] Validating environment..."

# Verify uv is available
if ! command -v uv >/dev/null 2>&1 && [ ! -x "./bin/uv" ]; then
echo "[copilot-hook] ERROR: uv not found. Run 'make install' to set up the environment."
echo "[copilot-hook] ❌ ERROR: uv not found"
echo "[copilot-hook] 💡 Remediation: Run 'make install' to set up the environment"
echo "[copilot-hook] 💡 Alternative: Ensure uv is in PATH or ./bin/uv exists"
exit 1
fi
echo "[copilot-hook] ✓ uv is available"

# Verify virtual environment exists
if [ ! -d ".venv" ]; then
echo "[copilot-hook] ERROR: .venv not found. Run 'make install' to set up the environment."
echo "[copilot-hook] ❌ ERROR: .venv not found"
echo "[copilot-hook] 💡 Remediation: Run 'make install' to create the virtual environment"
echo "[copilot-hook] 💡 Details: The .venv directory should contain Python dependencies"
exit 1
fi
echo "[copilot-hook] ✓ Virtual environment exists"

# Verify virtual environment is on PATH (activated via copilot-setup-steps.yml)
if ! command -v python >/dev/null 2>&1 || [[ "$(command -v python)" != *".venv"* ]]; then
echo "[copilot-hook] WARNING: .venv/bin is not on PATH. The agent may not use the correct Python."
echo "[copilot-hook] ⚠️ WARNING: .venv/bin is not on PATH"
echo "[copilot-hook] 💡 Note: The agent may not use the correct Python version"
echo "[copilot-hook] 💡 Remediation: Ensure .venv/bin is added to PATH before running the agent"
else
echo "[copilot-hook] ✓ Virtual environment is activated"
fi

echo "[copilot-hook] Environment validated successfully."
echo "[copilot-hook] Environment validated successfully"
15 changes: 10 additions & 5 deletions .github/workflows/rhiza_release.yml
Original file line number Diff line number Diff line change
Expand Up @@ -136,11 +136,16 @@ jobs:
TAG_VERSION=${TAG_VERSION#v}
PROJECT_VERSION=$(uv version --short)

if [[ "$PROJECT_VERSION" != "$TAG_VERSION" ]]; then
echo "::error::Version mismatch: pyproject.toml has '$PROJECT_VERSION' but tag is '$TAG_VERSION'"
# Normalize tag version to PEP 440 format for comparison.
# Tags use semver format (e.g., 0.11.1-beta.1) while uv version --short
# returns PEP 440 normalized format (e.g., 0.11.1b1).
NORMALIZED_TAG=$(uv run --with packaging --no-project python3 -c "from packaging.version import Version; print(Version('$TAG_VERSION'))")

if [[ "$PROJECT_VERSION" != "$NORMALIZED_TAG" ]]; then
echo "::error::Version mismatch: pyproject.toml has '$PROJECT_VERSION' but tag is '$NORMALIZED_TAG' (from tag '$TAG_VERSION')"
exit 1
fi
echo "Version verified: $PROJECT_VERSION matches tag"
echo "Version verified: $PROJECT_VERSION matches tag (normalized: $NORMALIZED_TAG)"

- name: Detect buildable Python package
id: buildable
Expand Down Expand Up @@ -174,8 +179,8 @@ jobs:
run: |
printf "[INFO] Generating SBOM in CycloneDX format...\n"
# Note: uvx caches the tool environment, so the second call is fast
uvx --from 'cyclonedx-bom>=7.0.0' cyclonedx-py environment --of JSON -o sbom.cdx.json
uvx --from 'cyclonedx-bom>=7.0.0' cyclonedx-py environment --of XML -o sbom.cdx.xml
uvx --from 'cyclonedx-bom>=7.0.0' cyclonedx-py environment --pyproject pyproject.toml --of JSON -o sbom.cdx.json
uvx --from 'cyclonedx-bom>=7.0.0' cyclonedx-py environment --pyproject pyproject.toml --of XML -o sbom.cdx.xml
printf "[INFO] SBOM generation complete\n"
printf "Generated files:\n"
ls -lh sbom.cdx.*
Expand Down
4 changes: 4 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -16,6 +16,7 @@ _marimushka
_mkdocs
_benchmarks
_jupyter
_site

# temp file used by Junie
.output.txt
Expand Down Expand Up @@ -80,6 +81,9 @@ coverage.json
.pytest_cache/
cover/

# Security scanning baselines (regenerate as needed)
.bandit-baseline.json

# Translations
*.mo
*.pot
Expand Down
9 changes: 3 additions & 6 deletions .rhiza/.cfg.toml
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
[tool.bumpversion]
parse = "(?P<major>\\d+)\\.(?P<minor>\\d+)\\.(?P<patch>\\d+)(?:-(?P<release>[a-z]+)\\.(?P<pre_n>\\d+))?(?:\\+build\\.(?P<build_n>\\d+))?"
parse = "(?P<major>\\d+)\\.(?P<minor>\\d+)\\.(?P<patch>\\d+)(?:[-]?(?P<release>[a-z]+)[\\.]?(?P<pre_n>\\d+))?(?:\\+build\\.(?P<build_n>\\d+))?"
serialize = ["{major}.{minor}.{patch}-{release}.{pre_n}+build.{build_n}", "{major}.{minor}.{patch}+build.{build_n}", "{major}.{minor}.{patch}-{release}.{pre_n}", "{major}.{minor}.{patch}"]
search = "{current_version}"
replace = "{new_version}"
Expand All @@ -21,7 +21,9 @@ optional_value = "prod"
values = [
"dev",
"alpha",
"a", # PEP 440 short form for alpha
"beta",
"b", # PEP 440 short form for beta
"rc",
"prod"
]
Expand All @@ -30,8 +32,3 @@ values = [
filename = "pyproject.toml"
search = 'version = "{current_version}"'
replace = 'version = "{new_version}"'

# [[tool.bumpversion.files]]
# filename = ".rhiza/template-bundles.yml"
# search = 'version: "{current_version}"'
# replace = 'version: "{new_version}"'
1 change: 0 additions & 1 deletion .rhiza/.env
Original file line number Diff line number Diff line change
@@ -1,6 +1,5 @@
MARIMO_FOLDER=book/marimo/notebooks
SOURCE_FOLDER=src
SCRIPTS_FOLDER=.rhiza/scripts

# Book-specific variables
BOOK_TITLE=Project Documentation
Expand Down
2 changes: 1 addition & 1 deletion .rhiza/.rhiza-version
Original file line number Diff line number Diff line change
@@ -1 +1 @@
0.11.0
0.11.2
4 changes: 1 addition & 3 deletions .rhiza/docs/CONFIG.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# Rhiza Configuration

This directory contains platform-agnostic scripts and utilities for the repository that can be used by GitHub Actions, GitLab CI, or other CI/CD systems.
This directory contains platform-agnostic utilities for the repository that can be used by GitHub Actions, GitLab CI, or other CI/CD systems.

## Important Documentation

Expand All @@ -14,8 +14,6 @@ This directory contains platform-agnostic scripts and utilities for the reposito

## Structure

- **scripts/** - Shell scripts for common tasks (book generation, release, etc.)
- **scripts/customisations/** - Repository-specific customisation hooks
- **utils/** - Python utilities for version management

GitHub-specific composite actions are located in `.github/rhiza/actions/`.
Expand Down
16 changes: 13 additions & 3 deletions .rhiza/history
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
# Rhiza Template History
# This file lists all files managed by the Rhiza template.
# Template repository: jebel-quant/rhiza
# Template branch: v0.8.0
# Template branch: v0.8.3
#
# Files under template control:
.editorconfig
Expand All @@ -14,7 +14,9 @@
.github/hooks/hooks.json
.github/hooks/session-end.sh
.github/hooks/session-start.sh
.github/secret_scanning.yml
.github/workflows/copilot-setup-steps.yml
.github/workflows/renovate_rhiza_sync.yml
.github/workflows/rhiza_benchmarks.yml
.github/workflows/rhiza_book.yml
.github/workflows/rhiza_ci.yml
Expand Down Expand Up @@ -48,6 +50,7 @@
.rhiza/make.d/custom-env.mk
.rhiza/make.d/custom-task.mk
.rhiza/make.d/docs.mk
.rhiza/make.d/gh-aw.mk
.rhiza/make.d/github.mk
.rhiza/make.d/marimo.mk
.rhiza/make.d/quality.mk
Expand All @@ -59,10 +62,10 @@
.rhiza/requirements/tests.txt
.rhiza/requirements/tools.txt
.rhiza/rhiza.mk
.rhiza/scripts/.gitkeep
.rhiza/templates/minibook/custom.html.jinja2
.rhiza/tests/README.md
.rhiza/tests/api/conftest.py
.rhiza/tests/api/test_gh_aw_targets.py
.rhiza/tests/api/test_github_targets.py
.rhiza/tests/api/test_makefile_api.py
.rhiza/tests/api/test_makefile_targets.py
Expand All @@ -75,6 +78,13 @@
.rhiza/tests/integration/test_sbom.py
.rhiza/tests/integration/test_test_mk.py
.rhiza/tests/integration/test_virtual_env_unexport.py
.rhiza/tests/security/test_security_patterns.py
.rhiza/tests/shell/test_scripts.sh
.rhiza/tests/stress/README.md
.rhiza/tests/stress/__init__.py
.rhiza/tests/stress/conftest.py
.rhiza/tests/stress/test_git_stress.py
.rhiza/tests/stress/test_makefile_stress.py
.rhiza/tests/structure/test_lfs_structure.py
.rhiza/tests/structure/test_project_layout.py
.rhiza/tests/structure/test_requirements.py
Expand All @@ -89,6 +99,7 @@ CODE_OF_CONDUCT.md
CONTRIBUTING.md
LICENSE
Makefile
SECURITY.md
book/marimo/notebooks/rhiza.py
docs/ARCHITECTURE.md
docs/BOOK.md
Expand All @@ -100,7 +111,6 @@ docs/QUICK_REFERENCE.md
docs/SECURITY.md
docs/TESTS.md
pytest.ini
renovate.json
ruff.toml
tests/benchmarks/conftest.py
tests/benchmarks/test_benchmarks.py
49 changes: 30 additions & 19 deletions .rhiza/make.d/book.mk
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,16 @@
# and compiling a companion book (minibook).

# Declare phony targets (they don't produce files)
.PHONY: marimushka mkdocs-build book
.PHONY: marimushka mkdocs-build book test benchmark stress hypothesis-test docs

# Define default no-op targets for test-related book dependencies.
# These are used when test.mk is not available or tests are not installed,
# ensuring 'make book' succeeds even without a test environment.
test:: ; @:
benchmark:: ; @:
stress:: ; @:
hypothesis-test:: ; @:
docs:: ; @:

# Define a default no-op marimushka target that will be used
# when book/marimo/marimo.mk doesn't exist or doesn't define marimushka
Expand Down Expand Up @@ -40,6 +49,9 @@ BOOK_SECTIONS := \
"API|_pdoc/index.html|pdoc/index.html|_pdoc|pdoc" \
"Coverage|_tests/html-coverage/index.html|tests/html-coverage/index.html|_tests/html-coverage|tests/html-coverage" \
"Test Report|_tests/html-report/report.html|tests/html-report/report.html|_tests/html-report|tests/html-report" \
"Benchmarks|_tests/benchmarks/report.html|tests/benchmarks/report.html|_tests/benchmarks|tests/benchmarks" \
"Stress Tests|_tests/stress/report.html|tests/stress/report.html|_tests/stress|tests/stress" \
"Hypothesis Tests|_tests/hypothesis/report.html|tests/hypothesis/report.html|_tests/hypothesis|tests/hypothesis" \
"Notebooks|_marimushka/index.html|marimushka/index.html|_marimushka|marimushka" \
"Official Documentation|_mkdocs/index.html|docs/index.html|_mkdocs|docs"

Expand All @@ -49,27 +61,10 @@ BOOK_SECTIONS := \
# 1. Aggregates API docs, coverage, test reports, notebooks, and MkDocs site into _book.
# 2. Generates links.json to define the book structure.
# 3. Uses 'minibook' to compile the final HTML site.
book:: test docs marimushka mkdocs-build ## compile the companion book
book:: test benchmark stress hypothesis-test docs marimushka mkdocs-build ## compile the companion book
@printf "${BLUE}[INFO] Building combined documentation...${RESET}\n"
@rm -rf _book && mkdir -p _book

@if [ -f "_tests/coverage.json" ]; then \
printf "${BLUE}[INFO] Generating coverage badge JSON...${RESET}\n"; \
mkdir -p _book/tests; \
${UV_BIN} run python -c "\
import json; \
data = json.load(open('_tests/coverage.json')); \
pct = int(data['totals']['percent_covered']); \
color = 'brightgreen' if pct >= 90 else 'green' if pct >= 80 else 'yellow' if pct >= 70 else 'orange' if pct >= 60 else 'red'; \
badge = {'schemaVersion': 1, 'label': 'coverage', 'message': f'{pct}%', 'color': color}; \
json.dump(badge, open('_book/tests/coverage-badge.json', 'w'))"; \
printf "${BLUE}[INFO] Coverage badge JSON:${RESET}\n"; \
cat _book/tests/coverage-badge.json; \
printf "\n"; \
else \
printf "${YELLOW}[WARN] No coverage.json found, skipping badge generation${RESET}\n"; \
fi

@printf "{\n" > _book/links.json
@first=1; \
for entry in $(BOOK_SECTIONS); do \
Expand All @@ -91,6 +86,22 @@ json.dump(badge, open('_book/tests/coverage-badge.json', 'w'))"; \
printf "${YELLOW}[WARN] Missing $$name, skipping${RESET}\n"; \
fi; \
done; \
if [ -n "$$GITHUB_REPOSITORY" ]; then \
CF_REPO="$$GITHUB_REPOSITORY"; \
else \
CF_REPO=$$(git remote get-url origin 2>/dev/null | sed 's|.*github\.com[:/]||' | sed 's|\.git$$||'); \
fi; \
if [ -n "$$CF_REPO" ]; then \
CF_URL="https://www.codefactor.io/repository/github/$$CF_REPO"; \
HTTP_CODE=$$(curl -s -o /dev/null -w "%{http_code}" --max-time 5 "$$CF_URL" 2>/dev/null || echo "000"); \
if [ "$$HTTP_CODE" = "200" ]; then \
if [ $$first -eq 0 ]; then printf ",\n" >> _book/links.json; fi; \
printf " \"CodeFactor\": \"$$CF_URL\"" >> _book/links.json; \
printf "${BLUE}[INFO] Adding CodeFactor...${RESET}\n"; \
else \
printf "${YELLOW}[WARN] CodeFactor page not accessible (HTTP $$HTTP_CODE), skipping${RESET}\n"; \
fi; \
fi; \
printf "\n}\n" >> _book/links.json

@printf "${BLUE}[INFO] Generated links.json:${RESET}\n"
Expand Down
4 changes: 2 additions & 2 deletions .rhiza/make.d/docs.mk
Original file line number Diff line number Diff line change
Expand Up @@ -74,7 +74,7 @@ mkdocs-build:: install-uv ## build MkDocs documentation site
@if [ -f "$(MKDOCS_CONFIG)" ]; then \
rm -rf "$(MKDOCS_OUTPUT)"; \
MKDOCS_OUTPUT_ABS="$$(pwd)/$(MKDOCS_OUTPUT)"; \
${UVX_BIN} --with mkdocs-material --with "pymdown-extensions>=10.0" mkdocs build \
${UVX_BIN} --with "mkdocs-material<10.0" --with "pymdown-extensions>=10.0" --with "mkdocs<2.0" mkdocs build \
-f "$(MKDOCS_CONFIG)" \
-d "$$MKDOCS_OUTPUT_ABS"; \
else \
Expand All @@ -85,7 +85,7 @@ mkdocs-build:: install-uv ## build MkDocs documentation site
# Useful for local development and previewing changes.
mkdocs-serve: install-uv ## serve MkDocs site with live reload
@if [ -f "$(MKDOCS_CONFIG)" ]; then \
${UVX_BIN} --with mkdocs-material --with "pymdown-extensions>=10.0" mkdocs serve \
${UVX_BIN} --with "mkdocs-material<10.0" --with "pymdown-extensions>=10.0" --with "mkdocs<2.0" mkdocs serve \
-f "$(MKDOCS_CONFIG)"; \
else \
printf "${RED}[ERROR] $(MKDOCS_CONFIG) not found${RESET}\n"; \
Expand Down
2 changes: 1 addition & 1 deletion .rhiza/make.d/marimo.mk
Original file line number Diff line number Diff line change
Expand Up @@ -61,7 +61,7 @@ marimushka:: install-uv ## export Marimo notebooks to HTML
OUTPUT_DIR="$$CURRENT_DIR/${MARIMUSHKA_OUTPUT}"; \
cd "${MARIMO_FOLDER}" && \
UVX_DIR=$$(dirname "$$(command -v uvx || echo "${INSTALL_DIR}/uvx")") && \
"${UVX_BIN}" "marimushka>=0.1.9" export --notebooks "." --output "$$OUTPUT_DIR" --bin-path "$$UVX_DIR" && \
"${UVX_BIN}" "marimushka>=0.3.3" export --notebooks "." --output "$$OUTPUT_DIR" --bin-path "$$UVX_DIR" && \
cd "$$CURRENT_DIR"; \
fi; \
fi
Loading
Loading